The PyDSTool project has grown rapidly from simple prototypes at the beginning of 2005. The package remains at the Beta stage of development partly because it contains so much legacy code from its prototype stage, and because many parts of it have not been rigorously tested. Technically the package is currently "research code" and should not yet be treated with expectations typical of a distributable package. We are working to change this situation through NSF grant support.
As research code, the approach taken so far has been to get the code working in the applications that we have been writing it for, in favour of aiming to make it optimally efficient. You are likely to find bugs and limitations in using our package, which you are encouraged to report to us so that we can address them. If you look at the code you may well find some "interesting" implementations of the algorithms, which may not be the cleanest, most efficient, or most Pythonic approach to have taken. We try to address these issues when we come across them and when our needs or conscience forces us to.
PyDSTool is compatible with Python 2.7, SciPy 0.9, Numpy 1.4, and MatPlotLib 0.99, as well as previous versions. Version 0.83.3 of PyDSTool is the last to be compatible with "old SciPy" (v0.3.2), and will no longer be maintained. All versions will continue to be available at the SourceForge repository.
The latest version available is 0.88. The current, stable sub-release can be found at SourceForge on the project files page. Release notes are now provided at SourceForge too:
It is recommended to delete a prior version before unzipping a new one. There is no installer for PyDSTool at this time (one is pending release in Spring 2013), you simply unzip the files and follow the steps on the GettingStarted page.
The latest (possibly unstable) version can always be found by cloning the master branch of the GitHub repository. The master branch should remain stable at all times, though. It is likely to contain many minor fixes and improvements over the latest Sourceforge release. In that link there is also an option to download a ZIP file of the Master branch directly, from which you can follow the instructions shown on the GitHub page.
You can see the most recent changes online at the GitHub page, where you can also download individual files if necessary.
An up to date release will shortly be present in the NeuroDebian distribution, now that a proper installer (for Linux) has been prepared. An older PyDSTool release is actively maintained in the Arch Linux distribution by independent developers.
v 0.88, 11 Dec 2009
New numeric_to_traj and pointset_to_traj utility functions
Major improvements to intelligent expr2func (symbolic -> python function conversion)
Cleanup of global imports, especially: entire numpy.random and linalg namespaces no longer imported by default
Added support for 'min' and 'max' keywords in functional specifications (for ODE right-hand sides, for instance)
Optimization tools from third-party genericOpt (included with permission) - improved parameter estimation examples
Numerical phase-response calculations now possible in PRC toolbox (see ToolboxDocumentation)
Fully-fledged DSSRT toolbox for neural modeling (see ToolboxDocumentation)
New tests and demonstrations in PyDSTool/tests
Improved compatibility with cross-platform use and with recent python versions and associated libraries
v 0.87, 13 Nov 2008
ModelInterface class and new-style Models - major restructure of
how Models are organized and can be hierarchically embedded in each other
Feature & condition classes with >ModelInterface ease specification
of hybrid models and complex parameter estimation & model inference problems
Support for 2nd order interpolated polynomial representation of
Trajectory objects (using Anne Archibald's poly_int classes): These are off by default as they require a large amount of time to compute. See tests/poly_interp_test.py (Auxiliary variables cannot be included in this, and remain linearly interpolated, as their derivatives are not generally available.)
Descriptors, Transform classes for automated model building and
manipulation of specifications
MProject sub-package for managing model estimation/inference
All generators now return trajectories that only continue until
any user-defined state bounds are reached. The state bounds are not found with the same accuracy as terminal events defined to trap these bounds errors, which is the recommended way to implement this (e.g., through ModelConstructor options).
Default interval checking is now off (use 'checklevel' option 1
or 2 to return to consistency checking with warnings) to improve speed of ODE solving
v 0.86, 01 June 2008
Now compatible with Python 2.5 and Numpy 1.0.4 / Scipy 0.6.0
Decreased overhead for simulating hybrid models
Improved efficiency of VODE Generator in computing trajectories
Interval class now supports discrete valued intervals
Improved diagnostic reporting structure in Generator and Model classes
Inclusion of intuitive arithmetic operations for Point and Pointset classes
Various bug fixes and other API tidying
v 0.85, 11 July 2007
* PyCont can now perform continuation according to the zeros of user-defined
functions or bounds (see example tests/PyCont_vanDerPol.py and PyCont_Hopfield.py)
Support for scipy.special functions in Generator definitions (Python targets
only. C targets with a more limited range of functions will be supported soon)
A few phase plane analysis tools have been added to Toolbox/phaseplane.py
-- in particular, the finding of fixed points, nullclines, and some simple code to compute saddle manifolds in planar systems
Phase response curves can now be calculated for limit cycles using the adjoint
method (see tests/ML_adjointPRC.py and HH_adjointPRC.py for examples)
Generators now include auxiliary variable data in recorded event data
Pointset comparisons using arithmetic operators now consistent with the
norm-based comparisons used by Points
Time mesh points for external inputs to integrators now forced to be integration
mesh points for greater numerical accuracy and stability
VODE now supports 'use_special' for forcing values of output trajectory mesh
Patch 1, 13 Mar 2007
Unzip outside PyDSTool directory to overwrite necessary files.
v 0.84, 20 Feb 2007
This is primarily a release to bring PyDSTool up to date with numpy 1.0, "new SciPy" and Python versions later than 2.4. Major code improvements are still in development and will be released in a 0.90 version as soon as we can.
Compatible with Scipy 0.5.1 and Python 2.4 onwards.
64-bit CPU compatible.
New ModelSpec building functionality, in particular "standard events"
for threshold crossings and turning points in every variable are automatically generated (but detection turned off by default).
Improved symbolic evaluation behavior (but in this pre-0.90 version
we have switched off some of the advanced simplification code as there is a bug in it which we haven't yet traced).
PyCont functionality improved and interface cleaned up.
ExtrapolateTable Generator class added
Various other minor fixes and improvements.
v 0.83.3, 20 Sep 2006
Point and Pointset behaviour modified to make access to values or arrays
of values (respectively) easier. Now, a Point pt['x'] is not the same as pt[['x']]. The former now returns the numeric value, the latter a new Point containing only the coordinate 'x'. Behaviour with a Pointset pts is altered similarly, except pts['x'] would be an array of numeric values.
Pointsets can now be initialized with independent variable arrays in
reverse order (e.g., from a backwards integration). The initializer method now automatically detects the reverse order and reverses both the independent and dependent variable arrays before creating the Pointset.
Updated FuncSpec and Generators to allow specs to be provided by lists
of symbolic quantities,
Provided better support for bounds events in C and Python specifications.
See tests/HH_model_testbounds.py and HH_model_Cintegrator_testbounds.py
Provided helper functions to automatically create turning-point and zero
crossing events for oDE right-hand sides and other auxiliary functions from ModelSpec definitions,
Fixed symbolic.py to properly process string literals.
Fixed bugs in defining Jacobian with respect to parameters.
Other bug fixes for ModelConstructor, ModelSpec, Symbolic.
Fixed bugs in Model/Trajectory sample methods to properly use tlo and thi
limits with precise=False option and eventdata option turned off.
Added data analysis toolbox for use in data-driven dynamical systems
modelling. Moved some functions over from fracdim.py toolbox.
Fixed bug in using global time with hybrid systems that have external
Event definitions now support external inputs.
Suppressed messages to stdout from compilers and calls to minpack
Two plotting methods were added to plot_cycles: 'stack' and 'highlight'.
Added continuation argument 'StopAtPoints', allowing computation to stop
at specified special points.
Added domain checking through introduction of a new special point labeled
'B' for 'Boundary'. Note that if 'B' is active (through specification in
LocBifPoints), domain information is saved along the curve with the labels 'inside' and 'outside'.
Added continuation argument 'description' allowing the user to give details
on the specific curve computation. When the info() method is called from the curve class, the description will be displayed.
Added argument SPOut to the LimitCycle class, allowing the user to stop at
specified values of variables or parameters.
Jacobians with respect to variables is now implemented in AUTO interface.
Jacobians with respect to parameters is currently not working, but will be added very soon in a minor release.
Added get() method to plot class.
Bug fixes (see PyCont/README for details).
Patch 1, 15 Jun 2006
(Unzip to PyDSTool\ directory to update files)
Fixed AUTO compilation bug on Linux
Fixed data file reference in tests\PyCont_Lorentz.py
Minor updates to toolbox utilities
Minor changes to common functions
v0.83.2, 05 Jun 2006
Provided support for 1D Jacobians for generators (only multi-D
were previously supported).
Fixed bug with symbolic differentiation and simplification of expressions
involving parsing of division of compound expressions, where braces are not explicitly given.
Points, pointsets, and parameter estimation objects are initialized
with keywords now, not dictionaries (although points and pointsets retain dictionary backwards compatibility ... for now).
Parameterized pointsets now have the option to be referenced by values
that are within a small tolerance of the defined independent variable values.
Improved syntax for common object methods: 'computeTraj' is now 'compute',
'sampleTraj' is now 'sample', 'setPars' for generators is now 'set', 'setParsModel' for models is now 'set'. Previous method names are still valid for now.
Allowed C-target FuncSpec objects to use multi-references in definitions
Model trajectories can now be referenced and deleted using the
m[traj_name] notation, where m is a Model object.
Fixed EventMappings bug for multiple assignments specified by dictionary.
Other minor bug fixes.
Overhaul of plotting so that bifurcation diagrams can be adjusted
through a curve's 'plot' attribute.
AUTO is interfaced for limit cycle continuation. Requires external
C compiler access in the same way as the Dopri and Radau integrators.
Improvements to the PyCont API.
Support for bifurcations of discrete maps.
v0.83.1, 12 Mar 2006
More bifurcations can be detected
Improved user interface / graphing capabilities
Exporting data to Pointsets now possible
Pointsets and Points:
Arithmetic and comparison operations now supported
Both classes now have an associated norm (defaults
to the 2-norm) in order to implement point-wise comparison
Labelling of points better implemented
Calling Pointsets now only applies if they are
parameterized, when argument must be an independent variable value. Other functionality previously associated with calling has been moved to  referencing. Calling is now more efficient for its original intended purpose
Setitem for Pointsets now allows setting of entire
coordinate or independent variable array
Full support for Points to be used as dictionaries
Better support for Pointsets to be used like arrays
Fixed handling of functions under differentiation to
make more sense
Can now declare symbolic functions without defining them,
and can differentiate them symbolically
Introduction of symbolic vectors and vector functions
Symbolic eval() more efficient, more intelligent
Fixed minor bugs and added many minor features
Definition can now be done using Quantity objects as well
Initialization keyword syntax simplified, partly to reflect
the fact that instances of class args() can be provided instead of dictionaries
'specstrdict' -> 'varspecs'
'auxfndict' -> 'fnspecs'
'xdatadict' -> 'ics'
'parameters' -> 'pars'
External input signals (as arrays) can now be passed to
the vector field in Dopri and Radau integrators
New ADMC++ target for ODETools (to facilitate automatic differentiation
and parameter sensitivity calculations)
Fixed some Radau compilation issues on non-Windows platforms
Added 'method' key to InterpTable to allow piecewise-constant
interpolation of data (key value = 'constant' or 'linear')
Added a couple of new test and example scripts, e.g. for
a SloppyCell model, and for symbolic differentiation and manipulation.
Used cPickle on non-Windows platforms to improve
efficiency of object __copy__ methods
No longer need to provide initial condition to
computeTraj call for single vector field model
Copyright (C) 2008-2012, Department of Mathematics and Statistics, Georgia State University. All rights reserved. The following BSD license applies to versions released 2008-2012.
[Previously: Copyright (C) 2005-2007, Department of Mathematics, Cornell University. All rights reserved. ]
Parts of this distribution that originate from different authors are individually marked as such. Copyright and licensing of those parts remains with the original authors.
Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met:
Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer.
Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution.
The name of Georgia State University and its representatives may not be used to endorse or promote products derived from this software without specific prior written permission.
THIS SOFTWARE IS PROVIDED BY GEORGIA STATE UNIVERSITY "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL CORNELL UNIVERSITY BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
We use a form of the standard GNU version numbering system of major.minor.patch for each module -- see this rant about good version numbering practice. PyDSTool as a whole can have its own version number based on the incremental changes in its constituent modules. At "alpha"-release of a well-tested core, for the purposes of wider testing and external feedback, We have started at a global version below 1.0. Version 1.0 will be reserved for the first "beta"-type general release that has been tested in new SciPy.
For the PyDSTool package's version numbering, we use a major.minor number followed by a patch level made up of a YYMMDD date. Note the ordering, which makes the numbering sort helpfully! For example, a package version before alpha release would look like 0.5.050628. As of June 28th 2005 this scheme will also be adopted for backup directory names.
Before contributing code to the project, please read Guido van Rossum's Style Guide on the PythonResources page.
There are many common utilities available in the common.py and utils.py modules, which you are encouraged to make use of in order to re-use existing code, and make the internal interfaces more consistent. The common.py module contains many utilities, some of which are of little relevance to casual end-users. These include functions to help deal with string parsing of user specifications, "sorting" dictionaries, verification of object properties (such as that a list of numbers is increasing, or that it only contains unique items), and a host of useful name mappings (such as NumArray type names to type codes).
Dozens of other utilities are also provided by SciPy and NumPy, and these two modules should be imported by every major PyDSTool module. It is worth a browse through these modules' API documentation to see what can be utilized before you re-invent the wheel. Particularly useful functions are:
array_import, loadtxt and savetxt
allclose(x,y,rtol=1e-5,atol=1e-8) => |x-y| < atol+rtol*|y|, fromfunction
There are a set of PyDSTool-specific exceptions listed in the errors.py module. These are generally close counterparts of standard Python exceptions. For instance, the PyDSTool_KeyError should be raised when an end-user provides an incorrect dictionary key in the initialisation of a PyDSTool object, whereas a standard KeyError should be raised when there are inconsistencies in the dictionaries passed internally between objects (i.e. because of an intrinsic programming error).
All module .py files should include some basic test code for all of the major features implemented in that module. This code goes in the __main__ block at the end of the file. Look at the existing code for examples. Whenever changes are made to a module, the module should be re-run to check that these tests are still met. This is a simplified form of "unit testing" (see the link on the PythonResources page), known as "regression testing".
Additionally, more sophisticated example scripts are in the sub-directory /tests/. These test multiple features simultaneously. Please feel free to add to these.