Skip to main content

Tools for making neural simulations using the Neural Engineering Framework

Project description

Latest PyPI version Number of PyPI downloads Travis-CI build status AppVeyor build status Test coverage

Nengo: Large-scale brain modelling in Python

An illustration of the three principles of the NEF

Installation

Nengo depends on NumPy, and we recommend that you install NumPy before installing Nengo. If you’re not sure how to do this, we recommend using Anaconda.

To install Nengo:

pip install nengo

If you have difficulty installing Nengo or NumPy, please read the more detailed Nengo installation instructions first.

If you’d like to install Nengo from source, please read the development installation instructions.

Nengo is tested to work on Python 2.7 and 3.4+.

Examples

Here are six of many examples showing how Nengo enables the creation and simulation of large-scale neural models in few lines of code.

  1. 100 LIF neurons representing a sine wave

  2. Computing the square across a neural connection

  3. Controlled oscillatory dynamics with a recurrent connection

  4. Learning a communication channel with the PES rule

  5. Simple question answering with the Semantic Pointer Architecture

  6. A summary of the principles underlying all of these examples

Documentation

Usage and API documentation can be found at https://pythonhosted.org/nengo/.

Development

Information for current or prospective developers can be found at https://pythonhosted.org/nengo/dev_guide.html.

Release History

2.2.0 (September 12, 2016)

API changes

  • It is now possible to pass a NumPy array to the function argument of nengo.Connection. The values in the array are taken to be the targets in the decoder solving process, which means that the eval_points must also be set on the connection. (#1010)

  • nengo.utils.connection.target_function is now deprecated, and will be removed in Nengo 3.0. Instead, pass the targets directly to the connection through the function argument. (#1010)

Behavioural changes

  • Dropped support for NumPy 1.6. Oldest supported NumPy version is now 1.7. (#1147)

Improvements

  • Added a nengo.backends entry point to make the reference simulator discoverable for other Python packages. In the future all backends should declare an entry point accordingly. (#1127)

  • Added ShapeParam to store array shapes. (#1045)

  • Added ThresholdingPreset to configure ensembles for thresholding. (#1058, #1077, #1148)

  • Tweaked rasterplot so that spikes from different neurons don’t overlap. (#1121)

Documentation

  • Added a page explaining the config system and preset configs. (#1150)

Bug fixes

  • Fixed some situations where the cache index becomes corrupt by writing the updated cache index atomically (in most cases). (#1097, #1107)

  • The synapse methods filt and filtfilt now support lists as input. (#1123)

  • Added a registry system so that only stable objects are cached. (#1054, #1068)

2.1.2 (June 27, 2016)

Bug fixes

  • The DecoderCache is now more robust when used improperly, and no longer requires changes to backends in order to use properly. (#1112)

2.1.1 (June 24, 2016)

Improvements

  • Improved the default LIF neuron model to spike at the same rate as the LIFRate neuron model for constant inputs. The older model has been moved to nengo_extras under the name FastLIF. (#975)

  • Added y0 attribute to WhiteSignal, which adjusts the phase of each dimension to begin with absolute value closest to y0. (#1064)

  • Allow the AssociativeMemory to accept Semantic Pointer expressions as input_keys and output_keys. (#982)

Bug fixes

  • The DecoderCache is used as context manager instead of relying on the __del__ method for cleanup. This should solve problems with the cache’s file lock not being removed. It might be necessary to manually remove the index.lock file in the cache directory after upgrading from an older Nengo version. (#1053, #1041, #1048)

  • If the cache index is corrupted, we now fail gracefully by invalidating the cache and continuing rather than raising an exception. (#1110, #1097)

  • The Nnls solver now works for weights. The NnlsL2 solver is improved since we clip values to be non-negative before forming the Gram system. (#1027, #1019)

  • Eliminate memory leak in the parameter system. (#1089, #1090)

  • Allow recurrence of the form a=b, b=a in basal ganglia SPA actions. (#1098, #1099)

  • Support a greater range of Jupyter notebook and ipywidgets versions with the the ipynb extensions. (#1088, #1085)

2.1.0 (April 27, 2016)

API changes

  • A new class for representing stateful functions called Process has been added. Node objects are now process-aware, meaning that a process can be used as a node’s output. Unlike non-process callables, processes are properly reset when a simulator is reset. See the processes.ipynb example notebook, or the API documentation for more details. (#590, #652, #945, #955)

  • Spiking LIF neuron models now accept an additional argument, min_voltage. Voltages are clipped such that they do not drop below this value (previously, this was fixed at 0). (#666)

  • The PES learning rule no longer accepts a connection as an argument. Instead, error information is transmitted by making a connection to the learning rule object (e.g., nengo.Connection(error_ensemble, connection.learning_rule). (#344, #642)

  • The modulatory attribute has been removed from nengo.Connection. This was only used for learning rules to this point, and has been removed in favor of connecting directly to the learning rule. (#642)

  • Connection weights can now be probed with nengo.Probe(conn, 'weights'), and these are always the weights that will change with learning regardless of the type of connection. Previously, either decoders or transform may have changed depending on the type of connection; it is now no longer possible to probe decoders or transform. (#729)

  • A version of the AssociativeMemory SPA module is now available as a stand-alone network in nengo.networks. The AssociativeMemory SPA module also has an updated argument list. (#702)

  • The Product and InputGatedMemory networks no longer accept a config argument. (#814)

  • The EnsembleArray network’s neuron_nodes argument is deprecated. Instead, call the new add_neuron_input or add_neuron_output methods. (#868)

  • The nengo.log utility function now takes a string level parameter to specify any logging level, instead of the old binary debug parameter. Cache messages are logged at DEBUG instead of INFO level. (#883)

  • Reorganised the Associative Memory code, including removing many extra parameters from nengo.networks.assoc_mem.AssociativeMemory and modifying the defaults of others. (#797)

  • Add close method to Simulator. Simulator can now be used used as a context manager. (#857, #739, #859)

  • Most exceptions that Nengo can raise are now custom exception classes that can be found in the nengo.exceptions module. (#781)

  • All Nengo objects (Connection, Ensemble, Node, and Probe) now accept a label and seed argument if they didn’t previously. (#958)

  • In nengo.synapses, filt and filtfilt are deprecated. Every synapse type now has filt and filtfilt methods that filter using the synapse. (#945)

  • Connection objects can now accept a Distribution for the transform argument; the transform matrix will be sampled from that distribution when the model is built. (#979).

Behavioural changes

  • The sign on the PES learning rule’s error has been flipped to conform with most learning rules, in which error is minimized. The error should be actual - target. (#642)

  • The PES rule’s learning rate is invariant to the number of neurons in the presynaptic population. The effective speed of learning should now be unaffected by changes in the size of the presynaptic population. Existing learning networks may need to be updated; to achieve identical behavior, scale the learning rate by pre.n_neurons / 100. (#643)

  • The probeable attribute of all Nengo objects is now implemented as a property, rather than a configurable parameter. (#671)

  • Node functions receive x as a copied NumPy array (instead of a readonly view). (#716, #722)

  • The SPA Compare module produces a scalar output (instead of a specific vector). (#775, #782)

  • Bias nodes in spa.Cortical, and gate ensembles and connections in spa.Thalamus are now stored in the target modules. (#894, #906)

  • The filt and filtfilt functions on Synapse now use the initial value of the input signal to initialize the filter output by default. This provides more accurate filtering at the beginning of the signal, for signals that do not start at zero. (#945)

Improvements

  • Added Ensemble.noise attribute, which injects noise directly into neurons according to a stochastic Process. (#590)

  • Added a randomized_svd subsolver for the L2 solvers. This can be much quicker for large numbers of neurons or evaluation points. (#803)

  • Added PES.pre_tau attribute, which sets the time constant on a lowpass filter of the presynaptic activity. (#643)

  • EnsembleArray.add_output now accepts a list of functions to be computed by each ensemble. (#562, #580)

  • LinearFilter now has an analog argument which can be set through its constructor. Linear filters with digital coefficients can be specified by setting analog to False. (#819)

  • Added SqrtBeta distribution, which describes the distribution of semantic pointer elements. (#414, #430)

  • Added Triangle synapse, which filters with a triangular FIR filter. (#660)

  • Added utils.connection.eval_point_decoding function, which provides a connection’s static decoding of a list of evaluation points. (#700)

  • Resetting the Simulator now resets all Processes, meaning the injected random signals and noise are identical between runs, unless the seed is changed (which can be done through Simulator.reset). (#582, #616, #652)

  • An exception is raised if SPA modules are not properly assigned to an SPA attribute. (#730, #791)

  • The Product network is now more accurate. (#651)

  • Numpy arrays can now be used as indices for slicing objects. (#754)

  • Config.configures now accepts multiple classes rather than just one. (#842)

  • Added add method to spa.Actions, which allows actions to be added after module has been initialized. (#861, #862)

  • Added SPA wrapper for circular convolution networks, spa.Bind (#849)

  • Added the Voja (Vector Oja) learning rule type, which updates an ensemble’s encoders to fire selectively for its inputs. (see examples/learning/learn_associations.ipynb). (#727)

  • Added a clipped exponential distribution useful for thresholding, in particular in the AssociativeMemory. (#779)

  • Added a cosine similarity distribution, which is the distribution of the cosine of the angle between two random vectors. It is useful for setting intercepts, in particular when using the Voja learning rule. (#768)

  • nengo.synapses.LinearFilter now has an evaluate method to evaluate the filter response to sine waves of given frequencies. This can be used to create Bode plots, for example. (#945)

  • nengo.spa.Vocabulary objects now have a readonly attribute that can be used to disallow adding new semantic pointers. Vocabulary subsets are read-only by default. (#699)

  • Improved performance of the decoder cache by writing all decoders of a network into a single file. (#946)

Bug fixes

  • Fixed issue where setting Connection.seed through the constructor had no effect. (#724)

  • Fixed issue in which learning connections could not be sliced. (#632)

  • Fixed issue when probing scalar transforms. (#667, #671)

  • Fix for SPA actions that route to a module with multiple inputs. (#714)

  • Corrected the rmses values in BuiltConnection.solver_info when using NNls and Nnl2sL2 solvers, and the reg argument for Nnl2sL2. (#839)

  • spa.Vocabulary.create_pointer now respects the specified number of creation attempts, and returns the most dissimilar pointer if none can be found below the similarity threshold. (#817)

  • Probing a Connection’s output now returns the output of that individual Connection, rather than the input to the Connection’s post Ensemble. (#973, #974)

  • Fixed thread-safety of using networks and config in with statements. (#989)

  • The decoder cache will only be used when a seed is specified. (#946)

2.0.4 (April 27, 2016)

Bug fixes

  • Cache now fails gracefully if the legacy.txt file cannot be read. This can occur if a later version of Nengo is used.

2.0.3 (December 7, 2015)

API changes

  • The spa.State object replaces the old spa.Memory and spa.Buffer. These old modules are deprecated and will be removed in 2.2. (#796)

2.0.2 (October 13, 2015)

2.0.2 is a bug fix release to ensure that Nengo continues to work with more recent versions of Jupyter (formerly known as the IPython notebook).

Behavioural changes

  • The IPython notebook progress bar has to be activated with %load_ext nengo.ipynb. (#693)

Improvements

  • Added [progress] section to nengorc which allows setting progress_bar and updater. (#693)

Bug fixes

  • Fix compatibility issues with newer versions of IPython, and Jupyter. (#693)

2.0.1 (January 27, 2015)

Behavioural changes

  • Node functions receive t as a float (instead of a NumPy scalar) and x as a readonly NumPy array (instead of a writeable array). (#626, #628)

Improvements

  • rasterplot works with 0 neurons, and generates much smaller PDFs. (#601)

Bug fixes

  • Fix compatibility with NumPy 1.6. (#627)

2.0.0 (January 15, 2015)

Initial release of Nengo 2.0! Supports Python 2.6+ and 3.3+. Thanks to all of the contributors for making this possible!

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

nengo-2.2.0.tar.gz (364.5 kB view details)

Uploaded Source

File details

Details for the file nengo-2.2.0.tar.gz.

File metadata

  • Download URL: nengo-2.2.0.tar.gz
  • Upload date:
  • Size: 364.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No

File hashes

Hashes for nengo-2.2.0.tar.gz
Algorithm Hash digest
SHA256 26c51f95a304fff63d31a25151bb02620fe93e79ea7b9459fd041a1c907ab604
MD5 1931a1693d0b484a011740b5edbd1f28
BLAKE2b-256 ed26e32a8b65cd40a60ece950d59bf2c582325340e6e35e852938797eb3331a1

See more details on using hashes here.

Provenance

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page