Skip to main content

A Python Library for Continuous Attractor Neural Networks

Project description

CANNs: Continuous Attractor Neural Networks Toolkit

CANNs Logo

PyPI - Python Version DOI

PyPI Downloads Ask DeepWiki

中文说明请见 README_zh.md

Overview

CANNs (Continuous Attractor Neural Networks toolkit) is a research toolkit built on BrainPy and JAX, with optional Rust-accelerated canns-lib for selected performance-critical routines (e.g., TDA/Ripser and task generation). It bundles model collections, task generators, analyzers, and the ASA pipeline (GUI/TUI) so researchers can run simulations and analyze results in a consistent workflow. The API separates models, tasks, analyzers, and trainers to keep experiments modular and extensible.

Architecture

CANNs Architecture
Layer hierarchy of the CANNs library showing five levels: Application (Pipeline orchestration), Functional (Task, Trainer, Analyzer, Utils modules), Core Models (CANN implementations), Foundation (BrainPy/JAX and Rust FFI backends), and Hardware (CPU/GPU/TPU support)

The CANNs library follows a modular architecture guided by two core principles: separation of concerns and extensibility through base classes. The design separates functional responsibilities into five independent modules:

  1. Models (canns.models) define neural network dynamics;
  2. Tasks (canns.task) generate experimental paradigms and input data;
  3. Analyzers (canns.analyzer) provide visualization and analysis tools;
  4. Trainers (canns.trainer) implement learning rules for brain-inspired models;
  5. Pipeline (canns.pipeline) orchestrates complete workflows.

Each module focuses on a single responsibility—models don't generate input data, tasks don't analyze results, and analyzers don't modify parameters. This separation ensures maintainability, testability, and extensibility. All major components inherit from abstract base classes (BasicModel, BrainInspiredModel, Trainer) that define standard interfaces, enabling users to create custom implementations that seamlessly integrate with the built-in ecosystem.

Core Features

  • Model collections: basic CANNs (1D/2D, SFA), hierarchical path integration, theta-sweep models, brain-inspired models (e.g., Amari-Hopfield, linear/spiking layers)
  • Task generators: smooth tracking, population coding, template matching, open/closed-loop navigation
  • Analyzer suite: energy landscapes, tuning curves, raster/firing-rate plots, TDA and decoding utilities, cell classification
  • ASA pipeline & GUI/TUI: end-to-end workflow for preprocessing, TDA, decoding, and result visualization (e.g., CohoMap/CohoSpace/PathCompare/FR/FRM/GridScore)
  • Training & extensibility: HebbianTrainer plus base classes for consistent extension
  • Optional acceleration: canns-lib for selected performance-critical routines

Analyzer Visuals

Model Analysis Overview
Overview of Neural Dynamics Models. Comparison of three basic models: (A) 1D CANN, (B) 2D CANN, and (C) Grid Cell Network

Analyzer Display
Rich Analyzer Visualization Results

ASA GUI Preview
ASA GUI preview

ASA GUI Demo (YouTube)
ASA GUI demo video

Smooth Tracking 1D

Smooth Tracking 1D
Activity bump following a moving stimulus

CANN2D Encoding

CANN2D Encoding
2D population encoding patterns over time

Theta Sweep Animation

Theta Sweep Animation
Theta-modulated sweep dynamics

Bump Analysis

Bump Analysis Demo
Bump fitting and stability diagnostics

Torus Bump

Torus Bump
Bump dynamics projected onto a torus manifold

Quick Start

1D CANN smooth tracking (imports → simulation → visualization)

import brainpy.math as bm
from canns.analyzer.visualization import PlotConfigs, energy_landscape_1d_animation
from canns.models.basic import CANN1D
from canns.task.tracking import SmoothTracking1D

# simulation time step
bm.set_dt(0.1)

# build model
cann = CANN1D(num=512)

# build tracking task (Iext length = duration length + 1)
task = SmoothTracking1D(
    cann_instance=cann,
    Iext=(0.0, 0.5, 1.0, 1.5),
    duration=(5.0, 5.0, 5.0),
    time_step=bm.get_dt(),
)
task.get_data()


# one-step simulation callback
def step(t, stimulus):
    cann(stimulus)
    return cann.u.value, cann.inp.value


# run simulation loop
us, inputs = bm.for_loop(
    step,
    operands=(task.run_steps, task.data),
)

# visualize with energy landscape animation
config = PlotConfigs.energy_landscape_1d_animation(
    time_steps_per_second=int(1 / bm.get_dt()),
    fps=20,
    title="Smooth Tracking 1D",
    xlabel="State",
    ylabel="Activity",
    show=True,
)

energy_landscape_1d_animation(
    data_sets={"u": (cann.x, us), "Iext": (cann.x, inputs)},
    config=config,
)

Installation

# CPU-only
pip install canns

# Optional accelerators (Linux)
pip install canns[cuda12]
pip install canns[cuda13]
pip install canns[tpu]

# GUI (ASA Pipeline)
pip install canns[gui]

Optional (uv):

uv pip install canns

Docs & Examples

  • Documentation and tutorials: https://routhleck.com/canns/
  • Local scripts: examples/
  • Sphinx docs and notebooks: docs/
  • ASA GUI entry: canns-gui

Citation

If you use CANNs in your research, please cite:

@software{he_2026_canns,
  author       = {He, Sichao and
                  Tuerhong, Aiersi and
                  She, Shangjun and
                  Chu, Tianhao and
                  Wu, Yuling and
                  Zuo, Junfeng and
                  Wu, Si},
  title        = {CANNs: Continuous Attractor Neural Networks Toolkit},
  month        = feb,
  year         = 2026,
  publisher    = {Zenodo},
  version      = {v1.0.0},
  doi          = {10.5281/zenodo.18453893},
  url          = {https://doi.org/10.5281/zenodo.18453893}
}

Plain text:

He, S., Tuerhong, A., She, S., Chu, T., Wu, Y., Zuo, J., & Wu, S. (2026). CANNs: Continuous Attractor Neural Networks Toolkit (v1.0.0). Zenodo. https://doi.org/10.5281/zenodo.18453893

Contributing & License

Contributions are welcome. Please read CONTRIBUTING.md before opening a PR.

Apache License 2.0. See LICENSE.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

canns-1.0.1.tar.gz (431.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

canns-1.0.1-py3-none-any.whl (501.4 kB view details)

Uploaded Python 3

File details

Details for the file canns-1.0.1.tar.gz.

File metadata

  • Download URL: canns-1.0.1.tar.gz
  • Upload date:
  • Size: 431.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for canns-1.0.1.tar.gz
Algorithm Hash digest
SHA256 ccff717ff4156e84609fecc1cf3e0a48724903fb61d0658b2a1c9df1b411d2a4
MD5 a0893b3173dada285580566456b22eef
BLAKE2b-256 2d80f1cb8ba09a74cc8fbd90e2ecf17d765e93a822abb5156de12f4dd94f01e8

See more details on using hashes here.

Provenance

The following attestation bundles were made for canns-1.0.1.tar.gz:

Publisher: publish.yml on Routhleck/canns

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file canns-1.0.1-py3-none-any.whl.

File metadata

  • Download URL: canns-1.0.1-py3-none-any.whl
  • Upload date:
  • Size: 501.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for canns-1.0.1-py3-none-any.whl
Algorithm Hash digest
SHA256 6004784d702b39eb2c4f562220c30461754e65f66dff5015b0c0de90e3a8edeb
MD5 2323befea4642eaf5861b729bcafff32
BLAKE2b-256 f4d54270bfc1aacda6eb9e296d889f8e0335320b754f42b519094511660a4613

See more details on using hashes here.

Provenance

The following attestation bundles were made for canns-1.0.1-py3-none-any.whl:

Publisher: publish.yml on Routhleck/canns

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page