Skip to main content

Spiking neural network models of V1 development using Brian2

Project description

briar

Large-Scale SNN framework with dendrites, built on Brian2

Research by Zubin Kane -- Makin Lab at Purdue University

Overview

briar provides pre-built architectures for studying how orientation selectivity and phase diversity emerge in visual cortex through spike-timing-dependent plasticity (STDP). It wraps Brian2 with a declarative layer for defining neuron pools, synapse pools, and learning rules, then handles device setup, compilation, monitoring, and result serialization automatically.

Key features:

  • Declarative architecture definition -- define pools and synapses as dataclasses, briar generates the Brian2 equations
  • Built-in architectures for common V1 models (feedforward, two-layer simple/complex, efficient coding)
  • Custom architecture for building networks from scratch
  • cpp_standalone support with incremental compilation for fast repeated runs
  • Parameter sweeps with automatic grid search
  • Rich result objects with summary dashboards, diffs, and architecture-aware plotting

Installation

pip install briar

Quick Start

The simplest way to run an experiment is to pick an architecture and a task, then call buildrun():

from briar import SimpleComplex, NaturalImageTask

task = NaturalImageTask(n_patterns=500, image_size=32)
arch = SimpleComplex(task)
results = arch.buildrun()

This builds the full Brian2 network, runs the simulation, and returns a Results object containing all parameters, spike monitors, weight histories, and connectivity.

Inspecting results

# In Jupyter, just evaluate `results` to see the rich dashboard
results

# In a script, print() gives the same dashboard in plain text
print(results)

# results.summary() also prints the dashboard (same as print)
results.summary()

# Show only parameters that differ from defaults
results.diff()

# Architecture-aware default plots
results.plot()
results.plot(full=True)    # additional diagnostics
results.plot(debug=True)   # low-level debug panels

Modifying parameters

Every architecture parameter can be overridden at construction. The diff() method shows exactly what changed:

arch = SimpleComplex(
    task,
    eta_ff_simple=1e-3,       # increase simple cell learning rate
    ff_complex_radius=6.0,    # widen complex cell receptive fields
)
results = arch.buildrun()

# diff() highlights only the non-default values
results.diff()

Adding pools to an existing architecture

Any architecture can be extended with additional pools after construction. Added pools are automatically discovered and built:

from briar import SimpleComplex, NaturalImageTask, NeuronPool, SynapsePool

task = NaturalImageTask(n_patterns=500, image_size=32)
arch = SimpleComplex(task)

# Add a new neuron pool and connect it
arch.add(NeuronPool(name='readout', n_neurons=8))
arch.add(SynapsePool(
    name='ff_readout',
    source=arch.simple_layer,
    target=arch.readout,
))

results = arch.buildrun()

Architectures

Simple

Feedforward-only: LGN ON/OFF inputs connect to simple cells via STDP. Tests whether feedforward learning alone can produce orientation selectivity with phase diversity.

input_pool (LGN ON/OFF) -> simple_layer (STDP)
from briar import Simple, RetinalWaveTask

task = RetinalWaveTask(n_waves=200, image_size=32)
arch = Simple(task, eta_ff_simple=5e-4)
results = arch.buildrun()

SimpleComplex

Two-layer architecture inspired by Antolik & Bednar (2011) with dendritic predictive coding from Mikulasch et al. (2021). Simple cells learn from LGN input via somatic STDP; complex cells learn from simple cells via dendritic predictive coding. Includes Mexican hat recurrent connections and feedback.

input_pool (LGN ON/OFF) -> simple_layer (somatic, STDP)
simple_layer -> complex_layer (dendritic, predictive coding)
complex_layer -> complex_layer (Mexican hat recurrent)
complex_layer -> simple_layer (Mexican hat feedback, fixed)
from briar import SimpleComplex, NaturalImageTask

task = NaturalImageTask(n_patterns=500, image_size=32)
arch = SimpleComplex(task)
results = arch.buildrun()

EfficientEncoder

Single-layer efficient coding model with feedforward predictive coding, Hebbian recurrent connections, and a decoder for reconstruction loss monitoring.

input_pool -> layer1 (dendritic, predictive coding)
layer1 -> layer1 (dendritic, Hebbian recurrent)
layer1 -> decoder (reconstruction loss)
from briar import EfficientEncoder, BarTask

task = BarTask(image_size=8, n_patterns=1000)
arch = EfficientEncoder(task)
results = arch.buildrun()

Custom

A blank architecture with no pre-defined pools. Build any network from scratch by adding pools and wiring them manually:

from briar import (
    Custom, NeuronPool, SynapsePool, PoissonPool,
    DecoderPool, SimConfig, PlasticityRule,
)
from briar.tasks import BarTask
from briar.datastructures import Compartment, DendriticRule

task = BarTask(image_size=8, n_patterns=500)
arch = Custom(task)

# Input layer
arch.add(PoissonPool(name='input', task=task))

# Hidden layer
arch.add(NeuronPool(name='hidden', n_neurons=16))

# Feedforward synapses with STDP
arch.add(SynapsePool(
    name='ff',
    source=arch.input,
    target=arch.hidden,
    eta=5e-4,
    plasticity_rule=PlasticityRule.STDP,
))

# Spike monitor
arch.hidden.add_monitor('spikes')

results = arch.buildrun()

Parameter Sweeps

Run multiple experiments varying one or more parameters:

from briar import SimpleComplex, NaturalImageTask, sweep

task = NaturalImageTask(n_patterns=500)

# Sweep a single parameter
results = sweep(
    SimpleComplex, task, use_cpp=False,
    eta_ff_simple=[5e-5, 5e-4, 5e-3, 5e-2, 5e-1],
)

# Mix fixed overrides with swept parameters
results = sweep(
    SimpleComplex, task,
    ff_complex_radius=6.0,                          # scalar -> fixed
    eta_ff_simple=[5e-5, 5e-4, 5e-3, 5e-2, 5e-1],  # list -> swept
)

# Multi-parameter grid (Cartesian product)
results = sweep(
    SimpleComplex, task,
    eta_ff_simple=[1e-4, 1e-3],
    eta_ff_complex=[5e-5, 5e-4],
)  # 4 runs total

Comparison Plots

Compare results from a sweep (or manually loaded pickles) side-by-side:

from briar import plot_raster_comparison, plot_rate_ridgeline, Results

# Stacked raster plots
plot_raster_comparison(results, layer='simple')
plot_raster_comparison(results, layer='complex')

# Ridgeline firing rate distributions
plot_rate_ridgeline(results, layer='simple')
plot_rate_ridgeline(results, layer='complex')

# Works with manually loaded results too
r1 = Results.load('dumps/SimpleComplex/NaturalImageTask/20260304_091654.pkl')
r2 = Results.load('dumps/SimpleComplex/NaturalImageTask/20260304_131307.pkl')
plot_raster_comparison([r1, r2], labels=['eta=1e-4', 'eta=1e-3'])

Testing

# All tests
pytest

# Fast tests only - runtime mode, fastest in sequential (not parallel)
pytest -m "not slow" --override-ini="addopts="

# Slow tests only - cpp_standalone compile+run
pytest -m slow

# Slow test *examples* for a specific architecture
pytest -m encoder
pytest -m "slow and simple_complex"

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

briar-0.1.1.tar.gz (148.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

briar-0.1.1-py3-none-any.whl (119.6 kB view details)

Uploaded Python 3

File details

Details for the file briar-0.1.1.tar.gz.

File metadata

  • Download URL: briar-0.1.1.tar.gz
  • Upload date:
  • Size: 148.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.5

File hashes

Hashes for briar-0.1.1.tar.gz
Algorithm Hash digest
SHA256 b7a8920c17158200ddc7d6fc56450483238956359f95a45dae38b458bed7babb
MD5 7aa56e3d6f2d32116f30c5af12f5ef10
BLAKE2b-256 5ee0427b6ade08b7fbe789b21c5a111810c11db4d126909905742ab5a1d18861

See more details on using hashes here.

File details

Details for the file briar-0.1.1-py3-none-any.whl.

File metadata

  • Download URL: briar-0.1.1-py3-none-any.whl
  • Upload date:
  • Size: 119.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.5

File hashes

Hashes for briar-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 15829cf1b73d9abf8ebd1b43f690fa9dca58536b0db2ab16daf195f0b7498f17
MD5 345e999afed8a0cc593d97c8e8416aa9
BLAKE2b-256 2b19f6ac4f19ad26ab016bd829117ff4d2354074499f40f7ee05ad8f89c89dd8

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page