Skip to main content

Spark: Modular Spiking Neural Networks

Project description

⚡ Spark: Modular Spiking Neural Networks

Spark Logo

Build, train, and deploy state-of-the-art Spiking Neural Networks with a powerful visual interface and JAX.

License GitHub Stars

Spark is a next-generation framework designed to simplify and accelerate the research, development, and deployment of non-gradient-base Spiking Neural Networks (SNNs). Our goal is to make SNNs more accessible to researchers, engineers, and enthusiasts by abstracting away boilerplate code and providing intuitive tools for model creation and experimentation, while maintaing state-of-the-art performance.

Key Features (Philosophy?)

High-Performance Backend: Powered by JAX and Flax NNX, Spark enables just-in-time (JIT) compilation and state management of entire models.

🧩 Modular & Extensible: Modular by construction. Everything (that is worth interacting) in Spark is a self-contained module. Easily create, modify, and share custom neuron models, synapses, and learning rules. Ever wanted a neuron with 3 Somas, 2 sets of Synapses and 2.5 Learning rules? As long as it can spike, you came to the right place!

🔄 Seamless Workflow: Spiking neural networks are not special, why should they require special data?!. One of the core features of Spark is the concept of input and output interfaces which are simple modules that help you transform regular datasets into streams of spikes and transform streams of spikes back into boring data formats like floats.

🧠 Graph Editor: Design complex SNN architectures by dragging, dropping, and connecting pre-built neural components. No coding required for model design.

Getting Started

To try Spark, clone this repository and install it via:

git clone https://github.com/nogarx/spark.git
cd spark
pip install -e .

The Spark Graph Editor

Design your network's structure, set parameters for each component, and connect them to create a model.

Spark Logo

The Spark visual interface for building SNNs.

You can start the Graph Editor through the CLI,

editor = spark.GraphEditor()
editor.launch()

or inside a Notebook!

%gui qt
editor = spark.GraphEditor()
if __name__ == "__main__":
    editor.launch()

(Note: Currently the editor is in an early stage of development an it is not recommended for building neurons from scratch due to a small latency factor that is introduced when the model is "compiled".)

Afterwards you can simply export your model and build it.

import spark

# Initialize the configuration class
brain_config = spark.nn.BrainConfig.from_file('./my_awesome_model.scfg')

# Construct the brain.
brain = spark.nn.Brain(config=brain_config)

# Build the brain.
brain(**my_awesome_inputs)

To harvest the true power of Spark your model needs to be JIT compiled. Jax requires your model to be traceable, which sometimes can be quite unintutive. Fortunately, this is quite simple in Spark, and most of the time it will look very similar to the next function.

import jax

@jax.jit
def run_model(graph, state, **my_awesome_inputs):
    # Reconstruct model
	model = spark.merge(graph, state)
    # Compute
	out = model(**my_awesome_inputs)
    # Split the model and recover its state.
    _, state = spark.split((model))
	return out, state

Finally, do the initial split of the model and use your function.

import jax.numpy as jnp

# Some dummy data
my_awesome_inputs = {
    x_0 = jnp.ones((64,), dtype=jnp.float16)
    ...
    x_k = jnp.ones((128,), dtype=jnp.float16)
}

# Split the model
graph, state = spark.split((model))

# Use run_model and reuse state!
out, state = run_model(graph, state, **my_awesome_inputs)

# State now contains the updated state of the model!.

There is much more to Spark that what we can show here, if you are ready to learn more you can go to the tutorial section!

Roadmap

We have many exciting features planned.

🔥 Components, a lot of them: Spark is built around the idea of modular neurons. Literature is full of really interesting ideas but integrating them to existing code is sometimes annoying and prone to errors. One of our goals is to transform those ideas into modular, reusable and plugable code.

📊 Built-in Visualization: (Maybe Coming Soon?) Tools for visualizing spike trains, membrane potentials, and network activity in real-time.

🧮 Custom kernels: Spark is fast but it can be faster!. Several operations can be further optimized using custom kernels.

Contributing

Contributions are what make the open-source community such an amazing place. Any contributions you make are greatly appreciated.

Citing Spark

You can use the following reference to cite this repository,

@software{spark_snn_github,
  author = {Mario Franco, Carlos Gershenson},
  title = {Spark: Modular Spiking Neural Networks},
  url = {https://github.com/nogarx/spark/},
  version = {0.1.0},
  year = {2025},
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

spark_snn-0.1.0.tar.gz (169.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

spark_snn-0.1.0-py3-none-any.whl (171.5 kB view details)

Uploaded Python 3

File details

Details for the file spark_snn-0.1.0.tar.gz.

File metadata

  • Download URL: spark_snn-0.1.0.tar.gz
  • Upload date:
  • Size: 169.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.8

File hashes

Hashes for spark_snn-0.1.0.tar.gz
Algorithm Hash digest
SHA256 ca18ffeff2a8990223cd6a359efce50d5496197fb77167618f5bcb78b7451d65
MD5 1cf0fd72e6968992777dcec7f8589b63
BLAKE2b-256 245cbfaf25da4ebbac273df69e44a3260469b7de58bbbe14ebbfb29039256b2d

See more details on using hashes here.

File details

Details for the file spark_snn-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: spark_snn-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 171.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.8

File hashes

Hashes for spark_snn-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 f2b8531e586d4725535b18861379198d7cd36e0ebd6981cc58ba6a9ce97553c1
MD5 d1da76eb09bd5dce63ac43789965ea36
BLAKE2b-256 654c7b9717bedab7e2a8cbd8e57daa5ead80ea49366deec59a4f42142623ad64

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page