Skip to main content

A modular and asynchronous framework for neuromorphic optimisation using spike-driven metaheuristics and heuristic-controlled spiking dynamics.

Project description

Status PyPI Version GitHub Tag Python Versions PyPI Downloads License arXiv DOI DOI

NeurOptimiser

NeurOptimiser is a neuromorphic optimisation framework in which metaheuristic search emerges from asynchronous spiking dynamics. It defines optimisation as a decentralised process executed by interconnected Neuromorphic Heuristic Units (NHUs), each embedding a spiking neuron model and a spike-triggered heuristic rule.

This framework enables fully event-driven, low-power optimisation by integrating spiking computation with local heuristic adaptation. It supports multiple neuron models, perturbation operators, and network topologies.


✨ Key Features

  • Modular and extensible architecture using Intel’s Lava.
  • Supports linear and Izhikevich neuron dynamics.
  • Implements random, fixed, directional, and Differential Evolution operators as spike-triggered perturbations.
  • Includes asynchronous neighbourhood management, tensor contraction layers, and greedy selectors.
  • Compatible with BBOB (COCO) suite.
  • Designed for scalability, reusability, and future deployment on Loihi-class neuromorphic hardware.

📖 Documentation

For detailed documentation, examples, and API reference, please visit the Neuroptimiser Documentation.

📦 Installation

pip install -e .

Ensure you have Python ≥ 3.10 and the Lava-NC environment configured.

You can also clone the repository and install it. Check the Makefile for additional options.

🚀 Example Usage

from neuroptimiser import NeuroOptimiser

neuropt = NeuroOptimiser(
    problem=my_bbob_function,
    dimension=10,
    num_units=30,
    neuron_model="izhikevich",
    spike_operator="de_rand",
    spike_condition="l2"
)

neuropt.run(max_steps=1000)

📊 Benchmarking

Neuroptimiser has been validated over the BBOB suite, showing:

  • Competitive convergence versus Random Search
  • Consistent results across function types and dimensions
  • Linear runtime scaling with number of units and problem size

🔬 Citation

@misc{neuroptimiser2025,
  author={Cruz-Duarte, Jorge M. and Talbi, El-Ghazali},
  title        = {Neuroptimiser: A neuromorphic optimisation framework},
  year         = {2025},
  url          = {https://github.com/neuroptimiser/neuroptimiser},
  note         = {Version 1.0.X, accessed on 20XX-XX-XX}
}

🔗 Resources

🛠️ License

MIT License — see LICENSE

🧑‍💻 Authors

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

neuroptimiser-1.0.1.tar.gz (37.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

neuroptimiser-1.0.1-py3-none-any.whl (33.4 kB view details)

Uploaded Python 3

File details

Details for the file neuroptimiser-1.0.1.tar.gz.

File metadata

  • Download URL: neuroptimiser-1.0.1.tar.gz
  • Upload date:
  • Size: 37.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.10.14

File hashes

Hashes for neuroptimiser-1.0.1.tar.gz
Algorithm Hash digest
SHA256 341a4e4e92d8f467d4e8ad6ad9aa1d9e0eae9c33f199d35a7b6b8a87330fd2cf
MD5 43ae8a89ea0c176c270b8d76480c9b89
BLAKE2b-256 f29a0e16296e8905a8e94f851e2dd54006911e4928383a9bb7a700ef889d1718

See more details on using hashes here.

File details

Details for the file neuroptimiser-1.0.1-py3-none-any.whl.

File metadata

  • Download URL: neuroptimiser-1.0.1-py3-none-any.whl
  • Upload date:
  • Size: 33.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.10.14

File hashes

Hashes for neuroptimiser-1.0.1-py3-none-any.whl
Algorithm Hash digest
SHA256 04c0934d5fb709020f15ae525f92530d2c245433684bc7d234c0b23bea0d03ad
MD5 fe1f793d9d184ea9608c3daddb399aff
BLAKE2b-256 2978e52a50a492a9978ea1b5042cd2bd42c283f86b434e6a5f4b80cb35e13bad

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page