Skip to main content

A modular and asynchronous framework for neuromorphic optimisation using spike-driven metaheuristics and heuristic-controlled spiking dynamics.

Project description

Status PyPI Version GitHub Tag Python Versions PyPI Downloads License arXiv DOI DOI

NeurOptimiser

NeurOptimiser is a neuromorphic optimisation framework in which metaheuristic search emerges from asynchronous spiking dynamics. It defines optimisation as a decentralised process executed by interconnected Neuromorphic Heuristic Units (NHUs), each embedding a spiking neuron model and a spike-triggered heuristic rule.

This framework enables fully event-driven, low-power optimisation by integrating spiking computation with local heuristic adaptation. It supports multiple neuron models, perturbation operators, and network topologies.


✨ Key Features

  • Modular and extensible architecture using Intel’s Lava.
  • Supports linear and Izhikevich neuron dynamics.
  • Implements random, fixed, directional, and Differential Evolution operators as spike-triggered perturbations.
  • Includes asynchronous neighbourhood management, tensor contraction layers, and greedy selectors.
  • Compatible with BBOB (COCO) suite.
  • Designed for scalability, reusability, and future deployment on Loihi-class neuromorphic hardware.

📖 Documentation

For detailed documentation, examples, and API reference, please visit the Neuroptimiser Documentation.

📦 Installation

pip install -e .

Ensure you have Python ≥ 3.10 and the Lava-NC environment configured.

You can also clone the repository and install it. Check the Makefile for additional options.

🚀 Example Usage

from neuroptimiser import NeuroOptimiser
import numpy as np

problem_function    = lambda x: np.linalg.norm(x)
problem_bounds      = np.array([[-5.0, 5.0], [-5.0, 5.0]])

optimiser = NeurOptimiser()

optimiser.solve(
    obj_func=problem_function,
    search_space=problem_bounds,
    debug_mode=True,
    num_iterations=1000,
)

For more examples, please, visit Neuroptimiser Usage

📊 Benchmarking

Neuroptimiser has been validated over the BBOB suite, showing:

  • Competitive convergence versus Random Search
  • Consistent results across function types and dimensions
  • Linear runtime scaling with number of units and problem size

🔬 Citation

@misc{neuroptimiser2025,
  author={Cruz-Duarte, Jorge M. and Talbi, El-Ghazali},
  title        = {Neuroptimiser: A neuromorphic optimisation framework},
  year         = {2025},
  url          = {https://github.com/neuroptimiser/neuroptimiser},
  note         = {Version 1.0.X, accessed on 20XX-XX-XX}
}

🔗 Resources

🛠️ License

MIT License — see LICENSE

🧑‍💻 Authors

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

neuroptimiser-1.0.2.tar.gz (41.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

neuroptimiser-1.0.2-py3-none-any.whl (37.3 kB view details)

Uploaded Python 3

File details

Details for the file neuroptimiser-1.0.2.tar.gz.

File metadata

  • Download URL: neuroptimiser-1.0.2.tar.gz
  • Upload date:
  • Size: 41.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.10.14

File hashes

Hashes for neuroptimiser-1.0.2.tar.gz
Algorithm Hash digest
SHA256 db91d5530692c76aeb8caf6b4d32c90904ae4caaba8482ef53deb9a4a84023f3
MD5 135dbb3c6591339862af8e6de54ff649
BLAKE2b-256 3252beaa1251d565fe08ffe4b9e63ffc772239a1030bee99e43d6514d361ba04

See more details on using hashes here.

File details

Details for the file neuroptimiser-1.0.2-py3-none-any.whl.

File metadata

  • Download URL: neuroptimiser-1.0.2-py3-none-any.whl
  • Upload date:
  • Size: 37.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.10.14

File hashes

Hashes for neuroptimiser-1.0.2-py3-none-any.whl
Algorithm Hash digest
SHA256 e84baba98aa4ede25518625ca782042fd8c70c0d7aa28063005e1af811eea6dc
MD5 9653b7ac3e27db387aba3fb753d0f8ea
BLAKE2b-256 c690c373fa170260d324c849d040c3f960472122db4927a6acda2a7c6976cdf3

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page