A modular and asynchronous framework for neuromorphic optimisation using spike-driven metaheuristics and heuristic-controlled spiking dynamics.
Project description
NeurOptimiser
NeurOptimiser is a neuromorphic optimisation framework in which metaheuristic search emerges from asynchronous spiking dynamics. It defines optimisation as a decentralised process executed by interconnected Neuromorphic Heuristic Units (NHUs), each embedding a spiking neuron model and a spike-triggered heuristic rule.
This framework enables fully event-driven, low-power optimisation by integrating spiking computation with local heuristic adaptation. It supports multiple neuron models, perturbation operators, and network topologies.
✨ Key Features
- Modular and extensible architecture using Intel’s Lava.
- Supports linear and Izhikevich neuron dynamics.
- Implements random, fixed, directional, and Differential Evolution operators as spike-triggered perturbations.
- Includes asynchronous neighbourhood management, tensor contraction layers, and greedy selectors.
- Compatible with BBOB (COCO) suite.
- Designed for scalability, reusability, and future deployment on Loihi-class neuromorphic hardware.
📖 Documentation
For detailed documentation, examples, and API reference, please visit the Neuroptimiser Documentation.
📦 Installation
pip install -e .
Ensure you have Python ≥ 3.10 and the Lava-NC environment configured.
You can also clone the repository and install it. Check the Makefile for additional options.
🚀 Example Usage
from neuroptimiser import NeuroOptimiser
import numpy as np
problem_function = lambda x: np.linalg.norm(x)
problem_bounds = np.array([[-5.0, 5.0], [-5.0, 5.0]])
optimiser = NeurOptimiser()
optimiser.solve(
obj_func=problem_function,
search_space=problem_bounds,
debug_mode=True,
num_iterations=1000,
)
For more examples, please, visit Neuroptimiser Usage
📊 Benchmarking
Neuroptimiser has been validated over the BBOB suite, showing:
- Competitive convergence versus Random Search
- Consistent results across function types and dimensions
- Linear runtime scaling with number of units and problem size
🔬 Citation
@misc{neuroptimiser2025,
author={Cruz-Duarte, Jorge M. and Talbi, El-Ghazali},
title = {Neuroptimiser: A neuromorphic optimisation framework},
year = {2025},
url = {https://github.com/neuroptimiser/neuroptimiser},
note = {Version 1.0.X, accessed on 20XX-XX-XX}
}
🔗 Resources
- 📘 Documentation
- 📜 Paper
- 🧠 Intel Lava-NC
- 🧪 COCO Platform
🛠️ License
MIT License — see LICENSE
🧑💻 Authors
- Jorge M. Cruz-Duarte — University of Lille
- El-Ghazali Talbi — University of Lille
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file neuroptimiser-1.0.3.tar.gz.
File metadata
- Download URL: neuroptimiser-1.0.3.tar.gz
- Upload date:
- Size: 41.6 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.10.14
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
4ea6ed94a55cbf6a547c6224e52a3f9ea5824ab06d80e992937c43aae6f57977
|
|
| MD5 |
c4ad3f6f6c3ef279d65006a34a11eae6
|
|
| BLAKE2b-256 |
c4d1045769aac1b30cf51a8dcdb897989c3e6be8a7afc8dade1418b1a6f88310
|
File details
Details for the file neuroptimiser-1.0.3-py3-none-any.whl.
File metadata
- Download URL: neuroptimiser-1.0.3-py3-none-any.whl
- Upload date:
- Size: 37.5 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.10.14
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
4403513b2b66678e4713e97b076283f87e093dd6a5cd981c07a979f8f0d7360c
|
|
| MD5 |
7d89e658299bcb77538beb6742d52ed2
|
|
| BLAKE2b-256 |
be716fb17f592f996088b6813eb17c6dd5d77e215daa475c76d5114556f97f8f
|