Evolutionary optimizer with NOMAD local search
Project description
ENOMAD
ENOMAD is a flexible evolutionary optimizer that couples global search (crossover + mutation) with local, derivative‑free refinement via the NOMAD solver, exposed through PyNomad.
Why ENOMAD?
Derivative‑free continuous control – no action discretisation, no back‑prop through time; handles dense recurrence gracefully.
Prior‑aware search – starts from a biological connectome (or any strong prior).
- EA mode: many tiny edits → minimal drift.
- rEA mode: < 50 edits → structural fidelity.
Embarrassingly parallel – flip on Ray to scale linearly with CPU cores.
Installation
pip install EANOMAD
# Development version (clone + editable install)
git clone https://github.com/greenfire0/EA-NOMAD.git
cd EA-NOMAD
pip install -e .[ray] # optional: Ray for parallel NOMAD calls
Requirements: Python >= 3.9 · NumPy >= 1.23 · tqdm · PyNomad >= 0.9 · (optional: ray)
Quick start
Try EA‑NOMAD instantly in your browser:
import numpy as np
from EANOMAD import EANOMAD
obj = lambda x: -np.sum(x**2) # maximise => global optimum at x = 0
opt = EANOMAD(
"EA", # or "hybrid"
population_size=32,
dimension=10,
objective_fn=obj,
subset_size=5,
bounds=0.2,
max_bb_eval=100,
n_mutate_coords=2,
)
best_x, best_fit = opt.run(generations=100)
print(f"Best fitness: {best_fit:.4f}")
API
EANOMAD(
optimizer_type: Literal["EA", "rEA"],
population_size: int,
dimension: int,
objective_fn: Callable[[np.ndarray], float],
subset_size: int = 20,
bounds: float = 0.1,
max_bb_eval: int = 200,
n_elites: int | None = None,
n_mutate_coords: int = 5,
crossover_rate: float = 0.5,
crossover_type: Literal["uniform", "fitness"] = "uniform",
crossover_exponent: float = 1.0,
init_pop: np.ndarray | None = None,
init_vec: np.ndarray | None = None,
low: float = -1.0,
high: float = 1.0,
use_ray: bool | None = None,
seed: int | None = None,
)
Methods
EA‑NOMAD offers two training strategies that differ only in when and how NOMAD is invoked within the evolutionary loop.
EA mode
Every generation, each individual in the population is passed to NOMAD for local refinement:
- Slice selection – Pick
subset_sizecoordinates at random (≤ 49, per NOMAD’s convergence guarantees). - Local search – Run PyNomad with a ±
boundshyper‑rectangle around that slice and a budget ofmax_bb_evalevaluations. - Replacement – If the refined individual improves its fitness, it replaces the original.
- Reproduction – Select the top
n_elitesby fitness, then fill the rest of the population via fitness‑proportional crossover (probabilitycrossover_rate) followed by random‑reset mutation (n_mutate_coordscoordinates).
EA mode tends to make many small synaptic adjustments, keeping the overall L2 distance to the original connectome low while steadily improving reward.
rEA mode
An evolutionary mutation proposes a sparse change‑set first; NOMAD then fine‑tunes only those altered weights:
- Mutation – Each offspring mutates a random subset of weights (usually < 50).
- Targeted NOMAD – If the diff mask is novel and < 50 coords, run PyNomad only on that mask.
- Evaluation & elitism – Update fitness, retain best individuals, proceed with crossover/mutation.
rEA mode yields comparable rewards to EA mode while changing far fewer synapses – ideal when biological plausibility demands minimal rewiring.
Key hyper‑parameters (shared):
| name | effect |
|---|---|
subset_size |
# parameters NOMAD refines per call (≤ 49) |
bounds |
half‑width of the NOMAD search box |
max_bb_eval |
NOMAD evaluations per call |
n_mutate_coords |
coordinates reset per mutation |
Testing
pip install -e .[dev] # includes pytest, ruff, black, etc.
pytest -q # run smoke + reproducibility tests
Contributing
- Fork + create a feature branch
- Run
pre-commit install - Add unit tests for new behavior
- PR + short summary of the change
License
MIT License — see LICENSE file
Acknowledgements
- Hi my name is miles, I hope you enjoy these algorithms and optimize some cool stuff using them <3
- PyNomad
- NOMAD team at Polytechnique Montréal / GERADmiddlemouse
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file enomad-0.1.0.tar.gz.
File metadata
- Download URL: enomad-0.1.0.tar.gz
- Upload date:
- Size: 29.5 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.13.2
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
9d404afc0a786f90779d1f5acbfeeaf07dac563d291ee43aded6211e599b2335
|
|
| MD5 |
4a0071de7c719a253c97c212bf2e9e4b
|
|
| BLAKE2b-256 |
dd853d8087cb2e4b5b822e08baace549f644b266e05a3026978c0def75c4a1e8
|
File details
Details for the file enomad-0.1.0-py3-none-any.whl.
File metadata
- Download URL: enomad-0.1.0-py3-none-any.whl
- Upload date:
- Size: 13.0 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.13.2
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
49a74a76dd2bb42e0e1f66e4233dd05296006d98885a0a224c84fb855bcec57f
|
|
| MD5 |
50b21645c4c39b7d9c7e1dccf534a0ad
|
|
| BLAKE2b-256 |
a59fbafb18b01881be949fca1b4bd634cb53bdabbc50a9bf5129df972528cab6
|