Skip to main content

Evolutionary optimizer with NOMAD local search

Project description

EA‑NOMAD

EA‑NOMAD is a flexible evolutionary optimizer that couples global search (crossover + mutation) with local, derivative‑free refinement via the NOMAD solver, exposed through PyNomad.

Why EA‑NOMAD?

  • Derivative‑free continuous control – no action discretisation, no back‑prop through time; handles dense recurrence gracefully.

  • Prior‑aware search – starts from a biological connectome (or any strong prior).

    • Pure mode: many tiny edits → minimal drift.
    • Hybrid mode: < 50 edits → structural fidelity.
  • Embarrassingly parallel – flip on Ray to scale linearly with CPU cores.


Installation

pip install EANOMAD

# Development version (clone + editable install)
git clone https://github.com/greenfire0/EA-NOMAD.git
cd EA-NOMAD
pip install -e .[ray]  # optional: Ray for parallel NOMAD calls

Requirements: Python >= 3.9 · NumPy >= 1.23 · tqdm · PyNomad >= 0.9 · (optional: ray)


Quick start

import numpy as np
from EANOMAD import EANOMAD

obj = lambda x: -np.sum(x**2)  # maximise => global optimum at x = 0

opt = EANOMAD(
    "pure",                          # or "hybrid"
    population_size=32,
    dimension=10,
    objective_fn=obj,
    subset_size=5,
    bounds=0.2,
    max_bb_eval=100,
    n_mutate_coords=2,
)

best_x, best_fit = opt.run(generations=100)
print(f"Best fitness: {best_fit:.4f}")

API

EANOMAD(
    optimizer_type: Literal["pure", "hybrid"],
    population_size: int,
    dimension: int,
    objective_fn: Callable[[np.ndarray], float],
    subset_size: int = 20,
    bounds: float = 0.1,
    max_bb_eval: int = 200,
    n_elites: int | None = None,
    n_mutate_coords: int = 5,
    crossover_rate: float = 0.5,
    crossover_type: Literal["uniform", "fitness"] = "uniform",
    crossover_exponent: float = 1.0,
    init_pop: np.ndarray | None = None,
    init_vec: np.ndarray | None = None,
    low: float = -1.0,
    high: float = 1.0,
    use_ray: bool | None = None,
    seed: int | None = None,
)

Methods

EA‑NOMAD offers two training strategies that differ only in when and how NOMAD is invoked within the evolutionary loop.

Pure mode

Every generation, each individual in the population is passed to NOMAD for local refinement:

  1. Slice selection – Pick subset_size coordinates at random (≤ 49, per NOMAD’s convergence guarantees).
  2. Local search – Run PyNomad with a ±bounds hyper‑rectangle around that slice and a budget of max_bb_eval evaluations.
  3. Replacement – If the refined individual improves its fitness, it replaces the original.
  4. Reproduction – Select the top n_elites by fitness, then fill the rest of the population via fitness‑proportional crossover (probability crossover_rate) followed by random‑reset mutation (n_mutate_coords coordinates).

Pure mode tends to make many small synaptic adjustments, keeping the overall L2 distance to the original connectome low while steadily improving reward.

Hybrid mode

An evolutionary mutation proposes a sparse change‑set first; NOMAD then fine‑tunes only those altered weights:

  1. Mutation – Each offspring mutates a random subset of weights (usually < 50).
  2. Targeted NOMAD – If the diff mask is novel and < 50 coords, run PyNomad only on that mask.
  3. Evaluation & elitism – Update fitness, retain best individuals, proceed with crossover/mutation.

Hybrid mode yields comparable rewards to Pure mode while changing far fewer synapses – ideal when biological plausibility demands minimal rewiring.

Key hyper‑parameters (shared):

name effect
subset_size # parameters NOMAD refines per call (≤ 49)
bounds half‑width of the NOMAD search box
max_bb_eval NOMAD evaluations per call
n_mutate_coords coordinates reset per mutation

Testing

pip install -e .[dev]  # includes pytest, ruff, black, etc.
pytest -q              # run smoke + reproducibility tests

Contributing

  1. Fork + create a feature branch
  2. Run pre-commit install
  3. Add unit tests for new behavior
  4. PR + short summary of the change

License

MIT License — see LICENSE file


Acknowledgements

  • Hi my name is miles, I hope you enjoy these algorithms and optimize some cool shit using them <3
  • PyNomad
  • NOMAD team at Polytechnique Montréal / GERADmiddlemouse

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

eanomad-0.1.0.tar.gz (32.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

eanomad-0.1.0-py3-none-any.whl (30.9 kB view details)

Uploaded Python 3

File details

Details for the file eanomad-0.1.0.tar.gz.

File metadata

  • Download URL: eanomad-0.1.0.tar.gz
  • Upload date:
  • Size: 32.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for eanomad-0.1.0.tar.gz
Algorithm Hash digest
SHA256 647585ce08a73f3a5247eb3dd545e4e3b40246f8b3048054b9d2b4fe029d5f6f
MD5 c098551fa06d6014abc5973e6b2ae9fe
BLAKE2b-256 fc0d7f07d5bcd7e6f5879d03d4c053836cf498b9e1795b474a27d5db92828ce3

See more details on using hashes here.

File details

Details for the file eanomad-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: eanomad-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 30.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for eanomad-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 4ee13aab84e8375e4990caabb40f5c1467162ebff02075f83c5b621df94b57d0
MD5 ac71af157e3d1621614975095bdad1b9
BLAKE2b-256 dac314caf0457cac60409df71393f7c6caedb2cc061c7ea7c3b963aff21f8da8

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page