pyperch refactor
Project description
pyperch
A lightweight, modular library for neural network weight optimization using randomized search algorithms built directly on top of PyTorch. Pyperch provides flexible alternatives to gradient-based optimization, enabling experimentation with Randomized Hill Climbing, Simulated Annealing, and Genetic Algorithms - as well as hybrid approaches where these methods work alongside traditional optimizers like Adam or SGD.
The current version introduces a unified training API, improved configuration system, deterministic behavior, and a cleaner separation between optimization logic and training.
Key Features
-
Randomized Optimization Algorithms
- Randomized Hill Climbing (RHC)
- Simulated Annealing (SA)
- Genetic Algorithm (GA)
-
Hybrid Training Support
Combine layer-wise modes (freeze, grad, meta) to mix gradient-free and gradient-based optimization in the same network. -
Unified Trainer API
An interface for classification, regression, batching, metrics, early stopping, and reproducibility. -
Pure PyTorch (No Skorch Dependency)
All examples are built on native PyTorch modules and DataLoader. -
Modern Configuration System
Structured configs (TrainConfig,OptimizerConfig, etc.) keep experiments consistent and explicit. -
Utility Functions Included
Metrics, plotting helpers, seed control, and structured outputs. -
Search Integration Optuna-based hyperparameter grid search (parallel-ready) for RHC/SA/GA tuning.
-
Pure PyTorch No Skorch dependency; all examples use native PyTorch modules and DataLoader.
-
Modern Project Tooling
- Poetry for dependencies, builds, and publishing
- Black for code formatting
- Ruff for linting and import sorting
- CircleCI for automated testing
-
Utilities Included Metrics, plotting helpers, consistent seed control, and structured training outputs.
Installation
pip install pyperch
If developing locally:
poetry install
Quick Start and Examples (Coming Soon)
will be added after the refactor merge:
Legacy Standalone Optimizers (RHC, SA, GA)
If you are upgrading from Pyperch ≤ 0.1.6, the original standalone (functional) optimizers have been preserved for backward compatibility.
You can find the previous implementations here:
- Git tag:
v1-legacy - Directory:
pyperch/optim/
The new refactored optimizers can be found under:
pyperch.optim.*
Contributing
Contributions are welcome. To submit a change:
- Fork the repository
- Create a feature branch:
git checkout -b feature/my-change
- Commit your work:
git commit -m "feat: describe your change"
- Push your branch:
git push origin feature/my-change
- Open a pull request on GitHub
Code Style
Before opening a PR:
poetry run black pyperch
poetry run ruff check pyperch --fix
This ensures consistent formatting and linting across the project.
License
MIT License
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
File details
Details for the file pyperch-0.2.1.tar.gz.
File metadata
- Download URL: pyperch-0.2.1.tar.gz
- Upload date:
- Size: 27.1 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.8.0 colorama/0.4.4 importlib-metadata/4.6.4 keyring/23.5.0 pkginfo/1.8.2 readme-renderer/34.0 requests-toolbelt/0.9.1 requests/2.25.1 rfc3986/1.5.0 tqdm/4.66.1 urllib3/1.26.5 CPython/3.10.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
90e7e93c15a2a47d1b22e46be26b1e4fc47f45817ae42d6a3eb08b780eef63be
|
|
| MD5 |
d1ae1cc532c9376a9371943299d0cd81
|
|
| BLAKE2b-256 |
2e0168e72e7344d84d3d6827163ac2b261737d4db1cee507deab08e07fa8a5ff
|