Skip to main content

pyperch refactor

Project description

pyperch

PyPI Python Versions License: MIT Code Style: Black Linter: Ruff CircleCI

A lightweight, modular library for neural network weight optimization using randomized search algorithms built directly on top of PyTorch. Pyperch provides flexible alternatives to gradient-based optimization, enabling experimentation with Randomized Hill Climbing, Simulated Annealing, and Genetic Algorithms - as well as hybrid approaches where these methods work alongside traditional optimizers like Adam or SGD.

The current version introduces a unified training API, improved configuration system, deterministic behavior, and a cleaner separation between optimization logic and training.


Key Features

  • Randomized Optimization Algorithms

    • Randomized Hill Climbing (RHC)
    • Simulated Annealing (SA)
    • Genetic Algorithm (GA)
  • Hybrid Training Support
    Combine layer-wise modes (freeze, grad, meta) to mix gradient-free and gradient-based optimization in the same network.

  • Unified Trainer API
    An interface for classification, regression, batching, metrics, early stopping, and reproducibility.

  • Pure PyTorch (No Skorch Dependency)
    All examples are built on native PyTorch modules and DataLoader.

  • Modern Configuration System
    Structured configs (TrainConfig, OptimizerConfig, etc.) keep experiments consistent and explicit.

  • Utility Functions Included
    Metrics, plotting helpers, seed control, and structured outputs.

  • Search Integration Optuna-based hyperparameter grid search (parallel-ready) for RHC/SA/GA tuning.

  • Pure PyTorch No Skorch dependency; all examples use native PyTorch modules and DataLoader.

  • Modern Project Tooling

    • Poetry for dependencies, builds, and publishing
    • Black for code formatting
    • Ruff for linting and import sorting
    • CircleCI for automated testing
  • Utilities Included Metrics, plotting helpers, consistent seed control, and structured training outputs.


Installation

pip install pyperch

If developing locally:

poetry install

Quick Start and Examples (Coming Soon)

will be added after the refactor merge:


Legacy Standalone Optimizers (RHC, SA, GA)

If you are upgrading from Pyperch ≤ 0.1.6, the original standalone (functional) optimizers have been preserved for backward compatibility.

You can find the previous implementations here:

The new refactored optimizers can be found under:

pyperch.optim.*

Contributing

Contributions are welcome. To submit a change:

  1. Fork the repository
  2. Create a feature branch:
git checkout -b feature/my-change
  1. Commit your work:
git commit -m "feat: describe your change"
  1. Push your branch:
git push origin feature/my-change
  1. Open a pull request on GitHub

Code Style

Before opening a PR:

poetry run black pyperch
poetry run ruff check pyperch --fix

This ensures consistent formatting and linting across the project.


License

MIT License

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pyperch-0.2.0.tar.gz (27.0 kB view details)

Uploaded Source

File details

Details for the file pyperch-0.2.0.tar.gz.

File metadata

  • Download URL: pyperch-0.2.0.tar.gz
  • Upload date:
  • Size: 27.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.8.0 colorama/0.4.4 importlib-metadata/4.6.4 keyring/23.5.0 pkginfo/1.8.2 readme-renderer/34.0 requests-toolbelt/0.9.1 requests/2.25.1 rfc3986/1.5.0 tqdm/4.66.1 urllib3/1.26.5 CPython/3.10.12

File hashes

Hashes for pyperch-0.2.0.tar.gz
Algorithm Hash digest
SHA256 c608bd09d5abc0b669a3bfd46f4bdcaabc42e36fcd0b294a305827c2354f4e08
MD5 c95e5dccd35a9b6879201ca556f0037f
BLAKE2b-256 668a473ce078ab57420136ddddf077300b16e1cf0eed15a2b78c889a2dcd631f

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page