Skip to main content

pyperch refactor

Project description

pyperch

PyPI Python Versions License: MIT Code Style: Black Linter: Ruff CircleCI

A lightweight, modular library for neural network weight optimization using randomized search algorithms built directly on top of PyTorch. Pyperch is a research and teaching-oriented library for training neural networks using randomized optimization methods (RHC, SA, GA), gradient-based methods, and hybrid combinations.


Key Features

  • Randomized Optimization Algorithms

    • Randomized Hill Climbing (RHC)
    • Simulated Annealing (SA)
    • Genetic Algorithm (GA)
  • Hybrid Training Support
    Combine layer-wise modes (freeze, grad, meta) to mix gradient-free and gradient-based optimization in the same network.

  • Unified Trainer API
    An interface for classification, regression, batching, metrics, early stopping, and reproducibility.

  • Pure PyTorch (No Skorch Dependency)
    All examples are built on native PyTorch modules and DataLoader.

  • Modern Configuration System
    Structured configs (TrainConfig, OptimizerConfig, etc.) keep experiments consistent and explicit.

  • Utility Functions Included
    Metrics, plotting helpers, seed control, and structured outputs.

  • Search Integration Optuna-based hyperparameter grid search (parallel-ready) for RHC/SA/GA tuning.

  • Pure PyTorch No Skorch dependency; all examples use native PyTorch modules and DataLoader.

  • Modern Project Tooling

    • Poetry for dependencies, builds, and publishing
    • Black for code formatting
    • Ruff for linting and import sorting
    • CircleCI for automated testing
  • Utilities Included Metrics, plotting helpers, consistent seed control, and structured training outputs.


Installation

pip install pyperch

If developing locally:

poetry install

API and Examples

The public API and usage examples are documented and organized as follows:

API Documentation

The user-facing APIs are documented under:

Key entry points include:

  • Perch Builder API - experiment construction, training, and hybrid optimization
  • Optuna Search API - hyperparameter search using an adapter-based Optuna integration

Examples

Notebook/colab examples showing common workflows can be found in:

The examples cover:

  • Classification and regression
  • Randomized optimization (RHC, SA, GA)
  • Gradient and hybrid optimization
  • Layer freezing and meta-optimization
  • Optuna-based hyperparameter search

Legacy Standalone Optimizers (RHC, SA, GA)

If you are upgrading from Pyperch ≤ 0.1.6, the original standalone (functional) optimizers have been preserved for backward compatibility.

You can find the previous implementations here:

The new refactored optimizers can be found under:

pyperch.optim.*

Contributing

Contributions are welcome. To submit a change:

  1. Fork the repository
  2. Create a feature branch:
git checkout -b feature/my-change
  1. Commit your work:
git commit -m "feat: describe your change"
  1. Push your branch:
git push origin feature/my-change
  1. Open a pull request on GitHub

Code Style

Before opening a PR:

poetry run black pyperch
poetry run ruff check pyperch --fix

This ensures consistent formatting and linting across the project.


License

MIT License

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pyperch-0.2.2.tar.gz (27.4 kB view details)

Uploaded Source

File details

Details for the file pyperch-0.2.2.tar.gz.

File metadata

  • Download URL: pyperch-0.2.2.tar.gz
  • Upload date:
  • Size: 27.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.8.0 colorama/0.4.4 importlib-metadata/4.6.4 keyring/23.5.0 pkginfo/1.8.2 readme-renderer/34.0 requests-toolbelt/0.9.1 requests/2.25.1 rfc3986/1.5.0 tqdm/4.66.1 urllib3/1.26.5 CPython/3.10.12

File hashes

Hashes for pyperch-0.2.2.tar.gz
Algorithm Hash digest
SHA256 0f2c6957fd4dc1af18c07a90e874830c1dd7b4aff9ea9a05007259e1ab53a20b
MD5 146c1208bb0420e54aef961a4410161c
BLAKE2b-256 4a819c9704c1b5851413f9e4e357e1be9309a8fe5b114c15c6050da4e23d7b09

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page