Skip to main content

A lightweight Python library for hyperparameter tuning using Metaheuristic algorithms.

Project description

Hyperparameter Optimizer

A lightweight and flexible Python library for hyperparameter tuning using metaheuristic techniques.
Currently, the library includes Particle Swarm Optimization (PSO) only.

Designed for Scikit-learn-compatible models, this package offers an easy-to-use interface for optimizing model performance in just a few lines.

PyPI version License: MIT Downloads


📦 Installation

Install from PyPI:

pip install hyperparameter-optimizer

⚙️ Features

  • ✅ Metaheuristic optimization using Particle Swarm Optimization (PSO)
  • ✅ Supports continuous, integer, and categorical hyperparameters
  • ✅ Compatible with any Scikit-learn estimator
  • ✅ Custom scoring metrics
  • ✅ Cross-validation built-in
  • ✅ Verbose logging and full traceability

📚 API Reference

HyperparameterOptimizer

HyperparameterOptimizer(
    obj_func,           # machine learning model or pipeline being created
    params,             # dictionary of parameters for which the model is to be optimized.
    scoring,            # scoring metric (e.g., 'accuracy')
    opt_type="max",     # type of optimization: "max" for maximization (default) or "min" for minimization.
    cv=5,               # number of cross-validation folds
    verbose=1           # binary with a value of 1 (default) to show iteration information.
)

optimizePS

optimizePS(
    features,           # training features
    target,             # training target
    nParticles,         # number of particles in the swarm
    bounds,             # List of (min, max) tuples or a categorical list of choices. 
                        # If either min/max is non-integer, the algorithm shall consider them as floats (i.e., continuous).
    w=0.5,              # inertia weight
    c1=1,               # cognitive weight
    c2=1,               # social weight
    maxIter=20,         # maximum number of iterations
    mutation_prob=0.1   # Mutation probability -- for discrete hyperparameters
)

🚀 Quickstart

Here's a basic example of how to use Hyperparameter Optimizer with a Scikit-learn model (e.g., RandomForestClassifier):

from sklearn.datasets import load_iris
from sklearn.ensemble import RandomForestClassifier
from hyperparameter_optimizer import HyperparameterOptimizer

# Load data
X, y = load_iris(return_X_y=True)

# Define parameter space
param_space = {
    'n_estimators': [],
    'max_depth': [],
    'criterion': []
}

# Initialize the optimizer
optimizer = HyperparameterOptimizer(
                                        obj_func=RandomForestClassifier(),
                                        params=param_space,
                                        scoring='accuracy',
                                        opt_type='max',
                                        cv=3,
                                        verbose=1
                                    )

# Run optimization
particles, Gbest_history, Gbest_pos, Gbest_score = optimizer.optimizePS(
                                                                        features=X,
                                                                        target=y,
                                                                        nParticles=10,
                                                                        bounds=[(10, 1000), (1, 10), ['gini', 'entropy']],
                                                                        maxIter=10
                                                                    )

print("Best Params:", Gbest_pos)
print("Best Score:", Gbest_score)

🧠 How It Works

This library uses Particle Swarm Optimization (PSO) to explore the hyperparameter space by simulating a group of candidate solutions ("particles") moving through the search space. Each particle adjusts its position based on personal and global bests discovered during the search, converging toward an optimal solution.


📜 License

This project is licensed under the MIT License.
© 2025 Dr. Ahmed Moussa


🤝 Contributing

Pull requests are welcome.
For major changes, please open an issue first to discuss what you would like to change.


📫 Contact

For feedback, bugs, or collaboration ideas:


⭐️ Show Your Support

If you find this project useful, consider giving it a ⭐️ on GitHub!

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

hyperparameter_optimizer-0.1.1.tar.gz (6.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

hyperparameter_optimizer-0.1.1-py3-none-any.whl (7.0 kB view details)

Uploaded Python 3

File details

Details for the file hyperparameter_optimizer-0.1.1.tar.gz.

File metadata

File hashes

Hashes for hyperparameter_optimizer-0.1.1.tar.gz
Algorithm Hash digest
SHA256 204bc76aaf18e0819e3c7c36d743e2fc2747a162809def431b9022d071de09b7
MD5 33f0d4d4df82aa80fe1e2ce913ee3fcc
BLAKE2b-256 634dee30e92c34660d405e948ec9e63d53dc2a0a364dc68d1e3181dd4b0ed05e

See more details on using hashes here.

File details

Details for the file hyperparameter_optimizer-0.1.1-py3-none-any.whl.

File metadata

File hashes

Hashes for hyperparameter_optimizer-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 f3ae2146d6609a80570e4775b606e183e618569d3ed0bf278f422e1133540b8e
MD5 23fe404284bbdc51e4aa2ae1ef34d7bc
BLAKE2b-256 69a88795ff40d9bf17be32553afaa64b471006f3a41f311f44eb57fabe97db9b

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page