Skip to main content

A lightweight Python library for hyperparameter tuning using Metaheuristic algorithms.

Project description

Hyperparameter Optimizer

A lightweight and flexible Python library for hyperparameter tuning using metaheuristic techniques.
Currently, the library includes Particle Swarm Optimization (PSO) and Pattern Search (PS) only.

Designed for Scikit-learn-compatible models, this package offers an easy-to-use interface for optimizing model performance in just a few lines.

PyPI version License: MIT Downloads


📦 Installation

Install from PyPI:

pip install hyperparameter-optimizer

⚙️ Features

  • ✅ Metaheuristic optimization using Particle Swarm Optimization (PSO) and Pattern Search (PS)
  • ✅ Supports continuous, integer, and categorical hyperparameters
  • ✅ Compatible with any Scikit-learn estimator
  • ✅ Custom scoring metrics
  • ✅ Cross-validation built-in
  • ✅ Verbose logging and full traceability

📚 API Reference

HyperparameterOptimizer

HyperparameterOptimizer(
                            obj_func,                       # machine learning model or pipeline being created
                            params,                         # dictionary of parameters for which the model is to be optimized.
                            scoring,                        # scoring metric (e.g., 'accuracy')
                            opt_type="max",                 # type of optimization: "max" for maximization (default) or "min" for minimization.
                            cv=5,                           # number of cross-validation folds
                            verbose=1                       # binary with a value of 1 (default) to show iteration information.
)

optimizePSO

optimizePSO(
            features,                       # training features.
            target,                         # training target.
            nParticles,                     # number of particles in the swarm.
            bounds,                         # list of (min, max) tuples or a categorical list of choices. 
                                            # if either min/max is non-integer, the algorithm shall consider them as floats (i.e., continuous).
            w=0.5,                          # inertia weight.
            c1=1,                           # cognitive weight.
            c2=1,                           # social weight.
            maxIter=20,                     # maximum number of iterations.
            mutation_prob=0.1               # mtation probability -- for discrete hyperparameters.
)

optimizePS

optimizePS(
            features,                       # training features.
            target,                         # training target.
            bounds,                         # list of (min, max) tuples or a categorical list of choices. 
                                            # if either min/max is non-integer, the algorithm shall consider them as floats (i.e., continuous).
            mesh_size_coeff=0.6,            # mesh size coefficient.
            acc_coeff=1,                    # acceleration/mesh expansion coefficient.
            contr_coeff=0.5,                # mesh contraction coefficient.
            search_method="gps",            # new points exploration method; either 'gps' or 'mads'.
            min_mesh_ratio=0.001,           # minimum mesh size as a percentage of parameter range.
            maxIter=25                      # maximum number of iterations.
)

🚀 Quickstart

Here's a basic example of how to use Hyperparameter Optimizer with a Scikit-learn model (e.g., RandomForestClassifier):

from sklearn.datasets import load_iris
from sklearn.ensemble import RandomForestClassifier
from hyperparameter_optimizer import HyperparameterOptimizer

# Load data
X, y = load_iris(return_X_y=True)

# Define parameter space
param_space = {
    'n_estimators': [],
    'max_depth': [],
    'criterion': []
}

# Initialize the optimizer
optimizer = HyperparameterOptimizer(
                                        obj_func=RandomForestClassifier(),
                                        params=param_space,
                                        scoring='accuracy',
                                        opt_type='max',
                                        cv=3,
                                        verbose=1
                                    )

# Run optimization
# [1] Particle Swarm Optimization
particles, Gbest_history, Gbest_pos, Gbest_score = optimizer.optimizePSO(
                                                                            features = X,
                                                                            target = y,
                                                                            nParticles = 10,
                                                                            bounds = [(10, 1000), (1, 10), ['gini', 'entropy']],
                                                                            maxIter = 10
                                                                    )
print("Best Params:", Gbest_pos)
print("Best Score:", Gbest_score)

# [2] Pattern Search Optimization
Gbest_history2, Gbest_pos2, Gbest_score2 = optimizer.optimizePS(
                                                                    features = X,
                                                                    target = y,
                                                                    bounds = [(10, 1000), (1, 10), ['gini', 'entropy']],
                                                                    mesh_size_coeff = 0.1,
                                                                    acc_coeff = 1,
                                                                    contr_coeff = 0.50,
                                                                    maxIter = 10
                                                            )
print("Best Params:", Gbest_pos2)
print("Best Score:", Gbest_score2)

🧠 How It Works

This library leverages two optimization techniques—Particle Swarm Optimization (PSO) and Pattern Search (PS)—to explore the hyperparameter space effectively.

Particle Swarm Optimization simulates a group of particles (candidate solutions) moving through the search space. Each particle adjusts its position based on its own best-known position and the best-known position among its peers, enabling a balance between exploration and exploitation as the swarm converges toward an optimal solution.

Pattern Search is a derivative-free method that explores the neighborhood of a current solution by systematically evaluating a set of search directions. If an improved solution is found, the algorithm moves in that direction and repeats the process, enabling efficient local refinement without relying on gradient information.

Together, these methods provide robust global and local search capabilities for hyperparameter tuning.


📜 License

This project is licensed under the MIT License.
© 2025 Dr. Ahmed Moussa


🤝 Contributing

Pull requests are welcome.
For major changes, please open an issue first to discuss what you would like to change.


📫 Contact

For feedback, bugs, or collaboration ideas:


⭐️ Show Your Support

If you find this project useful, consider giving it a ⭐️ on GitHub!

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

hyperparameter_optimizer-0.1.2.tar.gz (11.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

hyperparameter_optimizer-0.1.2-py3-none-any.whl (9.5 kB view details)

Uploaded Python 3

File details

Details for the file hyperparameter_optimizer-0.1.2.tar.gz.

File metadata

  • Download URL: hyperparameter_optimizer-0.1.2.tar.gz
  • Upload date:
  • Size: 11.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.10.8

File hashes

Hashes for hyperparameter_optimizer-0.1.2.tar.gz
Algorithm Hash digest
SHA256 e6013bb35d6baf78ab5ea32eb452702190888db3812b470ff94413e2852a0eba
MD5 c2ea3142c58bcbe3c149c1961f560fcc
BLAKE2b-256 78115171bd4d86acdff85f3a6c853830e64f892227658bb84e27d289db1e4718

See more details on using hashes here.

File details

Details for the file hyperparameter_optimizer-0.1.2-py3-none-any.whl.

File metadata

File hashes

Hashes for hyperparameter_optimizer-0.1.2-py3-none-any.whl
Algorithm Hash digest
SHA256 9d808011ae7a11866ebde3b93c785aa3971d890d0a901b786e88d7ee730c2bbf
MD5 acd8156f58d11ea679cd18cd75d9cb0d
BLAKE2b-256 dd98422ca923df50cdc45d93171a3b24a837faa4dc959491c395f081f1d8abd0

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page