An optimization and data collection toolbox for convenient and fast prototyping of computationally expensive models.
Project description
Welcome to hyperactive
A unified interface for optimization algorithms and problems.
Hyperactive implements a collection of optimization algorithms, accessible through a unified experiment-based interface that separates optimization problems from algorithms. The library provides native implementations of algorithms from the Gradient-Free-Optimizers package alongside direct interfaces to Optuna and scikit-learn optimizers, supporting discrete, continuous, and mixed parameter spaces.
| Overview • Installation • Tutorial • API reference • Citation | |
|---|---|
| Open Source | |
| Community | |
| CI/CD | |
| Code |
Installation
pip install hyperactive
Key Concepts
Experiment-Based Architecture
Hyperactive v5 introduces a clean separation between optimization algorithms and optimization problems through the experiment abstraction:
- Experiments define what to optimize (the objective function and evaluation logic)
- Optimizers define how to optimize (the search strategy and algorithm)
This design allows you to:
- Mix and match any optimizer with any experiment type
- Create reusable experiment definitions for common ML tasks
- Easily switch between different optimization strategies
- Build complex optimization workflows with consistent interfaces
Built-in experiments include:
SklearnCvExperiment- Cross-validation for sklearn estimatorsSktimeForecastingExperiment- Time series forecasting optimization- Custom function experiments (pass any callable as experiment)
Quickstart
Maximizing a custom function
import numpy as np
# function to be maximized
def problem(params):
x = params["x"]
y = params["y"]
return -(x**2 + y**2)
# discrete search space: dict of iterable, scikit-learn like grid space
# (valid search space types depends on optimizer)
search_space = {
"x": np.arange(-1, 1, 0.01),
"y": np.arange(-1, 2, 0.1),
}
from hyperactive.opt.gfo import HillClimbing
hillclimbing = HillClimbing(
search_space=search_space,
n_iter=100,
experiment=problem,
)
# running the hill climbing search:
best_params = hillclimbing.solve()
experiment abstraction - example: scikit-learn CV experiment
"experiment" abstraction = parametrized optimization problem
hyperactive provides a number of common experiments, e.g.,
scikit-learn cross-validation experiments:
import numpy as np
from hyperactive.experiment.integrations import SklearnCvExperiment
from sklearn.datasets import load_iris
from sklearn.svm import SVC
from sklearn.metrics import accuracy_score
from sklearn.model_selection import KFold
X, y = load_iris(return_X_y=True)
# create experiment
sklearn_exp = SklearnCvExperiment(
estimator=SVC(),
scoring=accuracy_score,
cv=KFold(n_splits=3, shuffle=True),
X=X,
y=y,
)
# experiments can be evaluated via "score"
params = {"C": 1.0, "kernel": "linear"}
score, add_info = sklearn_exp.score(params)
# they can be used in optimizers like above
from hyperactive.opt.gfo import HillClimbing
search_space = {
"C": np.logspace(-2, 2, num=10),
"kernel": ["linear", "rbf"],
}
hillclimbing = HillClimbing(
search_space=search_space,
n_iter=100,
experiment=sklearn_exp,
)
best_params = hillclimbing.solve()
full ML toolbox integration - example: scikit-learn
Any hyperactive optimizer can be combined with the ML toolbox integrations!
OptCV for tuning scikit-learn estimators with any hyperactive optimizer:
# 1. defining the tuned estimator:
from sklearn.svm import SVC
from hyperactive.integrations.sklearn import OptCV
from hyperactive.opt.gfo import HillClimbing
search_space = {"kernel": ["linear", "rbf"], "C": [1, 10]}
optimizer = HillClimbing(search_space=search_space, n_iter=20)
tuned_svc = OptCV(SVC(), optimizer)
# 2. fitting the tuned estimator:
from sklearn.datasets import load_iris
from sklearn.model_selection import train_test_split
X, y = load_iris(return_X_y=True)
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2)
tuned_svc.fit(X_train, y_train)
y_pred = tuned_svc.predict(X_test)
# 3. obtaining best parameters and best estimator
best_params = tuned_svc.best_params_
best_estimator = tuned_svc.best_estimator_
Citing Hyperactive
@Misc{hyperactive2021,
author = {{Simon Blanke}},
title = {{Hyperactive}: An optimization and data collection toolbox for convenient and fast prototyping of computationally expensive models.},
howpublished = {\url{https://github.com/SimonBlanke}},
year = {since 2019}
}
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distributions
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file hyperactive-5.0.3-py3-none-any.whl.
File metadata
- Download URL: hyperactive-5.0.3-py3-none-any.whl
- Upload date:
- Size: 150.0 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.4
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
87026c73a5abc1aefc3378b0fec64bb699f1b5d608dbae510b434fa9e14927d9
|
|
| MD5 |
808f1a93899627e47bd47d6f5e998165
|
|
| BLAKE2b-256 |
3a5367386eafa90381c61adede76df9cf02bf5921f0b49f311a4ac92b50a2341
|