Conformal hyperparameter optimization tool
Project description
Built for machine learning practitioners requiring flexible and robust hyperparameter tuning, ConfOpt delivers superior optimization performance through conformal uncertainty quantification and a wide selection of surrogate models.
📦 Installation
Install ConfOpt from PyPI using pip:
pip install confopt
For the latest development version:
git clone https://github.com/rick12000/confopt.git
cd confopt
pip install -e .
🎯 Getting Started
The example below shows how to optimize hyperparameters for a RandomForest classifier. You can find more examples in the documentation.
Step 1: Import Required Libraries
from confopt.tuning import ConformalTuner
from confopt.wrapping import IntRange, FloatRange, CategoricalRange
from sklearn.ensemble import RandomForestClassifier
from sklearn.datasets import load_wine
from sklearn.model_selection import train_test_split
from sklearn.metrics import accuracy_score
We import the necessary libraries for tuning and model evaluation. The load_wine function is used to load the wine dataset, which serves as our example data for optimizing the hyperparameters of the RandomForest classifier (the dataset is trivial and we can easily reach 100% accuracy, this is for example purposes only).
Step 2: Define the Objective Function
def objective_function(configuration):
X, y = load_wine(return_X_y=True)
X_train, X_test, y_train, y_test = train_test_split(
X, y, test_size=0.3, random_state=42, stratify=y
)
model = RandomForestClassifier(
n_estimators=configuration['n_estimators'],
max_features=configuration['max_features'],
criterion=configuration['criterion'],
random_state=42
)
model.fit(X_train, y_train)
predictions = model.predict(X_test)
return accuracy_score(y_test, predictions)
This function defines the objective we want to optimize. It loads the wine dataset, splits it into training and testing sets, and trains a RandomForest model using the provided configuration. The function returns test accuracy, which will be the objective value ConfOpt will optimize for.
Step 3: Define the Search Space
search_space = {
'n_estimators': IntRange(min_value=50, max_value=200),
'max_features': FloatRange(min_value=0.1, max_value=1.0),
'criterion': CategoricalRange(choices=['gini', 'entropy', 'log_loss'])
}
Here, we specify the search space for hyperparameters. In this Random Forest example, this includes defining the range for the number of estimators, the proportion of features to consider when looking for the best split, and the criterion for measuring the quality of a split.
Step 4: Create and Run the Tuner
tuner = ConformalTuner(
objective_function=objective_function,
search_space=search_space,
minimize=False
)
tuner.tune(max_searches=50, n_random_searches=10)
We initialize the ConformalTuner with the objective function and search space. The tune method then kickstarts hyperparameter search and finds the hyperparameters that maximize test accuracy.
Step 5: Retrieve and Display Results
best_params = tuner.get_best_params()
best_score = tuner.get_best_value()
print(f"Best accuracy: {best_score:.4f}")
print(f"Best parameters: {best_params}")
Finally, we retrieve the optimization's best parameters and test accuracy score and print them to the console for review.
For detailed examples and explanations see the documentation.
📚 Documentation
User Guide
- Classification Example: RandomForest hyperparameter tuning on a classification task.
- Regression Example: RandomForest hyperparameter tuning on a regression task.
Developer Resources
- Architecture Overview: System design and module interactions.
- API Reference: Complete reference for main classes, methods, and parameters.
📈 Benchmarks
ConfOpt is significantly better than plain old random search, but it also beats established tools like Optuna or traditional Gaussian Processes!
The above benchmark considers neural architecture search on complex image recognition datasets (JAHS-201) and neural network tuning on tabular classification datasets (LCBench-L).
For a fuller analysis of caveats and benchmarking results, refer to the latest methodological paper.
🔬 Theory
ConfOpt implements surrogate models and acquisition functions from the following papers:
Adaptive Conformal Hyperparameter Optimization arXiv, 2022
Optimizing Hyperparameters with Conformal Quantile Regression PMLR, 2023
Enhancing Performance and Calibration in Quantile Hyperparameter Optimization arXiv, 2025
🤝 Contributing
If you'd like to contribute, please email r.doyle.edu@gmail.com with a quick summary of the feature you'd like to add and we can discuss it before setting up a PR!
If you want to contribute a fix relating to a new bug, first raise an issue on GitHub, then email r.doyle.edu@gmail.com referencing the issue. Issues will be regularly monitored, only send an email if you want to contribute a fix.
📄 License
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file confopt-2.0.1.tar.gz.
File metadata
- Download URL: confopt-2.0.1.tar.gz
- Upload date:
- Size: 61.1 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
2486ee09207cf883cbef70e9cc56c8b76584189543095fa2d7760897b377c0e2
|
|
| MD5 |
6a27176f01200b640eca0b84c2cdce51
|
|
| BLAKE2b-256 |
007844726489da84a6b96e2d1b2c13e3ad3b918c49f2e268b70063b0c2112d8e
|
File details
Details for the file confopt-2.0.1-py3-none-any.whl.
File metadata
- Download URL: confopt-2.0.1-py3-none-any.whl
- Upload date:
- Size: 73.1 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
abb2cff27ca15ad3cdd35b30875bc6f08d5e502a9dba09cbf788dbda91072bdc
|
|
| MD5 |
f9ea036c38fac09d185bad9d93bd015d
|
|
| BLAKE2b-256 |
8a8177f71d6647a52524b4524693fb1ef7037afbe14a9dc66f224476864eebb4
|