Skip to main content

A Genetic Algorithm-based hyperparameter tuner for machine learning models.

Project description

Evolutune

A Genetic Algorithm-based hyperparameter tuner for machine learning models.

Introduction

Evolutune, implements a hyperparameter tuner based on the principles of a genetic algorithm. The genetic algorithm evolves a population of hyperparameter sets over several generations, aiming to find the set that optimizes a given scoring metric. This tuner is designed to work with various machine learning models.

Dependencies

Make sure you have the following dependencies installed:

  • numpy
  • joblib.Parallel and joblib.delayed
  • sklearn.metrics.get_scorer

Installation

pip install evolutune

Usage

from evolutune import GeneticTuner

# Define your machine learning model
# model = ...

# Define the hyperparameter search space
param_grid = {
    'param1': [value1, value2, ...],
    'param2': [value3, value4, ...],
    # Add more hyperparameters as needed
}

# Define the scoring metric to optimize
scoring_metric = 'accuracy'  # Replace with your preferred metric

# Instantiate the GeneticTuner
genetic_tuner = GeneticTuner(
    model=model,
    param_grid=param_grid,
    scoring=scoring_metric,
    population_size=10,
    generations=100,
    mutation_rate=0.1,
    random_state=None,
    n_jobs=None
)

Fitting the Tuner

# Define your training and evaluation sets
train_set = [X_train, y_train]
eval_set = [X_eval, y_eval]  # Set to None to use the training set for evaluation

# Specify the optimization direction ('maximize' or 'minimize')
direction = 'maximize'

# Fit the tuner on the training set
genetic_tuner.fit(train_set, eval_set, direction)

Accessing Results

# Access the best score and corresponding hyperparameters
best_score = genetic_tuner.best_score_
best_params = genetic_tuner.best_params_

print(f"Best Score: {best_score}")
print("Best Hyperparameters:")
for param, value in best_params.items():
    print(f"{param}: {value}")

Methods

Method Description
initialize_population(population_size: int) -> list Initialize a population of individuals with random hyperparameters.
crossover(parent1: dict, parent2: dict) -> tuple Perform crossover between two parents to generate two children.
mutate(individual: dict, mutation_rate: float) -> dict Introduce random mutations to an individual's hyperparameters.
calculate_fitness(train_set: list, eval_set: list, parameters: dict) -> float Evaluate the fitness (scoring metric) of a set of hyperparameters.
fit(train_set: list, eval_set: list = None, direction: str = "maximize") Fit the GeneticTuner on the training set and optional evaluation set.

Example

An example script demonstrating the usage of the GeneticTuner class is provided in the example.py file.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

evolutune-0.0.3.tar.gz (8.7 kB view details)

Uploaded Source

Built Distribution

evolutune-0.0.3-py3-none-any.whl (9.3 kB view details)

Uploaded Python 3

File details

Details for the file evolutune-0.0.3.tar.gz.

File metadata

  • Download URL: evolutune-0.0.3.tar.gz
  • Upload date:
  • Size: 8.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.9.18

File hashes

Hashes for evolutune-0.0.3.tar.gz
Algorithm Hash digest
SHA256 a7c782e20709b1d3419a536e5939ba6bff43c27438abd06ed273312cf22ce780
MD5 1f2c1628faf7f5ee3723b3b76095d71f
BLAKE2b-256 06317b44ef24b2e12a34d120048850068145b672c04812f985311ea8e5591392

See more details on using hashes here.

File details

Details for the file evolutune-0.0.3-py3-none-any.whl.

File metadata

  • Download URL: evolutune-0.0.3-py3-none-any.whl
  • Upload date:
  • Size: 9.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.9.18

File hashes

Hashes for evolutune-0.0.3-py3-none-any.whl
Algorithm Hash digest
SHA256 0e625cc7a46c9cd5761297ba5a2fef6bdb59a47fd3ebf1c1af114dd208e37d9d
MD5 0b12ab13bfc64f0f3f314015f0450878
BLAKE2b-256 8510ecb229951c647789580d777e413d0a42d6e68872d92d65c22c5adf4bfb0e

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page