Skip to main content

No project description provided

Project description

Based on the paper: J. S. Saravanan and A. Mahadevan, "AI based parameter estimation of ML model using Hybrid of Genetic Algorithm and Simulated Annealing," 2023 14th International Conference on Computing Communication and Networking Technologies (ICCCNT), Delhi, India, 2023, pp. 1-5, doi: 10.1109/ICCCNT56998.2023.10308077.

Hyperparameter Optimization with Genetic Algorithm and Simulated Annealing

This repository contains a Python package jss_optimizer for optimizing hyperparameters using genetic algorithm (GA) and simulated annealing (SA) hybrid optimization algorithm.

Installation

You can install the package using pip:

pip install jss_optimizer

Usage Example

from jss_optimizer.jss_optimizer import HyperparameterOptimizer
import pandas as pd
from sklearn.model_selection import train_test_split
from sklearn.ensemble import RandomForestClassifier

# Load dataset
df = pd.read_csv('dataset/heart_v2.csv')
X = df.drop('heart disease', axis=1)
y = df['heart disease']
X_train, X_test, y_train, y_test = train_test_split(X, y, train_size=0.7, random_state=42)

# Define model and parameters
model = RandomForestClassifier
params = ['max_depth', 'min_samples_leaf', 'n_estimators']

# Create an instance of HyperparameterOptimizer
optimizer = HyperparameterOptimizer(model, params)

# Optimize hyperparameters using genetic algorithm
best_solution_genetic = optimizer.optimize(X_train, y_train, X_test, y_test)
print('Best solution found by genetic algorithm:', best_solution_genetic)

# NOTE
# Most of the time, genetic algorithm itself could give an optimal solution. But it could also get caught in a local optima. 
# To aviod such senarios, further optimization with simulated annealing is recommended.
# Use the solution that you see fit is optimal.  

# Perform simulated annealing 
best_solution_simulated_annealing = optimizer.simulate_annealing(best_solution_genetic, X_train, y_train, X_test, y_test)
print('Best solution found by GA-SA hybrid optimization algorithm:', best_solution_simulated_annealing)

works are under progress to extend it to work with every data and model in any given senario

Version Logs

version: 0.1.1 - This will work only with Random Forest Classifier on any dataset.

version: 0.1.2 - Same as version: 0.1.1. Added improvements & support for train-test spitting with proper score metrics

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

jss_optimizer-0.1.2.tar.gz (6.3 kB view details)

Uploaded Source

Built Distribution

jss_optimizer-0.1.2-py3-none-any.whl (4.1 kB view details)

Uploaded Python 3

File details

Details for the file jss_optimizer-0.1.2.tar.gz.

File metadata

  • Download URL: jss_optimizer-0.1.2.tar.gz
  • Upload date:
  • Size: 6.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.8.8

File hashes

Hashes for jss_optimizer-0.1.2.tar.gz
Algorithm Hash digest
SHA256 8d38ebb23d569c6fb2b51d8ebb3e6337b5a655ce8ff72f1f48dea17d586955c4
MD5 fa9777a8a40f89da8d035cc9ed530978
BLAKE2b-256 2fbdcb5840e5c26311050ec295e021a109e0999649c37641718009b4501f0467

See more details on using hashes here.

File details

Details for the file jss_optimizer-0.1.2-py3-none-any.whl.

File metadata

File hashes

Hashes for jss_optimizer-0.1.2-py3-none-any.whl
Algorithm Hash digest
SHA256 650bf3a473880b11ef90df3e12f324c440f96490c252557a0ba4433a98932d69
MD5 5c95064aa2a6d5c1c04a71e696acb914
BLAKE2b-256 f6c391654d25a5a63928273a3c356671c8237835be5fac3ad2cadb2f87350a42

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page