Skip to main content

Quantum-Enhanced AutoML

Project description

Metis

Enterprise AutoML with Quantum-Enhanced Optimization

Metis automates machine learning model selection, hyperparameter tuning, and feature selection using classical optimization and optional quantum-enhanced sampling.

Features

  • Simple API: One-line model training with metis.fit()
  • Flexible Input: Accepts file paths (CSV, JSON, Parquet) or pandas DataFrames
  • Quantum-Enhanced: Optional QAOA-based quantum sampling for exploration
  • Multiple Models: Supports Random Forest, XGBoost, SVM, and Logistic Regression (Ridge for regression)
  • Custom Models: Register your own models with metis.add()
  • Automatic Feature Selection: Selects optimal feature subsets
  • Production-Ready: Comprehensive error handling and validation

Installation

pip install metis-automl

Quick Start

Basic Usage

import metis
from sklearn.ensemble import GradientBoostingClassifier, GradientBoostingRegressor

# Test custom model registration
def create_gbm(hyperparameters, is_classification):
    if is_classification:
        return GradientBoostingClassifier(**hyperparameters, random_state=42)
    else:
        return GradientBoostingRegressor(**hyperparameters, random_state=42)

metis.add(
    'gradient_boosting',
    create_gbm,
    {'n_estimators': [50, 100], 'learning_rate': [0.1, 0.3]}
)

print("Registered models:", metis.list_models())

# Test with a small search budget for quick testing
model = metis.fit(
    dataset="iris.csv",
    config={
        "metric": "accuracy",
        "objective": "maximize",
        "search_budget": 100,
        "use_quantum": True,
    }
)
print("Best model:", model.metadata['model_name'])
print("Hyperparameters:", model.hyperparameters)
print("Score:", model.metrics['validation_score'])

Configuration Options

model = metis.fit(
    dataset="data.csv",
    config={
        "metric": "accuracy",        # 'accuracy', 'f1', 'precision', 'recall', 'roc_auc', 'r2', 'mse', 'mae'
        "objective": "maximize",      # 'maximize' or 'minimize'
        "search_budget": 50,          # Number of optimization trials
        "max_features": 20,           # Maximum features to select (optional)
        "target_column": "target",    # Target column name (auto-detects 'target', 'label', 'y', or 'class' if not provided)
        "use_quantum": True,          # Enable quantum sampling (default: True)
    }
)

Or use keyword arguments:

model = metis.fit("data.csv", metric="f1", search_budget=100, use_quantum=True)

Accessing Results

# Model metadata
print(model.hyperparameters)      # Best hyperparameters
print(model.selected_features)    # Selected feature names
print(model.metrics)              # Train/validation/test scores
print(model.metadata)             # Additional metadata

Supported Metrics

Classification

  • accuracy: Classification accuracy
  • f1: F1 score (weighted)
  • precision: Precision score (weighted)
  • recall: Recall score (weighted)
  • roc_auc: ROC AUC score

Regression

  • r2: R² score
  • mse: Mean squared error (minimized)
  • mae: Mean absolute error (minimized)

Supported Models

Built-in Models

  • Random Forest: Ensemble of decision trees
  • XGBoost: Gradient boosting framework
  • SVM: Support Vector Machine
  • Logistic Regression: Linear classification (Ridge regression for regression tasks)

Custom Models

You can register your own custom models using metis.add():

import metis
from sklearn.ensemble import GradientBoostingClassifier, GradientBoostingRegressor

def create_gbm(hyperparameters, is_classification):
    """Create a Gradient Boosting model."""
    if is_classification:
        return GradientBoostingClassifier(**hyperparameters, random_state=42)
    else:
        return GradientBoostingRegressor(**hyperparameters, random_state=42)

# Register the custom model
metis.add(
    model_name='gradient_boosting',
    model_creator=create_gbm,
    hyperparameter_space={
        'n_estimators': [50, 100, 200],
        'learning_rate': [0.01, 0.1, 0.3],
        'max_depth': [3, 5, 7],
        'subsample': [0.8, 0.9, 1.0]
    },
    description='Gradient Boosting Machine'
)

# Now use it in AutoML
model = metis.fit("data.csv", search_budget=50)

Requirements for custom models:

  • Must be sklearn-compatible (implement fit(), predict(), and optionally predict_proba())
  • The model_creator function must accept (hyperparameters: Dict, is_classification: bool) and return a model instance
  • Hyperparameter space must be a dictionary mapping parameter names to lists of possible values

Managing custom models:

# List all registered models (includes built-in models)
all_models = metis.list_models()
print(all_models)  # ['random_forest', 'xgboost', 'svm', 'logistic_regression', 'gradient_boosting']

# List only custom models
custom_models = metis.list_models(include_builtin=False)
print(custom_models)  # ['gradient_boosting']

# Remove a custom model
metis.remove('gradient_boosting')

Error Handling

Metis provides custom exceptions for better error handling:

from metis import MetisError, MetisDataError, MetisConfigError, MetisTrainingError, MetisQuantumError

try:
    model = metis.fit("data.csv")
except MetisDataError as e:
    print(f"Data issue: {e}")
except MetisConfigError as e:
    print(f"Configuration issue: {e}")
except MetisTrainingError as e:
    print(f"Training issue: {e}")
except MetisQuantumError as e:
    print(f"Quantum sampling issue: {e}")

Requirements

  • Python >= 3.11
  • scikit-learn >= 1.3.0
  • pandas >= 2.0.0
  • numpy >= 1.24.0
  • optuna >= 3.5.0
  • joblib >= 1.3.0
  • xgboost >= 2.0.0
  • pennylane >= 0.35.0 (for quantum features)
  • scipy >= 1.11.0

License

MIT License

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

metis_automl-0.1.1.tar.gz (20.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

metis_automl-0.1.1-py3-none-any.whl (23.4 kB view details)

Uploaded Python 3

File details

Details for the file metis_automl-0.1.1.tar.gz.

File metadata

  • Download URL: metis_automl-0.1.1.tar.gz
  • Upload date:
  • Size: 20.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.13

File hashes

Hashes for metis_automl-0.1.1.tar.gz
Algorithm Hash digest
SHA256 50b35e0876043e36050ec1efc390048723d5e19761695ea87992fce36d4fb6df
MD5 7427b72c3e6b10e53b600da5e1310ff2
BLAKE2b-256 91e889d8315caab4f36bbfb2c278db4e20ffd4632ff3a82acbfdd812146a16f6

See more details on using hashes here.

File details

Details for the file metis_automl-0.1.1-py3-none-any.whl.

File metadata

  • Download URL: metis_automl-0.1.1-py3-none-any.whl
  • Upload date:
  • Size: 23.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.13

File hashes

Hashes for metis_automl-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 a0077f3de9b7b672493a8e476ed0d2afdb6880be4db65ed9b412e9dcd98f1869
MD5 a9cdcfb00c60983bf36aa17ced1f391f
BLAKE2b-256 a704da73336efa38c72df8e6a800438131791816f737be7d72d2d29c2cbdb94d

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page