Skip to main content

Library implementation of the Iljiceva model.

Project description

Iljicevs ML

Iljicevs ML is a Python library designed to simplify the process of selecting, training, and optimizing multiple machine learning models in an ensemble approach. It provides functionalities for dynamic model selection, hyperparameter tuning, feature importance, cross-validation with multiple metrics, and more.

Features

  • Model Hyperparameter Tuning: Automatically search for the best hyperparameters using GridSearchCV.
  • Dynamic Model Selection: Selects the best performing models based on cross-validation results.
  • Weighted Average Predictions: Combines predictions from multiple models by averaging their outputs based on accuracy weights.
  • Feature Importance Visualization: Displays the importance of features across ensemble models.
  • Class Balance Checking: Automatically checks class balance and suggests solutions for unbalanced datasets.
  • Cross-validation with Metrics: Supports evaluation with multiple metrics (Accuracy, F1, ROC AUC, etc.).
  • AutoML Support: Automatically selects models based on dataset characteristics.

Installation

To install the package, simply use pip (once it is published to PyPI):

pip install iljicevs_ml

Or, if you want to install directly from the source:

git clone https://github.com/yourusername/iljicevs_ml.git
cd iljicevs_ml
python setup.py install

Usage

Basic Example

Here's an example of how you can use iljicevs_ml to select models, tune their hyperparameters, and evaluate their performance.

from iljicevs_ml import IljicevsModel
from sklearn.ensemble import RandomForestClassifier, GradientBoostingClassifier
from sklearn.linear_model import LogisticRegression
from sklearn.svm import SVC
from sklearn.datasets import load_iris
from sklearn.model_selection import train_test_split

# Load data
data = load_iris()
X = data.data
y = data.target

# Train-test split
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.3, random_state=42)

# Define models and hyperparameters
models = {
    'logistic_regression': LogisticRegression(max_iter=200),
    'random_forest': RandomForestClassifier(),
    'svm': SVC(probability=True),
    'gradient_boosting': GradientBoostingClassifier()
}

param_grids = {
    'logistic_regression': {'C': [0.1, 1, 10]},
    'random_forest': {'n_estimators': [50, 100], 'max_depth': [5, 10]},
    'svm': {'C': [0.1, 1, 10], 'kernel': ['linear', 'rbf']},
    'gradient_boosting': {'n_estimators': [50, 100], 'learning_rate': [0.01, 0.1]}
}

# Initialize and train ensemble model
iljicevs = IljicevsModel(models, param_grids)

# Check class balance
iljicevs.check_class_balance(y_train)

# Tune hyperparameters
iljicevs.tune_hyperparameters(X_train, y_train)

# Select the best performing models
iljicevs.select_best_models(X_train, y_train)

# Fit the selected models
iljicevs.fit(X_train, y_train)

# Evaluate the ensemble's accuracy on the test set
accuracy = iljicevs.score(X_test, y_test)
print(f"Ensemble accuracy: {accuracy}")

# Display feature importance
iljicevs.feature_importance()

# Perform cross-validation with multiple metrics
iljicevs.cross_validate_with_metrics(X_train, y_train, metrics=['accuracy', 'f1', 'roc_auc'])

Class Balance Checking

To avoid issues with unbalanced datasets, you can use the built-in method check_class_balance() to get a summary of the class distribution and suggestions for handling imbalances:

iljicevs.check_class_balance(y_train)

Feature Importance

To visualize the importance of features across models, use:

iljicevs.feature_importance()

Cross-Validation with Metrics

For more detailed model evaluation, you can use cross-validation with custom metrics:

iljicevs.cross_validate_with_metrics(X_train, y_train, metrics=['accuracy', 'f1', 'roc_auc'])

Contributing

Contributions are welcome! Please feel free to submit a Pull Request or open an Issue if you find bugs or want to suggest new features.

  1. Fork the repository
  2. Create your feature branch (git checkout -b feature/my-feature)
  3. Commit your changes (git commit -m 'Add some feature')
  4. Push to the branch (git push origin feature/my-feature)
  5. Open a Pull Request

License

This project is licensed under the MIT License - see the LICENSE file for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

iljicevs_ml-0.2.tar.gz (5.4 kB view details)

Uploaded Source

Built Distribution

iljicevs_ml-0.2-py3-none-any.whl (6.2 kB view details)

Uploaded Python 3

File details

Details for the file iljicevs_ml-0.2.tar.gz.

File metadata

  • Download URL: iljicevs_ml-0.2.tar.gz
  • Upload date:
  • Size: 5.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.8.6

File hashes

Hashes for iljicevs_ml-0.2.tar.gz
Algorithm Hash digest
SHA256 f7cd8b2986ba4f79c1b328deb0323de76ce39f6f815fe455a1b59700c269bb3c
MD5 9f3dd2d5e70f4975d22eb8f23601d726
BLAKE2b-256 33b9e6b97048d11e06e907de10193272b931eb57c7a8eb31990147c4db114f97

See more details on using hashes here.

File details

Details for the file iljicevs_ml-0.2-py3-none-any.whl.

File metadata

  • Download URL: iljicevs_ml-0.2-py3-none-any.whl
  • Upload date:
  • Size: 6.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.8.6

File hashes

Hashes for iljicevs_ml-0.2-py3-none-any.whl
Algorithm Hash digest
SHA256 de8b39adf9275c7dd0d43bb6d2d014dfc82c70737ee16e53c18fcac57c288e0c
MD5 45246f6ab695b9bd9364ebfd6ed8c5c6
BLAKE2b-256 8711acbd550c78f21f77722b804a4adc7c23f71c2617b20b48c476993edd4544

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page