Skip to main content

Real-time explainable machine learning for business optimisation

Project description

Contributors

xplainable

Real-time explainable machine learning for business optimisation

Python PyPi Downloads

Xplainable makes tabular machine learning transparent, fair, and actionable.

Why Was Xplainable Created?

In machine learning, there has long been a trade-off between accuracy and explainability. This drawback has led to the creation of explainable ML libraries such as Shap and Lime which make estimations of model decision processes. These can be incredibly time-expensive and often present steep learning curves making them challenging to implement effectively in production environments.

To solve this problem, we created xplainable. xplainable presents a suite of novel machine learning algorithms specifically designed to match the performance of popular black box models like XGBoost and LightGBM while providing complete transparency, all in real-time.

Simple Interface

You can interface with xplainable either through a typical Pythonic API, or using a notebook-embedded GUI in your Jupyter Notebook.

Models

Xplainable has each of the fundamental tabular models used in data science teams. They are fast, accurate, and easy to use.

Model Python API Jupyter GUI
Regression
Binary Classification
Multi-Class Classification 🔜

Installation

You can install the core features of xplainable with:

pip install xplainable

to use the xplainable gui in a jupyter notebook, install with:

pip install xplainable[gui]

Getting Started

Basic Example

import xplainable as xp
from xplainable.core.models import XClassifier
import pandas as pd
from sklearn.model_selection import train_test_split

# Load data
data = xp.load_dataset('titanic')

X, y = data.drop(columns=['Survived']), data['Survived']

X_train, X_test, y_train, y_test = train_test_split(
     X, y, test_size=0.25, random_state=42)

# Train a model
model = XClassifier()
model.fit(X_train, y_train)

# Explain the model
model.explain()

Features

Xplainable helps to streamline development processes by making model tuning and deployment simpler than you can imagine.

Preprocessing

We built a comprehensive suite of preprocessing transformers for rapid and reproducible data preprocessing.

Feature Python API Jupyter GUI
Data Health Checks
Transformers Library
Preprocessing Pipelines
Pipeline Persistance

Using the API

from xplainable.preprocessing.pipeline import XPipeline
from xplainable.preprocessing import transformers as xtf

pipeline = XPipeline()

# Add stages for specific features
pipeline.add_stages([
    {"feature": "age", "transformer": xtf.Clip(lower=18, upper=99)},
    {"feature": "balance", "transformer": xtf.LogTransform()}
])

# add stages on multiple features
pipeline.add_stages([
    {"transformer": xtf.FillMissing({'job': 'mode', 'age': 'mean'})},
    {"transformer": xtf.DropCols(columns=['duration', 'campaign'])}
])

# Fit and transform the data
train_transformed = pipeline.fit_transform(train)

# Apply transformations on new data
test_transformed = pipeline.transform(test)

Using the GUI

pp = xp.Preprocessor()

pp.preprocess(train)

Modelling

Xplainable models can be developed, optimised, and re-optimised using Pythonic APIs or the embedded GUI.

Feature Python API Jupyter GUI
Classic Vanilla Data Science APIs -
AutoML
Hyperparameter Optimisation
Partitioned Models
Rapid Refitting (novel to xplainable)
Model Persistance

Using the API

import xplainable as xp
from xplainable.core.models import XClassifier
from xplainable.core.optimisation.bayesian import XParamOptimiser
from sklearn.model_selection import train_test_split
import pandas as pd

# Load your data
data = xp.load_dataset('titanic')

# note: the data requires preprocessing, so results may be poor
X, y = data.drop('Survived', axis=1), data['Survived']

X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2)

# Optimise params
opt = XParamOptimiser(metric='roc-auc')
params = opt.optimise(X_train, y_train)

# Train your model
model = XClassifier(**params)
model.fit(X_train, y_train)

# Predict on the test set
y_pred = model.predict(X_test)

# Explain the model
model.explain()

Using the GUI

model = xp.classifier(train)

Rapid Refitting

Fine tune your models by refitting model parameters on the fly, even on individual features.

Using the API

new_params = {
            "features": ['Age'],
            "max_depth": 6,
            "min_info_gain": 0.01,
            "min_leaf_size": 0.03,
            "weight": 0.05,
            "power_degree": 1,
            "sigmoid_exponent": 1,
            "x": X_train,
            "y": y_train
}

model.update_feature_params(**new_params)

Using the GUI


Explainability

Models are explainable and real-time, right out of the box, without having to fit surrogate models such as Shap orLime.

Feature Python API Jupyter GUI
Global Explainers
Regional Explainers
Local Explainers
Real-time Explainability
model.explain()

Action & Optimisation

We leverage the explainability of our models to provide real-time recommendations on how to optimise predicted outcomes at a local and global level.

Feature
Automated Local Prediction Optimisation
Automated Global Decision Optimisation 🔜

Deployment

Xplainable brings transparency to API deployments, and it's easy. By the time your finger leaves the mouse, your model is on a secure server and ready to go.

Feature Python API Xplainable Cloud
< 1 Second API Deployments
Explainability-Enabled API Deployments
A/B Testing - 🔜
Champion Challenger Models (MAB) - 🔜

#FairML

We promote fair and ethical use of technology for all machine learning tasks. To help encourage this, we're working on additional bias detection and fairness testing classes to ensure that everything you deploy is safe, fair, and compliant.

Feature Python API Xplainable Cloud
Bias Identification
Automated Bias Detection 🔜 🔜
Fairness Testing 🔜 🔜

Xplainable Cloud

This Python package is free and open-source. To add more value to data teams within organisations, we also created Xplainable Cloud that brings your models to a collaborative environment.

import xplainable as xp
import os

xp.initialise(api_key=os.environ['XP_API_KEY'])

Contributors

We'd love to welcome contributors to xplainable to keep driving forward more transparent and actionable machine learning. We're working on our contributor docs at the moment, but if you're interested in contributing, please send us a message at contact@xplainable.io.





Thanks for trying xplainable!

Made with ❤️ in Australia


© copyright xplainable pty ltd

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

xplainable-1.3.1.tar.gz (162.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

xplainable-1.3.1-py3-none-any.whl (176.9 kB view details)

Uploaded Python 3

File details

Details for the file xplainable-1.3.1.tar.gz.

File metadata

  • Download URL: xplainable-1.3.1.tar.gz
  • Upload date:
  • Size: 162.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.9.7

File hashes

Hashes for xplainable-1.3.1.tar.gz
Algorithm Hash digest
SHA256 838f727b1844292087cd5e4c93226b75d26425fd95f27277d89d065bc3c9956d
MD5 e6b8183d88d259160ee7d6326486f6d5
BLAKE2b-256 404d999a6a3039667341f852656968166194759bc5248ae618ca900b11f76f63

See more details on using hashes here.

File details

Details for the file xplainable-1.3.1-py3-none-any.whl.

File metadata

  • Download URL: xplainable-1.3.1-py3-none-any.whl
  • Upload date:
  • Size: 176.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.9.7

File hashes

Hashes for xplainable-1.3.1-py3-none-any.whl
Algorithm Hash digest
SHA256 3f0714dd0d7e4e3c9739c8cf790532e981123501d69cdc605b65537714ec0d5f
MD5 88729d6ce332a136ddc8e25ea828a064
BLAKE2b-256 7370cf4ea29fb1d4e32efce04f86d8927d31f44d1b2318618bfbcd978290e646

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page