Skip to main content

This repository aims to provide tools for comparing different explainability methods, enhancing the interpretation of machine learning models.

Project description

Logo

PyPI License Python

Description

xai-compare is an open-source library that provides a suite of tools to systematically compare and evaluate the quality of explanations generated by different Explainable AI (XAI) methods. This package facilitates the development of new XAI methods and promotes transparent evaluations of such methods.

xai-compare includes a variety of XAI techniques like SHAP, LIME, and Permutation Feature Importance, and introduces advanced comparison techniques such as consistency measurement and feature selection analysis. It is designed to be flexible, easy to integrate, and ideal for enhancing model transparency and interpretability across various applications.

Installation

The package can be installed from PyPI:

Using pip:

pip install xai-compare

Explainers

xai-compare supports three popular model-agnostic XAI methods:

SHAP

  • SHAP values provide global interpretations of a model's output by attributing each feature's contribution to the predicted outcome.
  • Depending on the model type, the script initializes an appropriate explainer such as shap.TreeExplainer for tree-based models, shap.LinearExplainer for linear models, or shap.KernelExplainer for more general models. It then uses SHAP to analyze and explain the behavior of the model.

LIME

  • LIME provides local interpretations of individual predictions by approximating the model's behavior around specific data points.
  • The script initializes a LimeTabularExplainer and explains local predictions of the model using LIME.

Permutation Feature Importance

  • Permutation Feature Importance assesses the impact of each feature on a model’s prediction by measuring the decrease in the model’s performance when the values of a feature are randomly shuffled.
  • The script measures this dependency by calculating the decrease in model performance after permuting each feature, averaged over multiple permutations.

Comparison techniques

Feature selection

The FeatureSelection class in xai-compare is a robust tool for optimizing machine learning models by identifying and prioritizing the most influential features. This class leverages a variety of explainers, including SHAP, LIME, and Permutation Importance, to evaluate feature relevance systematically. It facilitates the iterative removal of less significant features, allowing users to understand the impact of each feature on model performance. This approach not only improves model efficiency but also enhances interpretability, making it easier to understand and justify model decisions.

Feature Selection Workflow

Feature Selection Workflow

Consistency

The Consistency class assesses the stability and reliability of explanations provided by various explainers across different splits of data. This class is crucial for determining whether the insights provided by model explainers are consistent regardless of data variances.

Consistency Measurement Workflow

Consistency Measurement Workflow

Sample notebooks

The notebooks below demonstrate different use cases for xai-compare package. For hands-on experience and to explore the notebooks in detail, visit the notebooks directory in the repository.

Feature Selection Test Notebook

Consistency Test Notebook

Main Demo Notebook

Call for Contributors

We're seeking individuals with expertise in machine learning, preferably explainable artificial intelligence (XAI), and proficiency in Python programming. If you have a background in these areas and are passionate about enhancing machine learning model transparency, we welcome your contributions. Join us in shaping the future of interpretable AI.

License

This project is licensed under the MIT License - see the LICENSE file for details.

Acknowledgments

  • The California housing dataset is sourced from scikit-learn.
  • SHAP and LIME libraries are used for model interpretability.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

xai-compare-0.1.1.tar.gz (46.5 kB view details)

Uploaded Source

Built Distribution

xai_compare-0.1.1-py3-none-any.whl (51.1 kB view details)

Uploaded Python 3

File details

Details for the file xai-compare-0.1.1.tar.gz.

File metadata

  • Download URL: xai-compare-0.1.1.tar.gz
  • Upload date:
  • Size: 46.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.10.11

File hashes

Hashes for xai-compare-0.1.1.tar.gz
Algorithm Hash digest
SHA256 d1a71b4f8bbff4584f6cbd25b406f7f4208a2490b8fdd9894bb824b264343cec
MD5 6c90207c521d0c7b93c3b0c9bfb4aede
BLAKE2b-256 b44d3c9a453344ddf2cb228fa2e817f8f62b73246f74ed9e07359f930ceff6b1

See more details on using hashes here.

File details

Details for the file xai_compare-0.1.1-py3-none-any.whl.

File metadata

  • Download URL: xai_compare-0.1.1-py3-none-any.whl
  • Upload date:
  • Size: 51.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.10.11

File hashes

Hashes for xai_compare-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 2826af630adc282252bfc21e0abdfee77b6c97ec0abc8c72b8a18595bd7d8c00
MD5 37af2700beb3d94a89a8676eb10edd92
BLAKE2b-256 d109b01543e8cfcef6fd67f07a82a7130bf52050daea2a04f94625813b3923cd

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page