Skip to main content

This repository aims to provide tools for comparing different explainability methods, enhancing the interpretation of machine learning models.

Project description

Logo

PyPI License Python

Description

xai-compare is an open-source library that provides a suite of tools to systematically compare and evaluate the quality of explanations generated by different Explainable AI (XAI) methods. This package facilitates the development of new XAI methods and promotes transparent evaluations of such methods.

xai-compare includes a variety of XAI techniques like SHAP, LIME, and Permutation Feature Importance, and introduces advanced comparison techniques such as consistency measurement and feature selection analysis. It is designed to be flexible, easy to integrate, and ideal for enhancing model transparency and interpretability across various applications.

Installation

The package can be installed from PyPI:

Using pip:

pip install xai-compare

Explainers

xai-compare supports three popular model-agnostic XAI methods:

SHAP

  • SHAP values provide global interpretations of a model's output by attributing each feature's contribution to the predicted outcome.
  • Depending on the model type, the script initializes an appropriate explainer such as shap.TreeExplainer for tree-based models, shap.LinearExplainer for linear models, or shap.KernelExplainer for more general models. It then uses SHAP to analyze and explain the behavior of the model.

LIME

  • LIME provides local interpretations of individual predictions by approximating the model's behavior around specific data points.
  • The script initializes a LimeTabularExplainer and explains local predictions of the model using LIME.

Permutation Feature Importance

  • Permutation Feature Importance assesses the impact of each feature on a model’s prediction by measuring the decrease in the model’s performance when the values of a feature are randomly shuffled.
  • The script measures this dependency by calculating the decrease in model performance after permuting each feature, averaged over multiple permutations.

Comparison techniques

Feature selection

The FeatureSelection class in xai-compare is a robust tool for optimizing machine learning models by identifying and prioritizing the most influential features. This class leverages a variety of explainers, including SHAP, LIME, and Permutation Importance, to evaluate feature relevance systematically. It facilitates the iterative removal of less significant features, allowing users to understand the impact of each feature on model performance. This approach not only improves model efficiency but also enhances interpretability, making it easier to understand and justify model decisions.

Feature Selection Workflow

Feature Selection Workflow

Consistency

The Consistency class assesses the stability and reliability of explanations provided by various explainers across different splits of data. This class is crucial for determining whether the insights provided by model explainers are consistent regardless of data variances.

Consistency Measurement Workflow

Consistency Measurement Workflow

Sample notebooks

The notebooks below demonstrate different use cases for xai-compare package. For hands-on experience and to explore the notebooks in detail, visit the notebooks directory in the repository.

Feature Selection Comparison Notebook

Consistency Comparison Notebook

Main Demo Notebook

Call for Contributors

We're seeking individuals with expertise in machine learning, preferably explainable artificial intelligence (XAI), and proficiency in Python programming. If you have a background in these areas and are passionate about enhancing machine learning model transparency, we welcome your contributions. Join us in shaping the future of interpretable AI.

License

This project is licensed under the MIT License - see the LICENSE file for details.

Acknowledgments

  • The California housing dataset is sourced from scikit-learn.
  • SHAP and LIME libraries are used for model interpretability.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

xai-compare-0.1.2.tar.gz (49.5 kB view details)

Uploaded Source

Built Distribution

xai_compare-0.1.2-py3-none-any.whl (54.7 kB view details)

Uploaded Python 3

File details

Details for the file xai-compare-0.1.2.tar.gz.

File metadata

  • Download URL: xai-compare-0.1.2.tar.gz
  • Upload date:
  • Size: 49.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.10.11

File hashes

Hashes for xai-compare-0.1.2.tar.gz
Algorithm Hash digest
SHA256 fd924c815397dff0edc34152b282a9a8540217776f9645d9fb5e8987e6c2bab2
MD5 eb60fc947a10ddc2ccc4368b54348a91
BLAKE2b-256 0986ac014553d2949ded001a55ade8c632f60b81ab678c55377667dd46ffa823

See more details on using hashes here.

File details

Details for the file xai_compare-0.1.2-py3-none-any.whl.

File metadata

  • Download URL: xai_compare-0.1.2-py3-none-any.whl
  • Upload date:
  • Size: 54.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.10.11

File hashes

Hashes for xai_compare-0.1.2-py3-none-any.whl
Algorithm Hash digest
SHA256 fc6ab220c0fda410359bc7b15b328819d29a4dabd1c01ed8b9c0d9e503728912
MD5 43600911c76e49088ffab3d4178d99c3
BLAKE2b-256 27eef94c30cf80a945b2348eccbb4b697dd2135d664bd3b66e77bfefe2844c9a

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page