Skip to main content

Intel® Explainable AI Tools

Project description

Intel® Explainable AI Tools

This repository provides tools for data scientists and MLOps engineers that have requirements specific to AI model interpretability.

Overview

The Intel Explainable AI Tools are designed to help users detect and mitigate against issues of fairness and interpretability, while running best on Intel hardware. There are two Python* components in the repository:

  • Model Card Generator
    • Creates interactive HTML reports containing model performance and fairness metrics
  • Explainer
    • Runs post-hoc model distillation and visualization methods to examine predictive behavior for both TensorFlow* and PyTorch* models via a simple Python API including the following modules:

      • Attributions: Visualize negative and positive attributions of tabular features, pixels, and word tokens for predictions
      • CAM (Class Activation Mapping): Create heatmaps for CNN image classifications using gradient-weight class activation CAM mapping
      • Metrics: Gain insight into models with the measurements and visualizations needed during the machine learning workflow
    • ShapUI: A user interface to explore and compare impact scores of model predictions for each record of a tabular data set and discover insights of a model's behavior:

      • Error Analysis allows the user to filter data points based on their error type.
      • Impact Analysis allows users to filter data points based on their associated SHAP impact score for top important features.
      • Feature Analysis allows users to filter data points by feature values for top important features.

Get Started

Requirements

  • Linux system or WSL2 on Windows (validated on Ubuntu* 20.04/22.04 LTS)
  • Python 3.9 or 3.10
  • Install required OS packages with apt-get install build-essential python3-dev
  • git (only required for the "Developer Installation")

Create and activate a Python3 virtual environment

We encourage you to use a python virtual environment (virtualenv or conda) for consistent package management. There are two ways to do this:

a. Using virtualenv:

python3.9 -m virtualenv xai_env
source xai_env/bin/activate

Or conda:

conda create --name xai_env python=3.9
conda activate xai_env

Basic Installation

pip install intel-xai

Developer Installation

Use these instructions to install the Intel Explainable AI Tools with a clone of the GitHub repository. This can be done instead of the basic pip install, if you plan on making code changes.

  1. Clone this repo and navigate to the repo directory:
    git clone https://github.com/IntelAI/intel-xai-tools.git
    
    cd intel-xai-tools
    
  2. Install the Intel Explainable AI Tools using the following command:
    make install
    

Running Notebooks

The following links have Jupyter* notebooks showing how to use the Explainer and Model Card Generator APIs in various ML domains and use cases:

Support

The Intel Explainable AI Tools team tracks bugs and enhancement requests using GitHub issues. Before submitting a suggestion or bug report, search the existing GitHub issues to see if your issue has already been reported.

*Other names and brands may be claimed as the property of others. Trademarks

DISCLAIMER:

These scripts are not intended for benchmarking Intel platforms. For any performance and/or benchmarking information on specific Intel platforms, visit https://www.intel.ai/blog.

Intel is committed to the respect of human rights and avoiding complicity in human rights abuses, a policy reflected in the Intel Global Human Rights Principles. Accordingly, by accessing the Intel material on this platform you agree that you will not use the material in a product or application that causes or contributes to a violation of an internationally recognized human right.

License:

Intel® Explainable AI Tools is licensed under Apache License Version 2.0.

Datasets:

To the extent that any public datasets are referenced by Intel or accessed using tools or code on this site those datasets are provided by the third party indicated as the data source. Intel does not create the data, or datasets, and does not warrant their accuracy or quality. By accessing the public dataset(s) you agree to the terms associated with those datasets and that your use complies with the applicable license. DATASETS

Intel expressly disclaims the accuracy, adequacy, or completeness of any public datasets, and is not liable for any errors, omissions, or defects in the data, or for any reliance on the data. Intel is not liable for any liability or damages relating to your use of public datasets.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distribution

intel_xai-0.3.1.dev20230608-py3-none-any.whl (46.1 kB view details)

Uploaded Python 3

File details

Details for the file intel_xai-0.3.1.dev20230608-py3-none-any.whl.

File metadata

File hashes

Hashes for intel_xai-0.3.1.dev20230608-py3-none-any.whl
Algorithm Hash digest
SHA256 8c97eec5f924a7f18cb2abd638e145c487464ad4ce09c403c23866db1b4c649a
MD5 a0a2cb440f1c801085c1abbcc1a722eb
BLAKE2b-256 731db0ec5be8d0961f51f47bb2b0bccd70713aa4abdfecb3d99935ed665a2307

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page