Skip to main content

Shapley Interactions for Machine Learning

Project description

shapiq: Shapley Interactions for Machine Learning shapiq_logo

License Coverage Status Tests Read the Docs

PyPI Version PyPI status PePy Code Style

An interaction may speak more than a thousand main effects.

Shapley Interaction Quantification (shapiq) is a Python package for (1) approximating any-order Shapley interactions, (2) benchmarking game-theoretical algorithms for machine learning, (3) explaining feature interactions of model predictions. shapiq extends the well-known shap package for both researchers working on game theory in machine learning, as well as the end-users explaining models. SHAP-IQ extends individual Shapley values by quantifying the synergy effect between entities (aka players in the jargon of game theory) like explanatory features, data points, or weak learners in ensemble models. Synergies between players give a more comprehensive view of machine learning models.

🛠️ Install

shapiq is intended to work with Python 3.9 and above. Installation can be done via pip:

pip install shapiq

⭐ Quickstart

You can explain a model with shapiq.explainer and visualize Shapley interactions with shapiq.plot. If you are interested in the underlying game theoretic algorithms, then check out the shapiq.approximator and shapiq.games modules.

📈 Compute any-order feature interactions

Explain your models with Shapley interaction values like the k-SII values:

import shapiq
# load data
X, y = shapiq.load_california_housing(to_numpy=True)
# train a model
from sklearn.ensemble import RandomForestRegressor
model = RandomForestRegressor()
model.fit(X, y)
# set up an explainer with k-SII interaction values up to order 4
explainer = shapiq.TabularExplainer(
    model=model,
    data=X,
    index="k-SII",
    max_order=4
)
# explain the model's prediction for the first sample
interaction_values = explainer.explain(X[0], budget=256)
# analyse interaction values
print(interaction_values)

>> InteractionValues(
>>     index=k-SII, max_order=4, min_order=0, estimated=False,
>>     estimation_budget=256, n_players=8, baseline_value=2.07282292,
>>     Top 10 interactions:
>>         (0,): 1.696969079  # attribution of feature 0
>>         (0, 5): 0.4847876
>>         (0, 1): 0.4494288  # interaction between features 0 & 1
>>         (0, 6): 0.4477677
>>         (1, 5): 0.3750034
>>         (4, 5): 0.3468325
>>         (0, 3, 6): -0.320  # interaction between features 0 & 3 & 6
>>         (2, 3, 6): -0.329
>>         (0, 1, 5): -0.363
>>         (6,): -0.56358890
>> )

📊 Visualize feature interactions

A handy way of visualizing interaction scores up to order 2 are network plots. You can see an example of such a plot below. The nodes represent feature attributions and the edges represent the interactions between features. The strength and size of the nodes and edges are proportional to the absolute value of attributions and interactions, respectively.

shapiq.network_plot(
    first_order_values=interaction_values.get_n_order_values(1),
    second_order_values=interaction_values.get_n_order_values(2)
)
# or use
interaction_values.plot_network()

The pseudo-code above can produce the following plot (here also an image is added):

network_plot_example

📖 Documentation with tutorials

The documentation for shapiq can be found here.

💬 Citation

If you enjoy shapiq consider starring ⭐ the repository. If you really enjoy the package or it has been useful to you, and you would like to cite it in a scientific publication, please refer to our paper:

@inproceedings{shapiq,
  title        = {{SHAP-IQ}: Unified approximation of any-order Shapley interactions},
  author       = {Fabian Fumagalli and
                  Maximilian Muschalik and
                  Patrick Kolpaczki and
                  Eyke H{\"{u}}llermeier and
                  Barbara Hammer},
  booktitle    = {NeurIPS},
  year         = {2023}
}

Changelog

v1.0.1 (2024-06-05)

  • add max_order=1 to TabularExplainer and TreeExplainer
  • fix TreeExplainer.explain_X(..., njobs=2, random_state=0)

v1.0.0 (2024-06-04)

Major release of the shapiq Python package including (among others):

  • approximator module implements over 10 approximators of Shapley values and interaction indices.
  • exact module implements a computer for over 10 game theoretic concepts like interaction indices or generalized values.
  • games module implements over 10 application benchmarks for the approximators.
  • explainer module includes a TabularExplainer and TreeExplainer for any-order feature interactions of machine learning model predictions.
  • interaction_values module implements a data class to store and analyze interaction values.
  • plot module allows visualizing interaction values.
  • datasets module loads datasets for testing and examples.

Documentation of shapiq with tutorials and API reference is available at https://shapiq.readthedocs.io

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

shapiq-1.0.1.tar.gz (155.3 kB view hashes)

Uploaded Source

Built Distribution

shapiq-1.0.1-py3-none-any.whl (199.2 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page