Skip to main content

A unified approach to explain the output of any machine learning model.

Project description


PyPI Conda License Tests Binder Documentation Status Downloads PyPI pyversions

SHAP (SHapley Additive exPlanations) is a game theoretic approach to explain the output of any machine learning model. It connects optimal credit allocation with local explanations using the classic Shapley values from game theory and their related extensions (see papers for details and citations).

Install

SHAP can be installed from either PyPI or conda-forge:

pip install shap
or
conda install -c conda-forge shap

Tree ensemble example (XGBoost/LightGBM/CatBoost/scikit-learn/pyspark models)

While SHAP can explain the output of any machine learning model, we have developed a high-speed exact algorithm for tree ensemble methods (see our Nature MI paper). Fast C++ implementations are supported for XGBoost, LightGBM, CatBoost, scikit-learn and pyspark tree models:

import xgboost
import shap

# train an XGBoost model
X, y = shap.datasets.california()
model = xgboost.XGBRegressor().fit(X, y)

# explain the model's predictions using SHAP
# (same syntax works for LightGBM, CatBoost, scikit-learn, transformers, Spark, etc.)
explainer = shap.Explainer(model)
shap_values = explainer(X)

# visualize the first prediction's explanation
shap.plots.waterfall(shap_values[0])

The above explanation shows features each contributing to push the model output from the base value (the average model output over the training dataset we passed) to the model output. Features pushing the prediction higher are shown in red, those pushing the prediction lower are in blue. Another way to visualize the same explanation is to use a force plot (these are introduced in our Nature BME paper):

# visualize the first prediction's explanation with a force plot
shap.plots.force(shap_values[0])

If we take many force plot explanations such as the one shown above, rotate them 90 degrees, and then stack them horizontally, we can see explanations for an entire dataset (in the notebook this plot is interactive):

# visualize all the training set predictions
shap.plots.force(shap_values[:500])

To understand how a single feature effects the output of the model we can plot the SHAP value of that feature vs. the value of the feature for all the examples in a dataset. Since SHAP values represent a feature's responsibility for a change in the model output, the plot below represents the change in predicted house price as the latitude changes. Vertical dispersion at a single value of latitude represents interaction effects with other features. To help reveal these interactions we can color by another feature. If we pass the whole explanation tensor to the color argument the scatter plot will pick the best feature to color by. In this case it picks longitude.

# create a dependence scatter plot to show the effect of a single feature across the whole dataset
shap.plots.scatter(shap_values[:, "Latitude"], color=shap_values)

To get an overview of which features are most important for a model we can plot the SHAP values of every feature for every sample. The plot below sorts features by the sum of SHAP value magnitudes over all samples, and uses SHAP values to show the distribution of the impacts each feature has on the model output. The color represents the feature value (red high, blue low). This reveals for example that higher median incomes improves the predicted home price.

# summarize the effects of all the features
shap.plots.beeswarm(shap_values)

We can also just take the mean absolute value of the SHAP values for each feature to get a standard bar plot (produces stacked bars for multi-class outputs):

shap.plots.bar(shap_values)

Natural language example (transformers)

SHAP has specific support for natural language models like those in the Hugging Face transformers library. By adding coalitional rules to traditional Shapley values we can form games that explain large modern NLP model using very few function evaluations. Using this functionality is as simple as passing a supported transformers pipeline to SHAP:

import transformers
import shap

# load a transformers pipeline model
model = transformers.pipeline('sentiment-analysis', return_all_scores=True)

# explain the model on two sample inputs
explainer = shap.Explainer(model)
shap_values = explainer(["What a great movie! ...if you have no taste."])

# visualize the first prediction's explanation for the POSITIVE output class
shap.plots.text(shap_values[0, :, "POSITIVE"])

Deep learning example with DeepExplainer (TensorFlow/Keras models)

Deep SHAP is a high-speed approximation algorithm for SHAP values in deep learning models that builds on a connection with DeepLIFT described in the SHAP NIPS paper. The implementation here differs from the original DeepLIFT by using a distribution of background samples instead of a single reference value, and using Shapley equations to linearize components such as max, softmax, products, divisions, etc. Note that some of these enhancements have also been since integrated into DeepLIFT. TensorFlow models and Keras models using the TensorFlow backend are supported (there is also preliminary support for PyTorch):

# ...include code from https://github.com/keras-team/keras/blob/master/examples/demo_mnist_convnet.py

import shap
import numpy as np

# select a set of background examples to take an expectation over
background = x_train[np.random.choice(x_train.shape[0], 100, replace=False)]

# explain predictions of the model on four images
e = shap.DeepExplainer(model, background)
# ...or pass tensors directly
# e = shap.DeepExplainer((model.layers[0].input, model.layers[-1].output), background)
shap_values = e.shap_values(x_test[1:5])

# plot the feature attributions
shap.image_plot(shap_values, -x_test[1:5])

The plot above explains ten outputs (digits 0-9) for four different images. Red pixels increase the model's output while blue pixels decrease the output. The input images are shown on the left, and as nearly transparent grayscale backings behind each of the explanations. The sum of the SHAP values equals the difference between the expected model output (averaged over the background dataset) and the current model output. Note that for the 'zero' image the blank middle is important, while for the 'four' image the lack of a connection on top makes it a four instead of a nine.

Deep learning example with GradientExplainer (TensorFlow/Keras/PyTorch models)

Expected gradients combines ideas from Integrated Gradients, SHAP, and SmoothGrad into a single expected value equation. This allows an entire dataset to be used as the background distribution (as opposed to a single reference value) and allows local smoothing. If we approximate the model with a linear function between each background data sample and the current input to be explained, and we assume the input features are independent then expected gradients will compute approximate SHAP values. In the example below we have explained how the 7th intermediate layer of the VGG16 ImageNet model impacts the output probabilities.

from keras.applications.vgg16 import VGG16
from keras.applications.vgg16 import preprocess_input
import keras.backend as K
import numpy as np
import json
import shap

# load pre-trained model and choose two images to explain
model = VGG16(weights='imagenet', include_top=True)
X,y = shap.datasets.imagenet50()
to_explain = X[[39,41]]

# load the ImageNet class names
url = "https://s3.amazonaws.com/deep-learning-models/image-models/imagenet_class_index.json"
fname = shap.datasets.cache(url)
with open(fname) as f:
    class_names = json.load(f)

# explain how the input to the 7th layer of the model explains the top two classes
def map2layer(x, layer):
    feed_dict = dict(zip([model.layers[0].input], [preprocess_input(x.copy())]))
    return K.get_session().run(model.layers[layer].input, feed_dict)
e = shap.GradientExplainer(
    (model.layers[7].input, model.layers[-1].output),
    map2layer(X, 7),
    local_smoothing=0 # std dev of smoothing noise
)
shap_values,indexes = e.shap_values(map2layer(to_explain, 7), ranked_outputs=2)

# get the names for the classes
index_names = np.vectorize(lambda x: class_names[str(x)][1])(indexes)

# plot the explanations
shap.image_plot(shap_values, to_explain, index_names)

Predictions for two input images are explained in the plot above. Red pixels represent positive SHAP values that increase the probability of the class, while blue pixels represent negative SHAP values the reduce the probability of the class. By using ranked_outputs=2 we explain only the two most likely classes for each input (this spares us from explaining all 1,000 classes).

Model agnostic example with KernelExplainer (explains any function)

Kernel SHAP uses a specially-weighted local linear regression to estimate SHAP values for any model. Below is a simple example for explaining a multi-class SVM on the classic iris dataset.

import sklearn
import shap
from sklearn.model_selection import train_test_split

# print the JS visualization code to the notebook
shap.initjs()

# train a SVM classifier
X_train,X_test,Y_train,Y_test = train_test_split(*shap.datasets.iris(), test_size=0.2, random_state=0)
svm = sklearn.svm.SVC(kernel='rbf', probability=True)
svm.fit(X_train, Y_train)

# use Kernel SHAP to explain test set predictions
explainer = shap.KernelExplainer(svm.predict_proba, X_train, link="logit")
shap_values = explainer.shap_values(X_test, nsamples=100)

# plot the SHAP values for the Setosa output of the first instance
shap.force_plot(explainer.expected_value[0], shap_values[0][0,:], X_test.iloc[0,:], link="logit")

The above explanation shows four features each contributing to push the model output from the base value (the average model output over the training dataset we passed) towards zero. If there were any features pushing the class label higher they would be shown in red.

If we take many explanations such as the one shown above, rotate them 90 degrees, and then stack them horizontally, we can see explanations for an entire dataset. This is exactly what we do below for all the examples in the iris test set:

# plot the SHAP values for the Setosa output of all instances
shap.force_plot(explainer.expected_value[0], shap_values[0], X_test, link="logit")

SHAP Interaction Values

SHAP interaction values are a generalization of SHAP values to higher order interactions. Fast exact computation of pairwise interactions are implemented for tree models with shap.TreeExplainer(model).shap_interaction_values(X). This returns a matrix for every prediction, where the main effects are on the diagonal and the interaction effects are off-diagonal. These values often reveal interesting hidden relationships, such as how the increased risk of death peaks for men at age 60 (see the NHANES notebook for details):

Sample notebooks

The notebooks below demonstrate different use cases for SHAP. Look inside the notebooks directory of the repository if you want to try playing with the original notebooks yourself.

TreeExplainer

An implementation of Tree SHAP, a fast and exact algorithm to compute SHAP values for trees and ensembles of trees.

DeepExplainer

An implementation of Deep SHAP, a faster (but only approximate) algorithm to compute SHAP values for deep learning models that is based on connections between SHAP and the DeepLIFT algorithm.

GradientExplainer

An implementation of expected gradients to approximate SHAP values for deep learning models. It is based on connections between SHAP and the Integrated Gradients algorithm. GradientExplainer is slower than DeepExplainer and makes different approximation assumptions.

LinearExplainer

For a linear model with independent features we can analytically compute the exact SHAP values. We can also account for feature correlation if we are willing to estimate the feature covariance matrix. LinearExplainer supports both of these options.

KernelExplainer

An implementation of Kernel SHAP, a model agnostic method to estimate SHAP values for any model. Because it makes no assumptions about the model type, KernelExplainer is slower than the other model type specific algorithms.

  • Census income classification with scikit-learn - Using the standard adult census income dataset, this notebook trains a k-nearest neighbors classifier using scikit-learn and then explains predictions using shap.

  • ImageNet VGG16 Model with Keras - Explain the classic VGG16 convolutional neural network's predictions for an image. This works by applying the model agnostic Kernel SHAP method to a super-pixel segmented image.

  • Iris classification - A basic demonstration using the popular iris species dataset. It explains predictions from six different models in scikit-learn using shap.

Documentation notebooks

These notebooks comprehensively demonstrate how to use specific functions and objects.

Methods Unified by SHAP

  1. LIME: Ribeiro, Marco Tulio, Sameer Singh, and Carlos Guestrin. "Why should i trust you?: Explaining the predictions of any classifier." Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. ACM, 2016.

  2. Shapley sampling values: Strumbelj, Erik, and Igor Kononenko. "Explaining prediction models and individual predictions with feature contributions." Knowledge and information systems 41.3 (2014): 647-665.

  3. DeepLIFT: Shrikumar, Avanti, Peyton Greenside, and Anshul Kundaje. "Learning important features through propagating activation differences." arXiv preprint arXiv:1704.02685 (2017).

  4. QII: Datta, Anupam, Shayak Sen, and Yair Zick. "Algorithmic transparency via quantitative input influence: Theory and experiments with learning systems." Security and Privacy (SP), 2016 IEEE Symposium on. IEEE, 2016.

  5. Layer-wise relevance propagation: Bach, Sebastian, et al. "On pixel-wise explanations for non-linear classifier decisions by layer-wise relevance propagation." PloS one 10.7 (2015): e0130140.

  6. Shapley regression values: Lipovetsky, Stan, and Michael Conklin. "Analysis of regression in game theory approach." Applied Stochastic Models in Business and Industry 17.4 (2001): 319-330.

  7. Tree interpreter: Saabas, Ando. Interpreting random forests. http://blog.datadive.net/interpreting-random-forests/

Citations

The algorithms and visualizations used in this package came primarily out of research in Su-In Lee's lab at the University of Washington, and Microsoft Research. If you use SHAP in your research we would appreciate a citation to the appropriate paper(s):

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

shap-0.45.1.tar.gz (1.2 MB view details)

Uploaded Source

Built Distributions

If you're not sure about the file name format, learn more about wheel file names.

shap-0.45.1-cp312-cp312-win_amd64.whl (455.7 kB view details)

Uploaded CPython 3.12Windows x86-64

shap-0.45.1-cp312-cp312-musllinux_1_1_x86_64.whl (1.1 MB view details)

Uploaded CPython 3.12musllinux: musl 1.1+ x86-64

shap-0.45.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl (541.3 kB view details)

Uploaded CPython 3.12manylinux: glibc 2.17+ ARM64

shap-0.45.1-cp312-cp312-manylinux_2_12_x86_64.manylinux2010_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (544.3 kB view details)

Uploaded CPython 3.12manylinux: glibc 2.12+ x86-64manylinux: glibc 2.17+ x86-64

shap-0.45.1-cp312-cp312-macosx_11_0_arm64.whl (454.9 kB view details)

Uploaded CPython 3.12macOS 11.0+ ARM64

shap-0.45.1-cp312-cp312-macosx_10_9_x86_64.whl (458.9 kB view details)

Uploaded CPython 3.12macOS 10.9+ x86-64

shap-0.45.1-cp311-cp311-win_amd64.whl (455.5 kB view details)

Uploaded CPython 3.11Windows x86-64

shap-0.45.1-cp311-cp311-musllinux_1_1_x86_64.whl (1.1 MB view details)

Uploaded CPython 3.11musllinux: musl 1.1+ x86-64

shap-0.45.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl (538.3 kB view details)

Uploaded CPython 3.11manylinux: glibc 2.17+ ARM64

shap-0.45.1-cp311-cp311-manylinux_2_12_x86_64.manylinux2010_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (540.6 kB view details)

Uploaded CPython 3.11manylinux: glibc 2.12+ x86-64manylinux: glibc 2.17+ x86-64

shap-0.45.1-cp311-cp311-macosx_11_0_arm64.whl (455.2 kB view details)

Uploaded CPython 3.11macOS 11.0+ ARM64

shap-0.45.1-cp311-cp311-macosx_10_9_x86_64.whl (458.8 kB view details)

Uploaded CPython 3.11macOS 10.9+ x86-64

shap-0.45.1-cp310-cp310-win_amd64.whl (455.5 kB view details)

Uploaded CPython 3.10Windows x86-64

shap-0.45.1-cp310-cp310-musllinux_1_1_x86_64.whl (1.1 MB view details)

Uploaded CPython 3.10musllinux: musl 1.1+ x86-64

shap-0.45.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl (538.2 kB view details)

Uploaded CPython 3.10manylinux: glibc 2.17+ ARM64

shap-0.45.1-cp310-cp310-manylinux_2_12_x86_64.manylinux2010_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (540.5 kB view details)

Uploaded CPython 3.10manylinux: glibc 2.12+ x86-64manylinux: glibc 2.17+ x86-64

shap-0.45.1-cp310-cp310-macosx_11_0_arm64.whl (455.2 kB view details)

Uploaded CPython 3.10macOS 11.0+ ARM64

shap-0.45.1-cp310-cp310-macosx_10_9_x86_64.whl (458.8 kB view details)

Uploaded CPython 3.10macOS 10.9+ x86-64

shap-0.45.1-cp39-cp39-win_amd64.whl (455.5 kB view details)

Uploaded CPython 3.9Windows x86-64

shap-0.45.1-cp39-cp39-musllinux_1_1_x86_64.whl (1.1 MB view details)

Uploaded CPython 3.9musllinux: musl 1.1+ x86-64

shap-0.45.1-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl (538.1 kB view details)

Uploaded CPython 3.9manylinux: glibc 2.17+ ARM64

shap-0.45.1-cp39-cp39-manylinux_2_12_x86_64.manylinux2010_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (540.4 kB view details)

Uploaded CPython 3.9manylinux: glibc 2.12+ x86-64manylinux: glibc 2.17+ x86-64

shap-0.45.1-cp39-cp39-macosx_11_0_arm64.whl (455.2 kB view details)

Uploaded CPython 3.9macOS 11.0+ ARM64

shap-0.45.1-cp39-cp39-macosx_10_9_x86_64.whl (458.8 kB view details)

Uploaded CPython 3.9macOS 10.9+ x86-64

File details

Details for the file shap-0.45.1.tar.gz.

File metadata

  • Download URL: shap-0.45.1.tar.gz
  • Upload date:
  • Size: 1.2 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.12.3

File hashes

Hashes for shap-0.45.1.tar.gz
Algorithm Hash digest
SHA256 24e7d7e2c0d6b798701b83eacee063d64926426a150a0d261b4a135f60639f10
MD5 6b05170f31b5ee888452b36a3bca532f
BLAKE2b-256 dbae01a85567927a332c959768674475572ce2f80ec03dafc085a8c543c87e70

See more details on using hashes here.

File details

Details for the file shap-0.45.1-cp312-cp312-win_amd64.whl.

File metadata

  • Download URL: shap-0.45.1-cp312-cp312-win_amd64.whl
  • Upload date:
  • Size: 455.7 kB
  • Tags: CPython 3.12, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.12.3

File hashes

Hashes for shap-0.45.1-cp312-cp312-win_amd64.whl
Algorithm Hash digest
SHA256 35e3ce132e833e8d53bac8f9b4a52b387bc2ad47c3383f3fc2a356d9864e36b4
MD5 bebbb4ccfebc5bcd466b3f01f5ffd33e
BLAKE2b-256 4bfd5b47fec42880e033608e429baa0f4998fa5e5241187ef1222c65b3adefa3

See more details on using hashes here.

File details

Details for the file shap-0.45.1-cp312-cp312-musllinux_1_1_x86_64.whl.

File metadata

File hashes

Hashes for shap-0.45.1-cp312-cp312-musllinux_1_1_x86_64.whl
Algorithm Hash digest
SHA256 2322fff6744b8c895d925629fcd7a485f3e99daa4b88e12a76642192f8ad9951
MD5 67f24b796dbb4a78b1b53673a534690e
BLAKE2b-256 59fc4c4f09f1d7889a4fedc60e0adc25e2c1a910cee6ca8191189ea58be9e740

See more details on using hashes here.

File details

Details for the file shap-0.45.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl.

File metadata

File hashes

Hashes for shap-0.45.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl
Algorithm Hash digest
SHA256 5bce2d114836ad2de11b26484e74473e3131eccd6ba0f4833ce251f539f04097
MD5 d187628366e24001908a5d6d3d66acd6
BLAKE2b-256 09e49139b10fca7407189acc286ec37c25cb33206c300a0f8da14a5b331c6eb7

See more details on using hashes here.

File details

Details for the file shap-0.45.1-cp312-cp312-manylinux_2_12_x86_64.manylinux2010_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for shap-0.45.1-cp312-cp312-manylinux_2_12_x86_64.manylinux2010_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 f14de63c54b16919d45558054ee60d8046aaa3cf901fa58f8db77a2575aaa735
MD5 f174aecd220fc541354e4b56c8b37ee5
BLAKE2b-256 8673cd660afcd09ae6ab595405d4baf3d4fb77aad03bbaed6334912045484443

See more details on using hashes here.

File details

Details for the file shap-0.45.1-cp312-cp312-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for shap-0.45.1-cp312-cp312-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 c80a4dbc810d64efe2e4a8d80d275eecf251297b53748e1700708bd7f0b25401
MD5 ae7f3bc2449c74becc199488ffef971a
BLAKE2b-256 02589f44dde104d0dd1334672d2717f47fbb7ffb6331ab1af3abf8a3bb74c8ae

See more details on using hashes here.

File details

Details for the file shap-0.45.1-cp312-cp312-macosx_10_9_x86_64.whl.

File metadata

File hashes

Hashes for shap-0.45.1-cp312-cp312-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 b54c8f893a67564bcf726d39123782829c6bd174a4ff24538282f74502a18d75
MD5 9ffdb5f48de32273ec665f4f6177877e
BLAKE2b-256 8e7a227bfda072debc7edde1cc0c4c614756c692353a17d078aac0ce78d20aa0

See more details on using hashes here.

File details

Details for the file shap-0.45.1-cp311-cp311-win_amd64.whl.

File metadata

  • Download URL: shap-0.45.1-cp311-cp311-win_amd64.whl
  • Upload date:
  • Size: 455.5 kB
  • Tags: CPython 3.11, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.12.3

File hashes

Hashes for shap-0.45.1-cp311-cp311-win_amd64.whl
Algorithm Hash digest
SHA256 2fd753424a5ae8b3124da08e54ad9b092c2a184fd37ec43f1c4bcd50161c16bb
MD5 688e3767d820a7e19421cacf0b6d6c32
BLAKE2b-256 406c850cdf7d0c6351ee9d060c0a24237381ae212c125553afa61198eaa06b0b

See more details on using hashes here.

File details

Details for the file shap-0.45.1-cp311-cp311-musllinux_1_1_x86_64.whl.

File metadata

File hashes

Hashes for shap-0.45.1-cp311-cp311-musllinux_1_1_x86_64.whl
Algorithm Hash digest
SHA256 0e76637e78ac475e1f711643c062b2f350b473b1baf59f5d8173df65b433bb8d
MD5 7ae18ad8bc283625d9bcb320c6cd6671
BLAKE2b-256 688ba4b96e33b6666eacd21eeb41d60822a361cb0846274d034eed9ab6e34815

See more details on using hashes here.

File details

Details for the file shap-0.45.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl.

File metadata

File hashes

Hashes for shap-0.45.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl
Algorithm Hash digest
SHA256 dc416553d44c0ab38f3ff964af2b1081384e1bd51952c9f58a5879a1a1f34d6d
MD5 0dfee6d3a4b9276433f3867b6e177db7
BLAKE2b-256 3dbf897d3c5d42dfb919b15d4e988ee058b5a8fbbff9eb08b6394208b4534e25

See more details on using hashes here.

File details

Details for the file shap-0.45.1-cp311-cp311-manylinux_2_12_x86_64.manylinux2010_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for shap-0.45.1-cp311-cp311-manylinux_2_12_x86_64.manylinux2010_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 5d4fe1ea1c0332ccd36ed24925cbd1ec56f787e5184ef19b682d866075261c7d
MD5 91adad80d9624c0f8262de45db0c7867
BLAKE2b-256 d7d6a33e1c51f6716b914f164d52f693738b4e605360a60c4c07a986e2802811

See more details on using hashes here.

File details

Details for the file shap-0.45.1-cp311-cp311-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for shap-0.45.1-cp311-cp311-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 395052459542475d458afc6607fa37820374216ffa0739177b1105bcd551db9c
MD5 1715b3d7a31fd7653f06f5c608a75af0
BLAKE2b-256 d36a5882a1cd77cd282bd9f66cf9a2ddcb58475d1833760f83347103605bf226

See more details on using hashes here.

File details

Details for the file shap-0.45.1-cp311-cp311-macosx_10_9_x86_64.whl.

File metadata

File hashes

Hashes for shap-0.45.1-cp311-cp311-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 a467e6753f01e6d8dc6a5251a4846cc5bc14f6126f04829bdf5d66f03ca02e8e
MD5 8891aa5767d45aefc042672ae5b00848
BLAKE2b-256 dfd5be0bc4faf679a3e49814a3253ba9a1a538fa8050b12dc9996cd06ca5c9d5

See more details on using hashes here.

File details

Details for the file shap-0.45.1-cp310-cp310-win_amd64.whl.

File metadata

  • Download URL: shap-0.45.1-cp310-cp310-win_amd64.whl
  • Upload date:
  • Size: 455.5 kB
  • Tags: CPython 3.10, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.12.3

File hashes

Hashes for shap-0.45.1-cp310-cp310-win_amd64.whl
Algorithm Hash digest
SHA256 d48f8bf9db76c979a1f7a5601e8efaa6f814a8be65673ed9fa7bb4f963c0ab98
MD5 85e5f9b8199fcfeb97f74b7704f536a2
BLAKE2b-256 4f78524be0d40b31170706e77414109ef640ba8d09ff8d9f7f80f9828ec4fbb0

See more details on using hashes here.

File details

Details for the file shap-0.45.1-cp310-cp310-musllinux_1_1_x86_64.whl.

File metadata

File hashes

Hashes for shap-0.45.1-cp310-cp310-musllinux_1_1_x86_64.whl
Algorithm Hash digest
SHA256 3c9bcea3f5ba8bdbba653ea33912dd197a646189df93a1924a7549fdbf305e3a
MD5 fa1658b6ff543abe353c0ee642b4ddb6
BLAKE2b-256 15b344e62e8299318f69c0f83d1559bea4b72e32b4d8a2c6bcac67bbe0def720

See more details on using hashes here.

File details

Details for the file shap-0.45.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl.

File metadata

File hashes

Hashes for shap-0.45.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl
Algorithm Hash digest
SHA256 2e8ec0f7be8c22f2dc14e951cea552ade087446a5417a1c8113a8fc382be55b5
MD5 1b2ab64b93c3740d8545a208196faf3c
BLAKE2b-256 d20c8e130bff56c348d67407abf471f57341927f14e771b519e0bb9b06373571

See more details on using hashes here.

File details

Details for the file shap-0.45.1-cp310-cp310-manylinux_2_12_x86_64.manylinux2010_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for shap-0.45.1-cp310-cp310-manylinux_2_12_x86_64.manylinux2010_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 eacaf6a41de0e0ca52056f2d141f57897044279a44772e1484dcb4b251731eda
MD5 e53ac2d5fcff511fd699815bd4072e55
BLAKE2b-256 c77f146730a01e45ca0118beef2c9e02066598e2cbe0d890b79a8a1d41bd84dc

See more details on using hashes here.

File details

Details for the file shap-0.45.1-cp310-cp310-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for shap-0.45.1-cp310-cp310-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 493e824e48704d40129113310c18abfc6a6e7693a61ac2407028df37036bd05b
MD5 8a3dbc825ba2a49b9e92626c6cf73265
BLAKE2b-256 f31b3f49493e4e179befd82343c7eda7e7f3808c9712efa95f604f3208c4290f

See more details on using hashes here.

File details

Details for the file shap-0.45.1-cp310-cp310-macosx_10_9_x86_64.whl.

File metadata

File hashes

Hashes for shap-0.45.1-cp310-cp310-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 40559fa935d712a36eadd3d4b6ce5b9b891c9e99242b54291d97b789438d01e2
MD5 2957631db237df3dbcae646bc9dc2870
BLAKE2b-256 c8ab4e826705a7cea47456f8d3a865f4ccc8026a1204f7191b1d9ad8e6642308

See more details on using hashes here.

File details

Details for the file shap-0.45.1-cp39-cp39-win_amd64.whl.

File metadata

  • Download URL: shap-0.45.1-cp39-cp39-win_amd64.whl
  • Upload date:
  • Size: 455.5 kB
  • Tags: CPython 3.9, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.12.3

File hashes

Hashes for shap-0.45.1-cp39-cp39-win_amd64.whl
Algorithm Hash digest
SHA256 cab7265cd283bce19906a0fe9399be98a741ba9a47116f105220bbcfb5ef339c
MD5 dbe03fbc9b709261ccf369b48c509f8f
BLAKE2b-256 35c72ba19df54c0fc42f97f1fa191c51238d696d5f7f09a0e77d54c360455449

See more details on using hashes here.

File details

Details for the file shap-0.45.1-cp39-cp39-musllinux_1_1_x86_64.whl.

File metadata

File hashes

Hashes for shap-0.45.1-cp39-cp39-musllinux_1_1_x86_64.whl
Algorithm Hash digest
SHA256 8c7baf7f736b59f4c98ccd728776e0bc0a151f3726a07876329231744ae773ea
MD5 5d72430e4a6826682ae1d9fcb5ac4d69
BLAKE2b-256 4dc40d2c4a3135aee5614798a8d6e21ee77d578d69da5e9161752ca148ce5a3c

See more details on using hashes here.

File details

Details for the file shap-0.45.1-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl.

File metadata

File hashes

Hashes for shap-0.45.1-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl
Algorithm Hash digest
SHA256 510d5208b557dc28ca5dc3395bc2be997f0e874147b64b8ea9eeac37b1a8e121
MD5 18af074e054ec3e9b3cc2633927a8c23
BLAKE2b-256 2d6581777d9a1a54ac2ae3408dd4084ca942074d5b87b4263a44cc4b94fcb1f5

See more details on using hashes here.

File details

Details for the file shap-0.45.1-cp39-cp39-manylinux_2_12_x86_64.manylinux2010_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for shap-0.45.1-cp39-cp39-manylinux_2_12_x86_64.manylinux2010_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 70f34b6fe6780db22851ef6d232f136213b7fd9bdbb41d6c3f28a05b27661340
MD5 460479d07771e1200e8c0ea49cf44313
BLAKE2b-256 695257c7a70992abf1f2e756e1adecb0cf284a81a6b5452e941703bbc4f47262

See more details on using hashes here.

File details

Details for the file shap-0.45.1-cp39-cp39-macosx_11_0_arm64.whl.

File metadata

  • Download URL: shap-0.45.1-cp39-cp39-macosx_11_0_arm64.whl
  • Upload date:
  • Size: 455.2 kB
  • Tags: CPython 3.9, macOS 11.0+ ARM64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.12.3

File hashes

Hashes for shap-0.45.1-cp39-cp39-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 73f4b22cac47096d02337a98dd97edae30241c1bcdaf8ef5e4e08dc1e4c17c80
MD5 55647203c26e4da657b379926de50555
BLAKE2b-256 7081888f2cc334d4bd664963877ebef91df1dbc33df95c61bbf4ec8799eacf97

See more details on using hashes here.

File details

Details for the file shap-0.45.1-cp39-cp39-macosx_10_9_x86_64.whl.

File metadata

File hashes

Hashes for shap-0.45.1-cp39-cp39-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 1abbbc3685c6d8439083e740277782cefee2792e96c82f63505ef251391f4a05
MD5 23e895dc380ceb8d5192561f1c45d838
BLAKE2b-256 053233eece2a3c2c7798ec0dfb3d5a8584b9266ef44f6b1cca6d254f79c364a1

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page