Skip to main content

A unified approach to explain the output of any machine learning model.

Project description


PyPI Conda License Tests Binder Documentation Status Downloads PyPI pyversions

SHAP (SHapley Additive exPlanations) is a game theoretic approach to explain the output of any machine learning model. It connects optimal credit allocation with local explanations using the classic Shapley values from game theory and their related extensions (see papers for details and citations).

Install

SHAP can be installed from either PyPI or conda-forge:

pip install shap
or
conda install -c conda-forge shap

Tree ensemble example (XGBoost/LightGBM/CatBoost/scikit-learn/pyspark models)

While SHAP can explain the output of any machine learning model, we have developed a high-speed exact algorithm for tree ensemble methods (see our Nature MI paper). Fast C++ implementations are supported for XGBoost, LightGBM, CatBoost, scikit-learn and pyspark tree models:

import xgboost
import shap

# train an XGBoost model
X, y = shap.datasets.california()
model = xgboost.XGBRegressor().fit(X, y)

# explain the model's predictions using SHAP
# (same syntax works for LightGBM, CatBoost, scikit-learn, transformers, Spark, etc.)
explainer = shap.Explainer(model)
shap_values = explainer(X)

# visualize the first prediction's explanation
shap.plots.waterfall(shap_values[0])

The above explanation shows features each contributing to push the model output from the base value (the average model output over the training dataset we passed) to the model output. Features pushing the prediction higher are shown in red, those pushing the prediction lower are in blue. Another way to visualize the same explanation is to use a force plot (these are introduced in our Nature BME paper):

# visualize the first prediction's explanation with a force plot
shap.plots.force(shap_values[0])

If we take many force plot explanations such as the one shown above, rotate them 90 degrees, and then stack them horizontally, we can see explanations for an entire dataset (in the notebook this plot is interactive):

# visualize all the training set predictions
shap.plots.force(shap_values[:500])

To understand how a single feature effects the output of the model we can plot the SHAP value of that feature vs. the value of the feature for all the examples in a dataset. Since SHAP values represent a feature's responsibility for a change in the model output, the plot below represents the change in predicted house price as the latitude changes. Vertical dispersion at a single value of latitude represents interaction effects with other features. To help reveal these interactions we can color by another feature. If we pass the whole explanation tensor to the color argument the scatter plot will pick the best feature to color by. In this case it picks longitude.

# create a dependence scatter plot to show the effect of a single feature across the whole dataset
shap.plots.scatter(shap_values[:, "Latitude"], color=shap_values)

To get an overview of which features are most important for a model we can plot the SHAP values of every feature for every sample. The plot below sorts features by the sum of SHAP value magnitudes over all samples, and uses SHAP values to show the distribution of the impacts each feature has on the model output. The color represents the feature value (red high, blue low). This reveals for example that higher median incomes improves the predicted home price.

# summarize the effects of all the features
shap.plots.beeswarm(shap_values)

We can also just take the mean absolute value of the SHAP values for each feature to get a standard bar plot (produces stacked bars for multi-class outputs):

shap.plots.bar(shap_values)

Natural language example (transformers)

SHAP has specific support for natural language models like those in the Hugging Face transformers library. By adding coalitional rules to traditional Shapley values we can form games that explain large modern NLP model using very few function evaluations. Using this functionality is as simple as passing a supported transformers pipeline to SHAP:

import transformers
import shap

# load a transformers pipeline model
model = transformers.pipeline('sentiment-analysis', return_all_scores=True)

# explain the model on two sample inputs
explainer = shap.Explainer(model)
shap_values = explainer(["What a great movie! ...if you have no taste."])

# visualize the first prediction's explanation for the POSITIVE output class
shap.plots.text(shap_values[0, :, "POSITIVE"])

Deep learning example with DeepExplainer (TensorFlow/Keras models)

Deep SHAP is a high-speed approximation algorithm for SHAP values in deep learning models that builds on a connection with DeepLIFT described in the SHAP NIPS paper. The implementation here differs from the original DeepLIFT by using a distribution of background samples instead of a single reference value, and using Shapley equations to linearize components such as max, softmax, products, divisions, etc. Note that some of these enhancements have also been since integrated into DeepLIFT. TensorFlow models and Keras models using the TensorFlow backend are supported (there is also preliminary support for PyTorch):

# ...include code from https://github.com/keras-team/keras/blob/master/examples/mnist_cnn.py

import shap
import numpy as np

# select a set of background examples to take an expectation over
background = x_train[np.random.choice(x_train.shape[0], 100, replace=False)]

# explain predictions of the model on four images
e = shap.DeepExplainer(model, background)
# ...or pass tensors directly
# e = shap.DeepExplainer((model.layers[0].input, model.layers[-1].output), background)
shap_values = e.shap_values(x_test[1:5])

# plot the feature attributions
shap.image_plot(shap_values, -x_test[1:5])

The plot above explains ten outputs (digits 0-9) for four different images. Red pixels increase the model's output while blue pixels decrease the output. The input images are shown on the left, and as nearly transparent grayscale backings behind each of the explanations. The sum of the SHAP values equals the difference between the expected model output (averaged over the background dataset) and the current model output. Note that for the 'zero' image the blank middle is important, while for the 'four' image the lack of a connection on top makes it a four instead of a nine.

Deep learning example with GradientExplainer (TensorFlow/Keras/PyTorch models)

Expected gradients combines ideas from Integrated Gradients, SHAP, and SmoothGrad into a single expected value equation. This allows an entire dataset to be used as the background distribution (as opposed to a single reference value) and allows local smoothing. If we approximate the model with a linear function between each background data sample and the current input to be explained, and we assume the input features are independent then expected gradients will compute approximate SHAP values. In the example below we have explained how the 7th intermediate layer of the VGG16 ImageNet model impacts the output probabilities.

from keras.applications.vgg16 import VGG16
from keras.applications.vgg16 import preprocess_input
import keras.backend as K
import numpy as np
import json
import shap

# load pre-trained model and choose two images to explain
model = VGG16(weights='imagenet', include_top=True)
X,y = shap.datasets.imagenet50()
to_explain = X[[39,41]]

# load the ImageNet class names
url = "https://s3.amazonaws.com/deep-learning-models/image-models/imagenet_class_index.json"
fname = shap.datasets.cache(url)
with open(fname) as f:
    class_names = json.load(f)

# explain how the input to the 7th layer of the model explains the top two classes
def map2layer(x, layer):
    feed_dict = dict(zip([model.layers[0].input], [preprocess_input(x.copy())]))
    return K.get_session().run(model.layers[layer].input, feed_dict)
e = shap.GradientExplainer(
    (model.layers[7].input, model.layers[-1].output),
    map2layer(X, 7),
    local_smoothing=0 # std dev of smoothing noise
)
shap_values,indexes = e.shap_values(map2layer(to_explain, 7), ranked_outputs=2)

# get the names for the classes
index_names = np.vectorize(lambda x: class_names[str(x)][1])(indexes)

# plot the explanations
shap.image_plot(shap_values, to_explain, index_names)

Predictions for two input images are explained in the plot above. Red pixels represent positive SHAP values that increase the probability of the class, while blue pixels represent negative SHAP values the reduce the probability of the class. By using ranked_outputs=2 we explain only the two most likely classes for each input (this spares us from explaining all 1,000 classes).

Model agnostic example with KernelExplainer (explains any function)

Kernel SHAP uses a specially-weighted local linear regression to estimate SHAP values for any model. Below is a simple example for explaining a multi-class SVM on the classic iris dataset.

import sklearn
import shap
from sklearn.model_selection import train_test_split

# print the JS visualization code to the notebook
shap.initjs()

# train a SVM classifier
X_train,X_test,Y_train,Y_test = train_test_split(*shap.datasets.iris(), test_size=0.2, random_state=0)
svm = sklearn.svm.SVC(kernel='rbf', probability=True)
svm.fit(X_train, Y_train)

# use Kernel SHAP to explain test set predictions
explainer = shap.KernelExplainer(svm.predict_proba, X_train, link="logit")
shap_values = explainer.shap_values(X_test, nsamples=100)

# plot the SHAP values for the Setosa output of the first instance
shap.force_plot(explainer.expected_value[0], shap_values[0][0,:], X_test.iloc[0,:], link="logit")

The above explanation shows four features each contributing to push the model output from the base value (the average model output over the training dataset we passed) towards zero. If there were any features pushing the class label higher they would be shown in red.

If we take many explanations such as the one shown above, rotate them 90 degrees, and then stack them horizontally, we can see explanations for an entire dataset. This is exactly what we do below for all the examples in the iris test set:

# plot the SHAP values for the Setosa output of all instances
shap.force_plot(explainer.expected_value[0], shap_values[0], X_test, link="logit")

SHAP Interaction Values

SHAP interaction values are a generalization of SHAP values to higher order interactions. Fast exact computation of pairwise interactions are implemented for tree models with shap.TreeExplainer(model).shap_interaction_values(X). This returns a matrix for every prediction, where the main effects are on the diagonal and the interaction effects are off-diagonal. These values often reveal interesting hidden relationships, such as how the increased risk of death peaks for men at age 60 (see the NHANES notebook for details):

Sample notebooks

The notebooks below demonstrate different use cases for SHAP. Look inside the notebooks directory of the repository if you want to try playing with the original notebooks yourself.

TreeExplainer

An implementation of Tree SHAP, a fast and exact algorithm to compute SHAP values for trees and ensembles of trees.

DeepExplainer

An implementation of Deep SHAP, a faster (but only approximate) algorithm to compute SHAP values for deep learning models that is based on connections between SHAP and the DeepLIFT algorithm.

GradientExplainer

An implementation of expected gradients to approximate SHAP values for deep learning models. It is based on connections between SHAP and the Integrated Gradients algorithm. GradientExplainer is slower than DeepExplainer and makes different approximation assumptions.

LinearExplainer

For a linear model with independent features we can analytically compute the exact SHAP values. We can also account for feature correlation if we are willing to estimate the feature covariance matrix. LinearExplainer supports both of these options.

KernelExplainer

An implementation of Kernel SHAP, a model agnostic method to estimate SHAP values for any model. Because it makes no assumptions about the model type, KernelExplainer is slower than the other model type specific algorithms.

  • Census income classification with scikit-learn - Using the standard adult census income dataset, this notebook trains a k-nearest neighbors classifier using scikit-learn and then explains predictions using shap.

  • ImageNet VGG16 Model with Keras - Explain the classic VGG16 convolutional neural network's predictions for an image. This works by applying the model agnostic Kernel SHAP method to a super-pixel segmented image.

  • Iris classification - A basic demonstration using the popular iris species dataset. It explains predictions from six different models in scikit-learn using shap.

Documentation notebooks

These notebooks comprehensively demonstrate how to use specific functions and objects.

Methods Unified by SHAP

  1. LIME: Ribeiro, Marco Tulio, Sameer Singh, and Carlos Guestrin. "Why should i trust you?: Explaining the predictions of any classifier." Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. ACM, 2016.

  2. Shapley sampling values: Strumbelj, Erik, and Igor Kononenko. "Explaining prediction models and individual predictions with feature contributions." Knowledge and information systems 41.3 (2014): 647-665.

  3. DeepLIFT: Shrikumar, Avanti, Peyton Greenside, and Anshul Kundaje. "Learning important features through propagating activation differences." arXiv preprint arXiv:1704.02685 (2017).

  4. QII: Datta, Anupam, Shayak Sen, and Yair Zick. "Algorithmic transparency via quantitative input influence: Theory and experiments with learning systems." Security and Privacy (SP), 2016 IEEE Symposium on. IEEE, 2016.

  5. Layer-wise relevance propagation: Bach, Sebastian, et al. "On pixel-wise explanations for non-linear classifier decisions by layer-wise relevance propagation." PloS one 10.7 (2015): e0130140.

  6. Shapley regression values: Lipovetsky, Stan, and Michael Conklin. "Analysis of regression in game theory approach." Applied Stochastic Models in Business and Industry 17.4 (2001): 319-330.

  7. Tree interpreter: Saabas, Ando. Interpreting random forests. http://blog.datadive.net/interpreting-random-forests/

Citations

The algorithms and visualizations used in this package came primarily out of research in Su-In Lee's lab at the University of Washington, and Microsoft Research. If you use SHAP in your research we would appreciate a citation to the appropriate paper(s):

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

shap-0.43.0.tar.gz (389.1 kB view details)

Uploaded Source

Built Distributions

If you're not sure about the file name format, learn more about wheel file names.

shap-0.43.0-cp311-cp311-win_amd64.whl (447.3 kB view details)

Uploaded CPython 3.11Windows x86-64

shap-0.43.0-cp311-cp311-musllinux_1_1_x86_64.whl (1.1 MB view details)

Uploaded CPython 3.11musllinux: musl 1.1+ x86-64

shap-0.43.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl (530.7 kB view details)

Uploaded CPython 3.11manylinux: glibc 2.17+ ARM64

shap-0.43.0-cp311-cp311-manylinux_2_12_x86_64.manylinux2010_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (532.9 kB view details)

Uploaded CPython 3.11manylinux: glibc 2.12+ x86-64manylinux: glibc 2.17+ x86-64

shap-0.43.0-cp311-cp311-macosx_11_0_arm64.whl (445.4 kB view details)

Uploaded CPython 3.11macOS 11.0+ ARM64

shap-0.43.0-cp311-cp311-macosx_10_9_x86_64.whl (450.3 kB view details)

Uploaded CPython 3.11macOS 10.9+ x86-64

shap-0.43.0-cp310-cp310-win_amd64.whl (447.3 kB view details)

Uploaded CPython 3.10Windows x86-64

shap-0.43.0-cp310-cp310-musllinux_1_1_x86_64.whl (1.1 MB view details)

Uploaded CPython 3.10musllinux: musl 1.1+ x86-64

shap-0.43.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl (530.6 kB view details)

Uploaded CPython 3.10manylinux: glibc 2.17+ ARM64

shap-0.43.0-cp310-cp310-manylinux_2_12_x86_64.manylinux2010_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (532.9 kB view details)

Uploaded CPython 3.10manylinux: glibc 2.12+ x86-64manylinux: glibc 2.17+ x86-64

shap-0.43.0-cp310-cp310-macosx_11_0_arm64.whl (445.4 kB view details)

Uploaded CPython 3.10macOS 11.0+ ARM64

shap-0.43.0-cp310-cp310-macosx_10_9_x86_64.whl (450.3 kB view details)

Uploaded CPython 3.10macOS 10.9+ x86-64

shap-0.43.0-cp39-cp39-win_amd64.whl (447.3 kB view details)

Uploaded CPython 3.9Windows x86-64

shap-0.43.0-cp39-cp39-musllinux_1_1_x86_64.whl (1.1 MB view details)

Uploaded CPython 3.9musllinux: musl 1.1+ x86-64

shap-0.43.0-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl (530.5 kB view details)

Uploaded CPython 3.9manylinux: glibc 2.17+ ARM64

shap-0.43.0-cp39-cp39-manylinux_2_12_x86_64.manylinux2010_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (532.7 kB view details)

Uploaded CPython 3.9manylinux: glibc 2.12+ x86-64manylinux: glibc 2.17+ x86-64

shap-0.43.0-cp39-cp39-macosx_11_0_arm64.whl (445.4 kB view details)

Uploaded CPython 3.9macOS 11.0+ ARM64

shap-0.43.0-cp39-cp39-macosx_10_9_x86_64.whl (450.3 kB view details)

Uploaded CPython 3.9macOS 10.9+ x86-64

shap-0.43.0-cp38-cp38-win_amd64.whl (447.3 kB view details)

Uploaded CPython 3.8Windows x86-64

shap-0.43.0-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl (533.6 kB view details)

Uploaded CPython 3.8manylinux: glibc 2.17+ ARM64

shap-0.43.0-cp38-cp38-manylinux_2_12_x86_64.manylinux2010_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (535.7 kB view details)

Uploaded CPython 3.8manylinux: glibc 2.12+ x86-64manylinux: glibc 2.17+ x86-64

shap-0.43.0-cp38-cp38-macosx_11_0_arm64.whl (445.4 kB view details)

Uploaded CPython 3.8macOS 11.0+ ARM64

shap-0.43.0-cp38-cp38-macosx_10_9_x86_64.whl (450.3 kB view details)

Uploaded CPython 3.8macOS 10.9+ x86-64

File details

Details for the file shap-0.43.0.tar.gz.

File metadata

  • Download URL: shap-0.43.0.tar.gz
  • Upload date:
  • Size: 389.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.11.6

File hashes

Hashes for shap-0.43.0.tar.gz
Algorithm Hash digest
SHA256 1eabe01444a24e181ef6a7c9593b4d7c7143eefaeb1fa4d97bd5d9fdc96c4c1e
MD5 ff8e99a5a79c7a9dcf56c678528ca68b
BLAKE2b-256 65334e8c9c800a10bb428787339a07fb05a8ccbfa6015f5423a849e6d2f8bacd

See more details on using hashes here.

File details

Details for the file shap-0.43.0-cp311-cp311-win_amd64.whl.

File metadata

  • Download URL: shap-0.43.0-cp311-cp311-win_amd64.whl
  • Upload date:
  • Size: 447.3 kB
  • Tags: CPython 3.11, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.11.6

File hashes

Hashes for shap-0.43.0-cp311-cp311-win_amd64.whl
Algorithm Hash digest
SHA256 a6296cbf6326ae21fb128e6ad5a3fa8039c24241de10a1ffbff591e34be32e13
MD5 5e9be8d1709f3cc707c81c37607647d8
BLAKE2b-256 f5fce81722d6bec4fcba46e46ef895eddaeab0027ac71e78fc35ef351fac5fe4

See more details on using hashes here.

File details

Details for the file shap-0.43.0-cp311-cp311-musllinux_1_1_x86_64.whl.

File metadata

File hashes

Hashes for shap-0.43.0-cp311-cp311-musllinux_1_1_x86_64.whl
Algorithm Hash digest
SHA256 1d360b0a9a613f93cd5cabe130e254c15a25f7d00af31be225a7fa469fd6e0be
MD5 d2dc9a3422c43afad725861941ae4c1f
BLAKE2b-256 aac30bd17f83b034b08a3cd1ffdaaab047dff56db255e2d766e7f719ce04cab5

See more details on using hashes here.

File details

Details for the file shap-0.43.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl.

File metadata

File hashes

Hashes for shap-0.43.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl
Algorithm Hash digest
SHA256 0d68d1bfb306e69c28ae079311461e281a0173c3fd7d8aa9b895bc40728a65bb
MD5 09b8aee877b3a3155c0230b93ba72610
BLAKE2b-256 61b664919899bd8c2282f168d5dd282cffe17763896c2da4c1bc84550340a3c7

See more details on using hashes here.

File details

Details for the file shap-0.43.0-cp311-cp311-manylinux_2_12_x86_64.manylinux2010_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for shap-0.43.0-cp311-cp311-manylinux_2_12_x86_64.manylinux2010_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 b52a4e6fa588e708b0c2c88eaa0bc492b6dbd611723e9bdcbd0f67db36b1ee15
MD5 9d8b86528548c183db6421717cd6e6ad
BLAKE2b-256 18bb40136f966c220ad1d7bc4ab4585096ec392144695e2f5cba4b1259db48e8

See more details on using hashes here.

File details

Details for the file shap-0.43.0-cp311-cp311-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for shap-0.43.0-cp311-cp311-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 34936237d00ce05a75624979a360a521bc4df0efd4dbdc94059721b8f3ab2532
MD5 d335c901a271d87d8f2ca2fafc7a4ba4
BLAKE2b-256 fb992364cc073662517335383f68a10549c6b75486b99f0d671179e4dd8252d6

See more details on using hashes here.

File details

Details for the file shap-0.43.0-cp311-cp311-macosx_10_9_x86_64.whl.

File metadata

File hashes

Hashes for shap-0.43.0-cp311-cp311-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 fec5c7c03f7ea2ae8459f9066ee75f423ea3e87673501140210f91223caa3888
MD5 546e22ba64a7ffbb5adc6d89d709e664
BLAKE2b-256 6917805f875ba4c906c13e763d23387ec709bd9ff4d00e2e876f7bc1e31d9fb5

See more details on using hashes here.

File details

Details for the file shap-0.43.0-cp310-cp310-win_amd64.whl.

File metadata

  • Download URL: shap-0.43.0-cp310-cp310-win_amd64.whl
  • Upload date:
  • Size: 447.3 kB
  • Tags: CPython 3.10, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.11.6

File hashes

Hashes for shap-0.43.0-cp310-cp310-win_amd64.whl
Algorithm Hash digest
SHA256 a2ca34da26ff10abf56e7d74bf2490ebe89fe4a67a2f927f8d67dc99319ecd7d
MD5 7c7277bc4c9719e384a82a7bed6c7177
BLAKE2b-256 9cc62190e240140f582007b36b256e7b015572c5644055e6e1f80d917c74a9ff

See more details on using hashes here.

File details

Details for the file shap-0.43.0-cp310-cp310-musllinux_1_1_x86_64.whl.

File metadata

File hashes

Hashes for shap-0.43.0-cp310-cp310-musllinux_1_1_x86_64.whl
Algorithm Hash digest
SHA256 6c8e709e7619af8865c32d2af266dd42482f9c99ddddf22503cb58ff3a9c49ce
MD5 ed7d5d79b294d89eb5f52ccc6e2d69bd
BLAKE2b-256 638d0cc351f6821258f1b5ea161c4ab6b699d3f02704331b0a8835e7f3c45c03

See more details on using hashes here.

File details

Details for the file shap-0.43.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl.

File metadata

File hashes

Hashes for shap-0.43.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl
Algorithm Hash digest
SHA256 af6f5920ec5b19f1fe47db4e244726132fc6457299650f61ae7e70e9d26a44be
MD5 32db95ad3bfae7aeb006f4a7d0d620f7
BLAKE2b-256 60cffd05f53c64a48a5dfe56bcfa6f318f608aad64261da222d2592fa47b7379

See more details on using hashes here.

File details

Details for the file shap-0.43.0-cp310-cp310-manylinux_2_12_x86_64.manylinux2010_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for shap-0.43.0-cp310-cp310-manylinux_2_12_x86_64.manylinux2010_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 6094fb81db0bb67728be7ff1e46be388dc2023a6c323b84224b6791b7bcec3e1
MD5 5afbd29623216c649df5978da360aae4
BLAKE2b-256 9e3f247e0017d52eeef37c170d71357eb3a12a2c06718d2e184c9929b6f3d9ed

See more details on using hashes here.

File details

Details for the file shap-0.43.0-cp310-cp310-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for shap-0.43.0-cp310-cp310-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 65a458b3a2ee826fa2fa32f38d5c7cbec4048c15cc922357137b6220263346e6
MD5 6d8e68e4cad323e4ed2915d96e388fac
BLAKE2b-256 6def0e2db8763333865344523508c7fd37be1840abb6a56c8c0e273501c7de14

See more details on using hashes here.

File details

Details for the file shap-0.43.0-cp310-cp310-macosx_10_9_x86_64.whl.

File metadata

File hashes

Hashes for shap-0.43.0-cp310-cp310-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 a2ecc15ad2201507aeb1d67adbbf16b2a70902f509a189a83a7e1628bfc11bde
MD5 c4278db29cbd2b5e1247d603a03279af
BLAKE2b-256 77ef49188b016f74b483d96fe29c0aad6dd59002e39eb6b4fed5c3f090a31580

See more details on using hashes here.

File details

Details for the file shap-0.43.0-cp39-cp39-win_amd64.whl.

File metadata

  • Download URL: shap-0.43.0-cp39-cp39-win_amd64.whl
  • Upload date:
  • Size: 447.3 kB
  • Tags: CPython 3.9, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.11.6

File hashes

Hashes for shap-0.43.0-cp39-cp39-win_amd64.whl
Algorithm Hash digest
SHA256 52a429a765ed2bd2806fff2296482a97ea0817170ab3a05235fa579d3c2c1e0b
MD5 b49ad599875b202125ce548b69ed4526
BLAKE2b-256 1e86d41993abadf5f31173066b1d32ee1a944545144d448f4a11d68f730f38f5

See more details on using hashes here.

File details

Details for the file shap-0.43.0-cp39-cp39-musllinux_1_1_x86_64.whl.

File metadata

File hashes

Hashes for shap-0.43.0-cp39-cp39-musllinux_1_1_x86_64.whl
Algorithm Hash digest
SHA256 501d16b3cc6b3b831f5576c143c454241ae15731e7b5ad665f8816dc1b91f0cf
MD5 25bf0193a44322629b11d7d668668cd6
BLAKE2b-256 d3351ffa89fc5f15e87e2eecf7ff403d9519605dec5ae28d0a7b3ebd032dfa31

See more details on using hashes here.

File details

Details for the file shap-0.43.0-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl.

File metadata

File hashes

Hashes for shap-0.43.0-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl
Algorithm Hash digest
SHA256 a4c501a65eacc5b8a238f6b786e198ce9a7471ae030da620f7a6846bf40579b1
MD5 11a352ae8d8f070dcf464352c77b7a80
BLAKE2b-256 0a39523c12bb604d94c0bb9ab682d55ce2b6c489058a6b3e1c24b10f73b0a0ff

See more details on using hashes here.

File details

Details for the file shap-0.43.0-cp39-cp39-manylinux_2_12_x86_64.manylinux2010_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for shap-0.43.0-cp39-cp39-manylinux_2_12_x86_64.manylinux2010_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 e7151124bfbd0d74fe8112856a3c157fd860c82e782a43ee68ad0237a90911cd
MD5 26ad1743b1d430b99459a8ad0365c48d
BLAKE2b-256 c3f440bb8bbe5784678de897d228c5db7b09bef736040282dba4d576251c611a

See more details on using hashes here.

File details

Details for the file shap-0.43.0-cp39-cp39-macosx_11_0_arm64.whl.

File metadata

  • Download URL: shap-0.43.0-cp39-cp39-macosx_11_0_arm64.whl
  • Upload date:
  • Size: 445.4 kB
  • Tags: CPython 3.9, macOS 11.0+ ARM64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.11.6

File hashes

Hashes for shap-0.43.0-cp39-cp39-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 6d71691de29832d378779c75a9b578f30cd009bc0bcaf534722f1eb645d3cdf9
MD5 d4529d78faf4e8c5ef84b562f2515785
BLAKE2b-256 b0ddf233bb4bac8d083c5bb0c79e9e19541a580daff1dd5381e1f2e7486cc25e

See more details on using hashes here.

File details

Details for the file shap-0.43.0-cp39-cp39-macosx_10_9_x86_64.whl.

File metadata

File hashes

Hashes for shap-0.43.0-cp39-cp39-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 6ba6bfcd217edc6d2e74864901072e8e7ec2f3966d4d4790eddf3bda8b711f2a
MD5 68c78f7106ec8fd49876b5d68342c362
BLAKE2b-256 f51f21a1c97b2fca7629c2721ccdb565a4b24f6a66a11106f066e0849c0bd9a8

See more details on using hashes here.

File details

Details for the file shap-0.43.0-cp38-cp38-win_amd64.whl.

File metadata

  • Download URL: shap-0.43.0-cp38-cp38-win_amd64.whl
  • Upload date:
  • Size: 447.3 kB
  • Tags: CPython 3.8, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.11.6

File hashes

Hashes for shap-0.43.0-cp38-cp38-win_amd64.whl
Algorithm Hash digest
SHA256 8a52358783dfc314781dcec9f21ea0667cafa1e26a911ea11bf8ea7e45b283c1
MD5 b63e97d80eeae722290d6d80b6ae2807
BLAKE2b-256 1777b4db967c4d3f45a716e8f766302d683573c35b26a5f1a31098ddbee0dd66

See more details on using hashes here.

File details

Details for the file shap-0.43.0-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl.

File metadata

File hashes

Hashes for shap-0.43.0-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl
Algorithm Hash digest
SHA256 6cb4fd3f686be154834f1244a8cf4689913d254e98afcd2f9db6dc1a3e77e1ee
MD5 251a963e933325d7ef7f77ddd1e2dfee
BLAKE2b-256 1235057a0d97672e67d13c4b3b6fc85db6318a6671081845cdf23914ed96fb9c

See more details on using hashes here.

File details

Details for the file shap-0.43.0-cp38-cp38-manylinux_2_12_x86_64.manylinux2010_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for shap-0.43.0-cp38-cp38-manylinux_2_12_x86_64.manylinux2010_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 16f030fabc4c3ea8fb3b72cd9e4c982aaf2d2ca81e1f5b6df644327b794aed5d
MD5 e4ecfe4c5ed1b958c4a6610ffb59f50a
BLAKE2b-256 3d87b469dd6833d315e679022c624e54724ee031a080071b56cb508804fe729c

See more details on using hashes here.

File details

Details for the file shap-0.43.0-cp38-cp38-macosx_11_0_arm64.whl.

File metadata

  • Download URL: shap-0.43.0-cp38-cp38-macosx_11_0_arm64.whl
  • Upload date:
  • Size: 445.4 kB
  • Tags: CPython 3.8, macOS 11.0+ ARM64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.11.6

File hashes

Hashes for shap-0.43.0-cp38-cp38-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 cd5f88f09aa8b13711557da51913328c76ead5b36b5d2b5af36cc3430b7f1a54
MD5 aeac93e9be29949dc48745706a86eb1b
BLAKE2b-256 1de1020442cbd48ced7fae04ef78b7e4292b5a3f3aa9ca0b27e91d0ae3019264

See more details on using hashes here.

File details

Details for the file shap-0.43.0-cp38-cp38-macosx_10_9_x86_64.whl.

File metadata

File hashes

Hashes for shap-0.43.0-cp38-cp38-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 f769f8b7adb234e0ef12c768722cf4ee8b8b981aa50ae452cf60c6933cd5df67
MD5 42032a5e84c7af9677f2fcfa5ac2a0ff
BLAKE2b-256 d88f232575f0fb4ac9b054f854f12a5c2752aa751bbfd426c3ebafaefeadb846

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page