Skip to main content

A unified approach to explain the output of any machine learning model.

Project description


PyPI Conda License Tests Binder Documentation Status Downloads PyPI pyversions

SHAP (SHapley Additive exPlanations) is a game theoretic approach to explain the output of any machine learning model. It connects optimal credit allocation with local explanations using the classic Shapley values from game theory and their related extensions (see papers for details and citations).

Install

SHAP can be installed from either PyPI or conda-forge:

pip install shap
or
conda install -c conda-forge shap

Tree ensemble example (XGBoost/LightGBM/CatBoost/scikit-learn/pyspark models)

While SHAP can explain the output of any machine learning model, we have developed a high-speed exact algorithm for tree ensemble methods (see our Nature MI paper). Fast C++ implementations are supported for XGBoost, LightGBM, CatBoost, scikit-learn and pyspark tree models:

import xgboost
import shap

# train an XGBoost model
X, y = shap.datasets.california()
model = xgboost.XGBRegressor().fit(X, y)

# explain the model's predictions using SHAP
# (same syntax works for LightGBM, CatBoost, scikit-learn, transformers, Spark, etc.)
explainer = shap.Explainer(model)
shap_values = explainer(X)

# visualize the first prediction's explanation
shap.plots.waterfall(shap_values[0])

The above explanation shows features each contributing to push the model output from the base value (the average model output over the training dataset we passed) to the model output. Features pushing the prediction higher are shown in red, those pushing the prediction lower are in blue. Another way to visualize the same explanation is to use a force plot (these are introduced in our Nature BME paper):

# visualize the first prediction's explanation with a force plot
shap.plots.force(shap_values[0])

If we take many force plot explanations such as the one shown above, rotate them 90 degrees, and then stack them horizontally, we can see explanations for an entire dataset (in the notebook this plot is interactive):

# visualize all the training set predictions
shap.plots.force(shap_values[:500])

To understand how a single feature effects the output of the model we can plot the SHAP value of that feature vs. the value of the feature for all the examples in a dataset. Since SHAP values represent a feature's responsibility for a change in the model output, the plot below represents the change in predicted house price as the latitude changes. Vertical dispersion at a single value of latitude represents interaction effects with other features. To help reveal these interactions we can color by another feature. If we pass the whole explanation tensor to the color argument the scatter plot will pick the best feature to color by. In this case it picks longitude.

# create a dependence scatter plot to show the effect of a single feature across the whole dataset
shap.plots.scatter(shap_values[:, "Latitude"], color=shap_values)

To get an overview of which features are most important for a model we can plot the SHAP values of every feature for every sample. The plot below sorts features by the sum of SHAP value magnitudes over all samples, and uses SHAP values to show the distribution of the impacts each feature has on the model output. The color represents the feature value (red high, blue low). This reveals for example that higher median incomes improves the predicted home price.

# summarize the effects of all the features
shap.plots.beeswarm(shap_values)

We can also just take the mean absolute value of the SHAP values for each feature to get a standard bar plot (produces stacked bars for multi-class outputs):

shap.plots.bar(shap_values)

Natural language example (transformers)

SHAP has specific support for natural language models like those in the Hugging Face transformers library. By adding coalitional rules to traditional Shapley values we can form games that explain large modern NLP model using very few function evaluations. Using this functionality is as simple as passing a supported transformers pipeline to SHAP:

import transformers
import shap

# load a transformers pipeline model
model = transformers.pipeline('sentiment-analysis', return_all_scores=True)

# explain the model on two sample inputs
explainer = shap.Explainer(model)
shap_values = explainer(["What a great movie! ...if you have no taste."])

# visualize the first prediction's explanation for the POSITIVE output class
shap.plots.text(shap_values[0, :, "POSITIVE"])

Deep learning example with DeepExplainer (TensorFlow/Keras models)

Deep SHAP is a high-speed approximation algorithm for SHAP values in deep learning models that builds on a connection with DeepLIFT described in the SHAP NIPS paper. The implementation here differs from the original DeepLIFT by using a distribution of background samples instead of a single reference value, and using Shapley equations to linearize components such as max, softmax, products, divisions, etc. Note that some of these enhancements have also been since integrated into DeepLIFT. TensorFlow models and Keras models using the TensorFlow backend are supported (there is also preliminary support for PyTorch):

# ...include code from https://github.com/keras-team/keras/blob/master/examples/mnist_cnn.py

import shap
import numpy as np

# select a set of background examples to take an expectation over
background = x_train[np.random.choice(x_train.shape[0], 100, replace=False)]

# explain predictions of the model on four images
e = shap.DeepExplainer(model, background)
# ...or pass tensors directly
# e = shap.DeepExplainer((model.layers[0].input, model.layers[-1].output), background)
shap_values = e.shap_values(x_test[1:5])

# plot the feature attributions
shap.image_plot(shap_values, -x_test[1:5])

The plot above explains ten outputs (digits 0-9) for four different images. Red pixels increase the model's output while blue pixels decrease the output. The input images are shown on the left, and as nearly transparent grayscale backings behind each of the explanations. The sum of the SHAP values equals the difference between the expected model output (averaged over the background dataset) and the current model output. Note that for the 'zero' image the blank middle is important, while for the 'four' image the lack of a connection on top makes it a four instead of a nine.

Deep learning example with GradientExplainer (TensorFlow/Keras/PyTorch models)

Expected gradients combines ideas from Integrated Gradients, SHAP, and SmoothGrad into a single expected value equation. This allows an entire dataset to be used as the background distribution (as opposed to a single reference value) and allows local smoothing. If we approximate the model with a linear function between each background data sample and the current input to be explained, and we assume the input features are independent then expected gradients will compute approximate SHAP values. In the example below we have explained how the 7th intermediate layer of the VGG16 ImageNet model impacts the output probabilities.

from keras.applications.vgg16 import VGG16
from keras.applications.vgg16 import preprocess_input
import keras.backend as K
import numpy as np
import json
import shap

# load pre-trained model and choose two images to explain
model = VGG16(weights='imagenet', include_top=True)
X,y = shap.datasets.imagenet50()
to_explain = X[[39,41]]

# load the ImageNet class names
url = "https://s3.amazonaws.com/deep-learning-models/image-models/imagenet_class_index.json"
fname = shap.datasets.cache(url)
with open(fname) as f:
    class_names = json.load(f)

# explain how the input to the 7th layer of the model explains the top two classes
def map2layer(x, layer):
    feed_dict = dict(zip([model.layers[0].input], [preprocess_input(x.copy())]))
    return K.get_session().run(model.layers[layer].input, feed_dict)
e = shap.GradientExplainer(
    (model.layers[7].input, model.layers[-1].output),
    map2layer(X, 7),
    local_smoothing=0 # std dev of smoothing noise
)
shap_values,indexes = e.shap_values(map2layer(to_explain, 7), ranked_outputs=2)

# get the names for the classes
index_names = np.vectorize(lambda x: class_names[str(x)][1])(indexes)

# plot the explanations
shap.image_plot(shap_values, to_explain, index_names)

Predictions for two input images are explained in the plot above. Red pixels represent positive SHAP values that increase the probability of the class, while blue pixels represent negative SHAP values the reduce the probability of the class. By using ranked_outputs=2 we explain only the two most likely classes for each input (this spares us from explaining all 1,000 classes).

Model agnostic example with KernelExplainer (explains any function)

Kernel SHAP uses a specially-weighted local linear regression to estimate SHAP values for any model. Below is a simple example for explaining a multi-class SVM on the classic iris dataset.

import sklearn
import shap
from sklearn.model_selection import train_test_split

# print the JS visualization code to the notebook
shap.initjs()

# train a SVM classifier
X_train,X_test,Y_train,Y_test = train_test_split(*shap.datasets.iris(), test_size=0.2, random_state=0)
svm = sklearn.svm.SVC(kernel='rbf', probability=True)
svm.fit(X_train, Y_train)

# use Kernel SHAP to explain test set predictions
explainer = shap.KernelExplainer(svm.predict_proba, X_train, link="logit")
shap_values = explainer.shap_values(X_test, nsamples=100)

# plot the SHAP values for the Setosa output of the first instance
shap.force_plot(explainer.expected_value[0], shap_values[0][0,:], X_test.iloc[0,:], link="logit")

The above explanation shows four features each contributing to push the model output from the base value (the average model output over the training dataset we passed) towards zero. If there were any features pushing the class label higher they would be shown in red.

If we take many explanations such as the one shown above, rotate them 90 degrees, and then stack them horizontally, we can see explanations for an entire dataset. This is exactly what we do below for all the examples in the iris test set:

# plot the SHAP values for the Setosa output of all instances
shap.force_plot(explainer.expected_value[0], shap_values[0], X_test, link="logit")

SHAP Interaction Values

SHAP interaction values are a generalization of SHAP values to higher order interactions. Fast exact computation of pairwise interactions are implemented for tree models with shap.TreeExplainer(model).shap_interaction_values(X). This returns a matrix for every prediction, where the main effects are on the diagonal and the interaction effects are off-diagonal. These values often reveal interesting hidden relationships, such as how the increased risk of death peaks for men at age 60 (see the NHANES notebook for details):

Sample notebooks

The notebooks below demonstrate different use cases for SHAP. Look inside the notebooks directory of the repository if you want to try playing with the original notebooks yourself.

TreeExplainer

An implementation of Tree SHAP, a fast and exact algorithm to compute SHAP values for trees and ensembles of trees.

DeepExplainer

An implementation of Deep SHAP, a faster (but only approximate) algorithm to compute SHAP values for deep learning models that is based on connections between SHAP and the DeepLIFT algorithm.

GradientExplainer

An implementation of expected gradients to approximate SHAP values for deep learning models. It is based on connections between SHAP and the Integrated Gradients algorithm. GradientExplainer is slower than DeepExplainer and makes different approximation assumptions.

LinearExplainer

For a linear model with independent features we can analytically compute the exact SHAP values. We can also account for feature correlation if we are willing to estimate the feature covariance matrix. LinearExplainer supports both of these options.

KernelExplainer

An implementation of Kernel SHAP, a model agnostic method to estimate SHAP values for any model. Because it makes no assumptions about the model type, KernelExplainer is slower than the other model type specific algorithms.

  • Census income classification with scikit-learn - Using the standard adult census income dataset, this notebook trains a k-nearest neighbors classifier using scikit-learn and then explains predictions using shap.

  • ImageNet VGG16 Model with Keras - Explain the classic VGG16 convolutional neural network's predictions for an image. This works by applying the model agnostic Kernel SHAP method to a super-pixel segmented image.

  • Iris classification - A basic demonstration using the popular iris species dataset. It explains predictions from six different models in scikit-learn using shap.

Documentation notebooks

These notebooks comprehensively demonstrate how to use specific functions and objects.

Methods Unified by SHAP

  1. LIME: Ribeiro, Marco Tulio, Sameer Singh, and Carlos Guestrin. "Why should i trust you?: Explaining the predictions of any classifier." Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. ACM, 2016.

  2. Shapley sampling values: Strumbelj, Erik, and Igor Kononenko. "Explaining prediction models and individual predictions with feature contributions." Knowledge and information systems 41.3 (2014): 647-665.

  3. DeepLIFT: Shrikumar, Avanti, Peyton Greenside, and Anshul Kundaje. "Learning important features through propagating activation differences." arXiv preprint arXiv:1704.02685 (2017).

  4. QII: Datta, Anupam, Shayak Sen, and Yair Zick. "Algorithmic transparency via quantitative input influence: Theory and experiments with learning systems." Security and Privacy (SP), 2016 IEEE Symposium on. IEEE, 2016.

  5. Layer-wise relevance propagation: Bach, Sebastian, et al. "On pixel-wise explanations for non-linear classifier decisions by layer-wise relevance propagation." PloS one 10.7 (2015): e0130140.

  6. Shapley regression values: Lipovetsky, Stan, and Michael Conklin. "Analysis of regression in game theory approach." Applied Stochastic Models in Business and Industry 17.4 (2001): 319-330.

  7. Tree interpreter: Saabas, Ando. Interpreting random forests. http://blog.datadive.net/interpreting-random-forests/

Citations

The algorithms and visualizations used in this package came primarily out of research in Su-In Lee's lab at the University of Washington, and Microsoft Research. If you use SHAP in your research we would appreciate a citation to the appropriate paper(s):

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

shap-0.44.1.tar.gz (1.1 MB view details)

Uploaded Source

Built Distributions

shap-0.44.1-cp311-cp311-win_amd64.whl (450.3 kB view details)

Uploaded CPython 3.11 Windows x86-64

shap-0.44.1-cp311-cp311-musllinux_1_1_x86_64.whl (1.1 MB view details)

Uploaded CPython 3.11 musllinux: musl 1.1+ x86-64

shap-0.44.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl (533.5 kB view details)

Uploaded CPython 3.11 manylinux: glibc 2.17+ ARM64

shap-0.44.1-cp311-cp311-manylinux_2_12_x86_64.manylinux2010_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (535.8 kB view details)

Uploaded CPython 3.11 manylinux: glibc 2.12+ x86-64 manylinux: glibc 2.17+ x86-64

shap-0.44.1-cp311-cp311-macosx_11_0_arm64.whl (448.2 kB view details)

Uploaded CPython 3.11 macOS 11.0+ ARM64

shap-0.44.1-cp311-cp311-macosx_10_9_x86_64.whl (453.1 kB view details)

Uploaded CPython 3.11 macOS 10.9+ x86-64

shap-0.44.1-cp310-cp310-win_amd64.whl (450.3 kB view details)

Uploaded CPython 3.10 Windows x86-64

shap-0.44.1-cp310-cp310-musllinux_1_1_x86_64.whl (1.1 MB view details)

Uploaded CPython 3.10 musllinux: musl 1.1+ x86-64

shap-0.44.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl (533.4 kB view details)

Uploaded CPython 3.10 manylinux: glibc 2.17+ ARM64

shap-0.44.1-cp310-cp310-manylinux_2_12_x86_64.manylinux2010_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (535.7 kB view details)

Uploaded CPython 3.10 manylinux: glibc 2.12+ x86-64 manylinux: glibc 2.17+ x86-64

shap-0.44.1-cp310-cp310-macosx_11_0_arm64.whl (448.2 kB view details)

Uploaded CPython 3.10 macOS 11.0+ ARM64

shap-0.44.1-cp310-cp310-macosx_10_9_x86_64.whl (453.1 kB view details)

Uploaded CPython 3.10 macOS 10.9+ x86-64

shap-0.44.1-cp39-cp39-win_amd64.whl (450.3 kB view details)

Uploaded CPython 3.9 Windows x86-64

shap-0.44.1-cp39-cp39-musllinux_1_1_x86_64.whl (1.1 MB view details)

Uploaded CPython 3.9 musllinux: musl 1.1+ x86-64

shap-0.44.1-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl (533.3 kB view details)

Uploaded CPython 3.9 manylinux: glibc 2.17+ ARM64

shap-0.44.1-cp39-cp39-manylinux_2_12_x86_64.manylinux2010_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (535.6 kB view details)

Uploaded CPython 3.9 manylinux: glibc 2.12+ x86-64 manylinux: glibc 2.17+ x86-64

shap-0.44.1-cp39-cp39-macosx_11_0_arm64.whl (448.2 kB view details)

Uploaded CPython 3.9 macOS 11.0+ ARM64

shap-0.44.1-cp39-cp39-macosx_10_9_x86_64.whl (453.1 kB view details)

Uploaded CPython 3.9 macOS 10.9+ x86-64

shap-0.44.1-cp38-cp38-win_amd64.whl (450.3 kB view details)

Uploaded CPython 3.8 Windows x86-64

shap-0.44.1-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl (536.5 kB view details)

Uploaded CPython 3.8 manylinux: glibc 2.17+ ARM64

shap-0.44.1-cp38-cp38-manylinux_2_12_x86_64.manylinux2010_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (538.6 kB view details)

Uploaded CPython 3.8 manylinux: glibc 2.12+ x86-64 manylinux: glibc 2.17+ x86-64

shap-0.44.1-cp38-cp38-macosx_11_0_arm64.whl (448.2 kB view details)

Uploaded CPython 3.8 macOS 11.0+ ARM64

shap-0.44.1-cp38-cp38-macosx_10_9_x86_64.whl (453.1 kB view details)

Uploaded CPython 3.8 macOS 10.9+ x86-64

File details

Details for the file shap-0.44.1.tar.gz.

File metadata

  • Download URL: shap-0.44.1.tar.gz
  • Upload date:
  • Size: 1.1 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.11.7

File hashes

Hashes for shap-0.44.1.tar.gz
Algorithm Hash digest
SHA256 a21d5a622e12e7c3b4a58d6e93b70133b7e09b6342b19746071a0dc2d190b432
MD5 9bf351aa0808da72d3d6a24bb0f8e04a
BLAKE2b-256 ece085a34a7ec70b4f85e9b20b9e4234441b7cf4f8f24b2c59cf231115325a84

See more details on using hashes here.

File details

Details for the file shap-0.44.1-cp311-cp311-win_amd64.whl.

File metadata

  • Download URL: shap-0.44.1-cp311-cp311-win_amd64.whl
  • Upload date:
  • Size: 450.3 kB
  • Tags: CPython 3.11, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.11.7

File hashes

Hashes for shap-0.44.1-cp311-cp311-win_amd64.whl
Algorithm Hash digest
SHA256 e1c3de8747fb451e59768dca11b054e395a2a1601c7482738897616a70679419
MD5 ab9b21c94ebefc59d547e5105ee78028
BLAKE2b-256 a389f5cca6e299320e19194ad09f6014db33208a49e8d9161e729725eb17f8a8

See more details on using hashes here.

File details

Details for the file shap-0.44.1-cp311-cp311-musllinux_1_1_x86_64.whl.

File metadata

File hashes

Hashes for shap-0.44.1-cp311-cp311-musllinux_1_1_x86_64.whl
Algorithm Hash digest
SHA256 dda26c39ac0ef17233deba09b0f313ceb4d5fa663dced2a1b77a494d48db96b8
MD5 fe30d49416060e6ba5be5b50fb4e4de3
BLAKE2b-256 7f84731378c6cf3545a8223dc4985e27045d619801863d3c19039816bf344590

See more details on using hashes here.

File details

Details for the file shap-0.44.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl.

File metadata

File hashes

Hashes for shap-0.44.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl
Algorithm Hash digest
SHA256 4fa67a8810a042aa2366224afa612ded8a2a77b5a5a0d89ea60eba5eff3a0c1b
MD5 f974e0a014960e97b4286d8800ce2e96
BLAKE2b-256 69d0ba99db845cbef9d78df49c7652b62552922f58bdc74c45f39eb4dbff40eb

See more details on using hashes here.

File details

Details for the file shap-0.44.1-cp311-cp311-manylinux_2_12_x86_64.manylinux2010_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for shap-0.44.1-cp311-cp311-manylinux_2_12_x86_64.manylinux2010_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 5fe63f1f4acf6bdfa3c85db85067d0fda2a29fe38be2d14a2811844f35290f43
MD5 b84b919bb44e4f8f008295830e343e16
BLAKE2b-256 6a08f238d19437d22f4247bc139ba48a33ad2a929e06537e35239d844fb6af42

See more details on using hashes here.

File details

Details for the file shap-0.44.1-cp311-cp311-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for shap-0.44.1-cp311-cp311-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 e98bb13dc351317ad2420dfe2dbca5361237ab9bef511fbe28ce6c1dfb40ff8c
MD5 5ca76051d46b86075775129e3c24775b
BLAKE2b-256 6f362f5105c5accba2a58a8c6fda8d890fe8de329545f673a4f62235a67d723d

See more details on using hashes here.

File details

Details for the file shap-0.44.1-cp311-cp311-macosx_10_9_x86_64.whl.

File metadata

File hashes

Hashes for shap-0.44.1-cp311-cp311-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 44ed7a80ae30ed3927f1bc7911f6f04bb32bb2d938a9e8a5794b2bb99e3b99cc
MD5 385d5319f73202330f09a77b7632bae5
BLAKE2b-256 6dbc1634f74a4761685de4389ecf092557ebb73e07071cdcd3841e4b3a257f44

See more details on using hashes here.

File details

Details for the file shap-0.44.1-cp310-cp310-win_amd64.whl.

File metadata

  • Download URL: shap-0.44.1-cp310-cp310-win_amd64.whl
  • Upload date:
  • Size: 450.3 kB
  • Tags: CPython 3.10, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.11.7

File hashes

Hashes for shap-0.44.1-cp310-cp310-win_amd64.whl
Algorithm Hash digest
SHA256 d65a21a64f3c1e76e1fb5277a0cfbcce447c08a26eaef311b1cbae9a6efe0ea0
MD5 d3454eba801e79fc30e861ea37bad9fb
BLAKE2b-256 0a147f4686d0b4b4251ccda827d96a714ca029a3aec3120ed0f2a1b79b7957af

See more details on using hashes here.

File details

Details for the file shap-0.44.1-cp310-cp310-musllinux_1_1_x86_64.whl.

File metadata

File hashes

Hashes for shap-0.44.1-cp310-cp310-musllinux_1_1_x86_64.whl
Algorithm Hash digest
SHA256 1f19a3c531c6ceb0d7adea8262385580b52ef187981606342cd03288fc48cbfa
MD5 9f64fcd7002afd89842afbe42486d379
BLAKE2b-256 615b4b82837e23adb8df3ff98b326c3f2681d6489cbfb809df69553ae5a7c144

See more details on using hashes here.

File details

Details for the file shap-0.44.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl.

File metadata

File hashes

Hashes for shap-0.44.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl
Algorithm Hash digest
SHA256 e4ba4f3e84c3adf2c5a55e6d2becb7a4bb83e5784f7aa373316099a105ad593f
MD5 98a13a2edfa15ce5122506892eaee540
BLAKE2b-256 b145bb1bf3268a06f1491c5c157f43f37cfad30cee400bcc69cac1ca31173f05

See more details on using hashes here.

File details

Details for the file shap-0.44.1-cp310-cp310-manylinux_2_12_x86_64.manylinux2010_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for shap-0.44.1-cp310-cp310-manylinux_2_12_x86_64.manylinux2010_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 2ad3192026a0e4c3e399ef499da4f726ab01d62f388e03c804d8e10c4e61c8d1
MD5 6d4d1c407dbdadf5fe13d9dd7d2fd4fb
BLAKE2b-256 3aeef77fd0c99639579ec063b06b5ca310f2b9f496cedd8a4f633ac5d97a1473

See more details on using hashes here.

File details

Details for the file shap-0.44.1-cp310-cp310-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for shap-0.44.1-cp310-cp310-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 d2919f2b255e31363182afb1627b374eb6c4724c90b0318719cbe90a316682f5
MD5 74a93ae2527075355a60c6aadc582825
BLAKE2b-256 a4d918cca99dfa1450d1c5a325c663fa0f4b2a832234e7f1b6ea90e438584967

See more details on using hashes here.

File details

Details for the file shap-0.44.1-cp310-cp310-macosx_10_9_x86_64.whl.

File metadata

File hashes

Hashes for shap-0.44.1-cp310-cp310-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 93a94961a355249855f13f1ed564466afa1c5fae84f868dd56e50e936f4f9b57
MD5 272567c9e90727c9e79ddd3a5ef5e31b
BLAKE2b-256 7f787bec44bf97ba00bc0f304bcb0c156196568506ba5cc5cec2a471ae6d9636

See more details on using hashes here.

File details

Details for the file shap-0.44.1-cp39-cp39-win_amd64.whl.

File metadata

  • Download URL: shap-0.44.1-cp39-cp39-win_amd64.whl
  • Upload date:
  • Size: 450.3 kB
  • Tags: CPython 3.9, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.11.7

File hashes

Hashes for shap-0.44.1-cp39-cp39-win_amd64.whl
Algorithm Hash digest
SHA256 3c873d26e4fdcdb9e6a83a1e05c1e9c8c8b48ac4365b0259d6e138afb3729c32
MD5 f72d5de35ce107052572b17800066379
BLAKE2b-256 7eb52e5930c700039e9f57835aa1804ee1b8e357bca87b20a53f3315475790ca

See more details on using hashes here.

File details

Details for the file shap-0.44.1-cp39-cp39-musllinux_1_1_x86_64.whl.

File metadata

File hashes

Hashes for shap-0.44.1-cp39-cp39-musllinux_1_1_x86_64.whl
Algorithm Hash digest
SHA256 febd7f0ad6ac27f999ed857a897f0f9963d4a4d9eeb003ee2505cf6164590d9c
MD5 5623e82ae7c9bfd1426686ccec9de760
BLAKE2b-256 b4411ac4630480407ecf9d4f45774bd27da2aa6bdf594270e7f539d3d703a941

See more details on using hashes here.

File details

Details for the file shap-0.44.1-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl.

File metadata

File hashes

Hashes for shap-0.44.1-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl
Algorithm Hash digest
SHA256 62bb07b4748db004c78802494c29e5e70a2efff967a5c4d50209d2ee2ace16db
MD5 d4c8a00c73aec30a9327e5fe7e8ab94a
BLAKE2b-256 99fb4d61ff919e164eed6f83331b453f344aaa35b8b9f9313933cb379ca8a65f

See more details on using hashes here.

File details

Details for the file shap-0.44.1-cp39-cp39-manylinux_2_12_x86_64.manylinux2010_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for shap-0.44.1-cp39-cp39-manylinux_2_12_x86_64.manylinux2010_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 947c7e25b96d37763948ef2edd1711c6b3098a161de8e97fa0d1fdad6e49848c
MD5 9961d7fd90a1700f915ac6f40d108e94
BLAKE2b-256 70031bf2074d5a671a51dda45b59fd06e15217353e89b2802b38e543d29bc69d

See more details on using hashes here.

File details

Details for the file shap-0.44.1-cp39-cp39-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for shap-0.44.1-cp39-cp39-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 668ebe97c60a12bdffe254dfb2c8598cf24416e202b4b662076846b12d10b4b7
MD5 fc788b416223c869c8dde3ae3be11155
BLAKE2b-256 c06014e552bf80f8b8c0b63db08cd12f40d8d3e87c8b68919e99c1edc629e2a7

See more details on using hashes here.

File details

Details for the file shap-0.44.1-cp39-cp39-macosx_10_9_x86_64.whl.

File metadata

File hashes

Hashes for shap-0.44.1-cp39-cp39-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 ab36e2aecd0c1ba3df58f2452aa4aa24832ec766cbb2a794e2411897eb728b30
MD5 574e1583359024df9b6825f893872bbc
BLAKE2b-256 c4c075afdbbb0524ef84794d0a4208c60555e461b74ca84d584acf89fae9459f

See more details on using hashes here.

File details

Details for the file shap-0.44.1-cp38-cp38-win_amd64.whl.

File metadata

  • Download URL: shap-0.44.1-cp38-cp38-win_amd64.whl
  • Upload date:
  • Size: 450.3 kB
  • Tags: CPython 3.8, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.11.7

File hashes

Hashes for shap-0.44.1-cp38-cp38-win_amd64.whl
Algorithm Hash digest
SHA256 6aede800f0ec9efa8acd35913d791304d46376d444e8c6e1be905606e626c5e2
MD5 63ff1fc80dc96ad76246d3122ac041a0
BLAKE2b-256 7cffceb21b7559090adcf823b2e24cfaabbdce4f95e0f6bc9bdcd146968a14b6

See more details on using hashes here.

File details

Details for the file shap-0.44.1-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl.

File metadata

File hashes

Hashes for shap-0.44.1-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl
Algorithm Hash digest
SHA256 a466d8a9bb12e7e07f13e68ea2859e4deb1dc6d3953c3a88310bd77484943cd6
MD5 e7473007b06211539b9c162dab44e4f2
BLAKE2b-256 5e1fff13190550d5df33c15709312bd71d4809d4eb0b4e0f665f9c7aba825c57

See more details on using hashes here.

File details

Details for the file shap-0.44.1-cp38-cp38-manylinux_2_12_x86_64.manylinux2010_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for shap-0.44.1-cp38-cp38-manylinux_2_12_x86_64.manylinux2010_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 497bde2083b2a27e0a4b8ded09927c11c8803e80e76cb2e87e2136c49f1c93b2
MD5 a4ce79f7e1cbe122d8e7b624db4966bf
BLAKE2b-256 a5b8637c8fbf3e0a5e8cdd9ce2aa0a933d48ed57373465c0f21e4431494e2a8f

See more details on using hashes here.

File details

Details for the file shap-0.44.1-cp38-cp38-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for shap-0.44.1-cp38-cp38-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 89f4492b406ec9908750560281c4616aff2a28c61f3ad2f6ac511270f44a0c5a
MD5 324d631a160349642e3e64332da4d2d6
BLAKE2b-256 613a75da0fd0dd7665d8888b55429a9afb5525771918f4aa11726c5cc353591b

See more details on using hashes here.

File details

Details for the file shap-0.44.1-cp38-cp38-macosx_10_9_x86_64.whl.

File metadata

File hashes

Hashes for shap-0.44.1-cp38-cp38-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 c82efde41d2b2c65d707b678a0d057d77436faf72331a623545aa742ffe5b9f5
MD5 0d4eb61902280c3b4cbc1e3545ccb278
BLAKE2b-256 13e963a4135c86ffc6bc33a9baca7b60d71c91fe2f60c3f63c31044ce4fa9d7b

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page