
An introduction to explainable AI with Shapley values — SHAP latest ...
We will take a practical hands-on approach, using the shap Python package to explain progressively more complex models. This is a living document, and serves as an introduction to the shap Python …
shap.Explainer — SHAP latest documentation
This is the primary explainer interface for the SHAP library. It takes any combination of a model and masker and returns a callable subclass object that implements the particular estimation algorithm …
decision plot — SHAP latest documentation
SHAP Decision Plots SHAP decision plots show how complex models arrive at their predictions (i.e., how models make decisions). This notebook illustrates decision plot features and use cases with …
shap.plots.force — SHAP latest documentation
For SHAP values, it should be the value of explainer.expected_value. However, it is recommended to pass in a SHAP Explanation object instead (shap_values is not necessary in this case).
Image examples — SHAP latest documentation
Image examples These examples explain machine learning models applied to image data. They are all generated from Jupyter notebooks available on GitHub. Image classification Examples using …
Explaining quantitative measures of fairness — SHAP latest …
By using SHAP (a popular explainable AI tool) we can decompose measures of fairness and allocate responsibility for any observed disparity among each of the model’s input features.
shap.DeepExplainer — SHAP latest documentation
Meant to approximate SHAP values for deep learning models. This is an enhanced version of the DeepLIFT algorithm (Deep SHAP) where, similar to Kernel SHAP, we approximate the conditional …
Release notes — SHAP latest documentation
Nov 11, 2025 · This release incorporates many changes that were originally contributed by the SHAP community via @dsgibbons 's Community Fork, which has now been merged into the main shap …
Tabular examples — SHAP latest documentation
Example of loading a custom tree model into SHAP Explaining a simple OR function Explaining the Loss of a Tree Model Fitting a Linear Simulation with XGBoost Force Plot Colors Front page example …
shap.plots.waterfall — SHAP latest documentation
The waterfall plot is designed to visually display how the SHAP values (evidence) of each feature move the model output from our prior expectation under the background data distribution, to the final model …