public interface SHAP<T>
SHAP leverages local methods designed to explain a prediction
f(x) based on a single input
The local methods are defined as any interpretable approximation
of the original model. In particular, SHAP employs additive feature
SHAP values attribute to each feature the change in the expected
model prediction when conditioning on that feature. They explain
how to get from the base value
E[f(z)] that would be
predicted if we did not know any features to the current output
In game theory, the Shapley value is the average expected marginal contribution of one player after all possible combinations have been considered.
double shap(T x)
p x k, where
pis the number of features and
kis the classes. The first k elements are the SHAP values of first feature over k classes, respectively. The rest features follow accordingly.
x- an instance.
default double shap(java.util.stream.Stream<T> data)