Skip to content

Towards Efficient Shapley Value Estimation via Cross-contribution Maximization

Notifications You must be signed in to change notification settings

guanchuwang/SHEAR

Repository files navigation

Accelerating Shapley Explanation via Contributive Cooperator Selection

Research Motivation

Explaining the behavior of deep neural networks (DNNs) is a significant problem according to not only the practical requirement and but also the regulations in different domains. Among the state-of-art techniques of DNN interpretation, the Shapley value provides a natural and effective explanation from the perspective of cooperative game theory. However, the calculation of Shapley value is known to be an NP-hard problem with extremely high computational complexity. To solve this problem, we propose SHEAR for efficient Shapley value estimation in this repo.

Research Challenges

The brute-force algorithm to calculate exact Shapley values requires the enumeration of all possible feature coalitions, where the complexity grows exponentially with the feature number. To address this issue, SHEAR only involves few feature coalitions for the estimation. In such a manner, the enumeration throughout the whole feature space can be avoided such that the computational efficiency can be significantly improved.

SHEAR Framework

As shown in the following figure, given a DNN model f and feature value x = [x1,··· ,xM], SHEAR estimates the contribution of each feature independently. Specifically, for each feature, SHEAR first calculates its cross-contribution with other features; then greedily selects the contributive cooperators to maximize the cumulative cross-contribution; finally estimates the feature contribution throughout the coalitions of contributive cooperators.

Time Complexity of SHEAR

SHEAR has one model backward process to calculate the gradient for cross-contribution estimation, and N model forward process for feature contribution estimation. Hence, SHEAR has the time consumption given by TSHEAR ≈ tbackward + N tforward for single feature contribution estimation. Considering to interpretation of a model f having M features, we process the M features consecutively in this repo, where the overall time-cost increases to M times of single feature. The strucuture can be improved via parallel processing of the M features.

Dependency

numpy >= 1.19.5
torch >= 1.10.0
pandas >= 1.1.5
scipy >= 1.5.4

Train a classifier on the Adult dataset

python train_adult.py

Benchmark the GT-Shapley value for the Adult dataset

python benchmark_shap_adult.py

Benchmark the feature cross-contribution for SHEAR

python grad_benchmark_adult.py

Run SHEAR and Baseline Methods:

cd exp_adult
bash exp_adult/run_exp_eff_shap.sh     # SHEAR
bash exp_adult/run_exp_ks.sh           # Kernel-SHAP
bash exp_adult/run_exp_softks.sh       # Kernel-SHAP with Welford algorithm 
bash exp_adult/run_exp_ks_pair.sh      # Kernel-SHAP with Pair Sampling
bash exp_adult/run_exp_permutation.sh  # (Antithetical) Permutation Sampling
cd ../

Evaluate SHEAR and Baseline Methods:

cd exp_adult
python err_plot.py
python monotonicity_plot.py
python run_time_plot.py
cd ../

Illustrate the feature contribution generated by SHEAR

cd exp_adult
python interpret_plot.py
cd ../

Reproduce our experiment results on the Adult dataset:

Absolute error of estimated Shapley value, Accuracy of feature importance ranking and Algorithmic throughput of SHEAR and baseline methods:

Illustration of Model Interpretation:

Acknowledgement

We gratefully acknowledge the Meta Platforms Inc. for their contribution in this project.

Cite this Work:

If you find this work useful, you may cite this work:

@article{wang2022accelerating,
  title={Accelerating Shapley Explanation via Contributive Cooperator Selection},
  author={Wang*, Guanchu and Chuang*, Yu-Neng and Du, Mengnan and Yang, Fan and Zhou, Quan and Tripathi, Pushkar and Cai, Xuanting and Hu, Xia},
  journal={arXiv preprint arXiv:2206.08529},
  year={2022}
}

About

Towards Efficient Shapley Value Estimation via Cross-contribution Maximization

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published