saliency_evaluation
saliency_evaluation copied to clipboard
Python implementation for evaluating explanations presented in "On the (In)fidelity and Sensitivity for Explanations" in NeurIPS 2019 for evaluating any saliency explanations.
Saliency Evaluation
Python implementation for evaluating explanations presented in On the (In)fidelity and Sensitivity for Explanations published in NeurIPS 2019 for evaluating any saliency explanations.
Get Started
Run vis_mnist.ipynb to see examples of explanations in MNIST along with their sensitivity and infidelity.
Acknowledgements
We build our visualization tools based on codes available in the following repositories:
- https://github.com/PAIR-code/saliency
- https://github.com/marcoancona/DeepExplain