saliency_evaluation icon indicating copy to clipboard operation
saliency_evaluation copied to clipboard

Python implementation for evaluating explanations presented in "On the (In)fidelity and Sensitivity for Explanations" in NeurIPS 2019 for evaluating any saliency explanations.

Saliency Evaluation

Python implementation for evaluating explanations presented in On the (In)fidelity and Sensitivity for Explanations published in NeurIPS 2019 for evaluating any saliency explanations.

Get Started

Run vis_mnist.ipynb to see examples of explanations in MNIST along with their sensitivity and infidelity.

Acknowledgements

We build our visualization tools based on codes available in the following repositories:

  1. https://github.com/PAIR-code/saliency
  2. https://github.com/marcoancona/DeepExplain