shap
shap copied to clipboard
Question - Deep and Gradient Explainers with PyTorch and Embeddings
Hello, I am trying to use shap with a PyTorch model that uses embeddings and I was wondering if there are any suggestions/examples for how to accomplish this. I found a couple issues from 2019/2020 (#530 and #1039) that address the limitation of pytorch embeddings since they require longs and gradients are only supported with floats, but have there been any developments on this front?
Following the lead in #1039, I also looked at using the Kernel or Permutation explainers but the model I am using does not really fit a tabular dataset since it is using an encoder/decoder structure with different sequence lengths.
Thank you!