Mingyu Wang
Results
2
comments of
Mingyu Wang
https://github.com/WAMAWAMA/trans_attention_vis maybe you can try this but more novel method to visualize the attention maps is like: https://github.com/hila-chefer/Transformer-Explainability https://github.com/hila-chefer/Transformer-MM-Explainability
应该是你的路径有问题,建议仔细检查一下