How to visualize the attention map
I am attempting to visualize results, which is mostly handled by main.visualize(). However, the code to get the attention map has been commented out, and replaced with np.zeros.
My general question is what is the intuition behind the commented out code? Some specifics:
- What is i_datum?
- What is mod_layout_choice?
- Why is att_blob_name created the way it is?
This will be helpful to understand, as we are also attempting to connect an additional model to the final attention map, pre softmax activation. Thanks.
Same question here. When I uncomment the visualize lines, it goes to this error:
Traceback (most recent call last):
File "main.py", line 260, in
I have no idea of the numbers such as "101". It seems that "Find_101_softmax" is not in the layer list.