Jesse Vig
Jesse Vig
Good idea, thanks for the suggestion.
Hi, thanks for the suggestion and question. I don't currently have a suggestion on how to do that, but would agree it would be a nice feature.
Thanks for your feedback. Agreed that would be nice to have such a feature as opposed to taking a screenshot.
Hi @abhik-99, unfortunately still not available, but your feedback is duly noted. Thanks.
Thanks @Serbernari. I had removed it to simplify the repo and reduce the maintenance requirements. But I can see that it would be helpful to have notebooks for specific models...
This feature isn't yet available unfortunately, but I hope to add it at some point. Are you using the attention-head view?
Hi @tom68-ll thanks for your kind words, and for the question. Unfortunately we don't support this feature yet. So I believe the best you can do currently is a screen...
Hi @cgray1117, sorry for the troubles. Can you verify that the `attention` parameter is of this format?: list of ``torch.FloatTensor``(one for each layer) of shape ``(batch_size(must be 1), num_heads, sequence_length,...
Hmm, it looks good on the surface. What is the output of that last print statement?
Hi @cgray1117 sorry for the delayed response. The attention returned from a TF model is a TF tensor and needs to be converted to a torch tensor. I'll update the...