Universal-Transformer-Pytorch
Universal-Transformer-Pytorch copied to clipboard
if-statement on projecting embedding to hidden size
I found that in models/UTransformer.py:110&194, you have the following codes:
self.proj_flag = False
if(embedding_size == hidden_size):
self.embedding_proj = nn.Linear(embedding_size, hidden_size, bias=False)
self.proj_flag = True
I'm confused that you project embedding to hidden_size when embedding_size==hidden_size, but what if embedding_size!=hidden_size? Doing nothing? Wouldn't it leads to size mismatch?
Hi,
yes, that's a bug. This weekend I will fix it up. Thanks for letting me know
Best
Andrea