pytorch-original-transformer
pytorch-original-transformer copied to clipboard
Sorry, but I couldn't understand where is the concatenation layer after the multi head self attention, shouldn't there be?