DOFA icon indicating copy to clipboard operation
DOFA copied to clipboard

An error about TransformerWeightGenerator

Open Jack-bo1220 opened this issue 9 months ago • 2 comments

encoder_layer = nn.TransformerEncoderLayer( TypeError: init() got an unexpected keyword argument 'norm_first'

This will show an error after I delete it:

transformer_output = self.transformer_encoder(x) tgt_len, bsz, embed_dim = query.size() ValueError: not enough values to unpack (expected 3, got 2

I found that the shape of x in “transformer_output = self.transformer_encoder(x)” is torch.Size([132, 128])

Jack-bo1220 avatar May 23 '24 18:05 Jack-bo1220