FBK-Fairseq-ST icon indicating copy to clipboard operation
FBK-Fairseq-ST copied to clipboard

Error when not using any distance penalty

Open gegallego opened this issue 3 years ago • 2 comments

Hello @mattiadg

When I don't use the --distance-penalty flag I get the following error:

File "~/FBK-Fairseq-ST/fairseq/models/s_transformer.py", line 472, in __init__
    init_variance=(args.init_variance if args.distance_penalty == 'gauss' else None)
TypeError: __init__() got an unexpected keyword argument 'penalty'

The problem comes from the following lines in the constructor of TransformerEncoderLayer:

attn = LocalMultiheadAttention if args.distance_penalty != False else MultiheadAttention	
self.self_attn = attn(
	self.embed_dim, args.encoder_attention_heads,
	dropout=args.attention_dropout, penalty=args.distance_penalty,
	init_variance=(args.init_variance if args.distance_penalty == 'gauss' else None)
)

The argumentspenalty and init_variance do not exist in MultiheadAttention, so I substituted these lines by:

if args.distance_penalty != False:
    self.self_attn = LocalMultiheadAttention(
        self.embed_dim, args.encoder_attention_heads,
        dropout=args.attention_dropout, penalty=args.distance_penalty,
        init_variance=(args.init_variance if args.distance_penalty == 'gauss' else None)
    )
else:
    self.self_attn = MultiheadAttention(
        self.embed_dim, args.encoder_attention_heads,
        dropout=args.attention_dropout,
    )

gegallego avatar Mar 09 '21 10:03 gegallego

Do you want to submit a pull request?

mattiadg avatar Mar 09 '21 14:03 mattiadg

👍

gegallego avatar Mar 09 '21 17:03 gegallego