Faton
Faton
You need to specify `model.layer_type`, `model.attn_type` and `model.pos_enc_type` in the launch command. Otherwise fairseq doesn't seem to understand that you are trying to train a conformer, and tries to build...
Read the comment above. Change the code `build_encoder_layer` to allow a `**kwargs` argument. So from this: ```python class ConformerEncoder(TransformerEncoder): def build_encoder_layer(self, args): layer = ConformerWav2Vec2EncoderLayer( embed_dim=self.embedding_dim, ffn_embed_dim=args.encoder_ffn_embed_dim, attention_heads=args.encoder_attention_heads, dropout=args.dropout, depthwise_conv_kernel_size=args.depthwise_conv_kernel_size,...
Could you point to where you'd like to see this information? Is there somewhere specific where this clarification is needed? The [bot's website](https://lauler.github.io/sprakpolisen/)? The bot's comments?
Is this issue related to loading pretrained Llama2/Llama3 weights and using them as checkpoint? I was going to start a separate issue asking for some docs that explain how to...
I want to join the list of people here that were negatively affected by the change and wish it did not affect users that do not explicitly specify a language...