fairseq2 icon indicating copy to clipboard operation
fairseq2 copied to clipboard

FAIR Sequence Modeling Toolkit 2

Results 177 fairseq2 issues
Sort by recently updated
recently updated
newest added

**What does this PR do? Please describe:** The `max_num_warnings` tests have been skipped and it does not function as expected. **Does your PR introduce any breaking changes? If yes, please...

CLA Signed

**Describe the bug:** When calling `dynamic_bucket` on a data pipeline, I am getting an "incompatible function arguments" error. ``` TypeError: dynamic_bucket(): incompatible function arguments. The following argument types are supported:...

bug
data pipeline

**Describe the bug:** Currently the config for LLaMA 3.2 3B is not using tied weights. Only the 1B model is currently supported (https://github.com/facebookresearch/fairseq2/blob/main/src/fairseq2/models/llama/_config.py#L257) **Describe how to reproduce:** Loaded LLaMA 3.2...

bug

Are you planning to add ASR training recipes in the near future? Currently, it seems that only the fairseq2 asr eval recipe is supported

question

**Describe the bug:** When specifying `max_gen_len`, the `SamplingSequenceGenerator` can potentially generate more than `max_gen_len` for all batched sequences whose prompt length is shorter than the longest prompt length in batch....

bug
generation

I was trying to install Fairseq2 for SONAR model on Google colab and I received this error: https://github.com/pytorch/audio/issues/62 Note: it goes away for Pytorch version 2.5.1 while installed with torchaudio....

bug

**What does this PR do? Please describe:** This is a draft implementation of the BestRQ algorithm from https://arxiv.org/pdf/2202.01855 Fixes #{issue number} **Does your PR introduce any breaking changes? If yes,...

CLA Signed

Hi! How to train NLLB model using an existing NLLB-200 model(for example 3.3B) as a checkpoint?

question

I am trying to `mt train` with nllb_dense_3b arch on A6000 GPU. But I receive "CUDA out of memory" error immediately in step 1. The dataset is small - it...

question

There is a piece of code in these doc pages: ``` CKPT_PATH="/checkpoint/$USER/experiments/$EXPERIMENT_NAME/checkpoints/step_1000" CKPT_DIR=$(dirname "$CKPT_PATH") CKPT="checkpoint_$(basename "$CKPT_DIR")" # e.g., checkpoint_step_1000 ``` I guess the third row should be: `CKPT="checkpoint_$(basename "$CKPT_PATH")"`

documentation