robotsp

Results 51 comments of robotsp

@kauterry Would you please have a look at https://github.com/facebookresearch/fairseq/issues/4989 I prepare my data with the latest code of stopes and the changed config, but came across the new issue of...

I don't found nllb module in fairseq/examples of the version ==0.12.1 that recommended by the new version of Stopes (https://github.com/facebookresearch/stopes/tree/main). But when I reinstalled the nllb version of fairseq. Some...

@edvardasast > > @edvardasast Did you find any git repository for finetuning? > > unfortunately not :( I have successfuly preprocessed data by using this command: python preprocess.py -s eng_Latn...

how do you decide the value of knntarget_size? @Hannibal046 @FadedCosine

> Hi @madlag @julien-c @co42 @srush @Narsil > > I am trying to use `nn_pruning` for Pruning different transformer models. > > Code: > > ``` > model_checkpoint = "t5-small"...

> Hi, I am working to prune BART model for seq2seq purpose. Currently, I have replaced this [code](https://github.com/huggingface/nn_pruning/blob/main/notebooks/01-sparse-trainer.ipynb) with BART based functionalities. After executing I am getting drop in number...

@Azure-Tang hi tang, Thanks for your work of the implementation of terapipe on Megatron-LM. Did you try to compare the performance versus no terapipinig? How much benefit it brings?

Do you reproduce the pruning strategy on finetuning Roberta? @achen353

I have the same issue on loading the sft llm, I found it's because it cannot save custom modules using named_parameters. Did you solve the problem? @wangzhao88 @qiancheng99