xmtf icon indicating copy to clipboard operation
xmtf copied to clipboard

Crosslingual Generalization through Multitask Finetuning

Results 11 xmtf issues
Sort by recently updated
recently updated
newest added

Q1: I am trying to extract the Arabic instructions from the xP3 dataset, and I want to put them in the format: “Instruction”, “Input”, and “Output”. Currently, the data is...

Hello! Thanks a lot for your job! I want to finetune bloomz-mt by your Megatron-DeepSpeed,but I can not find a universal version checkpoint of bloomz-mt or bloomz. I only found...

Hello! Thanks a lot for your job! I'm using mT0-xxl for question answering task, however it performs with not so high quality I expected it to do. So I'm trying...

Can you provide the continue fine-tune code for mT0 to specific down-stream data task,we want to test it for specific scene, e.g. retrieval and recommendation. We find a similar version...

Hi! Thanks for the amazing job! Have a couple of quick questions. I'm trying to use mT0-xxl-mt for QA. When I provide the context and ask a question, subject of...

is it possible to use petals for inferring/prompt tuning without sharing my gpu?

Thanks for the great work! I have a few questions regarding data creation of xP3 after following the guide [here](https://github.com/bigscience-workshop/xmtf#create-xp3) to create instruction data on the `code` language subset. 1....

https://github.com/bigscience-workshop/Megatron-DeepSpeed/blob/main/megatron/data/mtf_dataset.py#L34 The MTFDataset class take `documents` as arguments, but didn't use it(except in assert statement). I think `documents` is train/valid/test split index, is it ok to ignore `documents`?

https://arxiv.org/pdf/2212.09535.pdf I was reading this paper, and really interested into trying this myself. But I can't find the model weights (bloom-3b) anywhere. Can you link that? would be great.

Hello, thanks for your work! I want to try to implement this work myself, but I cann't achieve the high performance by xP3 and mT0-xxl as shown in the paper...