qlora icon indicating copy to clipboard operation
qlora copied to clipboard

Finetuned T5 checkpoints

Open i-am-neo opened this issue 1 year ago • 1 comments

Very exciting development - thanks for sharing your paper and this repo. Would it be possible for your team to release the T5 finetuned checkpoints (Super-NaturalInstructions), small to xxl? We can upload to HF hub.

Thank you.

i-am-neo avatar May 27 '23 15:05 i-am-neo

Hello @i-am-neo,

I agree with your sentiment! The progress shown by @artidoro and the team on this repository is truly exciting and has a huge potential for the community.

As for releasing the T5 finetuned checkpoints (Super-NaturalInstructions), it seems like a great idea and would indeed be beneficial for researchers and developers looking to further explore this model's capabilities.

While I'm not in the position to release these checkpoints myself, I'd strongly support this suggestion and hope the maintainers will consider it. As a stopgap, you might be able to fine-tune T5 on your own by following the original T5 paper's guidelines.

One possible approach would be as follows:

from transformers import T5Tokenizer, T5ForConditionalGeneration

# Initialize the model and tokenizer
model = T5ForConditionalGeneration.from_pretrained('t5-base')
tokenizer = T5Tokenizer.from_pretrained('t5-base')

# Your fine-tuning code goes here

Keep in mind that fine-tuning requires a substantial amount of data and computational resources. If you're restricted in these aspects, cloud-based services might be the solution.

Once again, it's up to the maintainers to release their checkpoints, but this might help you get started in the meantime.

Best, @hemangjoshi37a

hemangjoshi37a avatar May 28 '23 08:05 hemangjoshi37a