Add support for T5 models
The bug Could you add support or provide some guidance (pun intended) so I can add support to the family of T5 model ?
To Reproduce
import guidance
model_id = 'google/flan-t5-xl'
flant5 = guidance.llms.Transformers(model_id, device=0)
Error: ValueError: Unrecognized configuration class <class 'transformers.models.t5.configuration_t5.T5Config'> for this kind of AutoModel: AutoModelForCausalLM. Model type should be one of BartConfig, BertConfig, BertGenerationConfig, BigBirdConfig, BigBirdPegasusConfig, BioGptConfig, BlenderbotConfig,etc
Possible fix: I believe that this is occurring because guidance does not support HuggingFace 'AutoModelForSeq2SeqLM'. Quick fix could be doing the following
from transformers import AutoModelForSeq2SeqLM, AutoModelForCausalLM
try:
model = AutoModelForCausalLM.from_pretrained(model_id).to(device)
except:
model = AutoModelForSeq2SeqLM.from_pretrained(model_id).to(device)
It's important to note the limited vocabulary on some Seq2Seq T5 models. However, I definitely think it's still possible if Guidance can maintain its specialized token healing processes.
I am looking forward to an update that introduces this suggestion.