Wiktor Ligeza
Wiktor Ligeza
> ```python > class FlanLLM(LLM): > model_name = "google/flan-t5-xl" > pipeline = pipeline("text2text-generation", model=model_name, device="cpu", model_kwargs={"torch_dtype":torch.bfloat16}) > > def _call(self, prompt, stop=None): > return self.pipeline(prompt, max_length=9999)[0]["generated_text"] > > def _identifying_params(self):...
Well, it turned out that it wasn't working on the google colab, but it is working just fine on my pc
same, any updates?