merveeozbayy

Results 1 comments of merveeozbayy

> I think that you can use `max_length=512` 😉 @ArthurZucker Hello Arthur, thank you for your answer. Firstly, I added max_length as you said: llama_llm = transformers.pipeline( model=model, tokenizer=tokenizer, return_full_text=True,...