gemma_pytorch icon indicating copy to clipboard operation
gemma_pytorch copied to clipboard

Output with higher max_length is repetition of base text

Open azrael05 opened this issue 1 year ago • 7 comments

While generating any text with a specified value of max_length, the generated text keeps repeating several times until the output spans the value of max_length. An example of the above is using the following code

gemma_lm = keras_nlp.models.GemmaCausalLM.from_preset("gemma_2b_en")
single_prompt_result = gemma_lm.generate("Keras is a", max_length=4096)
print(single_prompt_result)

As you can observe the sentence keeps repeating to span the max_length while it should ideally stop once it has written the base text. image

The code was run on Kaggle with "gemma_2b_en" model GPU - P100 To recreate the issue you can run the given code.

azrael05 avatar Feb 23 '24 21:02 azrael05

Could you please try the instruction-tuned model instead? It should give you better results.

pengchongjin avatar Feb 23 '24 22:02 pengchongjin

Could you please try the instruction-tuned model instead? It should give you better results.

Thanks, With the instruct tuned model the output is perfect.

Btw is there any reason why the gemma_2b_en model produced repetitive output instead ks stopping ?.

azrael05 avatar Feb 23 '24 22:02 azrael05

It's kind of expected that the pre-trained models only try to complete text. Maybe one way you could try is to tune the sampling parameters to see if you can get a bit diversity in the output.

pengchongjin avatar Feb 24 '24 01:02 pengchongjin

I am just happy to be a part of this chat

AbhishekJ24 avatar Feb 25 '24 14:02 AbhishekJ24

It's kind of expected that the pre-trained models only try to complete text. Maybe one way you could try is to tune the sampling parameters to see if you can get a bit diversity in the output.

Yeah, Its expected of it to complete the text but still shouldn't repeat its text right? Example the other text generation models might produce half ending sentence outputs depending on the max_length size but they don't producr repeating ouputs.

azrael05 avatar Feb 25 '24 15:02 azrael05

I've noticed the 2b model repeating itself as well. Although, I found it does it when the context of my prompt would be hard even for a human to figure out.

Ittiz avatar Feb 27 '24 04:02 Ittiz

it is expected these repetitions on PT models.it would be better to fine tune them or use the IT models

gustheman avatar Jul 15 '24 14:07 gustheman