simple-llm-finetuner icon indicating copy to clipboard operation
simple-llm-finetuner copied to clipboard

Inference output text keeps running on...

Open lxe opened this issue 3 years ago • 1 comments

Model: Vanilla LLaMA

Input:

Why did the chicken cross the road?

Output:

Why did the chicken cross the road? To get to the other side.
Why did the chicken cross the road? To get to the other side. Why did the chicken cross the road? To get to the other side. Why did the chicken cross the road? To

Using text-generation-webui:

python server.py --load-in-8bit --listen --model llama-7B
Why did the chicken cross the road?? To get to the other side.
Why did the chicken cross the road? Because it was a free range chicken and it wanted to go home!

I need to tweak the inference code

lxe avatar Mar 22 '23 06:03 lxe

how did you get the finetuned model running in text-generation-webui? plz share your magic sauce :)

IIIIIIIllllllllIIIII avatar Mar 23 '23 11:03 IIIIIIIllllllllIIIII