alpaca-lora
alpaca-lora copied to clipboard
Generator Object
I am having an issue with the generate.py. I have cloned the repo and after that finetune it: python finetune.py --base_model="decapoda-research/llama-7b-hf" --data-path "alpaca_data_tr.json"
I saved the output to ./lora_alpaca directory. But whenever I tried to generate response: python generate.py --base_model='decapoda-research/llama-7b-hf' --lora_weights='./lora-alpaca/' --load_8bit
The Instruction is: Tell me about alpacas.
The output is : <generator object main.
I am working in an offline environment and downloaded the decapoda-research/llama-7b-hf from https://huggingface.co/decapoda-research/llama-7b-hf/tree/main
I already checked if the issue with Gradio from getting the response with the terminal faced the same result.
Same issue
return output
instead of yield
. That works for me.
return output
instead ofyield
. That works for me.
Didn't worked out for me, but thank you.
Same issue
Did you solved your issue?
Remove the code of the situation stream_output is True
, and replace yield prompter.get_response(output)
with return prompter.get_response(output)
(the code of gr.Interface()
is also removed).
That works for me.
However, it is really strange because this part of code should be unused since the parameter stream_output
is set to be False.
@CadenShi Exactly! Your little fix works, but I cannot understand why the stream_output
thing was being called in the first place even if set to False
Thanks anyways!