Sebastian

Results 5 comments of Sebastian

> It's trying to import utils.py from FastChat. If you clone lmsys/Fastchat and copy the utils.py -as well as rename train_freeform.py to train.py - it should work. Did that. Still...

I have the same problem as https://github.com/huggingface/chat-ui/issues/693 It retrieves correctly when looking at the prompt and parameters, but the model answers as if it has gotten only the question.

Same problem here although I used the right version: pip install git+https://github.com/huggingface/transformers.git@refs/pull/21955/merge

Apparently they have a typo in their code the second L should be lowercase: tokenizer = transformers.LlamaTokenizer.from_pretrained("decapoda-research/llama-7b-hf") model = transformers.LlamaForCausalLM.from_pretrained("decapoda-research/llama-7b-hf").to(device) Install this: pip install git+https://github.com/huggingface/transformers.git@refs/pull/21955/merge and pip install sentencepiece and...