FastChat
FastChat copied to clipboard
execute fastchat.serve.model_ Worker Error
command: python -m fastchat.serve.model_worker --model-path /model_worker/lmsys/vicuna-13b-delta-v1.1
Error Info:
Have the same issue. For now, FastChat is not available AT ALL due this this error on the worker side. Even more, FastChat response displayers weird words which are totally non-sense and not related to the questions.
I have re read the document content and it seems that my model needs to be converted, and there is also a problem with my command. I am currently working on it and I am not sure if it is feasible.
Seems like a windows issue. i had the same error as well. But was able to run the converted model in Linux containers
You are using delta weights, You need to convert it to the HF format with original LLama Weights
@ch930410 this looks like a non-issue - it was a problem of the model version, not of fastChat. And now even less with the latest vicunas based on llama2. Let's close this one?