mlc-llm icon indicating copy to clipboard operation
mlc-llm copied to clipboard

python chat.py can not be run

Open lucasjinreal opened this issue 2 years ago • 6 comments

This actually do not provide a loadable in huggingface repo, how does the tokenizer can load?

OSError: ./dist/models/vicuna-v1-7b does not appear to have a file named config.json. Checkout 'https://huggingface.co/./dist/models/vicuna-v1-7b/None' for available files.

What's more, does I also need tvm built with vulkan enable to run demo .so???

  File "E:\codes\libs\relax\src\runtime\c_runtime_api.cc", line 131
TVMError:
---------------------------------------------------------------
An error occurred during the execution of TVM.
For more information, please see: https://tvm.apache.org/docs/errors.html
---------------------------------------------------------------
  Check failed: (allow_missing) is false: Device API vulkan is not enabled.

lucasjinreal avatar May 05 '23 03:05 lucasjinreal

Hi @lucasjinreal , to build models on your own, you should clone the models' original huggingface repository, for example, for LLaMA, you should:

git clone https://huggingface.co/decapoda-research/llama-7b-hf.git dist/models/llama-7b-hf

The hugging face repositories under mlc-ai store compiled models, not original ones. The build script is only applicable to original models.

Regarding the second question, the answer is no if your compilation target is not vulkan.

yzh119 avatar May 05 '23 05:05 yzh119

@yzh119 As a matter of fact I don't want compiling my self, I just wanna load the lib you generated such as vulkan_x86.dll on windows, but apparently I can not even using this dll if no target tvm.device build against. How can I just run the python with built-in dll on windows? (the prebuilt binary I can run sucessfully)

lucasjinreal avatar May 05 '23 06:05 lucasjinreal

the chat.py needs some changes before it can be run. on macbook air

hfyydd avatar May 09 '23 02:05 hfyydd

@hfyydd please specific which part need to change, to my opinion, this python code logically can not be run (except the plp who wrote this knows how to preparing) on any platform!!

lucasjinreal avatar May 09 '23 13:05 lucasjinreal

@lucasjinreal go to the link and see the changes chat.py on macbook air

hfyydd avatar May 09 '23 13:05 hfyydd

@hfyydd thank u! will try!

lucasjinreal avatar May 09 '23 13:05 lucasjinreal

Please use mlc_chat_cli instead

junrushao avatar Jun 05 '23 15:06 junrushao