ChatLLM-Web icon indicating copy to clipboard operation
ChatLLM-Web copied to clipboard

Does it support custom models?

Open yuehengwu opened this issue 1 year ago • 1 comments

I tried to add the DoctorGPT model, modified the model in /public/lib/vicuna-7b, and also modified the config.json to point the cacheUrl to the local model at http://localhost:3000/lib/WebLLM/vicuna-7b/doctorGPT/.

However, the webpage shows an error:

截屏2023-09-13 18 57 08

yuehengwu avatar Sep 13 '23 10:09 yuehengwu

Consider trying https://github.com/mlc-ai/web-llm-chat and create issues to the main web-llm repo for new model support requests.

Neet-Nestor avatar May 16 '24 01:05 Neet-Nestor