jing xu

Results 3 comments of jing xu

> (gh_web-llm) amd00@asus00:~/llm_dev/web-llm$ npm run build(gh_web-llm) amd00@asus00:~/llm_dev/web-llm$ npm run build > > > @mlc-ai/[email protected] build@mlc-ai/[email protected] 构建 > > rollup -c 汇总 -c > > src/index.ts → lib/index.js... [!] (plugin rpt2)...

> Thanks for the question! Under the hood, weights of the model selected is downloaded from the `model_url` field (a huggingface link) in a model record:谢谢你的提问!在底层,所选模型的权重是从模型记录中的 `model_url` 字段(huggingface 链接)下载的: >...

> They are due to the system prompt as shown in the `mlc-chat-config.json`: https://huggingface.co/mlc-ai/Llama-2-7b-chat-hf-q4f16_1-MLC/blob/main/mlc-chat-config.json#L33-L34, which follow the specification of the official model releases.它们是由于 `mlc-chat-config.json` 中所示的系统提示造成的:https://huggingface.co/mlc-ai/Llama-2-7b-chat-hf-q4f16_1-MLC/blob/main/mlc -chat-c​​onfig.json#L33-L34,遵循官方模型发布的规范。 > > If you'd...