mlc-llm
mlc-llm copied to clipboard
Question : Prompt configuration
Hi, I have M2 pro 16gb and it run really fast using vicuna. I wondered how to customize prompt, temperature, context etc ?
Thanks
To customize temperature you can modify the mlc-chat-config.json file in your model directory.
Currently, you need to modify the MLC-LLM source code to customize the prompt, we will make it easier after PR #251 gets merged. Please stay tuned!
@yzh119 Thanks, I'll wait for it
Please refer to this page for detailed documentation: https://mlc.ai/mlc-llm/docs/tutorials/runtime/mlc_chat_config.html
@junrushao Thanks for the documentation