mlc-llm
mlc-llm copied to clipboard
Better instruction needed
I think the project has too little information on adjusting some config.
Like how to load different weights apart from demo provided? How to adjust temp? I do not have any clues.
It will be better to write more instructions .
+1 agreed, but In the CLI lib here: https://github.com/mlc-ai/mlc-llm/blob/main/mlc_llm/utils.py You can see some arguments available which might work like you're asking for. There are some models available that if you download the weights, may or may not work depending on if they've actually been implemented.
I don't think it's possible to adjust temp today, but maybe the library authors can chime in.
mlc_chat_cli --model=dolly-v2-3b
I agree. I am looking forward to the project if we can adjust some basic config.
I agree.Hope to provide a more detailed documentation on compiling from source code which include env, conf and so on.
Thank you for your suggestions, we love to enable more users to DIY and will work on the instructions in the incoming weeks.
Currently we are working on the documentation page: https://mlc.ai/mlc-llm/docs/. Still lots of work to do.
Closing this issue as it's a bit generic. Please do bring up specific doc requests as it's critically important