llama-recipes
llama-recipes copied to clipboard
Android example with MLC-LLM can't build with mlc-llm nightly package on MacOS x86-64
System Info
MacOS x86-64
AMD GPU
Information
- [X] The official example scripts
- [ ] My own modified scripts
🐛 Describe the bug
The nightly pip package for mlc-llm and mlc-ai on py311 MacOS x86-64 is old that the mlc-llm
CLI doesn't include package
as a command argument.
For users on these machines, they will need to build from source.
I think we should update the demo instructions here to explain this clearly.
Error logs
command
python3 -m mlc_llm package --package-config mlc-package-config.json --output dist
output
------------------------- Usage -------------------------
usage: MLC LLM Command Line Interface. [-h] {compile,convert_weight,gen_config,chat,serve,bench}
positional arguments:
{compile,convert_weight,gen_config,chat,serve,bench}
Subcommand to to run. (choices: compile, convert_weight, gen_config, chat, serve, bench)
options:
-h, --help show this help message and exit
------------------------- Error -------------------------
argument subcommand: invalid choice: 'package' (choose from 'compile', 'convert_weight', 'gen_config', 'chat', 'serve', 'bench')
Expected behavior
The command should work as expected to produce binaries.