Tianqi Chen
Tianqi Chen
due to recent ci restart, likely we need to close and re create another pr for the ci to get retriggerd
@tvm-bot rerun
@tvm-bot rerun
@tvm-bot rerun
We migrated to a sdk based approach where the library will be packaged through mlc LLM package command
Thanks for asking, this is an interesting usecase. I believe you can use OpenAI style prompts(and just use the original llama3 conv template) via: ```python user_prompt=f""" Generate a SQL query...
thanks @Faolain for confirming, glad it works
you can use the pip insall, but still need to obtain a copy of the source code
the error was due to lack of cmake dependency, can you install cmake via conda?
https://llm.mlc.ai/docs/deploy/android.html#additional-guides-for-windows-users Please try the above tips, just confirmed with a windows