Tianqi Chen
Tianqi Chen
We have done a pass of refactoring of the components
Thanks you for your input. Indeed there are a lot of things we can improve. This is the beginning of the release so indeed there are a lot of things...
Thank you for your suggestion, we will work on the instructions in the incoming weeks. The current build.py pipeline should support the llama class and there is WIP on other...
@yx-chan131 yes, checkout https://github.com/mlc-ai/web-stable-diffusion
We have not yet tested vulkan for wsl, but you can directly try windows terminals
Thanks for asking, we plan to release more detailed tutorials in the incoming month.
At the moment tvm unity(the main MLC pipeline we rely on) supports as many platforms as possible, so yes it should work on windows (the existing demo also should already...
closing as there is no actionable items atm ,feel free to bring up more in the community discord
This is now supported
This is on DRAM requirement, not exactly sure in your case though, since normally pro max is ok (the app comes with require extra dram requirement to the system).