KoboldAI-Client icon indicating copy to clipboard operation
KoboldAI-Client copied to clipboard

MLC-LLM Integration?

Open ArcturusMayer opened this issue 2 years ago • 0 comments

Perhaps it would be a good idea to add support for the new MLC-AI team project https://github.com/mlc-ai/mlc-llm in the future to run on any graphics cards that support the Vulcan API? Just like you added llama.cpp support in the past. For example, I have an RX 570 graphics card that has Vulcan support, but does not have support for current ROCm versions, and has 8 GB VRAM, so for people in a similar situation, this would be very important and useful.

ArcturusMayer avatar May 03 '23 20:05 ArcturusMayer