mlc-llm
mlc-llm copied to clipboard
Error:VK_ERROR_INCOMPATIBLE_DRIVER
my environment is win10 WSL2 Ubuntu22.04, and I follow the command:
conda create -n mlc-chat
conda activate mlc-chat
conda install git git-lfs
conda install -c mlc-ai -c conda-forge mlc-chat-nightly
mkdir -p dist
git lfs install
git clone https://huggingface.co/mlc-ai/demo-vicuna-v1-7b-int3 dist/vicuna-v1-7b
git clone https://github.com/mlc-ai/binary-mlc-llm-libs.git dist/lib
However, when I complete this command:
mlc_chat_cli
error happen.
We have not yet tested vulkan for wsl, but you can directly try windows terminals
I had the same error actually in wsl. I should try powershell next time
WSL support for Vulkan is not there yet AFAIK, so please use CMD instead if you want to use Vulkan on Windows
I had the same error:
CentOS Linux release 8.0.1905 (Core) 4.18.0-348.7.1.el8_5.x86_64
I had the same error:
CentOS Linux release 8.0.1905 (Core) 4.18.0-348.7.1.el8_5.x86_64
Hey were you able to get it working on CentOs?
tvm.error.InternalError: Traceback (most recent call last):
7: tvm::runtime::PackedFuncObj::Extractor<tvm::runtime::PackedFuncSubObj<__mk_TVM1::{lambda(tvm::runtime::TVMArgs, tvm::runtime::TVMRetValue*)#1}> >::Call(tvm::runtime::PackedFuncObj const*, __mk_TVM1, tvm::
runtime::TVMRetValue)
6: tvm::runtime::DeviceAPIManager::GetAPI(int, bool)
5: tvm::runtime::DeviceAPIManager::GetAPI(std::__cxx11::basic_string<char, std::char_traits
I'm getting the same error. How do I run this on arch?