mlc-llm
mlc-llm copied to clipboard
Error: Vulkan Error, code=-3: VK_ERROR_INITIALIZATION_FAILED
mlc_chat_cli terminate called after throwing an instance of 'tvm::runtime::InternalError' what(): [03:45:52] /home/runner/work/utils/utils/tvm/src/runtime/vulkan/vulkan_instance.cc:144:
An error occurred during the execution of TVM. For more information, please see: https://tvm.apache.org/docs/errors.html
Check failed: (__e == VK_SUCCESS) is false: Vulkan Error, code=-3: VK_ERROR_INITIALIZATION_FAILED Stack trace: [bt] (0) /home/xt/anaconda3/envs/mlc-chat/bin/../lib/libtvm_runtime.so(tvm::runtime::Backtraceabi:cxx11+0x27) [0x7fa4f1a06b77] [bt] (1) /home/xt/anaconda3/envs/mlc-chat/bin/../lib/libtvm_runtime.so(+0x3f375) [0x7fa4f19a4375] [bt] (2) /home/xt/anaconda3/envs/mlc-chat/bin/../lib/libtvm_runtime.so(tvm::runtime::vulkan::VulkanInstance::GetPhysicalDevices() const+0x3e9) [0x7fa4f1af20a9] [bt] (3) /home/xt/anaconda3/envs/mlc-chat/bin/../lib/libtvm_runtime.so(tvm::runtime::vulkan::VulkanDeviceAPI::VulkanDeviceAPI()+0x13f) [0x7fa4f1aefe2f] [bt] (4) /home/xt/anaconda3/envs/mlc-chat/bin/../lib/libtvm_runtime.so(tvm::runtime::vulkan::VulkanDeviceAPI::Global()+0x4c) [0x7fa4f1af00cc] [bt] (5) /home/xt/anaconda3/envs/mlc-chat/bin/../lib/libtvm_runtime.so(+0x18b10d) [0x7fa4f1af010d] [bt] (6) /home/xt/anaconda3/envs/mlc-chat/bin/../lib/libtvm_runtime.so(+0x6bf04) [0x7fa4f19d0f04] [bt] (7) /home/xt/anaconda3/envs/mlc-chat/bin/../lib/libtvm_runtime.so(+0x6c4a7) [0x7fa4f19d14a7] [bt] (8) mlc_chat_cli(+0xe4f0) [0x55772fb6f4f0]
my gpu is NVIDIA-A100, does it support A100? Or which version of Vulkan should I install
I believe Vulkan is supported according to #15. On the other hand, A100 is an extremely powerful GPU, so why not simply run Huggingface's pytorch models directly?