llama.cpp
llama.cpp copied to clipboard
Misc. bug: Vulkan is not optional at runtime
Name and Version
With llama.cpp version from git tag b4549, if I compile Vulkan support in and then run in an environment where Vulkan is not supported (for example this would happen if a Linux distribution provides llama.cpp with Vulkan enabled but the user doesn't have a GPU with Vulkan), it will fail with the following exception:
terminate called after throwing an instance of 'vk::IncompatibleDriverError'
what(): vk::createInstance: ErrorIncompatibleDriver
It would be better to just disable Vulkan in this case but run on CPU.
Operating systems
Linux
Which llama.cpp modules do you know to be affected?
libllama (core library)
Command line
Any llama-cli command (yes, even `llama-cli -dev none`).
Problem description & steps to reproduce
- Compile with GGML_VULKAN=ON
- Run without GPU
- It crashes with exception
First Bad Commit
No response
Relevant log output