tabby
tabby copied to clipboard
llama-server with cpu device is not working in docker image
services:
tabby:
restart: always
image: tabbyml/tabby
entrypoint: /opt/tabby/bin/tabby-cpu
command: serve --model StarCoder-1B --chat-model Qwen2-1.5B-Instruct
volumes:
- ".data/tabby:/data"
ports:
- 8080:8080
which is document here https://tabby.tabbyml.com/docs/quick-start/installation/docker-compose/ wont work
tabby-1 | 2024-07-13T12:53:36.624504Z WARN llama_cpp_server::supervisor: crates/llama-cpp-server/src/supervisor.rs:99: llama-server <embedding> exited with status code 127
tabby-1 | 2024-07-13T12:53:36.624528Z WARN llama_cpp_server::supervisor: crates/llama-cpp-server/src/supervisor.rs:111: <embedding>: /opt/tabby/bin/llama-server: error while loading shared libraries: libcuda.so.1: cannot open shared object file: No such file or directory
Originally posted by @b-reich in https://github.com/TabbyML/tabby/issues/2082#issuecomment-2226889985