mac M1 can not start tabby
Describe the bug I followed the guide in this doc: https://tabby.tabbyml.com/docs/quick-start/installation/apple/ after i installed tabby use brew, and when i run below command: tabby serve --device metal --model StarCoder-1B
the terminal just like frozen and can not see any log output
Information about your version tabby 0.12.0
Information about your GPU mac os M1
Hi, can you run command again with following env vars?
RUST_LOG=debug RUST_BACKTRACE=1 tabby serve ...
@wsxiaoys I run command again with following env vars, it occurs to many logs, and i find it retry many times to start llama-server: 2024-06-14T08:09:54.319865Z WARN llama_cpp_server::supervisor: crates/llama-cpp-server/src/supervisor.rs:88: llama-server exited with status code -1, restarting...
@wsxiaoys do you have any idea for it?
I have the same problem run command
RUST_LOG=debug RUST_BACKTRACE=1 tabby serve --device metal --model StarCoder-1B --port 9823
output
2024-06-20T09:55:00.350736Z DEBUG hyper_util::client::legacy::connect::http: /Users/runner/.cargo/registry/src/index.crates.io-6f17d22bba15001f/hyper-util-0.1.5/src/client/legacy/connect/http.rs:631: connecting to 127.0.0.1:7890
2024-06-20T09:55:00.351082Z DEBUG hyper_util::client::legacy::connect::http: /Users/runner/.cargo/registry/src/index.crates.io-6f17d22bba15001f/hyper-util-0.1.5/src/client/legacy/connect/http.rs:634: connected to 127.0.0.1:7890
2024-06-20T09:55:01.187690Z WARN llama_cpp_server::supervisor: crates/llama-cpp-server/src/supervisor.rs:88: llama-server exited with status code -1, restarting...
happens to me too with 0.12.0, I see a single log line with "Waiting for llama-server to start"
and then a ton of these:
connecting to 127.0.0.1:30888
starting new connection: http://127.0.0.1:30888/
happens to me too with 0.12.0, I see a single log line with "Waiting for llama-server to start"
and then a ton of these:
connecting to 127.0.0.1:30888 starting new connection: http://127.0.0.1:30888/
i encountered this issue too, do you have any progress on it now?
so for me this was the solution:
- upgrade to recent version of tabby (I saw release notes about better logging)
- realize there are indeed new logs, see new errors from llama.cpp
- google the errors for a while
Symbol not found: (_cblas_sgemm$NEWLAPACK$ILP64) - see someone recommending upgrading macos
- upgrade to 14.5
- problem solved!
Hi @Madd0g - thanks for sharing the process, glad that improved logging actually help you locating the issue :)