tabby icon indicating copy to clipboard operation
tabby copied to clipboard

Can not run on the linux with docker.

Open zhouqian177 opened this issue 11 months ago • 5 comments

When I try on my window PC, it works. docker run --entrypoint /opt/tabby/bin/tabby-cpu -it -p 18080:8080 -v D:\BaiduNetdiskDownload\TabbyML\tabbyml-chatWizardCoder-3B\tabbyml-chatWizardCoder-3B.tabby:/data tabbyml/tabby serve --model TabbyML/WizardCoder-3B 2024-03-18T13:44:19.248524Z INFO tabby::serve: crates/tabby/src/serve.rs:118: Starting server, this might take a few minutes... 2024-03-18T13:46:34.947359Z INFO tabby::routes: crates/tabby/src/routes/mod.rs:35: Listening at 0.0.0.0:8080 ############################# But, when I try on my linux server, with many different commands, it still didn't work:

  1. docker run -it --gpus all -p 18080:8080 -v /data10/zhouqian/software/tabby/tabbyml-chatWizardCoder-3B/.tabby:/data tabbyml/tabby serve --model TabbyML/WizardCoder-3B thread 'main' panicked at crates/tabby-common/src/registry.rs:82:51: called Result::unwrap() on an Err value: Error(TrackableError { kind: Other, cause: Some(Cause(Os { code: 13, kind: PermissionDenied, message: "Permission denied" })), history: History([Location { module_path: "serdeconv::convert_json", file: "/root/.cargo/registry/src/index.crates.io-6f17d22bba15001f/serdeconv-0.4.1/src/convert_json.rs", line: 53, message: "" }]) }) note: run with RUST_BACKTRACE=1 environment variable to display a backtrace ######### It reports the error.

  2. docker run -it --rm --gpus all -p 18080:8080 -v /data10/zhouqian/software/tabby/tabbyml-chatWizardCoder-3B/.tabby:/data tabbyml/tabby serve --device cuda 2024-03-18T13:00:47.764568Z INFO tabby::serve: crates/tabby/src/serve.rs:118: Starting server, this might take a few minutes... 2024-03-18T13:00:48.071987Z INFO tabby::routes: crates/tabby/src/routes/mod.rs:35: Listening at 0.0.0.0:8080 ########## I didn't give the --model parameters, and it starts the server, but it doesn't work either (I have tried the tabby extension on VScode, but it didn't work)

  3. docker run --entrypoint /opt/tabby/bin/tabby-cpu -it --rm -p 18080:8080 -v /data10/zhouqian/software/models/tabbyml-cpu/.tabby:/data tabbyml/tabby serve 2024-03-18T12:52:53.576451Z INFO tabby::serve: crates/tabby/src/serve.rs:118: Starting server, this might take a few minutes... 2024-03-18T12:52:53.848521Z INFO tabby::routes: crates/tabby/src/routes/mod.rs:35: Listening at 0.0.0.0:8080

The CPU model is the same as above 2) , not work.

  1. Then I go into the container docker run exec {tabby_container_id} -it /bin/bash, and set `RUST_BACKTRACE=1";

tabby serve --model TabbyML/WizardCoder-3B thread 'main' panicked at crates/tabby-common/src/registry.rs:82:51: called Result::unwrap() on an Err value: Error(TrackableError { kind: Other, cause: Some(Cause(Os { code: 13, kind: PermissionDenied, message: "Permission denied" })), history: History([Location { module_path: "serdeconv::convert_json", file: "/root/.cargo/registry/src/index.crates.io-6f17d22bba15001f/serdeconv-0.4.1/src/convert_json.rs", line: 53, message: "" }]) }) stack backtrace: 0: rust_begin_unwind at ./rustc/07dca489ac2d933c78d3c5158e3f43beefeb02ce/library/std/src/panicking.rs:645:5 1: core::panicking::panic_fmt at ./rustc/07dca489ac2d933c78d3c5158e3f43beefeb02ce/library/core/src/panicking.rs:72:14 2: core::result::unwrap_failed at ./rustc/07dca489ac2d933c78d3c5158e3f43beefeb02ce/library/core/src/result.rs:1649:5 3: tabby_common::registry::ModelRegistry::save_model_info 4: tabby_download::download_model::{{closure}} 5: tabby::services::model::download_model_if_needed::{{closure}} 6: tabby::main::{{closure}} 7: tokio::runtime::park::CachedParkThread::block_on 8: tabby::main note: Some details are omitted, run with RUST_BACKTRACE=full for a verbose backtrace.

Then set `RUST_BACKTRACE=full";

tabby serve --model TabbyML/WizardCoder-3B thread 'main' panicked at crates/tabby-common/src/registry.rs:82:51: called Result::unwrap() on an Err value: Error(TrackableError { kind: Other, cause: Some(Cause(Os { code: 13, kind: PermissionDenied, message: "Permission denied" })), history: History([Location { module_path: "serdeconv::convert_json", file: "/root/.cargo/registry/src/index.crates.io-6f17d22bba15001f/serdeconv-0.4.1/src/convert_json.rs", line: 53, message: "" }]) }) stack backtrace: 0: 0x561ad4276b56 - std::backtrace_rs::backtrace::libunwind::trace::hbee8a7973eeb6c93 at /rustc/07dca489ac2d933c78d3c5158e3f43beefeb02ce/library/std/src/../../backtrace/src/backtrace/libunwind.rs:104:5 1: 0x561ad4276b56 - std::backtrace_rs::backtrace::trace_unsynchronized::hc8ac75eea3aa6899 at /rustc/07dca489ac2d933c78d3c5158e3f43beefeb02ce/library/std/src/../../backtrace/src/backtrace/mod.rs:66:5 2: 0x561ad4276b56 - std::sys_common::backtrace::_print_fmt::hc7f3e3b5298b1083 at /rustc/07dca489ac2d933c78d3c5158e3f43beefeb02ce/library/std/src/sys_common/backtrace.rs:68:5 3: 0x561ad4276b56 - <std::sys_common::backtrace::_print::DisplayBacktrace as core::fmt::Display>::fmt::hbb235daedd7c6190 at /rustc/07dca489ac2d933c78d3c5158e3f43beefeb02ce/library/std/src/sys_common/backtrace.rs:44:22 4: 0x561ad42a78a0 - core::fmt::rt::Argument::fmt::h76c38a80d925a410 at /rustc/07dca489ac2d933c78d3c5158e3f43beefeb02ce/library/core/src/fmt/rt.rs:142:9 5: 0x561ad42a78a0 - core::fmt::write::h3ed6aeaa977c8e45 at /rustc/07dca489ac2d933c78d3c5158e3f43beefeb02ce/library/core/src/fmt/mod.rs:1120:17 6: 0x561ad427374f - std::io::Write::write_fmt::h78b18af5775fedb5 at /rustc/07dca489ac2d933c78d3c5158e3f43beefeb02ce/library/std/src/io/mod.rs:1810:15 7: 0x561ad4276934 - std::sys_common::backtrace::_print::h5d645a07e0fcfdbb at /rustc/07dca489ac2d933c78d3c5158e3f43beefeb02ce/library/std/src/sys_common/backtrace.rs:47:5 8: 0x561ad4276934 - std::sys_common::backtrace::print::h85035a511aafe7a8 at /rustc/07dca489ac2d933c78d3c5158e3f43beefeb02ce/library/std/src/sys_common/backtrace.rs:34:9 9: 0x561ad42781b7 - std::panicking::default_hook::{{closure}}::hcce8cea212785a25 10: 0x561ad4277f19 - std::panicking::default_hook::hf5fcb0f213fe709a at /rustc/07dca489ac2d933c78d3c5158e3f43beefeb02ce/library/std/src/panicking.rs:292:9 11: 0x561ad4278648 - std::panicking::rust_panic_with_hook::h095fccf1dc9379ee at /rustc/07dca489ac2d933c78d3c5158e3f43beefeb02ce/library/std/src/panicking.rs:779:13 12: 0x561ad4278522 - std::panicking::begin_panic_handler::{{closure}}::h032ba12139b353db at /rustc/07dca489ac2d933c78d3c5158e3f43beefeb02ce/library/std/src/panicking.rs:657:13 13: 0x561ad4277056 - std::sys_common::backtrace::__rust_end_short_backtrace::h9259bc2ff8fd0f76 at /rustc/07dca489ac2d933c78d3c5158e3f43beefeb02ce/library/std/src/sys_common/backtrace.rs:171:18 14: 0x561ad4278280 - rust_begin_unwind at /rustc/07dca489ac2d933c78d3c5158e3f43beefeb02ce/library/std/src/panicking.rs:645:5 15: 0x561ad2c82ce5 - core::panicking::panic_fmt::h784f20a50eaab275 at /rustc/07dca489ac2d933c78d3c5158e3f43beefeb02ce/library/core/src/panicking.rs:72:14 16: 0x561ad2c83233 - core::result::unwrap_failed::h03d8a5018196e1cd at /rustc/07dca489ac2d933c78d3c5158e3f43beefeb02ce/library/core/src/result.rs:1649:5 17: 0x561ad3a7ab47 - tabby_common::registry::ModelRegistry::save_model_info::heb5929183cb617e0 18: 0x561ad2fd6cca - tabby_download::download_model::{{closure}}::h82dec901783a1f74 19: 0x561ad301688c - tabby::services::model::download_model_if_needed::{{closure}}::hd6b3c5bd27cec85f 20: 0x561ad301910c - tabby::main::{{closure}}::heac96f85fea81dac 21: 0x561ad3009059 - tokio::runtime::park::CachedParkThread::block_on::h3dfb3ecee383e3d1 22: 0x561ad2e920e5 - tabby::main::hafb0c6f339205a49 23: 0x561ad2facae3 - std::sys_common::backtrace::__rust_begin_short_backtrace::h56a3fa330a664b1d 24: 0x561ad2facaf9 - std::rt::lang_start::{{closure}}::h4573bab78175c32b 25: 0x561ad426a051 - core::ops::function::impls::<impl core::ops::function::FnOnce<A> for &F>::call_once::h37600b1e5eea4ecd at /rustc/07dca489ac2d933c78d3c5158e3f43beefeb02ce/library/core/src/ops/function.rs:284:13 26: 0x561ad426a051 - std::panicking::try::do_call::hb4bda49fa13a0c2b at /rustc/07dca489ac2d933c78d3c5158e3f43beefeb02ce/library/std/src/panicking.rs:552:40 27: 0x561ad426a051 - std::panicking::try::h8bbf75149211aaaa at /rustc/07dca489ac2d933c78d3c5158e3f43beefeb02ce/library/std/src/panicking.rs:516:19 28: 0x561ad426a051 - std::panic::catch_unwind::h8c78ec68ebea34cb at /rustc/07dca489ac2d933c78d3c5158e3f43beefeb02ce/library/std/src/panic.rs:142:14 29: 0x561ad426a051 - std::rt::lang_start_internal::{{closure}}::hffdf44a19fd9e220 at /rustc/07dca489ac2d933c78d3c5158e3f43beefeb02ce/library/std/src/rt.rs:148:48 30: 0x561ad426a051 - std::panicking::try::do_call::hcb3194972c74716d at /rustc/07dca489ac2d933c78d3c5158e3f43beefeb02ce/library/std/src/panicking.rs:552:40 31: 0x561ad426a051 - std::panicking::try::hcdc6892c5f0dba4c at /rustc/07dca489ac2d933c78d3c5158e3f43beefeb02ce/library/std/src/panicking.rs:516:19 32: 0x561ad426a051 - std::panic::catch_unwind::h4910beb4573f4776 at /rustc/07dca489ac2d933c78d3c5158e3f43beefeb02ce/library/std/src/panic.rs:142:14 33: 0x561ad426a051 - std::rt::lang_start_internal::h6939038e2873596b at /rustc/07dca489ac2d933c78d3c5158e3f43beefeb02ce/library/std/src/rt.rs:148:20 34: 0x561ad2e92e45 - main 35: 0x7f8caf1bfd90 - 36: 0x7f8caf1bfe40 - __libc_start_main 37: 0x561ad2c93725 - _start 38: 0x0 -

zhouqian177 avatar Mar 18 '24 13:03 zhouqian177

Starting needs to download some file from github

thanks chauncyAAA, I can download files from github, but I still can't start the tabby server.

git clone https://github.com/TabbyML/tabby Cloning into 'tabby'... remote: Enumerating objects: 22844, done. remote: Counting objects: 100% (246/246), done. remote: Compressing objects: 100% (172/172), done. remote: Total 22844 (delta 139), reused 131 (delta 69), pack-reused 22598 Receiving objects: 100% (22844/22844), 29.17 MiB | 21.38 MiB/s, done. Resolving deltas: 100% (13498/13498), done. Updating files: 100% (1039/1039), done. Filtering content: 100% (70/70), 20.58 MiB | 3.81 MiB/s, done.

#####still not work docker run -it --gpus all -p 18080:8080 -v /data10/zhouqian/software/tabby/tabbyml-chatWizardCoder-3B/.tabby:/data tabbyml/tabby serve --model TabbyML/WizardCoder-3B thread 'main' panicked at crates/tabby-common/src/registry.rs:82:51: called Result::unwrap() on an Err value: Error(TrackableError { kind: Other, cause: Some(Cause(Os { code: 13, kind: PermissionDenied, message: "Permission denied" })), history: History([Location { module_path: "serdeconv::convert_json", file: "/root/.cargo/registry/src/index.crates.io-6f17d22bba15001f/serdeconv-0.4.1/src/convert_json.rs", line: 53, message: "" }]) }) note: run with RUST_BACKTRACE=1 environment variable to display a backtrace

zhouqian177 avatar Mar 19 '24 06:03 zhouqian177

I get this issue when running docker run -it --entrypoint /opt/tabby/bin/tabby-cpu -v $HOME/.tabby:/data tabbyml/tabby scheduler --now, but not when running docker run -it --gpus all -p 8080:8080 -v $HOME/.tabby:/data tabbyml/tabby serve --model StarCoder-1B --device cuda.

thread 'main' panicked at /root/workspace/crates/tabby-scheduler/src/lib.rs:23:14:
Must be able to retrieve repositories for sync: Config file '/data/config.toml' is missing or not valid

Caused by:
    Other (cause; No such file or directory (os error 2))
    HISTORY:
      [0] at /root/.cargo/registry/src/index.crates.io-6f17d22bba15001f/serdeconv-0.4.1/src/convert_toml.rs:17
    
note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace

Running scheduler without the --now flag worked, though. Perhaps it's something to do with how the flags are parsed?

Geremia avatar May 28 '24 22:05 Geremia

I fixed this issue by putting the config file in the proper place: ~/.tabby/config.toml

Geremia avatar May 28 '24 23:05 Geremia

I have a problem where it seems like it's trying to access GPU when I explicitly ran CPU version. Here's one sample warning message:

⠸     0.243 s   Starting...2024-08-14T15:51:32.862384Z  WARN llama_cpp_server::supervisor: crates/llama-cpp-server/src/supervisor.rs:96: llama-server <embedding> exited with status code 127
2024-08-14T15:51:32.862604Z  WARN llama_cpp_server::supervisor: crates/llama-cpp-server/src/supervisor.rs:108: <embedding>: /opt/tabby/bin/llama-server: error while loading shared libraries: libcuda.so.1: cannot open shared object file: No such file or directory

It says libcuda library failed to load. But why CPU needs CUDA libraries?

barisbogdan avatar Aug 14 '24 15:08 barisbogdan

Hi @barisbogdan - this is an known limitation of llama CPP, for a workaround, please checkout https://github.com/TabbyML/tabby/discussions/2867

wsxiaoys avatar Aug 14 '24 19:08 wsxiaoys