Bowen
Bowen
Hi @AI-Zebra . The logs show that localAI cannot load model successfully. And the model does not exist in `stat /build/models/gpt-3.5-turbo: no such file or directory`.
Have you checked if the model already exists in the path `models/`
Happy for @mudler sharing more detail here.
Hi, guys. Thanks for your discussion. I got some useful info, cool. In my case, I have the medical data of different vital signs for multiple patients. These are multivariate...
The process of rust backend: - [x] The basic framework of Rust gRPC backend (Maybe still some issues, will fixed by other commits) - [ ] Implement with burn (Working...
An idea of choosing default burn backend for Rust backend #1219
Get stuck in some issues like below(only in debug mode), it may related to the Rust implement PyTorch C++ API. ```bash dyld[15803]: Library not loaded: @rpath/libtorch_cpu.dylib Referenced from: /Users/tifa/Downloads/workspace/LocalAI/backend/rust/target/debug/deps/server-bc3eca19368e3b4a Reason:...
> it seems to look for libtorch and fails to find it. if you use the ndarray backend does it work? Will try it and give a feedback # Update...
> On the M1 probably the wgpu backend is the nicest to use, but ndarray is the one that does not depend on the host system. Thanks a lot. I...
The error message in the log is `/opt/conda/lib/libcurl:no version information available(required by /usr/bin/cmake)`. And it indicates that the version information is not available for the `libcurl` library required by make....