Remember

Results 18 comments of Remember

> 试下开虚拟环境,页面上可以开启。 开了虚拟环境,仍然报错; 并且尝试了python -c "from sentence_transformers import SentenceTransformer; model = SentenceTransformer('BAAI/bge-m3'); print('Model loaded successfully')"手动导入模型,是成功的,但就是没办法在xinference上加载模型

Traceback (most recent call last): File "/data/conda/envs/xinference_env/lib/python3.11/site-packages/xinference/core/utils.py", line 93, in wrapped ret = await func(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/data/conda/envs/xinference_env/lib/python3.11/site-packages/xinference/core/worker.py", line 1140, in launch_builtin_model await model_ref.load() File "/data/conda/envs/xinference_env/lib/python3.11/site-packages/xoscar/backends/context.py", line 262, in...

I just solved this problem. It is because the ragflow and ollama's IP is different. You just need to re-add the model in model providers and use the your IP(http://ip:11434)

you cansolve this issue via the following instructions: sudo nano /etc/systemd/system/ollama.service Environment="OLLAMA_HOST=0.0.0.0" sudo systemctl daemon-reload sudo systemctl status ollama sudo netstat -tuln | grep 11434 test: curl http://:11434

hi,could you share your python file using HTTP API?

how you solve the login issue when you using http API? I can't obtain the response data. please help me! thanks.

> What's the version of the docker image? Actually,it is the Dev version. which version is more reliable?