Sailfish
Sailfish
按教程容器部署后,始终无法连接本地Ollama的端口,是需要特殊的配置吗? 确认Ollama运行及服务正常。 
### What problem does this PR solve? Add automation scripts to support displaying environment information such as RAGFlow repository version, operating system, Python version, etc. in a Linux environment for...
### Is there an existing issue for the same feature request? - [X] I have checked the existing issues. ### Is your feature request related to a problem? ```Markdown When...
### Is there an existing issue for the same bug? - [X] I have checked the existing issues. ### Branch name main ### Commit ID 2afe7a7 ### Other environment information...
不管改成多少,都只输出很短一节内容,如图所示。 使用4卡部署,启动参数为:python run_demo.py --model-path "/home/dl/data/codegeex2-6b-model" --n-gpus 4 
已按要求完成环境部署,按照文档 How to deploy a local demo? 的说明运行时出现以下异常。 (internvl) yushen@user-MS-7E06:~/ai/InternVL/internvl_chat_llava$ python -m llava.serve.gradio_web_server --controller http://0.0.0.0:10000 --model-list-mode reload --device auto 2024-05-24 15:14:59 | ERROR | stderr | Traceback (most recent call...