[Bug] lobechat 调用 ollama 的嵌入模型出错
📦 部署环境
Docker
📌 软件版本
1.55.4
💻 系统环境
Windows
🌐 浏览器
Edge
🐛 问题描述
调用 ollama 的嵌入模型出错
📷 复现步骤
.env
OLLAMA_PROXY_URL=http://xxx:11434/v1
DEFAULT_FILES_CONFIG=embedding_model=ollama/bge-m3:latest
web中调用ollama 聊天模型可以正常使用,但是向量化的时候失败 lobe-chat | message: '{"error":{"message":"Error 429: Too Many Requests","name":"ResponseError","status_code":429},"errorType":"OllamaBizError","provider":"ollama"}', lobe-chat | name: 'EmbeddingError' lobe-chat | }
🚦 期望结果
No response
📝 补充信息
No response
👀 @agluo
Thank you for raising an issue. We will investigate into the matter and get back to you as soon as possible.
Please make sure you have given us as much context as possible.
非常感谢您提交 issue。我们会尽快调查此事,并尽快回复您。 请确保您已经提供了尽可能多的背景信息。
Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑🤝🧑👫🧑🏿🤝🧑🏻👩🏾🤝👨🏿👬🏿
📦 Deployment environment
Docker
📌 Software version
1.55.4
💻 System environment
Windows
🌐 Browser
Edge
🐛 Question description
An error occurred when calling ollama's embed model
📷 Reproduction steps
.env
OLLAMA_PROXY_URL=http://xxx:11434
DEFAULT_FILES_CONFIG=embedding_model=ollama/bge-m3:latest
Calling the ollama chat model can be used normally, but failed when vectoring lobe-chat | embeddingChunks error { lobe-chat | message: '{"error":{"message":"[object Object]","name":"ResponseError","status_code":404},"errorType":"OllamaBizError","provider" :"ollama"}', lobe-chat | name: 'EmbeddingError' lobe-chat | }
🚦 Expected results
No response
📝 Supplementary information
No response
@agluo 你好!我是Dosu,一个可以帮助你解决问题,回答疑问,并成为贡献者的机器人。我会在等待人类维护者的过程中帮助你。
The issue you're encountering with the embedding model call to Ollama resulting in a 404 error is likely due to an incorrect proxy URL configuration. This is a known issue where the URL configuration can lead to errors in the application [1].
To resolve this, you should remove the OLLAMA_PROXY_URL configuration from your settings, as it can cause URL construction errors. If you need to use a proxy, ensure that the URL is correctly formatted and includes '/v1' at the end, like so:
OLLAMA_PROXY_URL=http://[your-ip-address]:11434/v1
DEFAULT_FILES_CONFIG=embedding_model=ollama/bge-m3:latest
Make sure to verify the URL format to prevent similar issues. This should help in resolving the 404 error when using the embedding model.
To continue talking to Dosu, mention @dosu.
Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Bug Report | Other
docker部署添加这个参数再试下看--add-host=host.docker.internal:host-gateway
遇到了同样的问题,提示404
Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑🤝🧑👫🧑🏿🤝🧑🏻👩🏾🤝👨🏿👬🏿
I encountered the same problem, prompt 404
我也是同样问题,用curl去请求ollama嵌入接口是没问题的
Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑🤝🧑👫🧑🏿🤝🧑🏻👩🏾🤝👨🏿👬🏿
I have the same problem, it is no problem to use curl to request the ollama embed interface
遇到了同样的问题
Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑🤝🧑👫🧑🏿🤝🧑🏻👩🏾🤝👨🏿👬🏿
Encountered the same problem
📦 部署环境
Docker
📌 软件版本
1.55.4
💻 系统环境
Windows
🌐 浏览器
Edge
🐛 问题描述
调用 ollama 的嵌入模型出错
📷 复现步骤
.env
OLLAMA_PROXY_URL=http://xxx:11434/v1 DEFAULT_FILES_CONFIG=embedding_model=ollama/bge-m3:latestweb中调用ollama 聊天模型可以正常使用,但是向量化的时候失败 lobe-chat | message: '{"error":{"message":"Error 429: Too Many Requests","name":"ResponseError","status_code":429},"errorType":"OllamaBizError","provider":"ollama"}', lobe-chat | name: 'EmbeddingError' lobe-chat | }
🚦 期望结果
No response
📝 补充信息
No response
去掉 /v1
Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑🤝🧑👫🧑🏿🤝🧑🏻👩🏾🤝👨🏿👬🏿
📦 Deployment Environment
Docker
📌 Software Version
1.55.4
💻 System environment
Windows
🌐 Browser
Edge
🐛 Question description
An error occurred when calling ollama's embed model
📷 Reproduction steps
.env
OLLAMA_PROXY_URL=http://xxx:11434/v1 DEFAULT_FILES_CONFIG=embedding_model=ollama/bge-m3:latestCalling the ollama chat model in the web can be used normally, but failed in vectorization lobe-chat | message: '{"error":{"message":"Error 429: Too Many Requests","name":"ResponseError","status_code":429},"errorType":"OllamaBizError","provider":"ollama"}', lobe-chat | name: 'EmbeddingError' lobe-chat | }
🚦 Expected results
No response
📝 Supplementary information
No response
Remove /v1