[Question]: How to setup cloud OLLAMA API in demo ragflow.
Self Checks
- [x] I have searched for existing issues search for existing issues, including closed ones.
- [x] I confirm that I am using English to submit this report (Language Policy).
- [x] Non-english title submitions will be closed directly ( 非英文标题的提交将会被直接关闭 ) (Language Policy).
- [x] Please do not modify this template :) and fill in all the required fields.
Describe your problem
I try to setup the ollama AI model on demo ragflow : https://demo.ragflow.io/user-setting/model seems it could not be added successfully. it pop up the error, please see the attachment.
Could you help to check it and give me some suggestions ?
To set up the Ollama cloud API in the RAGFlow demo, make sure you fill out the model configuration form with these key details:
- Model type: Select the type that matches your use case (e.g., "chat" or "embedding").
- Model name: Enter the exact name as shown by
ollama list, including any version tags (for example,gpt-oss:20b-cloud). If the name doesn't match, RAGFlow won't connect to the model reference. - Base URL: Use the public endpoint of your Ollama cloud instance (not
localhostor127.0.0.1). The URL must be accessible from the RAGFlow demo environment. For example,https://your-ollama-cloud-host:11434reference. - API key: Enter your valid Ollama cloud API key. This is required for authentication; if missing or incorrect, you'll get an AUTH_ERROR (litellm.APIConnectionError) reference.
- Max tokens: Set this to the maximum output tokens you want for responses.
Troubleshooting AUTH_ERROR:
- Double-check that your API key is correct and not empty. The backend sets the Authorization header as
Bearer {api_key}for Ollama cloud requests reference. - Make sure the base URL points to a reachable Ollama cloud endpoint, not a local address.
- Confirm your Ollama instance is running and accessible from the internet, and that any firewall rules allow inbound connections on the required port.
- If you still see errors, try updating to the latest RAGFlow version or use the nightly Docker image, as recent releases have fixed several Ollama integration bugs reference.
If you follow these steps and provide the correct API key and base URL, you should be able to add your Ollama cloud model successfully. If you run into further issues, check the error message for clues—most problems are due to authentication or network connectivity.
To reply, just mention @dosu.
How did I do? Good | Irrelevant | Incorrect | Verbose | Hallucination | Report 🐛 | Other
what about https://ollama.com:11434/? And you might also need an API KEY for ollama cloud services
refer to https://docs.ollama.com/cloud
Yes, I also set the API KEY for ollama cloud services, but it doesn't works. I am not sure if the URL is correct or not.
Is there someone use the ollama cloud services for ragflow ? Could you provide a reference configuration for me? My configuration is the following:
Model type: chat Model name: gpt-oss:20b-cloud Base url: https://ollama.com/ APK key: my ollama APK key