Langflow and Ollama
Bug Description
his seems to be happening because I cannot specify a model in the Ollama component due to the uneditable "Model" field.
Steps to Reproduce:
- Add an "Ollama" component to a new flow.
- Try to click and type in the "Model" field.
Expected Behavior: The "Model" field should be editable, allowing the user to select or input an Ollama model name.
Actual Behavior: The "Model" field is not editable.
Environment: (You can add details about your Langflow version, how you installed it, etc., if you know them)
Thank you for looking into this.
Reproduction
Issue: Unable to edit the "Model" field in the Ollama component
Description:
I am trying to use the Ollama component in Langflow to connect to a locally running Ollama instance. I have confirmed that Ollama is running correctly and accessible (e.g., ollama list works).
I can enter the Ollama URL (e.g., http://localhost:11434) in the "Ollama URL" field of the component. However, the "Model" field remains uneditable, preventing me from selecting or typing in a model name.
When I try to run a basic flow with a "Text Input" connected to the Ollama component (with the Ollama URL set) and then to a "Text Output", I get the following error:
Expected behavior
Thank you for your support!
Who can help?
No response
Operating System
Ubuntu 22.04 and the new 25 have been tried
Langflow Version
Environment: Langflow version: Unknown (unable to find in the UI) Installed via: Docker (if that's how you installed it)
Python Version
3.12
Screenshot
No response
Flow File
No response
The "Model" field in the Ollama component is designed to be a dropdown that dynamically lists available models from your Ollama server. It is not directly editable because it expects to fetch the list of models from the Ollama API endpoint /api/tags. If this API call fails or returns an empty list, the dropdown remains uneditable and empty, which matches the behavior you're seeing.
Here’s what’s happening under the hood:
- The Ollama component tries to connect to your Ollama server (defaulting to
http://localhost:11434if not specified) and fetches available model names from/api/tags. - If the server is unreachable from the Langflow environment (for example, due to Docker networking issues), or if the API response is empty, the "Model" field will not populate and will remain uneditable.
- There is a refresh button next to the field, which attempts to re-fetch the model list in real time. If this still doesn’t populate the dropdown, it’s likely a connectivity or API issue between Langflow and your Ollama instance.
Recent updates addressed a related bug where model names were not being fetched due to a code issue, which was fixed in a recent pull request. If you are running an older version of Langflow, you may want to update to the latest version to ensure this fix is included [source].
Troubleshooting steps:
- Make sure your Ollama server is running and accessible from inside the Docker container running Langflow. Sometimes,
localhostinside Docker refers to the container itself, not your host machine. Try usinghost.docker.internalor your machine’s IP address in the "Ollama URL" field. - Click the refresh button next to the "Model" field after entering the correct URL.
- If the field is still uneditable, check the Ollama server logs and ensure the
/api/tagsendpoint is returning a list of models.
References:
- The "Model" field is implemented as a dropdown that is populated by fetching model names from the Ollama API [source].
- If the API call fails, the field remains uneditable and an error is raised [source].
- Documentation on using the Ollama component and model selection [source].
If you continue to have issues, please check your Docker networking setup and ensure the Ollama API is accessible from within the Langflow container.
To continue talking to Dosu, mention @dosu-bot.
Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Bug Report | Other
Not sure if this is related, but I have the same problem with the Ollama Embeddings.
The model list populates for me on the "Ollama Models", but not on the "Ollama Embeddings". I do see /api/tags being invoked on the Ollama instance, but the models list does not populate.
I haven't used Ollama Embedding recently, but after analyzing the recent changes, I found that in Langflow, there is an internal dictionary that distinguishes between embedding models and chat models based on their type. This dictionary is used to filter the models. When I tested it yesterday, I confirmed that the chat models were correctly displayed in the model list. As for the model list fetching in the OllamaChatComponent, I implemented that feature myself—it directly uses Ollama’s List API. So if the model list is not loading, it may be due to a network issue.
I will add a feature in the next version that allows users to optionally enter the model name manually. i also found it inconvenient when using our own custom models.
@rickelmore Can you please add more information on what the actual error is ? Also specify which langflow version have you been using ? Also did you try toggling off the tool mode which helps you load more models in the dropdown?