langflow
langflow copied to clipboard
Ollama Error
Discussed in https://github.com/langflow-ai/langflow/discussions/1804
Originally posted by VinojRaj April 30, 2024 I am new to Langflow and I was trying to use Llama2 through Ollama as the model but I am getting the following error: ValueError: Error building vertex Ollama: ChatOllamaComponent.build() missing 1 required positional argument: 'base_url'
The base url is default on http://localhost:11434/
I have the same ...
it seems to be working now :)
Hi, I'm no expert in langflow but I had a similar issue. Can you describe your langflow deployment? Are you using docker or running on your local machine? Have you tried to make post requests using postman to Ollama?
- Ollama api: https://github.com/ollama/ollama/blob/main/docs/api.md
In my case it was a network related error, and I was running langflow and ollama in separate docker containers. Nothing that creating a network and running both in the same network can't fix. But I would need more details to tell what could be happening on your side (reference images on the error would be much appreciated).
I bet this is an update that might have changed a bug or any other issue I am not familiar with. I have Ollama running on http://127.0.0.1:11434, which is typical, but yesterday there was an error, at midnight it changed ;). One remark, I am not running the app in a container...
I am running langflow locally on my computer. I have the same problem.
What exactly is the base_url
?
same problem here. Trying to run a rag flow using ollama.
my friends, here is what helped for me: Bring both container (langflow and ollama) in the same network:
- create new network "m-net" docker network create my-net
- find container names (last column, withe ollama and docker_example-langflow-1): docker ps
- add first container docker network connect my-net ollama 4.add second container docker network connect my-net docker_example-langflow-1 5.use container name as baseUrl: e.g ‘ollama:11434’ for ollama hope that helps.
If you are running Ollama in the taskbar, exit out of it. Then go to terminal and start ollama server. It will work.
Hello, Sorry for the delay. Did you try using the new version? Does the error still persist?
Hi @VinojRaj
We hope you're doing well. Just a friendly reminder that if we do not hear back from you within the next 3 days, we will close this issue. If you need more time or further assistance, please let us know.
Thank you for your understanding!
Thank you for your contribution! This issue will be closed. If you have any questions or encounter another problem, please open a new issue and we will be ready to assist you.