langflow icon indicating copy to clipboard operation
langflow copied to clipboard

Ollama Error

Open VinojRaj opened this issue 9 months ago • 9 comments

Discussed in https://github.com/langflow-ai/langflow/discussions/1804

Originally posted by VinojRaj April 30, 2024 I am new to Langflow and I was trying to use Llama2 through Ollama as the model but I am getting the following error: ValueError: Error building vertex Ollama: ChatOllamaComponent.build() missing 1 required positional argument: 'base_url'

The base url is default on http://localhost:11434/

VinojRaj avatar Apr 30 '24 17:04 VinojRaj

I have the same ...

qwaszaq avatar Apr 30 '24 17:04 qwaszaq

it seems to be working now :)

qwaszaq avatar May 01 '24 14:05 qwaszaq

Hi, I'm no expert in langflow but I had a similar issue. Can you describe your langflow deployment? Are you using docker or running on your local machine? Have you tried to make post requests using postman to Ollama?

  • Ollama api: https://github.com/ollama/ollama/blob/main/docs/api.md

In my case it was a network related error, and I was running langflow and ollama in separate docker containers. Nothing that creating a network and running both in the same network can't fix. But I would need more details to tell what could be happening on your side (reference images on the error would be much appreciated).

AngelDPena avatar May 01 '24 14:05 AngelDPena

I bet this is an update that might have changed a bug or any other issue I am not familiar with. I have Ollama running on http://127.0.0.1:11434, which is typical, but yesterday there was an error, at midnight it changed ;). One remark, I am not running the app in a container...

qwaszaq avatar May 01 '24 14:05 qwaszaq

I am running langflow locally on my computer. I have the same problem. What exactly is the base_url?

RaminParker avatar May 03 '24 13:05 RaminParker

same problem here. Trying to run a rag flow using ollama.

JavierCCC avatar May 03 '24 21:05 JavierCCC

my friends, here is what helped for me: Bring both container (langflow and ollama) in the same network:

  1. create new network "m-net" docker network create my-net
  2. find container names (last column, withe ollama and docker_example-langflow-1): docker ps
  3. add first container docker network connect my-net ollama 4.add second container docker network connect my-net docker_example-langflow-1 5.use container name as baseUrl: e.g ‘ollama:11434’ for ollama hope that helps.

maga868 avatar May 23 '24 18:05 maga868

If you are running Ollama in the taskbar, exit out of it. Then go to terminal and start ollama server. It will work.

adnanbwp avatar May 24 '24 08:05 adnanbwp

Hello, Sorry for the delay. Did you try using the new version? Does the error still persist?

anovazzi1 avatar Jul 01 '24 17:07 anovazzi1

Hi @VinojRaj

We hope you're doing well. Just a friendly reminder that if we do not hear back from you within the next 3 days, we will close this issue. If you need more time or further assistance, please let us know.


Thank you for your understanding!

carlosrcoelho avatar Jul 17 '24 14:07 carlosrcoelho

Thank you for your contribution! This issue will be closed. If you have any questions or encounter another problem, please open a new issue and we will be ready to assist you.

carlosrcoelho avatar Jul 22 '24 00:07 carlosrcoelho