PraisonAI icon indicating copy to clipboard operation
PraisonAI copied to clipboard

Error 404 looking for `gpt-4o` when configured to use `ollama`.

Open PieBru opened this issue 1 year ago • 1 comments

Hi Mervin,

I am on Arch Linux, installed PraisonAI using:

cd Git
git clone https://github.com/MervinPraison/PraisonAI
cd PraisonAI

uv venv
source .venv/bin/activate
uv pip install praisonai
praisonai --help

# Ollama
export OPENAI_API_KEY=none
export OPENAI_BASE_URL="http://localhost:11434/v1"
export OPENAI_MODEL_NAME="llama3.2"

praisonai

The console log says:

...
           INFO     [16:46:43] migration.py:207 INFO Context impl SQLiteImpl.                                                migration.py:207
           INFO     [16:46:43] migration.py:210 INFO Will assume non-transactional DDL.                                      migration.py:210
           WARNING  [16:46:43] agents_generator.py:176 WARNING Error loading tools from                               agents_generator.py:176
                    /run/media/piero/NVMe-4TB/Piero/Git/PraisonAI/tools.py: No module named 'duckduckgo_search'                              
           INFO     [16:46:43] agents_generator.py:215 INFO Spec: ModuleSpec(name='tools',                            agents_generator.py:215
                    loader=<_frozen_importlib_external.SourceFileLoader object at 0x76d398373b50>,                                           
                    origin='/run/media/piero/NVMe-4TB/Piero/Git/PraisonAI/tools.py')                                                         
           WARNING  [16:46:43] agents_generator.py:238 WARNING Error loading tools from tools.py: No module named     agents_generator.py:238
                    'duckduckgo_search'                                                                                                      
           INFO     [16:46:43] agents_generator.py:542 INFO Loaded tools: []                                          agents_generator.py:542
           INFO     [16:46:43] agents_generator.py:577 INFO Created agent Researcher with tools: []                   agents_generator.py:577
           INFO     [16:46:43] agents_generator.py:599 INFO Created task research_task with tools: []                 agents_generator.py:599
           INFO     [16:46:43] agents_generator.py:577 INFO Created agent Narrative Designer with tools: []           agents_generator.py:577
           INFO     [16:46:43] agents_generator.py:599 INFO Created task story_concept_development with tools: []     agents_generator.py:599
           INFO     [16:46:43] agents_generator.py:577 INFO Created agent Scriptwriter with tools: []                 agents_generator.py:577
           INFO     [16:46:43] agents_generator.py:599 INFO Created task scriptwriting_task with tools: []            agents_generator.py:599
           INFO     [16:46:43] agents.py:84 INFO Executing task 0: Research about Mars, its environment, and the feasibility of  agents.py:84
                    a cat being on Mars. Also, research about cat behavior and characteristics. using Researcher                             
╭─────────────────────────────────────────────────────────────── Instruction ───────────────────────────────────────────────────────────────╮
│ Agent Researcher is processing prompt:                                                                                                    │
│ You need to do the following task: Research about Mars, its environment, and the feasibility of a cat being on Mars. Also, research about │
│ cat behavior and characteristics..                                                                                                        │
│ Expected Output: Document with research findings on Mars and cats..                                                                       │
│         Please provide only the final result of your work. Do not add any conversation or extra explanation.                              │
╰───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯
           INFO     [16:46:43] _client.py:1025 INFO HTTP Request: POST http://localhost:11434/v1/chat/completions "HTTP/1.1   _client.py:1025
                    404 Not Found"                                                                                                           
╭────────────────────────────────────────────────────────────────── Error ──────────────────────────────────────────────────────────────────╮
│ Error in chat completion: Error code: 404 - {'error': {'message': 'model "gpt-4o" not found, try pulling it first', 'type': 'api_error',  │
│ 'param': None, 'code': None}}                                                                                                             │
╰───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯
           INFO     [16:46:43] agents.py:159 INFO Task 0 not completed, retrying                                                agents.py:159
[16:46:44] INFO     [16:46:44] agents.py:84 INFO Executing task 0: Research about Mars, its environment, and the feasibility of  agents.py:84
                    a cat being on Mars. Also, research about cat behavior and characteristics. using Researcher                             
...

As you see, it gives 404 looking for gpt-4o on the ollama URL. Thank you, Piero

PieBru avatar Dec 28 '24 15:12 PieBru

I was having the same issue. In the code use the following: manager_llm="llama3"

ciaotesla avatar Dec 30 '24 13:12 ciaotesla

I am experiencing the same problem. I can get the agents to function as expected when only exporting my OpenAI key, however exporting any of the OLLAMA variables fails. @ciaotesla should setting that variable be done in an .env file? I installed my agents in an Ubuntu 22.04 system through pipx. thanks for the help.

farkey avatar Jan 03 '25 23:01 farkey

Use MODEL_NAME="llama3"

gityeop avatar Jan 04 '25 16:01 gityeop

@PieBru It should work now. there was a small bug. now its fixed.

Ollama

export OPENAI_API_KEY=none
export OPENAI_BASE_URL="http://localhost:11434/v1"
export OPENAI_MODEL_NAME="llama3.2"

MervinPraison avatar Jan 06 '25 13:01 MervinPraison