Jobs_Applier_AI_Agent icon indicating copy to clipboard operation
Jobs_Applier_AI_Agent copied to clipboard

[BUG]: Uses Gpt4o when set to run Ollama with Llama 3.2 in config.yaml file

Open sonicnerd14 opened this issue 4 months ago • 13 comments

Describe the bug

The program seems to have a bug where it doesn't know which model you've stated you want to use. I've tried Llama 3.2 with Ollama and Gemini, but when trying to run either of these AI hawk insists to use GPT 4o. You can not omit a API key, or otherwise the program will not run. Even if you try using Ollama which requires no API key and only URL. Inputting a Gemini key does not work as any Gemini key seems to be invalid according to the script. However, I would like to use ollama when possible, and I would like to figure out a fix to get it to run and not GPT 4o.

Steps to reproduce

  1. Modify config.yaml file with the following:

llm_model_type: ollama llm_model: 'llama3.2' llm_api_url: http://127.0.0.1:11434/

  1. Save and Run AI Hawk

Expected behavior

Running Llama 3.2 with Ollama

Actual behavior

Running GPT 4o mini

Branch

None

Branch name

No response

Python version

No response

LLM Used

No response

Model used

No response

Additional context

No response

sonicnerd14 avatar Oct 24 '24 04:10 sonicnerd14