crewAI icon indicating copy to clipboard operation
crewAI copied to clipboard

Error "Co-worker mentioned not found..." when using with local llama3

Open italovieira opened this issue 1 year ago • 6 comments

With the example below I get the following error:

Error executing tool. Co-worker mentioned not found, it must to be one of the following options:
- pilot
from crewai import Agent, Task, Crew

from langchain_community.llms import Ollama

import os

os.environ["OPENAI_API_KEY"] = "NA"


llm = Ollama(model="llama3")


# Agents
luke = Agent(
    role="pilot",
    goal="Destroy the Death Star",
    backstory="The young destined-to-be-Jedi pilot, summoned to attack the Death Star.",
    llm=llm,
)

leia = Agent(
    role="strategist",
    goal="Coordinate the attack on the Death Star",
    backstory="The Rebel leader, essential for strategy and communication.",
    llm=llm,
)

# Tasks
coordinate_attack = Task(
    description="""Leia must coordinate the mission,
    maintaining communication and providing strategic support.
    Leia must ensure that everything is in order, providing a safe path for Luke""",
    expected_output="""Successfully coordinated attack, Death Star destroyed. All units informed and aligned.""",
    agent=leia,
    allow_delegation=True,
)


destroy_death_star = Task(
    description="""Luke must pilot his X-Wing and shoot at the Death Star's weak point to destroy it.""",
    expected_output="""Death Star destroyed, mission successful.""",
    agent=luke,
)


# Crews
rebel_alliance = Crew(
    agents=[leia, luke],
    tasks=[coordinate_attack, destroy_death_star],
    verbose=2,
)


rebel_alliance.kickoff()

italovieira avatar May 15 '24 05:05 italovieira

There are multiple issues with your code.

  1. manager llm only works with proccess.hierarchical
  2. it is generally recommended to use English for promoting and then just instruct to "translate" the response
  3. When using Ollama use the ollama model class not OpenAI from langchain_community.llms import Ollama

If i havn't missed anything, the coworker not being found is either because points 1-3 or because llama3 is "too dumb"

noggynoggy avatar May 15 '24 08:05 noggynoggy

Actually this might be related to #602

noggynoggy avatar May 15 '24 08:05 noggynoggy

There are multiple issues with your code.

1. manager llm only works with proccess.hierarchical

2. it is generally recommended to use English for promoting and then just instruct to "translate" the response

3. When using Ollama use the ollama model class not OpenAI `from langchain_community.llms import Ollama `

I've updated the code in the description with what you indicated.

For the point 3, I used ChatOpenAI just like the example in the doc https://docs.crewai.com/how-to/LLM-Connections/#ollama-integration-ex-for-using-llama-2-locally, though.

Either way, the error still occurs.

italovieira avatar May 15 '24 10:05 italovieira

This might be a problem in how the ollama or langchain outputs the steps for the agents.

But I did a bisect and found out crewAI was able to cope with that before https://github.com/joaomdmoura/crewAI/commit/0b781065d277564077fdaf630d46995c210cc9d1. It was only after this commit this error started.

italovieira avatar May 15 '24 10:05 italovieira

Hey @italovieira have you been able to fix it? Getting same issue :(

Yazington avatar May 17 '24 02:05 Yazington

Hey @italovieira have you been able to fix it? Getting same issue :(

I've opened a MR to fix this issue, but it's not merged yet.

italovieira avatar May 18 '24 01:05 italovieira

Without logs it's hard to figure out the reason. I've had the same problem when LLM (Mistral 0.3 in my case) return action input key as co-worker instead of required coworker and this caused the same misleading error about absent agent. I fix it in this PR. Already tested in my fork - it works fine. Maybe you can try this fix and see the result....

madmag77 avatar May 30 '24 07:05 madmag77

Sorry to ask, maybe I missed something. Did you solved this issue with an workaround? If yes can please let me know how ? Thanks in advance!

psyq0 avatar Jun 03 '24 14:06 psyq0

@psyq0 If you asking about co-worker problem then yes, I made a fork and raised a PR to the this repo but it's not merged yet. I managed to make example with tools and delegation work on open source LLMs - this is the example. It works with LM Studio + Mistral 0.3, Phi3 medium and Llama 3 7B, also with Ollama + Llama 3 7B.

Hope it helps.

madmag77 avatar Jun 03 '24 14:06 madmag77

Sorry to ask, maybe I missed something. Did you solved this issue with an workaround? If yes can please let me know how ? Thanks in advance!

For me the fix from @madmag77 didn't work. It seems that for me the reason it is not able to match the co-worker role is that i got an extra " in the role name.

Comparing available_agents to agent in agent_tools.py :

"senior research analyst
['senior research analyst', 'tech content strategist', 'french translator']

So as a dirty fix i replaced this line with the following one :

if available_agent.role.casefold().strip() == agent.casefold().strip().replace('"', '')

And now it works with llama3:8b without issue.

A bit dirty but i hope it helps waiting for a fix.

For reference this is how i initialize the agent :

def initialize_agent(agent):
    initialized_agent = crewai.Agent(
        role=agent.role,
        goal=agent.goal,
        backstory=agent.backstory,
        verbose=agent.verbose,
        allow_delegation=agent.allow_delegation,
        llm = ChatOllama(
                    model = 'llama3:8b",
                    base_url = "http://localhost:11434"
         )

    )
    return initialized_agent

Sisif-eu avatar Jun 04 '24 01:06 Sisif-eu

This issue is stale because it has been open for 30 days with no activity. Remove stale label or comment or this will be closed in 5 days.

github-actions[bot] avatar Aug 17 '24 12:08 github-actions[bot]

This issue was closed because it has been stalled for 5 days with no activity.

github-actions[bot] avatar Aug 23 '24 12:08 github-actions[bot]