crewAI
crewAI copied to clipboard
Process.hierarchical forces openAI
This works with local Ollama
crew = Crew( agents=[research_agent,summarize_agent], tasks=[task,task1], process=Process.sequential, memory=False, max_rpm=2, share_crew=False, manager_llm=mistral, function_calling_llm=mistral, verbose=2 )
This doesn't
crew = Crew( agents=[research_agent, summarize_agent], tasks=[task, task1], manager_llm=mistral, # Mandatory for hierarchical process process=Process.hierarchical, # Specifies the hierarchical management approach memory=True, max_rpm=2, function_calling_llm=mistral, verbose=2 )
The agents and task code remains same, so looks like Process.hierarchical is forcing openAI ?
You havent specified the manager_llm attribute in your crew - https://docs.crewai.com/core-concepts/Processes/
crew = Crew(
agents=[research_agent, summarize_agent],
tasks=[task, task1],
manager_llm=mistral, # Mandatory for hierarchical process
process=Process.hierarchical, # Specifies the hierarchical management approach
memory=True,
max_rpm=2,
function_calling_llm=mistral,
verbose=2
)
mark memory = False, (when you use memory = True, it creates a rag and stores your data, for storing your data into a rag, it needs an embedding which by default is set to openai.) or you can provide your own embedder
embedder = dict(
provider = "google",
config = dict(
model = 'models/embedding-001'
)
)
Crew(
agents=[improver,expect],
tasks=[improve_prompt_task, answer_question_task],
process=Process.hierarchical,
manager_llm=mistral,
embedder = embedder,
memory=True,
function_calling_llm=mistral
)
Exactly, but you can also set it to use other providers for memory too :): https://docs.crewai.com/core-concepts/Memory/#using-google-ai-embeddings
-- João Moura @joaomdmoura http://twitter.com/joaomdmoura
Em sex., 12 de abr. de 2024 às 01:56, punitchauhan771 < @.***> escreveu:
crew = Crew( agents=[research_agent, summarize_agent], tasks=[task, task1], manager_llm=mistral, # Mandatory for hierarchical process process=Process.hierarchical, # Specifies the hierarchical management approach memory=True, max_rpm=2, function_calling_llm=mistral, verbose=2 )
mark memory = False, (when you use memory = True, it creates a rag and stores your data, for storing your data into a rag, it needs a embedding which by default is set to openai.
— Reply to this email directly, view it on GitHub https://github.com/joaomdmoura/crewAI/issues/462#issuecomment-2050973998, or unsubscribe https://github.com/notifications/unsubscribe-auth/AAFC3N7REAGOV56QEXELVXDY45SOPAVCNFSM6AAAAABGCQXSTKVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDANJQHE3TGOJZHA . You are receiving this because you are subscribed to this thread.Message ID: @.***>
This issue is stale because it has been open for 30 days with no activity. Remove stale label or comment or this will be closed in 5 days.
This issue was closed because it has been stalled for 5 days with no activity.