crewAI icon indicating copy to clipboard operation
crewAI copied to clipboard

ModuleNotFoundError: No module named 'ollama'

Open lusabo opened this issue 1 year ago • 3 comments

Hi guys!

My code stopped to work and now I am receiving the error:

ModuleNotFoundError: No module named 'ollama'

I am running on Google Collab and below you can see the part with problem:

import os
import json
import requests
import warnings
import openrouteservice
from crewai import Agent, Task, Crew, Process
from crewai_tools import tool, SerperDevTool, JSONSearchTool, PGSearchTool
from langchain_groq import ChatGroq
import google.generativeai as genai
...
llama3 = ChatGroq(
    api_key="<add_your_key_here>",
    model="llama3-70b-8192"
)
...
json_tool = JSONSearchTool(
    json_path='/content/hotels.json',
    config={
        "llm": {
            "provider": "ollama",
            "config": {
                "model": "llama3"
            },
        },
        "embedder": {
            "provider": "google",
            "config": {
                "model": "models/embedding-001",
                "task_type": "retrieval_document"
            }
        }
    }
)

lusabo avatar Jun 22 '24 10:06 lusabo

Hi Lusabo,

I see you are using an LLM from GROQ cloud API, so in your json_tool, llm provider should be "groq" and not "ollama". Hope this resolves your issue.

NPriyankaDS avatar Jun 27 '24 17:06 NPriyankaDS

Hi! Anyway, if you need to, you can do a pip install ollama, if you use ollama.

cccadet avatar Jul 12 '24 12:07 cccadet

check whether the ollama library installed, if not install it. https://pypi.org/project/ollama/

LaksLaksman avatar Aug 08 '24 05:08 LaksLaksman

This issue is stale because it has been open for 30 days with no activity. Remove stale label or comment or this will be closed in 5 days.

github-actions[bot] avatar Sep 07 '24 12:09 github-actions[bot]

This issue was closed because it has been stalled for 5 days with no activity.

github-actions[bot] avatar Sep 12 '24 12:09 github-actions[bot]

I have the same issue. I have this code: import ollama desiredModel='llama3.2:3b' userInput=input("Ask me anything... \n")

ai_response = ollama.chat(model=desiredModel, messages=[ { 'role': 'user', 'content': userInput, }, ])

OllamaResponse=response['message']['content']

print(f"The Ai responded {OllamaResponse}")

with open("OutputOllama.txt", "w", encoding="utf-8") as text_file: text_file.write(OllamaResponse) but I get the ModuleNotFound no module named ollama

ABelly99 avatar Jun 21 '25 04:06 ABelly99