ModuleNotFoundError: No module named 'ollama'
Hi guys!
My code stopped to work and now I am receiving the error:
ModuleNotFoundError: No module named 'ollama'
I am running on Google Collab and below you can see the part with problem:
import os
import json
import requests
import warnings
import openrouteservice
from crewai import Agent, Task, Crew, Process
from crewai_tools import tool, SerperDevTool, JSONSearchTool, PGSearchTool
from langchain_groq import ChatGroq
import google.generativeai as genai
...
llama3 = ChatGroq(
api_key="<add_your_key_here>",
model="llama3-70b-8192"
)
...
json_tool = JSONSearchTool(
json_path='/content/hotels.json',
config={
"llm": {
"provider": "ollama",
"config": {
"model": "llama3"
},
},
"embedder": {
"provider": "google",
"config": {
"model": "models/embedding-001",
"task_type": "retrieval_document"
}
}
}
)
Hi Lusabo,
I see you are using an LLM from GROQ cloud API, so in your json_tool, llm provider should be "groq" and not "ollama". Hope this resolves your issue.
Hi! Anyway, if you need to, you can do a pip install ollama, if you use ollama.
check whether the ollama library installed, if not install it. https://pypi.org/project/ollama/
This issue is stale because it has been open for 30 days with no activity. Remove stale label or comment or this will be closed in 5 days.
This issue was closed because it has been stalled for 5 days with no activity.
I have the same issue. I have this code: import ollama desiredModel='llama3.2:3b' userInput=input("Ask me anything... \n")
ai_response = ollama.chat(model=desiredModel, messages=[ { 'role': 'user', 'content': userInput, }, ])
OllamaResponse=response['message']['content']
print(f"The Ai responded {OllamaResponse}")
with open("OutputOllama.txt", "w", encoding="utf-8") as text_file: text_file.write(OllamaResponse) but I get the ModuleNotFound no module named ollama