crewAI-tools icon indicating copy to clipboard operation
crewAI-tools copied to clipboard

TXTSearchTool with ollama

Open fre391 opened this issue 1 year ago • 5 comments

Search is NOT limited to given txt file.

` from crewai_tools import TXTSearchTool

txt_search_tool = TXTSearchTool( txt="kunst.txt", config=dict( llm=dict( provider="ollama", config=dict( model="llama3.1", ), ), embedder=dict( provider="ollama", config=dict( model="mxbai-embed-large", ), ), ) ) `

No error message and at the end the best result is shown, but in between (Verbose) it will also show snippets from other sources, f.e. of a PDF which was searched by XMLSearchTool earlier by using a seperate script....

fre391 avatar Aug 08 '24 14:08 fre391

@fre391, nice solution, though I am not yet getting it to work. I am trying to apply the text tool to a local file like so:

tool_txt_search = TXTSearchTool(
    txt=path/'nonsense.txt',
    config=config, 
    verbose=True
)

My config:

config=dict(
    llm=dict(
        provider="ollama",  # Change this to your LLM provider
        config=dict(
            model="llama3.1",  # Specify the model you want to use
        ),
    ),
    embedder=dict(
        provider="ollama",  # Change this to your LLM provider
        config=dict(
            model="mxbai-embed-large",  # Specify the embedding model 
        ),
    ),
)

Running this from jupyter currently (on a server via ssh). Unfortunately, the instantiation never seems to finish. I am not seeing any text output and nothing in journalctl -u ollama either.

How long does it take you to instantiate the tool? Any idea, why it might get stuck?

iNLyze avatar Oct 10 '24 21:10 iNLyze

+1 . Also facing the same issue.

kspviswa avatar Dec 01 '24 00:12 kspviswa

I got everything working with this:

os.environ['OPENAI_API_BASE'] = 'http://localhost:11434'
os.environ['OPENAI_MODEL_NAME'] = 'ollama/llama3.2'
os.environ['OPENAI_API_KEY'] = 'NA'

Note: Using this will make this your default llm, no need to configure everything separately

siddas27 avatar Dec 05 '24 04:12 siddas27

I got everything working with this:

os.environ['OPENAI_API_BASE'] = 'http://localhost:11434'
os.environ['OPENAI_MODEL_NAME'] = 'ollama/llama3.2'
os.environ['OPENAI_API_KEY'] = 'NA'

Note: Using this will make this your default llm, no need to configure everything separately

Sorry not working.

/.venv/lib/python3.12/site-packages/crewai/agent.py", line 161, in post_init_setup
    if env_var["key_name"] in unnacepted_attributes:
       ~~~~~~~^^^^^^^^^^^^

It is very clear that Ollama provider doesn't have a key called key_name . I wonder how it worked for you. Can you elaborate?

kspviswa avatar Dec 06 '24 02:12 kspviswa

Ok I inspected the code and found that I had to upgrade crewai to latest version. Now it is working.

kspviswa avatar Dec 06 '24 02:12 kspviswa