danielgen
danielgen
This issue is possibly related to this one: https://github.com/omz/Pythonista-Issues/issues/652
For me Llama3 works as expected in Ollama CLI. However it does not work in CrewAi, not even specifying the same modelfile. Not sure if Ollama is at fault here,...
Not sure what the issue is with the ollama llama3 model and CrewAI, it seems to have a hard time stopping generation. - Llama3 and Llama2 work correctly from Ollama...
I can see the regex is now updated to the suggested version. I am still seeing this behaviour with a custom tool to execute some code. I printed `text` and...
@contractorwolf you tagged me by mistake, you meant to tag [kyuumeitai](https://github.com/kyuumeitai) based on your quote
@mustangs0786 just use a modelfile [following documentation](https://docs.crewai.com/how-to/LLM-Connections/#ollama-integration-ex-for-using-llama-2-locally) and then reference the model, e.g.: ``` from langchain_openai import ChatOpenAI import os os.environ["OPENAI_API_KEY"] = "" ollama_llm = ChatOpenAI( model_name='crewai-llama2', base_url = "http://localhost:11434/v1"...
I am not sure if I follow the example but: the model passes only a string to tools in my understanding (i.e. the model does not pass datatypes other than...
@brunoboto96 it does work, take a look at this example from the tools library: https://github.com/joaomdmoura/crewAI-examples/blob/main/prep-for-a-meeting/tools/ExaSearchTool.py#L23 The newest issue you are facing seems to be related to the model you are...
@brunoboto96 yeah I am just a user, not a repo maintainer. What I am trying to help you with is saying: 1. I had the same exact behaviour as you...