example model file for ollama openhermes2.5-mistral
would this be a correct modelfile to use for ollama openhermes2.5-mistral?
TEMPLATE """<|im_start|>system {{ .System }}<|im_end|> <|im_start|>user {{ .Prompt }}<|im_end|> <|im_start|>assistant """ PARAMETER num_ctx 4096 PARAMETER top_p 0.5 PARAMETER temperature 0 PARAMETER stop "<|im_start|>" PARAMETER stop "<|im_end|>" PARAMETER stop "Observation"
I am trying to run the trip_planner example locally using ollama and openhermes2.5-mistral model. In the example readme it says...
Configure Ollama: Set up Ollama to work with your local model. You will probably need to tweak the model using a Modelfile, I'd recommend adding Observation as a stop word and playing with top_p and temperature.
https://github.com/joaomdmoura/crewAI-examples/tree/main/trip_planner
The reason why I am asking is if I use default ollama model file for openhermes2.5-mistral I get these errors...
Retrying langchain.chat_models.openai.ChatOpenAI.completion_with_retry.
Maybe this is the same problem at #2
https://github.com/joaomdmoura/crewAI-examples/issues/2