MrMWith
Results
2
comments of
MrMWith
I am trying to run Ollama locally per the instructions (from my docker container) however for this example, it appears to still be using OpenAI? Not sure, but I can...
try this... ollama_openhermes = Ollama(model="openhermes") Or the name of your modelfile version if you did that step.