WillJesperson

Results 5 comments of WillJesperson

I have validated that the server is running using `ollama serve` and have set the default model as directed. Still no dice. Here is a little more information to help....

Any idea why Ollama has strayed away from using markdown for prompts?

Ive run into the same issue while using my local model. I am however able to use a remote Ollama server. I think this gives credence to the idea that...

If i make a --remoteOllamaServer call on the machine which is actually running the model then it works. But it refuses to run on the machine natively regardless of what...

youd have to implement some sort of program to webscrape that for you. Fabric is simply the infastructure that allows you to use consistant prompts on a given AI model....