frob
frob
Before: litellm-1 | POST Request Sent from LiteLLM: litellm-1 | curl -X POST \ litellm-1 | http://ollama:11434/api/generate \ litellm-1 | -d '{'model': 'mistral', 'prompt': 'why is the sky blue?', 'options':...
Hi Jakob, base64 encoded data shouldn't contain commas, it's restricted to the character set [A-Za-z0-9/+=]: https://en.wikipedia.org/wiki/Base64 On Thu, 11 Apr 2024 at 05:10, Jakob ***@***.***> wrote: > I think this...
Just as a followup, the test only checks that the ollama server is responding to connections. In my own case, and this is likely overkill for most people, I check...


phi3:3.8b-instruct is the q4_0 quantized version of the model. If you use the fp16 version, it follows instructions better. ``` $ ollama run phi3:3.8b-mini-128k-instruct-fp16 >>> /set system As a knowledge...
See https://github.com/BerriAI/litellm/pull/2888
This should be fixed.