akiradev0x
akiradev0x
Ok, theres going to be a big refactor we are packaging today that should fix the hanging. Will also include openai support. @mberman84 @businistry
Interesting. Another person tried it and had an issue on M3. Taking a dive tomorrow.
Hey @mberman84 did it end up working for you? We just pushed a new version that seems to be better.
Hey all, heres a couple things you can try: 1) uninstall via pipx and reinstall the latest version 2) upgrade nodejs version to 22, I tested this yesterday (I found...
awesome! @businistry glad to hear it. I think for now, we can close this issue.
Ok so `litellm proxy` is deprecated, @antoninoLorenzo you should just be able to do the completion as normal with the `api_base` set to the ollama base url one as it...
Sounds good to me! @antoninoLorenzo
@cyborgmarina yeah that seems correct
Ollama support merged. closing
I think right now this is a context management problem. To me that seems like a hallucination from the model. Just made some updates to the tools the model has...