Christian Weyer
Christian Weyer
Actually, I see this with any function-capable model with the lastest LiteLLM model @krrishdholakia. I tried several (actually, I did *not* try Gemma😉). Always we get that wrong nested result...
Not quite @krrishdholakia This is what I see via LiteLLM: ```json "tool_calls" : [ { "function" : { "arguments" : "{\n \"name\": \"get_current_weather\", \n \"arguments\": {\"location\": \"Boston, MA\"}\n}\n", "name" :...
Maybe the issue is here @krrishdholakia: https://github.com/BerriAI/litellm/blob/4913ad41db9f10261790917ccdf73dbb535c0366/litellm/llms/ollama.py#L221 https://github.com/BerriAI/litellm/blob/4913ad41db9f10261790917ccdf73dbb535c0366/litellm/llms/ollama.py#L318
BTW: I also think that it should be `"finish_reason" : "tool_calls"` With LiteLLM it is `"stop"`.
> I think this was resolved in `v1.35.34+` by PR #1526 as discussed in related issue #3333 . Requires using the `ollama_chat/` prefix in place of `ollama/`. Streaming responses remain...
> Hey @ChristianWeyer Which PR are you referring to? I might've missed it. > > We have finish reason mapping here - > > https://github.com/BerriAI/litellm/blob/918367cc7bdc9e8e01477243ebc963709ac8178e/litellm/utils.py#L188 This: https://github.com/BerriAI/litellm/pull/2597
Hi @jostFT - interesting. @johan-v-r I am still having issues with `LibSaasBuilder` on my M1 ARM MacBook Pro. Error: `/Users/xyz/.nuget/packages/libsassbuilder/2.0.1/build/LibSassBuilder.targets(95,5): error MSB3073: The command "dotnet "/Users/xyz/.nuget/packages/libsassbuilder/2.0.1/build/../tool/LibSassBuilder.dll" files "/Users/xyz/Sources/blazor-wasm-things-to-know/IsolatedJS/Client/Shared/MainLayout.razor.scss" --outputstyle compressed...
It stays like this, yes.
Any more ideas how to fix this?
> https://github.com/abetlen/llama-cpp-python @imartinez When we run privateGPT on an M1, it only uses CPU, right? There is currently no support for GPU (e.g. via MPS)?