Zarinah Casanova
Zarinah Casanova
Thank you. I had the same issue and this fixed it.
My workaround was to have litellm sdk use the ollama openai endpoint, so instead of using ollama/llama3.2:latest, I prefix openai and provide the api_base: ```python responses = completion( model="openai/llama3.2:latest", api_base="http://localhost:11434/v1",...
It seems like this project is no longer maintained. What do you all think? I really love the vscode integration but worry about investing more time in it since it...
> In version 1.0.18 traces were working fine. Looks like something was introduced in v19 resulted into breaking things. I see [tracing service](https://github.com/langflow-ai/langflow/blob/main/src/backend/base/langflow/services/tracing/service.py) logic was slightly changed when introducing ruff...
> @italojohnny other than the python code is being added into the input. The output is not showing empty. See above screenshot. Were you able to reproduce the empty output?...