skodavalla
Results
2
comments of
skodavalla
Same error on using base_url='http://192.168.10.8:11434/api' 2025-03-26 19:18:58,428 - LiteLLM - INFO - LiteLLM completion() model= llama3.2:latest; provider = ollama 2025-03-26 19:18:58,493 - root - ERROR - LiteLLM call failed: litellm.APIConnectionError:...
Please let me know if any additional info is needed. Also the container size is more than 40GB can it be reduced?