bytebot icon indicating copy to clipboard operation
bytebot copied to clipboard

API Exceptions when attempting task using Ollama & bytebot-llm-proxy

Open radiantone opened this issue 3 months ago • 2 comments

litellm-config.yaml

  - model_name: llama-deepseek-r1
    litellm_params:
      drop_params: true
      model: ollama/deepseek-r1:8b
      api_base: http://host.docker.internal:11434

Ollama Log

time=2025-09-05T15:30:54.687-04:00 level=INFO source=app_windows.go:272 msg="starting Ollama" app=C:\Users\VYRAL\AppData\Local\Programs\Ollama version=0.11.10 OS=Windows/10.0.26100
time=2025-09-05T15:30:54.688-04:00 level=INFO source=app.go:212 msg="initialized tools registry" tool_count=0
time=2025-09-05T15:30:54.691-04:00 level=INFO source=app.go:227 msg="starting ollama server"
time=2025-09-05T15:30:54.749-04:00 level=INFO source=app.go:256 msg="starting ui server" port=56034
time=2025-09-05T15:30:56.066-04:00 level=INFO source=ui.go:167 msg=site.serveHTTP http.method=GET http.path=/api/v1/settings http.pattern="GET /api/v1/settings" http.status=200 http.d=0s request_id=1757100656066995400 version=0.11.10
time=2025-09-05T15:30:56.067-04:00 level=INFO source=ui.go:167 msg=site.serveHTTP http.method=GET http.path=/api/v1/chats http.pattern="GET /api/v1/chats" http.status=200 http.d=0s request_id=1757100656067499700 version=0.11.10
time=2025-09-05T15:30:56.069-04:00 level=INFO source=ui.go:167 msg=site.serveHTTP http.method=GET http.path=/api/v1/models http.pattern="GET /api/v1/models" http.status=200 http.d=1.5067ms request_id=1757100656067499700 version=0.11.10
time=2025-09-05T15:30:56.070-04:00 level=INFO source=ui.go:167 msg=site.serveHTTP http.method=GET http.path=/api/v1/me http.pattern="GET /api/v1/me" http.status=200 http.d=21.4762ms request_id=1757100656049046500 version=0.11.10
time=2025-09-05T15:30:56.083-04:00 level=INFO source=ui.go:167 msg=site.serveHTTP http.method=GET http.path=/api/v1/chats http.pattern="GET /api/v1/chats" http.status=200 http.d=503.7µs request_id=1757100656082986300 version=0.11.10
time=2025-09-05T15:30:56.088-04:00 level=INFO source=ui.go:167 msg=site.serveHTTP http.method=GET http.path=/api/v1/models/turbo http.pattern="GET /api/v1/models/turbo" http.status=200 http.d=20.6865ms request_id=1757100656067499700 version=0.11.10
time=2025-09-05T15:30:56.111-04:00 level=INFO source=ui.go:167 msg=site.serveHTTP http.method=GET http.path=/api/v1/model/deepseek-r1:8b/capabilities http.pattern="GET /api/v1/model/{model}/capabilities" http.status=200 http.d=27.6397ms request_id=1757100656083490000 version=0.11.10
time=2025-09-05T15:30:56.214-04:00 level=INFO source=ui.go:167 msg=site.serveHTTP http.method=POST http.path=/api/v1/model/upstream http.pattern="POST /api/v1/model/upstream" http.status=200 http.d=131.8353ms request_id=1757100656082986300 version=0.11.10
time=2025-09-05T15:30:57.749-04:00 level=INFO source=updater.go:252 msg="beginning update checker" interval=1h0m0s
time=2025-09-05T15:31:00.023-04:00 level=INFO source=ui.go:167 msg=site.serveHTTP http.method=GET http.path=/api/v1/chat/01991868-ebb1-72f1-a856-86b1e53656cc http.pattern="GET /api/v1/chat/{id}" http.status=200 http.d=563.7µs request_id=1757100660023429600 version=0.11.10
time=2025-09-05T15:31:00.056-04:00 level=INFO source=ui.go:167 msg=site.serveHTTP http.method=GET http.path=/api/v1/chat/01991b56-1b5f-7b99-8a96-4d6a64394edf http.pattern="GET /api/v1/chat/{id}" http.status=200 http.d=596µs request_id=1757100660055864700 version=0.11.10
time=2025-09-05T15:31:00.530-04:00 level=INFO source=ui.go:167 msg=site.serveHTTP http.method=GET http.path=/api/v1/settings http.pattern="GET /api/v1/settings" http.status=200 http.d=999.1µs request_id=1757100660529487700 version=0.11.10
time=2025-09-05T15:31:00.531-04:00 level=INFO source=ui.go:167 msg=site.serveHTTP http.method=GET http.path=/api/v1/models http.pattern="GET /api/v1/models" http.status=200 http.d=1.0152ms request_id=1757100660530486800 version=0.11.10
time=2025-09-05T15:31:00.554-04:00 level=INFO source=ui.go:167 msg=site.serveHTTP http.method=GET http.path=/api/v1/models/turbo http.pattern="GET /api/v1/models/turbo" http.status=200 http.d=23.7919ms request_id=1757100660530990300 version=0.11.10
time=2025-09-05T15:31:02.124-04:00 level=INFO source=ui.go:167 msg=site.serveHTTP http.method=GET http.path=/api/v1/chat/01991868-ebb1-72f1-a856-86b1e53656cc http.pattern="GET /api/v1/chat/{id}" http.status=200 http.d=533.5µs request_id=1757100662123684300 version=0.11.10
time=2025-09-05T15:31:02.239-04:00 level=INFO source=ui.go:167 msg=site.serveHTTP http.method=GET http.path=/api/v1/chat/01991b56-1b5f-7b99-8a96-4d6a64394edf http.pattern="GET /api/v1/chat/{id}" http.status=200 http.d=765.3µs request_id=1757100662238831600 version=0.11.10
time=2025-09-05T15:31:03.356-04:00 level=INFO source=ui.go:167 msg=site.serveHTTP http.method=GET http.path=/api/v1/settings http.pattern="GET /api/v1/settings" http.status=200 http.d=0s request_id=1757100663356428000 version=0.11.10
time=2025-09-05T15:31:03.357-04:00 level=INFO source=ui.go:167 msg=site.serveHTTP http.method=GET http.path=/api/v1/models http.pattern="GET /api/v1/models" http.status=200 http.d=1.0264ms request_id=1757100663356428000 version=0.11.10
time=2025-09-05T15:31:03.377-04:00 level=INFO source=ui.go:167 msg=site.serveHTTP http.method=GET http.path=/api/v1/models/turbo http.pattern="GET /api/v1/models/turbo" http.status=200 http.d=21.0236ms request_id=1757100663356428000 version=0.11.10
time=2025-09-05T15:31:05.741-04:00 level=INFO source=ui.go:167 msg=site.serveHTTP http.method=GET http.path=/api/v1/chats http.pattern="GET /api/v1/chats" http.status=200 http.d=512.6µs request_id=1757100665741409900 version=0.11.10
time=2025-09-05T15:31:05.748-04:00 level=INFO source=ui.go:167 msg=site.serveHTTP http.method=GET http.path=/api/v1/settings http.pattern="GET /api/v1/settings" http.status=200 http.d=512.4µs request_id=1757100665747804100 version=0.11.10
time=2025-09-05T15:31:05.749-04:00 level=INFO source=ui.go:167 msg=site.serveHTTP http.method=GET http.path=/api/v1/models http.pattern="GET /api/v1/models" http.status=200 http.d=1.6197ms request_id=1757100665748316500 version=0.11.10
time=2025-09-05T15:31:05.772-04:00 level=INFO source=ui.go:167 msg=site.serveHTTP http.method=GET http.path=/api/v1/models/turbo http.pattern="GET /api/v1/models/turbo" http.status=200 http.d=23.5545ms request_id=1757100665748831200 version=0.11.10
time=2025-09-05T15:31:09.079-04:00 level=INFO source=ui.go:167 msg=site.serveHTTP http.method=POST http.path=/api/v1/chat/new http.pattern="POST /api/v1/chat/{id}" http.status=200 http.d=3.3403321s request_id=1757100665738849800 version=0.11.10
time=2025-09-05T15:31:09.080-04:00 level=INFO source=ui.go:167 msg=site.serveHTTP http.method=GET http.path=/api/v1/chat/01991b5c-6b8a-7cf7-910a-ea64d858ff99 http.pattern="GET /api/v1/chat/{id}" http.status=200 http.d=0s request_id=1757100669080473400 version=0.11.10

bytebot-agent, proxy log

bytebot-agent      | [Nest] 18  - 09/05/2025, 7:31:30 PM     LOG [TasksService] Found existing task with ID: 41d988d4-4bf4-4624-a3a7-7781bbea7ca2, and status PENDING. Resuming.
bytebot-agent      | [Nest] 18  - 09/05/2025, 7:31:30 PM     LOG [TasksService] Updating task with ID: 41d988d4-4bf4-4624-a3a7-7781bbea7ca2
bytebot-agent      | [Nest] 18  - 09/05/2025, 7:31:30 PM   DEBUG [TasksService] Update data: {"status":"RUNNING","executedAt":"2025-09-05T19:31:30.005Z"}
bytebot-agent      | [Nest] 18  - 09/05/2025, 7:31:30 PM     LOG [TasksService] Retrieving task by ID: 41d988d4-4bf4-4624-a3a7-7781bbea7ca2
bytebot-agent      | [Nest] 18  - 09/05/2025, 7:31:30 PM   DEBUG [TasksService] Retrieved task with ID: 41d988d4-4bf4-4624-a3a7-7781bbea7ca2
bytebot-agent      | [Nest] 18  - 09/05/2025, 7:31:30 PM     LOG [TasksService] Successfully updated task ID: 41d988d4-4bf4-4624-a3a7-7781bbea7ca2
bytebot-agent      | [Nest] 18  - 09/05/2025, 7:31:30 PM   DEBUG [TasksService] Updated task: {"id":"41d988d4-4bf4-4624-a3a7-7781bbea7ca2","description":"Hi","type":"IMMEDIATE","status":"RUNNING","priority":"MEDIUM","control":"ASSISTANT","createdAt":"2025-09-05T19:31:26.966Z","createdBy":"USER","scheduledFor":null,"updatedAt":"2025-09-05T19:31:30.007Z","executedAt":"2025-09-05T19:31:30.005Z","completedAt":null,"queuedAt":null,"error":null,"result":null,"model":{"name":"ollama/deepseek-r1:8b","title":"llama-deepseek-r1","provider":"proxy","contextWindow":128000}}
bytebot-agent      | [Nest] 18  - 09/05/2025, 7:31:30 PM   DEBUG [AgentScheduler] Processing task ID: 41d988d4-4bf4-4624-a3a7-7781bbea7ca2
bytebot-agent      | [Nest] 18  - 09/05/2025, 7:31:30 PM     LOG [AgentProcessor] Starting processing for task ID: 41d988d4-4bf4-4624-a3a7-7781bbea7ca2
bytebot-agent      | [Nest] 18  - 09/05/2025, 7:31:30 PM     LOG [TasksService] Retrieving task by ID: 41d988d4-4bf4-4624-a3a7-7781bbea7ca2
bytebot-agent      | [Nest] 18  - 09/05/2025, 7:31:30 PM   DEBUG [TasksService] Retrieved task with ID: 41d988d4-4bf4-4624-a3a7-7781bbea7ca2
bytebot-agent      | [Nest] 18  - 09/05/2025, 7:31:30 PM     LOG [AgentProcessor] Processing iteration for task ID: 41d988d4-4bf4-4624-a3a7-7781bbea7ca2
bytebot-agent      | [Nest] 18  - 09/05/2025, 7:31:30 PM   DEBUG [AgentProcessor] Sending 1 messages to LLM for processing
bytebot-llm-proxy  | INFO:     172.18.0.5:51514 - "POST /chat/completions HTTP/1.1" 200 OK
bytebot-agent      | [Nest] 18  - 09/05/2025, 7:31:31 PM   DEBUG [AgentProcessor] Received 1 content blocks from LLM
bytebot-agent      | [Nest] 18  - 09/05/2025, 7:31:31 PM   DEBUG [AgentProcessor] Token usage for task 41d988d4-4bf4-4624-a3a7-7781bbea7ca2: 4124/128000 (3%)
bytebot-agent      | [Nest] 18  - 09/05/2025, 7:31:31 PM     LOG [TasksService] Retrieving task by ID: 41d988d4-4bf4-4624-a3a7-7781bbea7ca2
bytebot-agent      | [Nest] 18  - 09/05/2025, 7:31:31 PM   DEBUG [TasksService] Retrieved task with ID: 41d988d4-4bf4-4624-a3a7-7781bbea7ca2
bytebot-agent      | [Nest] 18  - 09/05/2025, 7:31:31 PM     LOG [AgentProcessor] Processing iteration for task ID: 41d988d4-4bf4-4624-a3a7-7781bbea7ca2
bytebot-agent      | [Nest] 18  - 09/05/2025, 7:31:31 PM   DEBUG [AgentProcessor] Sending 2 messages to LLM for processing
bytebot-llm-proxy  | 19:31:33 - LiteLLM Proxy:ERROR: common_request_processing.py:642 - litellm.proxy.proxy_server._handle_llm_api_exception(): Exception occured - litellm.APIConnectionError: list index out of range
bytebot-llm-proxy  | Traceback (most recent call last):
bytebot-llm-proxy  |   File "/usr/lib/python3.13/site-packages/litellm/main.py", line 3162, in completion
bytebot-llm-proxy  |     response = base_llm_http_handler.completion(
bytebot-llm-proxy  |         model=model,
bytebot-llm-proxy  |     ...<13 lines>...
bytebot-llm-proxy  |         client=client,
bytebot-llm-proxy  |     )
bytebot-llm-proxy  |   File "/usr/lib/python3.13/site-packages/litellm/llms/custom_httpx/llm_http_handler.py", line 329, in completion
bytebot-llm-proxy  |     data = provider_config.transform_request(
bytebot-llm-proxy  |         model=model,
bytebot-llm-proxy  |     ...<3 lines>...
bytebot-llm-proxy  |         headers=headers,
bytebot-llm-proxy  |     )
bytebot-llm-proxy  |   File "/usr/lib/python3.13/site-packages/litellm/llms/ollama/completion/transformation.py", line 344, in transform_request
bytebot-llm-proxy  |     modified_prompt = ollama_pt(model=model, messages=messages)
bytebot-llm-proxy  |   File "/usr/lib/python3.13/site-packages/litellm/litellm_core_utils/prompt_templates/factory.py", line 237, in ollama_pt
bytebot-llm-proxy  |     tool_calls = messages[msg_i].get("tool_calls")
bytebot-llm-proxy  |                  ~~~~~~~~^^^^^^^
bytebot-llm-proxy  | IndexError: list index out of range
bytebot-llm-proxy  | . Received Model Group=ollama/deepseek-r1:8b
bytebot-llm-proxy  | Available Model Group Fallbacks=None LiteLLM Retried: 1 times, LiteLLM Max Retries: 2
bytebot-llm-proxy  | Traceback (most recent call last):
bytebot-llm-proxy  |   File "/usr/lib/python3.13/site-packages/litellm/main.py", line 3162, in completion
bytebot-llm-proxy  |     response = base_llm_http_handler.completion(
bytebot-llm-proxy  |         model=model,
bytebot-llm-proxy  |     ...<13 lines>...
bytebot-llm-proxy  |         client=client,
bytebot-llm-proxy  |     )
bytebot-llm-proxy  |   File "/usr/lib/python3.13/site-packages/litellm/llms/custom_httpx/llm_http_handler.py", line 329, in completion
bytebot-llm-proxy  |     data = provider_config.transform_request(
bytebot-llm-proxy  |         model=model,
bytebot-llm-proxy  |     ...<3 lines>...
bytebot-llm-proxy  |         headers=headers,
bytebot-llm-proxy  |     )
bytebot-llm-proxy  |   File "/usr/lib/python3.13/site-packages/litellm/llms/ollama/completion/transformation.py", line 344, in transform_request
bytebot-llm-proxy  |     modified_prompt = ollama_pt(model=model, messages=messages)
bytebot-llm-proxy  |   File "/usr/lib/python3.13/site-packages/litellm/litellm_core_utils/prompt_templates/factory.py", line 237, in ollama_pt
bytebot-llm-proxy  |     tool_calls = messages[msg_i].get("tool_calls")
bytebot-llm-proxy  |                  ~~~~~~~~^^^^^^^
bytebot-llm-proxy  | IndexError: list index out of range
bytebot-llm-proxy  |
bytebot-llm-proxy  | During handling of the above exception, another exception occurred:
bytebot-llm-proxy  |
bytebot-llm-proxy  | Traceback (most recent call last):
bytebot-llm-proxy  |   File "/usr/lib/python3.13/site-packages/litellm/proxy/proxy_server.py", line 4055, in chat_completion
bytebot-llm-proxy  |     result = await base_llm_response_processor.base_process_llm_request(
bytebot-llm-proxy  |              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
bytebot-llm-proxy  |     ...<16 lines>...
bytebot-llm-proxy  |     )
bytebot-llm-proxy  |     ^
bytebot-llm-proxy  |   File "/usr/lib/python3.13/site-packages/litellm/proxy/common_request_processing.py", line 436, in base_process_llm_request
bytebot-llm-proxy  |     responses = await llm_responses
bytebot-llm-proxy  |                 ^^^^^^^^^^^^^^^^^^^
bytebot-llm-proxy  |   File "/usr/lib/python3.13/site-packages/litellm/router.py", line 1083, in acompletion
bytebot-llm-proxy  |     raise e
bytebot-llm-proxy  |   File "/usr/lib/python3.13/site-packages/litellm/router.py", line 1059, in acompletion
bytebot-llm-proxy  |     response = await self.async_function_with_fallbacks(**kwargs)
bytebot-llm-proxy  |                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
bytebot-llm-proxy  |   File "/usr/lib/python3.13/site-packages/litellm/router.py", line 3902, in async_function_with_fallbacks
bytebot-llm-proxy  |     return await self.async_function_with_fallbacks_common_utils(
bytebot-llm-proxy  |            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
bytebot-llm-proxy  |     ...<8 lines>...
bytebot-llm-proxy  |     )
bytebot-llm-proxy  |     ^
bytebot-llm-proxy  |   File "/usr/lib/python3.13/site-packages/litellm/router.py", line 3860, in async_function_with_fallbacks_common_utils
bytebot-llm-proxy  |     raise original_exception
bytebot-llm-proxy  |   File "/usr/lib/python3.13/site-packages/litellm/router.py", line 3894, in async_function_with_fallbacks
bytebot-llm-proxy  |     response = await self.async_function_with_retries(*args, **kwargs)
bytebot-llm-proxy  |                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
bytebot-llm-proxy  |   File "/usr/lib/python3.13/site-packages/litellm/router.py", line 4099, in async_function_with_retries
bytebot-llm-proxy  |     raise original_exception
bytebot-llm-proxy  |   File "/usr/lib/python3.13/site-packages/litellm/router.py", line 3990, in async_function_with_retries
bytebot-llm-proxy  |     response = await self.make_call(original_function, *args, **kwargs)
bytebot-llm-proxy  |                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
bytebot-llm-proxy  |   File "/usr/lib/python3.13/site-packages/litellm/router.py", line 4108, in make_call
bytebot-llm-proxy  |     response = await response
bytebot-llm-proxy  |                ^^^^^^^^^^^^^^
bytebot-llm-proxy  |   File "/usr/lib/python3.13/site-packages/litellm/router.py", line 1364, in _acompletion
bytebot-llm-proxy  |     raise e
bytebot-llm-proxy  |   File "/usr/lib/python3.13/site-packages/litellm/router.py", line 1316, in _acompletion
bytebot-llm-proxy  |     response = await _response
bytebot-llm-proxy  |                ^^^^^^^^^^^^^^^
bytebot-llm-proxy  |   File "/usr/lib/python3.13/site-packages/litellm/utils.py", line 1586, in wrapper_async
bytebot-llm-proxy  |     raise e
bytebot-llm-proxy  |   File "/usr/lib/python3.13/site-packages/litellm/utils.py", line 1437, in wrapper_async
bytebot-llm-proxy  |     result = await original_function(*args, **kwargs)
bytebot-llm-proxy  |              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
bytebot-llm-proxy  |   File "/usr/lib/python3.13/site-packages/litellm/main.py", line 563, in acompletion
bytebot-llm-proxy  |     raise exception_type(
bytebot-llm-proxy  |     ...<5 lines>...
bytebot-llm-proxy  |     )
bytebot-llm-proxy  |   File "/usr/lib/python3.13/site-packages/litellm/main.py", line 536, in acompletion
bytebot-llm-proxy  |     init_response = await loop.run_in_executor(None, func_with_context)
bytebot-llm-proxy  |                     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
bytebot-llm-proxy  |   File "/usr/lib/python3.13/concurrent/futures/thread.py", line 59, in run
bytebot-llm-proxy  |     result = self.fn(*self.args, **self.kwargs)
bytebot-llm-proxy  |   File "/usr/lib/python3.13/site-packages/litellm/utils.py", line 1060, in wrapper
bytebot-llm-proxy  |     result = original_function(*args, **kwargs)
bytebot-llm-proxy  |   File "/usr/lib/python3.13/site-packages/litellm/main.py", line 3525, in completion
bytebot-llm-proxy  |     raise exception_type(
bytebot-llm-proxy  |           ~~~~~~~~~~~~~~^
bytebot-llm-proxy  |         model=model,
bytebot-llm-proxy  |         ^^^^^^^^^^^^
bytebot-llm-proxy  |     ...<3 lines>...
bytebot-llm-proxy  |         extra_kwargs=kwargs,
bytebot-llm-proxy  |         ^^^^^^^^^^^^^^^^^^^^
bytebot-llm-proxy  |     )
bytebot-llm-proxy  |     ^
bytebot-llm-proxy  |   File "/usr/lib/python3.13/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py", line 2301, in exception_type
bytebot-llm-proxy  |     raise e
bytebot-llm-proxy  |   File "/usr/lib/python3.13/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py", line 2277, in exception_type
bytebot-llm-proxy  |     raise APIConnectionError(
bytebot-llm-proxy  |     ...<8 lines>...
bytebot-llm-proxy  |     )
bytebot-llm-proxy  | litellm.exceptions.APIConnectionError: litellm.APIConnectionError: list index out of range
bytebot-llm-proxy  | Traceback (most recent call last):
bytebot-llm-proxy  |   File "/usr/lib/python3.13/site-packages/litellm/main.py", line 3162, in completion
bytebot-llm-proxy  |     response = base_llm_http_handler.completion(
bytebot-llm-proxy  |         model=model,
bytebot-llm-proxy  |     ...<13 lines>...
bytebot-llm-proxy  |         client=client,
bytebot-llm-proxy  |     )
bytebot-llm-proxy  |   File "/usr/lib/python3.13/site-packages/litellm/llms/custom_httpx/llm_http_handler.py", line 329, in completion
bytebot-llm-proxy  |     data = provider_config.transform_request(
bytebot-llm-proxy  |         model=model,
bytebot-llm-proxy  |     ...<3 lines>...
bytebot-llm-proxy  |         headers=headers,
bytebot-llm-proxy  |     )
bytebot-llm-proxy  |   File "/usr/lib/python3.13/site-packages/litellm/llms/ollama/completion/transformation.py", line 344, in transform_request
bytebot-llm-proxy  |     modified_prompt = ollama_pt(model=model, messages=messages)
bytebot-llm-proxy  |   File "/usr/lib/python3.13/site-packages/litellm/litellm_core_utils/prompt_templates/factory.py", line 237, in ollama_pt
bytebot-llm-proxy  |     tool_calls = messages[msg_i].get("tool_calls")
bytebot-llm-proxy  |                  ~~~~~~~~^^^^^^^
bytebot-llm-proxy  | IndexError: list index out of range
bytebot-llm-proxy  | . Received Model Group=ollama/deepseek-r1:8b
bytebot-llm-proxy  | Available Model Group Fallbacks=None LiteLLM Retried: 1 times, LiteLLM Max Retries: 2
bytebot-llm-proxy  | INFO:     172.18.0.5:51514 - "POST /chat/completions HTTP/1.1" 500 Internal Server Error
bytebot-llm-proxy  | 19:31:36 - LiteLLM Proxy:ERROR: common_request_processing.py:642 - litellm.proxy.proxy_server._handle_llm_api_exception(): Exception occured - litellm.APIConnectionError: list index out of range
bytebot-llm-proxy  | Traceback (most recent call last):
bytebot-llm-proxy  |   File "/usr/lib/python3.13/site-packages/litellm/main.py", line 3162, in completion
bytebot-llm-proxy  |     response = base_llm_http_handler.completion(
bytebot-llm-proxy  |         model=model,
bytebot-llm-proxy  |     ...<13 lines>...
bytebot-llm-proxy  |         client=client,
bytebot-llm-proxy  |     )
bytebot-llm-proxy  |   File "/usr/lib/python3.13/site-packages/litellm/llms/custom_httpx/llm_http_handler.py", line 329, in completion
bytebot-llm-proxy  |     data = provider_config.transform_request(
bytebot-llm-proxy  |         model=model,
bytebot-llm-proxy  |     ...<3 lines>...
bytebot-llm-proxy  |         headers=headers,
bytebot-llm-proxy  |     )
bytebot-llm-proxy  |   File "/usr/lib/python3.13/site-packages/litellm/llms/ollama/completion/transformation.py", line 344, in transform_request
bytebot-llm-proxy  |     modified_prompt = ollama_pt(model=model, messages=messages)
bytebot-llm-proxy  |   File "/usr/lib/python3.13/site-packages/litellm/litellm_core_utils/prompt_templates/factory.py", line 237, in ollama_pt
bytebot-llm-proxy  |     tool_calls = messages[msg_i].get("tool_calls")
bytebot-llm-proxy  |                  ~~~~~~~~^^^^^^^
bytebot-llm-proxy  | IndexError: list index out of range
bytebot-llm-proxy  | . Received Model Group=ollama/deepseek-r1:8b
bytebot-llm-proxy  | Available Model Group Fallbacks=None LiteLLM Retried: 1 times, LiteLLM Max Retries: 2
bytebot-llm-proxy  | Traceback (most recent call last):
bytebot-llm-proxy  |   File "/usr/lib/python3.13/site-packages/litellm/main.py", line 3162, in completion
bytebot-llm-proxy  |     response = base_llm_http_handler.completion(
bytebot-llm-proxy  |         model=model,
bytebot-llm-proxy  |     ...<13 lines>...
bytebot-llm-proxy  |         client=client,
bytebot-llm-proxy  |     )
bytebot-llm-proxy  |   File "/usr/lib/python3.13/site-packages/litellm/llms/custom_httpx/llm_http_handler.py", line 329, in completion
bytebot-llm-proxy  |     data = provider_config.transform_request(
bytebot-llm-proxy  |         model=model,
bytebot-llm-proxy  |     ...<3 lines>...
bytebot-llm-proxy  |         headers=headers,
bytebot-llm-proxy  |     )
bytebot-llm-proxy  |   File "/usr/lib/python3.13/site-packages/litellm/llms/ollama/completion/transformation.py", line 344, in transform_request
bytebot-llm-proxy  |     modified_prompt = ollama_pt(model=model, messages=messages)
bytebot-llm-proxy  |   File "/usr/lib/python3.13/site-packages/litellm/litellm_core_utils/prompt_templates/factory.py", line 237, in ollama_pt
bytebot-llm-proxy  |     tool_calls = messages[msg_i].get("tool_calls")
bytebot-llm-proxy  |                  ~~~~~~~~^^^^^^^
bytebot-llm-proxy  | IndexError: list index out of range
bytebot-llm-proxy  |
bytebot-llm-proxy  | During handling of the above exception, another exception occurred:
bytebot-llm-proxy  |
bytebot-llm-proxy  | Traceback (most recent call last):
bytebot-llm-proxy  |   File "/usr/lib/python3.13/site-packages/litellm/proxy/proxy_server.py", line 4055, in chat_completion
bytebot-llm-proxy  |     result = await base_llm_response_processor.base_process_llm_request(
bytebot-llm-proxy  |              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
bytebot-llm-proxy  |     ...<16 lines>...
bytebot-llm-proxy  |     )
bytebot-llm-proxy  |     ^
bytebot-llm-proxy  |   File "/usr/lib/python3.13/site-packages/litellm/proxy/common_request_processing.py", line 436, in base_process_llm_request
bytebot-llm-proxy  |     responses = await llm_responses
bytebot-llm-proxy  |                 ^^^^^^^^^^^^^^^^^^^
bytebot-llm-proxy  |   File "/usr/lib/python3.13/site-packages/litellm/router.py", line 1083, in acompletion
bytebot-llm-proxy  |     raise e
bytebot-llm-proxy  |   File "/usr/lib/python3.13/site-packages/litellm/router.py", line 1059, in acompletion
bytebot-llm-proxy  |     response = await self.async_function_with_fallbacks(**kwargs)
bytebot-llm-proxy  |                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
bytebot-llm-proxy  |   File "/usr/lib/python3.13/site-packages/litellm/router.py", line 3902, in async_function_with_fallbacks
bytebot-llm-proxy  |     return await self.async_function_with_fallbacks_common_utils(
bytebot-llm-proxy  |            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
bytebot-llm-proxy  |     ...<8 lines>...
bytebot-llm-proxy  |     )
bytebot-llm-proxy  |     ^
bytebot-llm-proxy  |   File "/usr/lib/python3.13/site-packages/litellm/router.py", line 3860, in async_function_with_fallbacks_common_utils
bytebot-llm-proxy  |     raise original_exception
bytebot-llm-proxy  |   File "/usr/lib/python3.13/site-packages/litellm/router.py", line 3894, in async_function_with_fallbacks
bytebot-llm-proxy  |     response = await self.async_function_with_retries(*args, **kwargs)
bytebot-llm-proxy  |                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
bytebot-llm-proxy  |   File "/usr/lib/python3.13/site-packages/litellm/router.py", line 4099, in async_function_with_retries
bytebot-llm-proxy  |     raise original_exception
bytebot-llm-proxy  |   File "/usr/lib/python3.13/site-packages/litellm/router.py", line 3990, in async_function_with_retries
bytebot-llm-proxy  |     response = await self.make_call(original_function, *args, **kwargs)
bytebot-llm-proxy  |                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
bytebot-llm-proxy  |   File "/usr/lib/python3.13/site-packages/litellm/router.py", line 4108, in make_call
bytebot-llm-proxy  |     response = await response
bytebot-llm-proxy  |                ^^^^^^^^^^^^^^
bytebot-llm-proxy  |   File "/usr/lib/python3.13/site-packages/litellm/router.py", line 1364, in _acompletion
bytebot-llm-proxy  |     raise e
bytebot-llm-proxy  |   File "/usr/lib/python3.13/site-packages/litellm/router.py", line 1316, in _acompletion
bytebot-llm-proxy  |     response = await _response
bytebot-llm-proxy  |                ^^^^^^^^^^^^^^^
bytebot-llm-proxy  |   File "/usr/lib/python3.13/site-packages/litellm/utils.py", line 1586, in wrapper_async
bytebot-llm-proxy  |     raise e
bytebot-llm-proxy  |   File "/usr/lib/python3.13/site-packages/litellm/utils.py", line 1437, in wrapper_async
bytebot-llm-proxy  |     result = await original_function(*args, **kwargs)
bytebot-llm-proxy  |              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
bytebot-llm-proxy  |   File "/usr/lib/python3.13/site-packages/litellm/main.py", line 563, in acompletion
bytebot-llm-proxy  |     raise exception_type(
bytebot-llm-proxy  |     ...<5 lines>...
bytebot-llm-proxy  |     )
bytebot-llm-proxy  |   File "/usr/lib/python3.13/site-packages/litellm/main.py", line 536, in acompletion
bytebot-llm-proxy  |     init_response = await loop.run_in_executor(None, func_with_context)
bytebot-llm-proxy  |                     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
bytebot-llm-proxy  |   File "/usr/lib/python3.13/concurrent/futures/thread.py", line 59, in run
bytebot-llm-proxy  |     result = self.fn(*self.args, **self.kwargs)
bytebot-llm-proxy  |   File "/usr/lib/python3.13/site-packages/litellm/utils.py", line 1060, in wrapper
bytebot-llm-proxy  |     result = original_function(*args, **kwargs)
bytebot-llm-proxy  |   File "/usr/lib/python3.13/site-packages/litellm/main.py", line 3525, in completion
bytebot-llm-proxy  |     raise exception_type(
bytebot-llm-proxy  |           ~~~~~~~~~~~~~~^
bytebot-llm-proxy  |         model=model,
bytebot-llm-proxy  |         ^^^^^^^^^^^^
bytebot-llm-proxy  |     ...<3 lines>...
bytebot-llm-proxy  |         extra_kwargs=kwargs,
bytebot-llm-proxy  |         ^^^^^^^^^^^^^^^^^^^^
bytebot-llm-proxy  |     )
bytebot-llm-proxy  |     ^
bytebot-llm-proxy  |   File "/usr/lib/python3.13/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py", line 2301, in exception_type
bytebot-llm-proxy  |     raise e
bytebot-llm-proxy  |   File "/usr/lib/python3.13/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py", line 2277, in exception_type
bytebot-llm-proxy  |     raise APIConnectionError(
bytebot-llm-proxy  |     ...<8 lines>...
bytebot-llm-proxy  |     )
bytebot-llm-proxy  | litellm.exceptions.APIConnectionError: litellm.APIConnectionError: list index out of range
bytebot-llm-proxy  | Traceback (most recent call last):
bytebot-llm-proxy  |   File "/usr/lib/python3.13/site-packages/litellm/main.py", line 3162, in completion
bytebot-llm-proxy  |     response = base_llm_http_handler.completion(
bytebot-llm-proxy  |         model=model,
bytebot-llm-proxy  |     ...<13 lines>...
bytebot-llm-proxy  |         client=client,
bytebot-llm-proxy  |     )
bytebot-llm-proxy  |   File "/usr/lib/python3.13/site-packages/litellm/llms/custom_httpx/llm_http_handler.py", line 329, in completion
bytebot-llm-proxy  |     data = provider_config.transform_request(
bytebot-llm-proxy  |         model=model,
bytebot-llm-proxy  |     ...<3 lines>...
bytebot-llm-proxy  |         headers=headers,
bytebot-llm-proxy  |     )
bytebot-llm-proxy  |   File "/usr/lib/python3.13/site-packages/litellm/llms/ollama/completion/transformation.py", line 344, in transform_request
bytebot-llm-proxy  |     modified_prompt = ollama_pt(model=model, messages=messages)
bytebot-llm-proxy  |   File "/usr/lib/python3.13/site-packages/litellm/litellm_core_utils/prompt_templates/factory.py", line 237, in ollama_pt
bytebot-llm-proxy  |     tool_calls = messages[msg_i].get("tool_calls")
bytebot-llm-proxy  |                  ~~~~~~~~^^^^^^^
bytebot-llm-proxy  | IndexError: list index out of range
bytebot-llm-proxy  | . Received Model Group=ollama/deepseek-r1:8b
bytebot-llm-proxy  | Available Model Group Fallbacks=None LiteLLM Retried: 1 times, LiteLLM Max Retries: 2
bytebot-llm-proxy  | INFO:     172.18.0.5:51520 - "POST /chat/completions HTTP/1.1" 500 Internal Server Error
bytebot-llm-proxy  | 19:31:40 - LiteLLM Proxy:ERROR: common_request_processing.py:642 - litellm.proxy.proxy_server._handle_llm_api_exception(): Exception occured - litellm.APIConnectionError: list index out of range
bytebot-agent      | [Nest] 18  - 09/05/2025, 7:31:40 PM     LOG [TasksService] Updating task with ID: 41d988d4-4bf4-4624-a3a7-7781bbea7ca2
bytebot-llm-proxy  | Traceback (most recent call last):





bytebot-llm-proxy  |   File "/usr/lib/python3.13/site-packages/litellm/main.py", line 3162, in completion
bytebot-agent      | [Nest] 18  - 09/05/2025, 7:31:40 PM     LOG [TasksService] Retrieving task by ID: 41d988d4-4bf4-4624-a3a7-7781bbea7ca2


bytebot-llm-proxy  |     response = base_llm_http_handler.completion(
bytebot-agent      | [Nest] 18  - 09/05/2025, 7:31:40 PM   ERROR [ProxyService] Error sending message to proxy: 500 litellm.APIConnectionError: list index out of range
bytebot-llm-proxy  |         model=model,

bytebot-agent      | Traceback (most recent call last):
bytebot-llm-proxy  |     ...<13 lines>...

bytebot-agent      |   File "/usr/lib/python3.13/site-packages/litellm/main.py", line 3162, in completion


bytebot-agent      |     response = base_llm_http_handler.completion(
bytebot-llm-proxy  |         client=client,
bytebot-agent      |         model=model,
bytebot-llm-proxy  |     )



bytebot-agent      |     ...<13 lines>...
bytebot-llm-proxy  |   File "/usr/lib/python3.13/site-packages/litellm/llms/custom_httpx/llm_http_handler.py", line 329, in completion
bytebot-llm-proxy  |     data = provider_config.transform_request(
bytebot-agent      |     )

bytebot-llm-proxy  |         model=model,
bytebot-agent      |   File "/usr/lib/python3.13/site-packages/litellm/llms/custom_httpx/llm_http_handler.py", line 329, in completion

bytebot-llm-proxy  |     ...<3 lines>...




bytebot-agent      |     data = provider_config.transform_request(
bytebot-llm-proxy  |         headers=headers,


bytebot-agent      |         model=model,
bytebot-llm-proxy  |     )
bytebot-agent      |     ...<3 lines>...


bytebot-llm-proxy  |   File "/usr/lib/python3.13/site-packages/litellm/llms/ollama/completion/transformation.py", line 344, in transform_request

bytebot-agent      |         headers=headers,
bytebot-llm-proxy  |     modified_prompt = ollama_pt(model=model, messages=messages)

bytebot-agent      |     )


bytebot-llm-proxy  |   File "/usr/lib/python3.13/site-packages/litellm/litellm_core_utils/prompt_templates/factory.py", line 237, in ollama_pt
bytebot-agent      |   File "/usr/lib/python3.13/site-packages/litellm/llms/ollama/completion/transformation.py", line 344, in transform_request
bytebot-llm-proxy  |     tool_calls = messages[msg_i].get("tool_calls")
bytebot-agent      |     modified_prompt = ollama_pt(model=model, messages=messages)




bytebot-llm-proxy  |                  ~~~~~~~~^^^^^^^


bytebot-agent      |   File "/usr/lib/python3.13/site-packages/litellm/litellm_core_utils/prompt_templates/factory.py", line 237, in ollama_pt
bytebot-llm-proxy  | IndexError: list index out of range
bytebot-agent      |     tool_calls = messages[msg_i].get("tool_calls")


bytebot-llm-proxy  | . Received Model Group=ollama/deepseek-r1:8b


bytebot-agent      |                  ~~~~~~~~^^^^^^^
bytebot-llm-proxy  | Available Model Group Fallbacks=None LiteLLM Retried: 1 times, LiteLLM Max Retries: 2
bytebot-agent      | IndexError: list index out of range


bytebot-llm-proxy  | Traceback (most recent call last):
bytebot-agent      | . Received Model Group=ollama/deepseek-r1:8b


bytebot-llm-proxy  |   File "/usr/lib/python3.13/site-packages/litellm/main.py", line 3162, in completion
bytebot-agent      | Available Model Group Fallbacks=None


bytebot-llm-proxy  |     response = base_llm_http_handler.completion(
bytebot-agent      | Error: 500 litellm.APIConnectionError: list index out of range
bytebot-llm-proxy  |         model=model,
bytebot-agent      | Traceback (most recent call last):
bytebot-llm-proxy  |     ...<13 lines>...
bytebot-agent      |   File "/usr/lib/python3.13/site-packages/litellm/main.py", line 3162, in completion




bytebot-llm-proxy  |         client=client,


bytebot-agent      |     response = base_llm_http_handler.completion(
bytebot-llm-proxy  |     )
bytebot-agent      |         model=model,

bytebot-llm-proxy  |   File "/usr/lib/python3.13/site-packages/litellm/llms/custom_httpx/llm_http_handler.py", line 329, in completion
bytebot-agent      |     ...<13 lines>...

bytebot-llm-proxy  |     data = provider_config.transform_request(
bytebot-agent      |         client=client,

bytebot-llm-proxy  |         model=model,
bytebot-agent      |     )
bytebot-llm-proxy  |     ...<3 lines>...
bytebot-agent      |   File "/usr/lib/python3.13/site-packages/litellm/llms/custom_httpx/llm_http_handler.py", line 329, in completion
bytebot-llm-proxy  |         headers=headers,
bytebot-agent      |     data = provider_config.transform_request(
bytebot-llm-proxy  |     )
bytebot-llm-proxy  |   File "/usr/lib/python3.13/site-packages/litellm/llms/ollama/completion/transformation.py", line 344, in transform_request
bytebot-agent      |         model=model,


bytebot-agent      |     ...<3 lines>...
bytebot-llm-proxy  |     modified_prompt = ollama_pt(model=model, messages=messages)




bytebot-agent      |         headers=headers,
bytebot-llm-proxy  |   File "/usr/lib/python3.13/site-packages/litellm/litellm_core_utils/prompt_templates/factory.py", line 237, in ollama_pt
bytebot-agent      |     )

bytebot-agent      |   File "/usr/lib/python3.13/site-packages/litellm/llms/ollama/completion/transformation.py", line 344, in transform_request

bytebot-llm-proxy  |                  ~~~~~~~~^^^^^^^
bytebot-agent      |     modified_prompt = ollama_pt(model=model, messages=messages)
bytebot-llm-proxy  | IndexError: list index out of range

bytebot-agent      |     tool_calls = messages[msg_i].get("tool_calls")
bytebot-llm-proxy  | During handling of the above exception, another exception occurred:
bytebot-agent      |                  ~~~~~~~~^^^^^^^

bytebot-llm-proxy  |
bytebot-agent      | IndexError: list index out of range
bytebot-llm-proxy  | Traceback (most recent call last):
bytebot-agent      | . Received Model Group=ollama/deepseek-r1:8b

bytebot-llm-proxy  |   File "/usr/lib/python3.13/site-packages/litellm/proxy/proxy_server.py", line 4055, in chat_completion
bytebot-llm-proxy  |     result = await base_llm_response_processor.base_process_llm_request(


bytebot-agent      |     at APIError.generate (/app/bytebot-agent/node_modules/openai/core/error.js:66:20)
bytebot-llm-proxy  |              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
bytebot-agent      |     at OpenAI.makeStatusError (/app/bytebot-agent/node_modules/openai/client.js:158:32)
bytebot-llm-proxy  |     ...<16 lines>...


bytebot-agent      |     at OpenAI.makeRequest (/app/bytebot-agent/node_modules/openai/client.js:301:30)
bytebot-llm-proxy  |     )
bytebot-agent      |     at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
bytebot-llm-proxy  |     ^
bytebot-agent      |     at async ProxyService.generateMessage (/app/bytebot-agent/dist/proxy/proxy.service.js:44:32)
bytebot-llm-proxy  |   File "/usr/lib/python3.13/site-packages/litellm/proxy/common_request_processing.py", line 436, in base_process_llm_request




bytebot-agent      |     at async AgentProcessor.runIteration (/app/bytebot-agent/dist/agent/agent.processor.js:138:29)


bytebot-llm-proxy  |     responses = await llm_responses
bytebot-agent      | [Nest] 18  - 09/05/2025, 7:31:40 PM   ERROR [AgentProcessor] Error during task processing iteration for task ID: 41d988d4-4bf4-4624-a3a7-7781bbea7ca2 - 500 litellm.APIConnectionError: list index out of range




bytebot-llm-proxy  |                 ^^^^^^^^^^^^^^^^^^^
bytebot-agent      | Traceback (most recent call last):


bytebot-llm-proxy  |   File "/usr/lib/python3.13/site-packages/litellm/router.py", line 1083, in acompletion
bytebot-agent      |   File "/usr/lib/python3.13/site-packages/litellm/main.py", line 3162, in completion
bytebot-llm-proxy  |     raise e

bytebot-agent      |     response = base_llm_http_handler.completion(
bytebot-llm-proxy  |   File "/usr/lib/python3.13/site-packages/litellm/router.py", line 1059, in acompletion

bytebot-agent      |         model=model,
bytebot-llm-proxy  |     response = await self.async_function_with_fallbacks(**kwargs)
bytebot-agent      |     ...<13 lines>...
bytebot-llm-proxy  |                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

bytebot-agent      |         client=client,


bytebot-agent      |     )
bytebot-llm-proxy  |   File "/usr/lib/python3.13/site-packages/litellm/router.py", line 3902, in async_function_with_fallbacks
bytebot-agent      |   File "/usr/lib/python3.13/site-packages/litellm/llms/custom_httpx/llm_http_handler.py", line 329, in completion

bytebot-llm-proxy  |     return await self.async_function_with_fallbacks_common_utils(
bytebot-llm-proxy  |            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
bytebot-agent      |     data = provider_config.transform_request(
bytebot-llm-proxy  |     ...<8 lines>...
bytebot-agent      |         model=model,


bytebot-agent      |     ...<3 lines>...
bytebot-llm-proxy  |     )
bytebot-llm-proxy  |     ^
bytebot-agent      |         headers=headers,

bytebot-agent      |     )
bytebot-llm-proxy  |     raise original_exception
bytebot-agent      |   File "/usr/lib/python3.13/site-packages/litellm/llms/ollama/completion/transformation.py", line 344, in transform_request
bytebot-llm-proxy  |   File "/usr/lib/python3.13/site-packages/litellm/router.py", line 3894, in async_function_with_fallbacks
bytebot-agent      |     modified_prompt = ollama_pt(model=model, messages=messages)


bytebot-agent      |   File "/usr/lib/python3.13/site-packages/litellm/litellm_core_utils/prompt_templates/factory.py", line 237, in ollama_pt
bytebot-llm-proxy  |     response = await self.async_function_with_retries(*args, **kwargs)
bytebot-agent      |     tool_calls = messages[msg_i].get("tool_calls")

bytebot-llm-proxy  |                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^


bytebot-agent      |                  ~~~~~~~~^^^^^^^
bytebot-llm-proxy  |   File "/usr/lib/python3.13/site-packages/litellm/router.py", line 4099, in async_function_with_retries
bytebot-llm-proxy  |     raise original_exceptionf range
bytebot-agent      | . Received Model Group=ollama/deepseek-r1:8b

bytebot-llm-proxy  |   File "/usr/lib/python3.13/site-packages/litellm/router.py", line 3990, in async_function_with_retries
bytebot-llm-proxy  |     response = await self.make_call(original_function, *args, **kwargs)
bytebot-agent      | Error: 500 litellm.APIConnectionError: list index out of range

bytebot-llm-proxy  |                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
bytebot-agent      | Traceback (most recent call last):
bytebot-llm-proxy  |   File "/usr/lib/python3.13/site-packages/litellm/router.py", line 4108, in make_call


bytebot-agent      |   File "/usr/lib/python3.13/site-packages/litellm/main.py", line 3162, in completion
bytebot-llm-proxy  |     response = await response




bytebot-agent      |     response = base_llm_http_handler.completion(
bytebot-llm-proxy  |                ^^^^^^^^^^^^^^
bytebot-agent      |         model=model,
bytebot-llm-proxy  |   File "/usr/lib/python3.13/site-packages/litellm/router.py", line 1364, in _acompletion




bytebot-agent      |     ...<13 lines>...

bytebot-llm-proxy  |   File "/usr/lib/python3.13/site-packages/litellm/router.py", line 1316, in _acompletion
bytebot-llm-proxy  |     response = await _response
bytebot-llm-proxy  |                ^^^^^^^^^^^^^^^te-packages/litellm/llms/custom_httpx/llm_http_handler.py", line 329, in completion


bytebot-agent      |     data = provider_config.transform_request(
bytebot-llm-proxy  |   File "/usr/lib/python3.13/site-packages/litellm/utils.py", line 1586, in wrapper_async


bytebot-agent      |         model=model,
bytebot-llm-proxy  |     raise e



bytebot-llm-proxy  |   File "/usr/lib/python3.13/site-packages/litellm/utils.py", line 1437, in wrapper_async




bytebot-agent      |         headers=headers,
bytebot-llm-proxy  |     result = await original_function(*args, **kwargs)




bytebot-agent      |     )


bytebot-llm-proxy  |              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^


bytebot-agent      |   File "/usr/lib/python3.13/site-packages/litellm/llms/ollama/completion/transformation.py", line 344, in transform_request


bytebot-llm-proxy  |   File "/usr/lib/python3.13/site-packages/litellm/main.py", line 563, in acompletion
bytebot-agent      |     modified_prompt = ollama_pt(model=model, messages=messages)
bytebot-llm-proxy  |     raise exception_type(

bytebot-agent      |   File "/usr/lib/python3.13/site-packages/litellm/litellm_core_utils/prompt_templates/factory.py", line 237, in ollama_pt
bytebot-llm-proxy  |     ...<5 lines>...

bytebot-agent      |     tool_calls = messages[msg_i].get("tool_calls")
bytebot-llm-proxy  |     )
bytebot-agent      |                  ~~~~~~~~^^^^^^^
bytebot-agent      | IndexError: list index out of rangeckages/litellm/main.py", line 536, in acompletion
bytebot-llm-proxy  |     init_response = await loop.run_in_executor(None, func_with_context)

bytebot-agent      | . Received Model Group=ollama/deepseek-r1:8b
bytebot-llm-proxy  |                     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
bytebot-agent      | Available Model Group Fallbacks=None
bytebot-agent      |     at APIError.generate (/app/bytebot-agent/node_modules/openai/core/error.js:66:20)
bytebot-llm-proxy  |     result = self.fn(*self.args, **self.kwargs)

bytebot-agent      |     at OpenAI.makeStatusError (/app/bytebot-agent/node_modules/openai/client.js:158:32)
bytebot-llm-proxy  |   File "/usr/lib/python3.13/site-packages/litellm/utils.py", line 1060, in wrapper
bytebot-agent      |     at OpenAI.makeRequest (/app/bytebot-agent/node_modules/openai/client.js:301:30)
bytebot-agent      |     at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
bytebot-llm-proxy  |     result = original_function(*args, **kwargs)




bytebot-agent      |     at async ProxyService.generateMessage (/app/bytebot-agent/dist/proxy/proxy.service.js:44:32)
bytebot-llm-proxy  |   File "/usr/lib/python3.13/site-packages/litellm/main.py", line 3525, in completion

bytebot-agent      |     at async AgentProcessor.runIteration (/app/bytebot-agent/dist/agent/agent.processor.js:138:29)
bytebot-llm-proxy  |     raise exception_type(
bytebot-agent      | [Nest] 18  - 09/05/2025, 7:31:40 PM   DEBUG [TasksService] Retrieved task with ID: 41d988d4-4bf4-4624-a3a7-7781bbea7ca2
bytebot-llm-proxy  |           ~~~~~~~~~~~~~~^
bytebot-llm-proxy  |         model=model,
bytebot-agent      | [Nest] 18  - 09/05/2025, 7:31:40 PM     LOG [TasksService] Successfully updated task ID: 41d988d4-4bf4-4624-a3a7-7781bbea7ca2




bytebot-llm-proxy  |         ^^^^^^^^^^^^
bytebot-agent      | [Nest] 18  - 09/05/2025, 7:31:40 PM   DEBUG [TasksService] Updated task: {"id":"41d988d4-4bf4-4624-a3a7-7781bbea7ca2","description":"Hi","type":"IMMEDIATE","status":"FAILED","priority":"MEDIUM","control":"ASSISTANT","createdAt":"2025-09-05T19:31:26.966Z","createdBy":"USER","scheduledFor":null,"updatedAt":"2025-09-05T19:31:40.747Z","executedAt":"2025-09-05T19:31:30.005Z","completedAt":null,"queuedAt":null,"error":null,"result":null,"model":{"name":"ollama/deepseek-r1:8b","title":"llama-deepseek-r1","provider":"proxy","contextWindow":128000}}


bytebot-llm-proxy  |     ...<3 lines>...


bytebot-llm-proxy  |         extra_kwargs=kwargs,
bytebot-llm-proxy  |         ^^^^^^^^^^^^^^^^^^^^
bytebot-llm-proxy  |     )
bytebot-llm-proxy  |     ^
bytebot-llm-proxy  |   File "/usr/lib/python3.13/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py", line 2301, in exception_type
bytebot-llm-proxy  |     raise e
bytebot-llm-proxy  |   File "/usr/lib/python3.13/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py", line 2277, in exception_type
bytebot-llm-proxy  |     raise APIConnectionError(
bytebot-llm-proxy  |     ...<8 lines>...
bytebot-llm-proxy  |     )
bytebot-llm-proxy  | litellm.exceptions.APIConnectionError: litellm.APIConnectionError: list index out of range
bytebot-llm-proxy  | Traceback (most recent call last):
bytebot-llm-proxy  |   File "/usr/lib/python3.13/site-packages/litellm/main.py", line 3162, in completion
bytebot-llm-proxy  |     response = base_llm_http_handler.completion(
bytebot-llm-proxy  |         model=model,
bytebot-llm-proxy  |     ...<13 lines>...
bytebot-llm-proxy  |         client=client,
bytebot-llm-proxy  |     )
bytebot-llm-proxy  |   File "/usr/lib/python3.13/site-packages/litellm/llms/custom_httpx/llm_http_handler.py", line 329, in completion
bytebot-llm-proxy  |     data = provider_config.transform_request(
bytebot-llm-proxy  |         model=model,
bytebot-llm-proxy  |     ...<3 lines>...
bytebot-llm-proxy  |         headers=headers,
bytebot-llm-proxy  |     )
bytebot-llm-proxy  |   File "/usr/lib/python3.13/site-packages/litellm/llms/ollama/completion/transformation.py", line 344, in transform_request
bytebot-llm-proxy  |     modified_prompt = ollama_pt(model=model, messages=messages)
bytebot-llm-proxy  |   File "/usr/lib/python3.13/site-packages/litellm/litellm_core_utils/prompt_templates/factory.py", line 237, in ollama_pt
bytebot-llm-proxy  |     tool_calls = messages[msg_i].get("tool_calls")
bytebot-llm-proxy  |                  ~~~~~~~~^^^^^^^
bytebot-llm-proxy  | IndexError: list index out of range
bytebot-llm-proxy  | . Received Model Group=ollama/deepseek-r1:8b
bytebot-llm-proxy  | Available Model Group Fallbacks=None LiteLLM Retried: 1 times, LiteLLM Max Retries: 2
bytebot-llm-proxy  | INFO:     172.18.0.5:51528 - "POST /chat/completions HTTP/1.1" 500 Internal Server Error

radiantone avatar Sep 05 '25 19:09 radiantone

I have the exactly same problem. Also on UI response of LLM not appear as plain text, it shows the raw json

Image

serk7 avatar Sep 07 '25 12:09 serk7

I also had to add drop_params: true because without it there was a different exception (which I will post here later).

radiantone avatar Sep 07 '25 14:09 radiantone

This is the problematic part:

root@bytebot-1:/opt/bytebot/packages/bytebot-agent/src/proxy# egrep -A 2 -B 8 "effort" proxy.service.ts ); try { // Prepare the Chat Completion request const completionRequest: OpenAI.Chat.ChatCompletionCreateParams = { model, messages: chatMessages, max_tokens: 8192, ...(useTools && { tools: proxyTools }), reasoning_effort: 'high', }; root@bytebot-1:/opt/bytebot/packages/bytebot-agent/src/proxy#

If you remove:

...(useTools && { tools: proxyTools }), reasoning_effort: 'high',

you don’t need drop_params: true

The Ollama connection still isn’t working (yet another error)...

plitc avatar Sep 08 '25 05:09 plitc

Without drop_params true it throws a different exception though. I will share that.

This is the problematic part:

root@bytebot-1:/opt/bytebot/packages/bytebot-agent/src/proxy# egrep -A 2 -B 8 "effort" proxy.service.ts ); try { // Prepare the Chat Completion request const completionRequest: OpenAI.Chat.ChatCompletionCreateParams = { model, messages: chatMessages, max_tokens: 8192, ...(useTools && { tools: proxyTools }), reasoning_effort: 'high', }; root@bytebot-1:/opt/bytebot/packages/bytebot-agent/src/proxy#

If you remove:

...(useTools && { tools: proxyTools }), reasoning_effort: 'high',

you don’t need drop_params: true

The Ollama connection still isn’t working (yet another error)...

radiantone avatar Sep 08 '25 11:09 radiantone

This is the problematic part:

root@bytebot-1:/opt/bytebot/packages/bytebot-agent/src/proxy# egrep -A 2 -B 8 "effort" proxy.service.ts ); try { // Prepare the Chat Completion request const completionRequest: OpenAI.Chat.ChatCompletionCreateParams = { model, messages: chatMessages, max_tokens: 8192, ...(useTools && { tools: proxyTools }), reasoning_effort: 'high', }; root@bytebot-1:/opt/bytebot/packages/bytebot-agent/src/proxy#

If you remove:

...(useTools && { tools: proxyTools }), reasoning_effort: 'high',

you don’t need drop_params: true

The Ollama connection still isn’t working (yet another error)...

I modified according to your answer, but there is still this error.

Image Image

nenelin avatar Sep 09 '25 02:09 nenelin

@radiantone @serk7 @nenelin The Ollama setup isn’t working at the moment and I haven’t heard back from anyone on the ByteBot team yet.

plitc avatar Sep 09 '25 13:09 plitc

@plitc Thank you for reponse. But its from the documentation: https://docs.bytebot.ai/deployment/litellm. Maybe it should be edited?

Image

serk7 avatar Sep 09 '25 22:09 serk7

It doesn't work at all. It's not a documentation issue.

@plitc Thank you for reponse. But its from the documentation: https://docs.bytebot.ai/deployment/litellm. Maybe it should be edited?

Image

radiantone avatar Sep 09 '25 23:09 radiantone

Problem is that Ollama doesn't support tools with the opensource models. You need to use vLLM or LMStudio. Both have better openAI API support including tools/function calling. I have the local models working with LMStudio:

- model_name: local-lmstudio-qwen2.5vl:32b
    litellm_params:
      model: openai/qwen2.5-vl-32b-instruct
      api_base: http://192.168.4.250:1234/v1
      api_key: lm-studio
      supports_function_calling: true
      drop_params: true

zhound420 avatar Sep 11 '25 00:09 zhound420

Good catch, some of the examples were generated. We'll update the documentation to specify which models actually work.

atupem avatar Sep 11 '25 15:09 atupem

Good catch, some of the examples were generated. We'll update the documentation to specify which models actually work.

Thank you so much!

serk7 avatar Sep 12 '25 13:09 serk7