llama-stack icon indicating copy to clipboard operation
llama-stack copied to clipboard

Connecting to MCP tools results in an internal server error

Open rchaganti opened this issue 7 months ago • 3 comments

System Info

GPU Type: NVIDIA A100 OS: Ubuntu 24.04 CUDA: 12.8

Information

  • [ ] The official example scripts
  • [x] My own modified scripts

🐛 Describe the bug

Here is the example MCP integration script I am trying to run:

import pprint
import asyncio
from llama_stack_client.types.shared_params.url import URL
from llama_stack_client import LlamaStackClient

HOST='localhost'
PORT=8321

client = LlamaStackClient(
    base_url=f"http://{HOST}:{PORT}",
)
client.toolgroups.register(
    toolgroup_id="mcp::filesystem",
    provider_id="model-context-protocol",
    mcp_endpoint=URL(uri="http://localhost:8888/sse"),
)
pprint(client.tools.list(toolgroup_id="mcp::filesystem"))
$ npx -y supergateway --port 8888 --stdio 'npx -y @modelcontextprotocol/server-filesystem content/'
[supergateway] Starting...
[supergateway] Supergateway is supported by Superinterface - https://superinterface.ai
[supergateway]   - port: 8888
[supergateway]   - stdio: npx -y @modelcontextprotocol/server-filesystem content/
[supergateway]   - ssePath: /sse
[supergateway]   - messagePath: /message
[supergateway]   - CORS enabled: false
[supergateway]   - Health endpoints: (none)
[supergateway] Listening on port 8888
[supergateway] SSE endpoint: http://localhost:8888/sse
[supergateway] POST messages: http://localhost:8888/message
[supergateway] Child stderr: npm WARN
[supergateway] Child stderr:  EBADENGINE Unsupported engine {
npm WARN EBADENGINE   package: '[email protected]',
npm WARN EBADENGINE   required: { node: '20 || >=22' },
npm WARN EBADENGINE   current: { node: 'v18.19.1', npm: '9.2.0' }
npm WARN EBADENGINE }

[supergateway] Child stderr: Secure MCP Filesystem Server running on stdio

[supergateway] Child stderr: Allowed directories: [ '/home/rchaganti/llama-stack/content' ]

When I run the above program, it results in a 500 internal error.

Error logs

10:23:57.253 [START] /v1/toolgroups
ERROR    2025-03-21 10:23:57,252 __main__:195 server: Error executing endpoint route='/v1/toolgroups' method='post'     
         ╭───────────────────────────────────── Traceback (most recent call last) ─────────────────────────────────────╮
         │ /usr/local/lib/python3.10/site-packages/llama_stack/distribution/server/server.py:193 in endpoint           │
         │                                                                                                             │
         │   190 │   │   │   │   │   return StreamingResponse(gen, media_type="text/event-stream")                     │
         │   191 │   │   │   │   else:                                                                                 │
         │   192 │   │   │   │   │   value = func(**kwargs)                                                            │
         │ ❱ 193 │   │   │   │   │   return await maybe_await(value)                                                   │
         │   194 │   │   │   except Exception as e:                                                                    │
         │   195 │   │   │   │   logger.exception(f"Error executing endpoint {route=} {method=}")                      │
         │   196 │   │   │   │   raise translate_exception(e) from e                                                   │
         │                                                                                                             │
         │ /usr/local/lib/python3.10/site-packages/llama_stack/distribution/server/server.py:156 in maybe_await        │
         │                                                                                                             │
         │   153                                                                                                       │
         │   154 async def maybe_await(value):                                                                         │
         │   155 │   if inspect.iscoroutine(value):                                                                    │
         │ ❱ 156 │   │   return await value                                                                            │
         │   157 │   return value                                                                                      │
         │   158                                                                                                       │
         │   159                                                                                                       │
         │                                                                                                             │
         │ /usr/local/lib/python3.10/site-packages/llama_stack/providers/utils/telemetry/trace_protocol.py:102 in      │
         │ async_wrapper                                                                                               │
         │                                                                                                             │
         │    99 │   │   │                                                                                             │
         │   100 │   │   │   with tracing.span(f"{class_name}.{method_name}", span_attributes) as span:                │
         │   101 │   │   │   │   try:                                                                                  │
         │ ❱ 102 │   │   │   │   │   result = await method(self, *args, **kwargs)                                      │
         │   103 │   │   │   │   │   span.set_attribute("output", serialize_value(result))                             │
         │   104 │   │   │   │   │   return result                                                                     │
         │   105 │   │   │   │   except Exception as e:                                                                │
         │                                                                                                             │
         │ /usr/local/lib/python3.10/site-packages/llama_stack/distribution/routers/routing_tables.py:490 in           │
         │ register_tool_group                                                                                         │
         │                                                                                                             │
         │   487 │   │   args: Optional[Dict[str, Any]] = None,                                                        │
         │   488 │   ) -> None:                                                                                        │
         │   489 │   │   tools = []                                                                                    │
         │ ❱ 490 │   │   tool_defs = await                                                                             │
         │       self.impls_by_provider_id[provider_id].list_runtime_tools(toolgroup_id, mcp_endpoint)                 │
         │   491 │   │   tool_host = ToolHost.model_context_protocol if mcp_endpoint else                              │
         │       ToolHost.distribution                                                                                 │
         │   492 │   │                                                                                                 │
         │   493 │   │   for tool_def in tool_defs:                                                                    │
         ╰─────────────────────────────────────────────────────────────────────────────────────────────────────────────╯
         KeyError: 'model-context-protocol'                                                                             
INFO:     172.17.0.1:40886 - "POST /v1/toolgroups HTTP/1.1" 500 Internal Server Error
10:23:57.318 [END] /v1/toolgroups [StatusCode.OK] (64.95ms)
 10:23:57.317 [ERROR] Error executing endpoint route='/v1/toolgroups' method='post'
Traceback (most recent call last):
  File "/usr/local/lib/python3.10/site-packages/llama_stack/distribution/server/server.py", line 193, in endpoint
    return await maybe_await(value)
  File "/usr/local/lib/python3.10/site-packages/llama_stack/distribution/server/server.py", line 156, in maybe_await
    return await value
  File "/usr/local/lib/python3.10/site-packages/llama_stack/providers/utils/telemetry/trace_protocol.py", line 102, in async_wrapper
    result = await method(self, *args, **kwargs)
  File "/usr/local/lib/python3.10/site-packages/llama_stack/distribution/routers/routing_tables.py", line 490, in register_tool_group
    tool_defs = await self.impls_by_provider_id[provider_id].list_runtime_tools(toolgroup_id, mcp_endpoint)
KeyError: 'model-context-protocol'
10:59:21.644 [START] /v1/toolgroups
ERROR    2025-03-21 10:59:21,640 __main__:195 server: Error executing endpoint route='/v1/toolgroups' method='post'     
         ╭───────────────────────────────────── Traceback (most recent call last) ─────────────────────────────────────╮
         │ /usr/local/lib/python3.10/site-packages/llama_stack/distribution/server/server.py:193 in endpoint           │
         │                                                                                                             │
         │   190 │   │   │   │   │   return StreamingResponse(gen, media_type="text/event-stream")                     │
         │   191 │   │   │   │   else:                                                                                 │
         │   192 │   │   │   │   │   value = func(**kwargs)                                                            │
         │ ❱ 193 │   │   │   │   │   return await maybe_await(value)                                                   │
         │   194 │   │   │   except Exception as e:                                                                    │
         │   195 │   │   │   │   logger.exception(f"Error executing endpoint {route=} {method=}")                      │
         │   196 │   │   │   │   raise translate_exception(e) from e                                                   │
         │                                                                                                             │
         │ /usr/local/lib/python3.10/site-packages/llama_stack/distribution/server/server.py:156 in maybe_await        │
         │                                                                                                             │
         │   153                                                                                                       │
         │   154 async def maybe_await(value):                                                                         │
         │   155 │   if inspect.iscoroutine(value):                                                                    │
         │ ❱ 156 │   │   return await value                                                                            │
         │   157 │   return value                                                                                      │
         │   158                                                                                                       │
         │   159                                                                                                       │
         │                                                                                                             │
         │ /usr/local/lib/python3.10/site-packages/llama_stack/providers/utils/telemetry/trace_protocol.py:102 in      │
         │ async_wrapper                                                                                               │
         │                                                                                                             │
         │    99 │   │   │                                                                                             │
         │   100 │   │   │   with tracing.span(f"{class_name}.{method_name}", span_attributes) as span:                │
         │   101 │   │   │   │   try:                                                                                  │
         │ ❱ 102 │   │   │   │   │   result = await method(self, *args, **kwargs)                                      │
         │   103 │   │   │   │   │   span.set_attribute("output", serialize_value(result))                             │
         │   104 │   │   │   │   │   return result                                                                     │
         │   105 │   │   │   │   except Exception as e:                                                                │
         │                                                                                                             │
         │ /usr/local/lib/python3.10/site-packages/llama_stack/distribution/routers/routing_tables.py:490 in           │
         │ register_tool_group                                                                                         │
         │                                                                                                             │
         │   487 │   │   args: Optional[Dict[str, Any]] = None,                                                        │
         │   488 │   ) -> None:                                                                                        │
         │   489 │   │   tools = []                                                                                    │
         │ ❱ 490 │   │   tool_defs = await                                                                             │
         │       self.impls_by_provider_id[provider_id].list_runtime_tools(toolgroup_id, mcp_endpoint)                 │
         │   491 │   │   tool_host = ToolHost.model_context_protocol if mcp_endpoint else                              │
         │       ToolHost.distribution                                                                                 │
         │   492 │   │                                                                                                 │
         │   493 │   │   for tool_def in tool_defs:                                                                    │
         ╰─────────────────────────────────────────────────────────────────────────────────────────────────────────────╯
         KeyError: 'model-context-protocol'                                                                             
INFO:     172.17.0.1:51478 - "POST /v1/toolgroups HTTP/1.1" 500 Internal Server Error
10:59:21.709 [END] /v1/toolgroups [StatusCode.OK] (65.66ms)
 10:59:21.708 [ERROR] Error executing endpoint route='/v1/toolgroups' method='post'
Traceback (most recent call last):
  File "/usr/local/lib/python3.10/site-packages/llama_stack/distribution/server/server.py", line 193, in endpoint
    return await maybe_await(value)
  File "/usr/local/lib/python3.10/site-packages/llama_stack/distribution/server/server.py", line 156, in maybe_await
    return await value
  File "/usr/local/lib/python3.10/site-packages/llama_stack/providers/utils/telemetry/trace_protocol.py", line 102, in async_wrapper
    result = await method(self, *args, **kwargs)
  File "/usr/local/lib/python3.10/site-packages/llama_stack/distribution/routers/routing_tables.py", line 490, in register_tool_group
    tool_defs = await self.impls_by_provider_id[provider_id].list_runtime_tools(toolgroup_id, mcp_endpoint)
KeyError: 'model-context-protocol'
10:59:22.126 [START] /v1/toolgroups
ERROR    2025-03-21 10:59:22,126 __main__:195 server: Error executing endpoint route='/v1/toolgroups' method='post'     
         ╭───────────────────────────────────── Traceback (most recent call last) ─────────────────────────────────────╮
         │ /usr/local/lib/python3.10/site-packages/llama_stack/distribution/server/server.py:193 in endpoint           │
         │                                                                                                             │
         │   190 │   │   │   │   │   return StreamingResponse(gen, media_type="text/event-stream")                     │
         │   191 │   │   │   │   else:                                                                                 │
         │   192 │   │   │   │   │   value = func(**kwargs)                                                            │
         │ ❱ 193 │   │   │   │   │   return await maybe_await(value)                                                   │
         │   194 │   │   │   except Exception as e:                                                                    │
         │   195 │   │   │   │   logger.exception(f"Error executing endpoint {route=} {method=}")                      │
         │   196 │   │   │   │   raise translate_exception(e) from e                                                   │
         │                                                                                                             │
         │ /usr/local/lib/python3.10/site-packages/llama_stack/distribution/server/server.py:156 in maybe_await        │
         │                                                                                                             │
         │   153                                                                                                       │
         │   154 async def maybe_await(value):                                                                         │
         │   155 │   if inspect.iscoroutine(value):                                                                    │
         │ ❱ 156 │   │   return await value                                                                            │
         │   157 │   return value                                                                                      │
         │   158                                                                                                       │
         │   159                                                                                                       │
         │                                                                                                             │
         │ /usr/local/lib/python3.10/site-packages/llama_stack/providers/utils/telemetry/trace_protocol.py:102 in      │
         │ async_wrapper                                                                                               │
         │                                                                                                             │
         │    99 │   │   │                                                                                             │
         │   100 │   │   │   with tracing.span(f"{class_name}.{method_name}", span_attributes) as span:                │
         │   101 │   │   │   │   try:                                                                                  │
         │ ❱ 102 │   │   │   │   │   result = await method(self, *args, **kwargs)                                      │
         │   103 │   │   │   │   │   span.set_attribute("output", serialize_value(result))                             │
         │   104 │   │   │   │   │   return result                                                                     │
         │   105 │   │   │   │   except Exception as e:                                                                │
         │                                                                                                             │
         │ /usr/local/lib/python3.10/site-packages/llama_stack/distribution/routers/routing_tables.py:490 in           │
         │ register_tool_group                                                                                         │
         │                                                                                                             │
         │   487 │   │   args: Optional[Dict[str, Any]] = None,                                                        │
         │   488 │   ) -> None:                                                                                        │
         │   489 │   │   tools = []                                                                                    │
         │ ❱ 490 │   │   tool_defs = await                                                                             │
         │       self.impls_by_provider_id[provider_id].list_runtime_tools(toolgroup_id, mcp_endpoint)                 │
         │   491 │   │   tool_host = ToolHost.model_context_protocol if mcp_endpoint else                              │
         │       ToolHost.distribution                                                                                 │
         │   492 │   │                                                                                                 │
         │   493 │   │   for tool_def in tool_defs:                                                                    │
         ╰─────────────────────────────────────────────────────────────────────────────────────────────────────────────╯
         KeyError: 'model-context-protocol'                                                                             
INFO:     172.17.0.1:51478 - "POST /v1/toolgroups HTTP/1.1" 500 Internal Server Error
10:59:22.191 [END] /v1/toolgroups [StatusCode.OK] (64.51ms)
 10:59:22.190 [ERROR] Error executing endpoint route='/v1/toolgroups' method='post'
Traceback (most recent call last):
  File "/usr/local/lib/python3.10/site-packages/llama_stack/distribution/server/server.py", line 193, in endpoint
    return await maybe_await(value)
  File "/usr/local/lib/python3.10/site-packages/llama_stack/distribution/server/server.py", line 156, in maybe_await
    return await value
  File "/usr/local/lib/python3.10/site-packages/llama_stack/providers/utils/telemetry/trace_protocol.py", line 102, in async_wrapper
    result = await method(self, *args, **kwargs)
  File "/usr/local/lib/python3.10/site-packages/llama_stack/distribution/routers/routing_tables.py", line 490, in register_tool_group
    tool_defs = await self.impls_by_provider_id[provider_id].list_runtime_tools(toolgroup_id, mcp_endpoint)
KeyError: 'model-context-protocol'
10:59:22.988 [START] /v1/toolgroups
ERROR    2025-03-21 10:59:22,986 __main__:195 server: Error executing endpoint route='/v1/toolgroups' method='post'     
         ╭───────────────────────────────────── Traceback (most recent call last) ─────────────────────────────────────╮
         │ /usr/local/lib/python3.10/site-packages/llama_stack/distribution/server/server.py:193 in endpoint           │
         │                                                                                                             │
         │   190 │   │   │   │   │   return StreamingResponse(gen, media_type="text/event-stream")                     │
         │   191 │   │   │   │   else:                                                                                 │
         │   192 │   │   │   │   │   value = func(**kwargs)                                                            │
         │ ❱ 193 │   │   │   │   │   return await maybe_await(value)                                                   │
         │   194 │   │   │   except Exception as e:                                                                    │
         │   195 │   │   │   │   logger.exception(f"Error executing endpoint {route=} {method=}")                      │
         │   196 │   │   │   │   raise translate_exception(e) from e                                                   │
         │                                                                                                             │
         │ /usr/local/lib/python3.10/site-packages/llama_stack/distribution/server/server.py:156 in maybe_await        │
         │                                                                                                             │
         │   153                                                                                                       │
         │   154 async def maybe_await(value):                                                                         │
         │   155 │   if inspect.iscoroutine(value):                                                                    │
         │ ❱ 156 │   │   return await value                                                                            │
         │   157 │   return value                                                                                      │
         │   158                                                                                                       │
         │   159                                                                                                       │
         │                                                                                                             │
         │ /usr/local/lib/python3.10/site-packages/llama_stack/providers/utils/telemetry/trace_protocol.py:102 in      │
         │ async_wrapper                                                                                               │
         │                                                                                                             │
         │    99 │   │   │                                                                                             │
         │   100 │   │   │   with tracing.span(f"{class_name}.{method_name}", span_attributes) as span:                │
         │   101 │   │   │   │   try:                                                                                  │
         │ ❱ 102 │   │   │   │   │   result = await method(self, *args, **kwargs)                                      │
         │   103 │   │   │   │   │   span.set_attribute("output", serialize_value(result))                             │
         │   104 │   │   │   │   │   return result                                                                     │
         │   105 │   │   │   │   except Exception as e:                                                                │
         │                                                                                                             │
         │ /usr/local/lib/python3.10/site-packages/llama_stack/distribution/routers/routing_tables.py:490 in           │
         │ register_tool_group                                                                                         │
         │                                                                                                             │
         │   487 │   │   args: Optional[Dict[str, Any]] = None,                                                        │
         │   488 │   ) -> None:                                                                                        │
         │   489 │   │   tools = []                                                                                    │
         │ ❱ 490 │   │   tool_defs = await                                                                             │
         │       self.impls_by_provider_id[provider_id].list_runtime_tools(toolgroup_id, mcp_endpoint)                 │
         │   491 │   │   tool_host = ToolHost.model_context_protocol if mcp_endpoint else                              │
         │       ToolHost.distribution                                                                                 │
         │   492 │   │                                                                                                 │
         │   493 │   │   for tool_def in tool_defs:                                                                    │
         ╰─────────────────────────────────────────────────────────────────────────────────────────────────────────────╯
         KeyError: 'model-context-protocol'                                                                             
INFO:     172.17.0.1:51478 - "POST /v1/toolgroups HTTP/1.1" 500 Internal Server Error

Expected behavior

A list of tools exposed by the MCP server should be printed

rchaganti avatar Mar 21 '25 11:03 rchaganti