CopilotKit icon indicating copy to clipboard operation
CopilotKit copied to clipboard

🐛 Bug: Custom LLM Service Adapter Integration Fails with "serviceAdapter.process is not a function" Error

Open ChanMeng666 opened this issue 7 months ago • 3 comments

♻️ Reproduction Steps

  1. Create a custom service adapter for IBM watsonx AI following the pattern of OpenAIAdapter
  2. Implement the following methods in the custom adapter:
    class WatsonxServiceAdapter {
      async createChatCompletion(options: ChatCompletionOptions): Promise<ChatCompletionResponse>
      async chatCompletion(options: ChatCompletionOptions): Promise<any>
      async streamChatCompletion(options: ChatCompletionOptions): Promise<ReadableStream>
      async listModels(): Promise<any>
    }
    
  3. Configure the CopilotKit runtime in /api/copilotkit/route.ts:
    import { WatsonxServiceAdapter } from "@/lib/watsonx-copilot-adapter";
    
    const serviceAdapter = new WatsonxServiceAdapter({
      apiKey: process.env.WATSONX_API_KEY!,
      deploymentId: process.env.WATSONX_DEPLOYMENT_ID!,
      projectId: process.env.WATSONX_PROJECT_ID!,
      modelId: process.env.WATSONX_MODEL_ID!,
    });
    
    const runtime = new CopilotRuntime();
    const handler = runtime.streamHttpServerResponse(req, serviceAdapter);
    
  4. Start the application and try to interact with the Copilot chat interface
  5. Attempt to send a message through the Copilot chat

✅ Expected Behavior

When implementing a custom service adapter with the core LLM methods (createChatCompletion, chatCompletion, streamChatCompletion, listModels), CopilotKit should successfully integrate with the custom LLM provider (IBM watsonx AI) and process chat interactions normally, similar to how it works with the built-in OpenAIAdapter.

❌ Actual Behavior

The application crashes at runtime with the following error:

TypeError: serviceAdapter.process is not a function
    at CopilotRuntime.streamHttpServerResponse

The chat interface fails to initialize, and users cannot interact with the Copilot. The error suggests that CopilotKit expects a process method on the service adapter, but this requirement is not documented anywhere in the official documentation or evident from examining the OpenAIAdapter implementation.

𝌚 CopilotKit Version

npm ls | grep "@copilotkit"
├── @copilotkit/[email protected]
├── @copilotkit/[email protected]  
├── @copilotkit/[email protected]
└── @copilotkit/[email protected]

📄 Logs (Optional)

**Terminal Error (Copilot Runtime):**

TypeError: serviceAdapter.process is not a function
    at CopilotRuntime.streamHttpServerResponse (/node_modules/@copilotkit/runtime/dist/index.js:234:42)
    at handler (/app/api/copilotkit/route.ts:15:37)
    at process.processTicksAndRejections (node:internal/process/task_queues.js:95:5)


**Browser Console Errors:**

Failed to fetch chat response from /api/copilotkit
NetworkError: Internal Server Error (500)


**Additional Context:**

This issue highlights a critical gap in CopilotKit's documentation for custom LLM provider integration. The service adapter interface requirements are not clearly documented, making it extremely difficult for developers to integrate enterprise LLM providers like IBM watsonx AI, Azure OpenAI Service, or AWS Bedrock.

**Impact:** This prevents enterprise adoption where organizations need to use specific LLM providers for compliance, security, or cost reasons.

**Workaround Attempted:** Tried examining the OpenAIAdapter source code and implementing similar methods, but the missing `process` method requirement was not apparent from the available documentation or examples.

ChanMeng666 avatar Jun 08 '25 13:06 ChanMeng666

@ChanMeng666 Our team will look into this issue. For urgent production concerns, please book a quick consultation here: https://cal.com/nathan-tarbert-copilotkit/15min

copilotkit-support avatar Jun 09 '25 05:06 copilotkit-support

I had the same issue when I forgot to specify the agent name parameter in my <CopilotKit> react element.

slumbi avatar Jun 11 '25 19:06 slumbi

Hi I wonder if you succeed, I am also defining my own watsonx-adaptor.tx but cannot get it working. Thanks!

dannymcy avatar Jun 25 '25 20:06 dannymcy