haystack icon indicating copy to clipboard operation
haystack copied to clipboard

Add support for OpenAI's Responses API

Open sjrl opened this issue 7 months ago • 1 comments

We should look to support the new Responses API from OpenAI. Here is an article from OpenAI explaining the difference https://platform.openai.com/docs/guides/responses-vs-chat-completions

It sounds like this will become their preferred API for interacting with their models and will come with new features like the ability to store Chat Messages and expose some of their builtin tools like web search.

One approach could be to create a separate Chat Generator (e.g. OpenAIResponsesGenerator?) that would run on this new API. We don't want to overwrite/change the existing component since it also looks like the Chat Completions endpoint is also here to stay.

sjrl avatar May 22 '25 06:05 sjrl

I'd be happy to help integrate the Responses API. I’ve started reviewing the existing OpenAIChatGenerator implementation and would propose adding a separate OpenAIResponsesGenerator. Let me know if that sounds aligned with your plans.

therealladiesman217 avatar May 25 '25 07:05 therealladiesman217

We received a request for supporting the Responses API also in AzureOpenAIChatGenerator.

julian-risch avatar Sep 03 '25 08:09 julian-risch

I currently have the same issue using self hosted gpt-oss over vllm with multiple mcp servers. Merging the PR would be great! I am ready to help.

Hansehart avatar Oct 14 '25 09:10 Hansehart