Add support for OpenAI's Responses API
We should look to support the new Responses API from OpenAI. Here is an article from OpenAI explaining the difference https://platform.openai.com/docs/guides/responses-vs-chat-completions
It sounds like this will become their preferred API for interacting with their models and will come with new features like the ability to store Chat Messages and expose some of their builtin tools like web search.
One approach could be to create a separate Chat Generator (e.g. OpenAIResponsesGenerator?) that would run on this new API. We don't want to overwrite/change the existing component since it also looks like the Chat Completions endpoint is also here to stay.
I'd be happy to help integrate the Responses API. I’ve started reviewing the existing OpenAIChatGenerator implementation and would propose adding a separate OpenAIResponsesGenerator. Let me know if that sounds aligned with your plans.
We received a request for supporting the Responses API also in AzureOpenAIChatGenerator.
I currently have the same issue using self hosted gpt-oss over vllm with multiple mcp servers. Merging the PR would be great! I am ready to help.