feat(ai,provider,anthropic,openai,azure) Allow clients to download CodeInterpreter output files
Background
The Code Interpreter can create files through Python execution.
OpenAI provides an API that allows downloading files from the Code Interpreter container.
To use this API, both the container_id and file_id are required.
Therefore, the AI SDK needs to support returning the container-file-citation annotation to provide these IDs to the client.
Reference:
This PR enables handling of the container_file_citation annotation so that the container_id and file_id can be retrieved and returned to the client.
Summary
This PR enables returning the parameters required by the OpenAI Container File API for files generated by the Code Interpreter.
Developers can continue using useChat and streamText as before, but the output may now include the container-file-citation annotation and a new source-execution-file entry.
This makes it possible to replace text references with actual download URLs, or to provide link-based UIs powered by source-execution-file.
Users can simply click the link to retrieve the generated file, allowing the AI SDK to seamlessly handle various file formats (e.g., Excel).
DEMO - GIF animation
- examples/next-openai/src/app/test-openai-code-interpreter-download-files
- http://localhost:3000/test-openai-code-interpreter-download-files
- http://localhost:3000/test-openai-code-interpreter-download-files
Receiving container_file_citation in TextUIPart
The client can receive the OpenAI annotation container_file_citation using the TextUIPart's providerMetadata.
This PR retrieves the container_file_citation annotation from providerMetadata and replaces the corresponding text message according to a defined rule.
By further rendering it through markdown (e.g., Streamdown), the output enables users to directly click and download the files generated by the Code Interpreter.
This approach improves usability — instead of copying raw sandbox paths or API URLs, users can now access the generated files with a single click, resulting in a much better UX.
{
type: 'text',
text: 'I have created an Excel file containing the names of 10 historical figures. You can download it here:\n' +
'\n' +
'[Download the Excel file](sandbox:/mnt/data/historical_figures.xlsx)',
providerMetadata: {
openai: {
itemId: 'msg_09fa2bb0f5d25ded0068e7ce98cdfc8195853725f2bf424486',
annotations: [
{
type: 'container_file_citation',
container_id: 'cntr_68e7ce8aef948190a3447822efbd111c0745ba41b6f16e8a',
file_id: 'cfile_68e7ce9ba01c8191a81ff3ce590dfc44',
filename: 'historical_figures.xlsx',
start_index: 129,
end_index: 170
}
]
}
}
},
Visualization Example (Comparison Table)
Below is a comparison of plain text vs. markdown rendering (Streamdown).
When the container_file_citation annotation is applied, users can click to download the file directly — improving both functionality and UX.
[!CAUTION] This PR adds the
container_file_citationannotation to providerMetadata. Similarly,file_citationandurl_citationannotations are also attached in the same way. These annotations are currently included even whensendSourcesis set to false. If reviewers believe this behavior should be adjusted before merging, I’ll be happy to update the implementation accordingly.
source-execution-file: Independent Download UI
A third type, source-execution-file, can be created in SourceUIPart to build a download UI that is independent of the text. ProviderMetadata for this purpose has also been defined.
source-execution-file (Independent Download UI)
When using streamText, a new source part with type: "execution-file" is generated.
This part enables creating a clickable download UI that is independent from the text message itself.
For example, the Excel file generated by the Code Interpreter can be shown as a blue download button, allowing users to easily obtain the file without interacting with the text content.
Example Output
{
type: 'source',
sourceType: 'execution-file',
id: 'pgDscAT78wrTK3dG',
providerMetadata: {
openai: {
containerId: 'cntr_68e7e2fb0e408190b9a122b2b2d6ea2d0e937059125f30cc',
fileId: 'cfile_68e7e312edbc8191a08252ebc0791a06',
filename: 'historical_figures.xlsx'
}
}
}
Visualization Example
For example , like this button.
The blue button appears at the end of the text message.
Parsing providerMetadata with Zod
A Zod schema has been introduced to safely parse the providerMetadata of each UI part and extract the required information.
This ensures type safety and prevents runtime errors when handling provider-specific metadata.
For example, the TextUIPart uses the container_file_citation annotation metadata, while the SourceExecutionFileUIPart extracts file-related parameters such as containerId, fileId, and filename.
https://github.com/vercel/ai/blob/1c7953a936fff88169ac74fc02820336e89538c8/packages/openai/src/responses/openai-responses-api.ts#L706-L719
A zod parse for source-file-citation example.
https://github.com/vercel/ai/blob/1c7953a936fff88169ac74fc02820336e89538c8/examples/next-openai/app/test-openai-code-interpreter-download-files/container-file-citation-download-button.tsx#L11-L27
Available in both @ai-sdk/openai and @ai-sdk/azure.
Similar to the OpenAI implementation, a Zod schema is also provided for Azure OpenAI.
z.union Unified Zod Schema for Multiple Providers
This approach helps handle provider-specific metadata safely and consistently, enabling appropriate logic for each provider.
https://github.com/vercel/ai/blob/ee8d88b25f71cdd724b549c94e5efb181ebbc6eb/examples/next-openai/app/test-azure-code-interpreter-download-files/message-text-with-download-link.tsx#L10-L26
Example: File Download Route for Next.js
Since the actual file download is handled through native fetching from OpenAI’s official API,
an example route.ts file has been added for Next.js to demonstrate how the download endpoint can be implemented.
You can see the example implementation here:
https://github.com/vercel/ai/blob/1c7953a936fff88169ac74fc02820336e89538c8/examples/next-openai/app/api/chat-openai-code-interpreter-download-files/%5Bcontainer%5D/%5Bfile%5D/route.ts#L32-L38
Manual Verification
ai-core
- [x] src\generate-text\openai-responses-code-interpreter-download-files.ts
- [x] src\stream-text\openai-responses-code-interpreter-download-files.ts
- [x] src\generate-text\azure-responses-code-interpreter-download-files.ts
- [x] src\stream-text\azure-responses-code-interpreter-download-files.ts
example/next-openai
- [x] OpenAI http://localhost:3000/test-openai-code-interpreter-download-files
- [x] Azure OpenAI http://localhost:3000/test-azure-code-interpreter-download-files
- when the test, run
pnpm add clsx tailwind-merge streamdownand fix comments inadditional-dependencies.tsxfile. - input example
Create an Excel file with the names of 10 historical figures. Run it immediately. No questions allowed.
- when the test, run
Checklist
- [ ] Tests have been added / updated (for bug fixes / features)
- [ ] Documentation has been added / updated (for bug fixes / features)
- [ ] A patch changeset for relevant packages has been added (for bug fixes / features - run
pnpm changesetin the project root) - [ ] Formatting issues have been fixed (run
pnpm prettier-fixin the project root) - [ ] I have reviewed this pull request (self-review)
[!NOTE] This PR depends on
SharedV3ProviderMetadata.
Future Work
[!CAUTION] (Repost) This PR adds the
container_file_citationannotation to providerMetadata. Similarly,file_citationandurl_citationannotations are also attached in the same way. These annotations are currently included even whensendSourcesis set to false. While this behavior may change before the PR is merged, it is noted as Future Work for now.
Related Issues
@tsuzaki430 Hello tried to test it through the page and works good, no errors, can test the specific test files tomorrow.
Though did catch a typo for openaiResponseAnnotationSchema
Both stream-text and generate-text tests work correctly
@jephal Thank you for try this PR and left two messages and let me know spell miss! I made some fixes and now you can try Azure OpenAI cases.
- ai-core
- openai
- generate-text/openai-responses-code-interpreter-download-files.ts
- stream-text/openai-responses-code-interpreter-download-files.ts
- azure (New)
- generate-text/azure-responses-code-interpreter-download-files.ts
- stream-text/azure-responses-code-interpreter-download-files.ts
- openai
- next-openai
- openai http://localhost:3000/test-openai-code-interpreter-download-files
- azure(New) http://localhost:3000/test-azure-code-interpreter-download-files
Hi @tsuzaki430, sorry been busy lately but managed to check out your new changes now!
I see that you are in the process of refactoring components and routes to support Anthropic provider (I dont have access to anthropic so cannot test this atm), so dont know if you have already solved these issues locally... but
For Openai/Azure, the direct tests themselves worked well, but the the both the in-text button and the separate button component pointed at the wrong routes.
So for the standalone button it pointed at execution-files, not code-execution files route for both provider paths
whereas for the inline openai-message-text-with-annotations it still pointed at the old routes
I also saw you combined the separate buttons for dowloading files from azure and openai, but the old files still exist under the page folder
When updating these it worked great!
@jephal Thank you for your kind message and notification wrong path download link! I just fixed url path😊
Hey @tsuzaki430, happy to look into the PR. Are you still working on it? Could you please look into the conflicts? No rush
@gr2m
Thank you for your message on this pull request. Actually, most of the implementation attempted here has already been introduced, and I believe the remaining issue is the processing for Azure. This conflict will be resolved in the future, but I plan to modify the content to focus on Azure.
@tsuzaki430 this PR came up during our Azure community call yesterday: https://github.com/vercel/ai/issues/9863#issuecomment-3533809136. Let us know if you need any help to move it forward
@gr2m
Thank you for message. I'm going to redefine this list.
| id | task |
|---|---|
| #9431 closed | azure tests cicd |
| #10161 closed | example tests with next.js |
| #10252 (sonn PR) | identifer issue |
| not yet | #10252 future work , fix the document. |
| soon | I want annotations in doStream same as doGenerate. |
| soon | for annotations in TextPartUI , sholud we export zod shema? |
| not yet | The source-document has various roles, and we need to clarify each endpoint. |
| not yet | fix next-openai example after those PRs are closed. I think the best practice will moved. |
I created issue #10255 for codeInterpreter Output files.
hey @tsuzaki430, i know you and @gr2m have been working on this but just want to reiterate that we already support accessing codeInterpreter files.
this PR then can be closed and the any additional changes can be added via subsequent PRs. Doing it in this one will become too messy
@aayush-kapoor
I'm sorry , and thank you message. I just closed this PR.
no need to apologise, it's all good! good work here tho, we might need to extract some parts of it for future work of making it standardized