next-runtime icon indicating copy to clipboard operation
next-runtime copied to clipboard

[Bug]: Streaming from edge function works, but chunks arrive together at once at the end

Open webpro opened this issue 2 years ago • 0 comments

Summary

  • Edge function in pages/api/*.ts works technically, but is not really streaming
  • Next.js application without any Netlify-specific configuration
  • On Vercel, things work as expected, e.g. headers are set properly, etc.
  • On Netlify, it works, but there is a bug or a missing piece to make streaming work end-to-end

This issue template is killing me. Just wanted to ask whether this is a known limitation/issue? Trying to fill out the fields as requested. Happy to provide more details!

I'm pointing to the reproduction repository I'm working with, but it's not really a repro since you would need API keys etc. so it's just to get an idea of what I'm working on.

(By the way, can't debug this locally properly, since netlify dev eventually results in self is not defined coming from the streaming edge function (this does not happen using next dev; it also happens using netlify dev --live); not sure how to go about this. I expected Next.js didn't use Webpack anymore, but it does and somehow netlify is different from next here.)

A link to a reproduction repository

https://github.com/kadena-community/7-docs-demo/blob/main/src/pages/api/completion.ts

Expected Result

See a streaming response with the first chunk coming in early on and then chunks arriving over time subsequently

Actual Result

Technically a streaming response, but the first and last chunk are one millisecond apart

Steps to reproduce

To see the issue right away:

  1. Visit https://thunderous-medovik-731432.netlify.app
  2. Select "Need suggestions?" and pick one
  3. See that the response takes some time and is printed at once
  4. See in the network tab that the response was indeed streaming, but all chunks arrive at the same time (see screenshots)

To get something going locally:

  1. Create a Next.js application
  2. Add a API function like pages/api/my-function.js
  3. Call OpenAI's /chat/completions function and set stream: true
  4. See that the response takes some time and is printed at once
  5. See in the network tab that the response was indeed streaming, but all chunks arrive at the same time (see screenshots)
Screenshot 2023-05-19 at 12 47 31 PM Screenshot 2023-05-19 at 12 47 22 PM

Next Runtime version

4.36.1

Is your issue related to the app directory?

  • [ ] Yes, I am using the app directory

More information about your build

  • [ ] I am building using the CLI
  • [ ] I am building using file-based configuration (netlify.toml)

What OS are you using?

None

Your netlify.toml file

`netlify.toml`
# Paste content of your `netlify.toml` file here

Your public/_redirects file

`_redirects`
# Paste content of your `_redirects` file here

Your next.config.js file

`next.config.js`
# Paste content of your `next.config.js` file here. Check there is no private info in there.

Builds logs (or link to your logs)

Build logs
# Paste logs here

Function logs

Function logs
# Paste logs here

.next JSON files

generated .next JSON files
# Paste file contents here. Please check there isn't any private info in them
# You can either build locally, or download the deploy from Netlify by clicking the arrow next to the deploy time.

webpro avatar May 19 '23 10:05 webpro