bruno icon indicating copy to clipboard operation
bruno copied to clipboard

fix: #1884 - Fixes infinite loading issue for text/event-stream requests

Open davirxavier opened this issue 8 months ago • 4 comments

Description

This pull request fixes the infinite loading issue when handling requests with the text/event-stream content type by adding support for these streamed request types.

I decided to implement this feature after noticing a few issues raised by users requesting it. Some mentioned that the lack of support for this content type was the only reason they hadn’t fully migrated from Postman (that's my case as well).

If i didn't follow any standards of the project or didn't add a needed change somewhere else please let me know and i will be glad to push any fixes.

Relevant issues: #1884, #2489, #498

(changed the title from feat to fix because fix seems more suited)

Changes

The code changes are really simple, in summary, i just set the responseType to stream in the axios request if the accept header is set to text/event-stream, send the incoming streamed data to the front-end by listening to the stream callbacks and treat the streamed data in the response body panel.

With these changes, if the user sends a request with an accept header of text/event-stream, the right arrow icon in the URL bar will change to an X icon and the streamed data will be concatenated to the response body panel. If the user clicks the X icon, the front-end will send an event to the back-end signaling for the stream to be closed and the data will stop being streamed.

  • Added boolean: item.response.hasStreamRunning, will be true if there's a stream active at the moment. This is the variable that is used to check if there's a stream active.
  • packages/bruno-app/src/components/RequestPane/QueryUrl/index.js: Add icon change from arrow to X if there's a stream active.
  • packages/bruno-app/src/components/RequestTabPanel/index.js: Changed the handleRun function to cancel the current request if there is a stream running (instead of starting a new request).
  • packages/bruno-app/src/providers/App/useIpcEvents.js - Added two new events:
  1. main:http-stream-new-data - Will be emitted from the back-end to the front-end when there's new data to the stream.
  2. main:http-stream-end - Will be emitted from the back-end to the front-end when the stream has been cancelled by the host.
  • packages/bruno-app/src/providers/ReduxStore/slices/collections/index.js - Added and changed a few collection actions:
  1. requestCancelled - If there's a stream active, don't set the response to null, and set the item.response.hasStreamRunning variable to null, so the front-end will reset to there being no stream running.
  2. Don't set the cancelTokenUid to null when the request is created, so it can be cancelled by the user.
  3. Add the streamDataReceived action, this action will be ran when the main:http-stream-new-data event is received, it will concatenate the data received in the stream with the current displayed data.
  • packages/bruno-app/src/utils/network/index.js - Copy the hasStreamRunning variable from the back-end response to item.
  • packages/bruno-electron/src/ipc/network/index.js:
  1. Add checks for the text/event-stream accept header in the request send handler, if this header is found set the axios responseType to "stream" and set the connection header to "keep-alive", so the connection isn't lost.
  2. If the header is found, the first response data will be an empty string.
  3. Before returning the response in the handler, add listeners for the on data received and stream ended callbacks from the axios stream object.
  4. When the data callback is received, emit a main:http-stream-new-data event to the front-end with the new data.
  5. If there's a stream ended callback, check if there's still a delete token saved for that request, if there is, the event was not initiated by a user, so emit main:http-stream-end to the front-end and remove the saved delete token for that request.
  6. Skip the request run if we are running a whole collection (because the event stream will never end).

Contribution Checklist:

  • [ X ] The pull request only addresses one issue or adds one feature.
  • [ - ] The pull request does not introduce any breaking changes - Could have breaking changes. I tested only the basic functionality of the desktop app, seeing as i don't have that much familiarity with the app as a whole or with other interfaces of the application (would be good if someone could help by me testing the other functionalities).
  • [ - ] I have added screenshots or gifs to help explain the change if applicable. - Not applicable.
  • [ X ] I have read the contribution guidelines.
  • [ X ] Create an issue and link to the pull request.

Note: Keeping the PR small and focused helps make it easier to review and merge. If you have multiple changes you want to make, please consider submitting them as separate pull requests.

Publishing to New Package Managers

Please see here for more information.

Resolves #1884 Resolves #2489 Resolves #498

davirxavier avatar Apr 10 '25 00:04 davirxavier

@davirxavier what are your thoughts on,

  • how we can handle Accept: */* ? currently we need to add Accept: text/event-stream as a header to the request to make it work?

  • also while making a request, the elapsed time shown currently is of the first stream response, not the entire time elapsed to get all the responses, other api testing clients seems to show the entire time elapsed, what are your thoughts on that?

  • the send button get disabled only after we get the first streamed response, multiple clicks done followed by first click are currently resulting in multiple responses being received at the same time, for example, if we are supposed to get 7 responses back per request, if we clicked 3 times in quick succession, we will end up populating the response pane with 21 responses, all the responses disappear as we get all responses, i have added a screen recording of the same

  • post response script seems to be running just after the first response and not waiting for all the streamed responses, how do you think we can tackle this

@helloanoop @ramki-bruno would also love to hear your thoughts on these, how we should we approach these problems!

https://github.com/user-attachments/assets/d242ea2e-f2f9-46f3-be35-39a916669a2e

sanish-bruno avatar May 05 '25 10:05 sanish-bruno

@sanish-bruno Hello :)

how we can handle Accept: / ? currently we need to add Accept: text/event-stream as a header to the request to make it work?

  • The accept: text/event-stream is important only so we can explicitly set the responseType property to "stream" in the axios request before the request is executed. I think i could probably make an automatic "upgrade" to the stream responseType by making one request normally, checking the response's headers with an interceptor for the Content-Type, cancelling the request and making another one with the appropriate responseType value (i don't think the responseType can be changed on the fly with axios), but that would increase complexity a bit though. Do you think it would be worth to include something like this?

also while making a request, the elapsed time shown currently is of the first stream response, not the entire time elapsed to get all the responses, other api testing clients seems to show the entire time elapsed, what are your thoughts on that?

the send button get disabled only after we get the first streamed response, multiple clicks done followed by first click are currently resulting in multiple responses being received at the same time, for example, if we are supposed to get 7 responses back per request, if we clicked 3 times in quick succession, we will end up populating the response pane with 21 responses, all the responses disappear as we get all responses, i have added a screen recording of the same

  • These seem easy to fix, i can work on them this next week.

post response script seems to be running just after the first response and not waiting for all the streamed responses, how do you think we can tackle this

  • Would it be better for the script to run after the connection is closed or after every event is received from the server?

davirxavier avatar May 09 '25 22:05 davirxavier

  • Would it be better for the script to run after the connection is closed or after every event is received from the server?

I think we should be running the post-response scripts, after we received all the required responses and the connection is closed, that has been the behaviour on other api clients, I believe we should be following the same.

  • The accept: text/event-stream is important only so we can explicitly set the responseType property to "stream" in the axios request before the request is executed. I think i could probably make an automatic "upgrade" to the stream responseType by making one request normally, checking the response's headers with an interceptor for the Content-Type, cancelling the request and making another one with the appropriate responseType value (i don't think the responseType can be changed on the fly with axios), but that would increase complexity a bit though. Do you think it would be worth to include something like this?

Current behaviour forces users to explicitly add Accept: text/event-stream to make the requests, considering the behaviours over the other api testing clients, users are able to make a requests without adding Accept: text/event-stream to request headers. I think we should be relying on response headers, instead of request headers to decide on hoe should we be handling of response.

@davirxavier What are your thought on these?

sanish-bruno avatar May 14 '25 08:05 sanish-bruno

I agree with these points, i'm currently working on adjustments for them.

davirxavier avatar May 20 '25 23:05 davirxavier

@sanish-bruno I have made new commits fixing the discussed problems. Below is a list of the changes i had to made in the code. Apologies for the delay.

how we can handle Accept: / ? currently we need to add Accept: text/event-stream as a header to the request to make it work?

I've managed to make a full automatic "upgrade" to streamed responses, though i've had to change all requests to the "stream" responseType. With this setting, requests will always return a stream object immediately instead of the received data for requests, even for responses that are not text/event-stream. As to not make too much changes to the current code, i've wrapped the stream in a promise, the promise will resolve with the full accumulated data when the stream closes and will reject if any error occurs. So, if the response has the text/event-stream content-type i will mark the response as a stream and return immediately (as was working before), otherwise, i will wrap the stream in the aforementioned promise and await it's completion for the full response data.

For the full folder/collection runner, i have made the same changes, but the requests that have text/event-stream responses will wait for one data event and then stop automatically, so the request will not hang infinitely.

I have tested these changes with text/event-stream responses, json, image, video and html and it seemed to work fine, though this will warrant more testing, as the changes i have made will affect all requests (even if the code changes are simple).

also while making a request, the elapsed time shown currently is of the first stream response, not the entire time elapsed to get all the responses, other api testing clients seems to show the entire time elapsed, what are your thoughts on that?

I have added a stopwatch to the response panel, it will show the elapsed time in real time. If the stream is closed it will show the full time the stream was active.

the send button get disabled only after we get the first streamed response, multiple clicks done followed by first click are currently resulting in multiple responses being received at the same time

I have changed the button to not make more requests when the current request is loading. I think the same was happening to other types of requests too? So this will probably fix this for them as well.

post response script seems to be running just after the first response and not waiting for all the streamed responses, how do you think we can tackle this

The script will only run after the stream closes completely now.

davirxavier avatar May 31 '25 15:05 davirxavier

@davirxavier Thank you so much for actively working on it and we really appreciate it. I have tested it myself, and most of it seems to work as expected. we will review it. most probably will have to test the behaviour over on cli. This needs be tested thoroughly as there are many changes in major parts.

we will be proactively following this. Thanks again! we will let you know if there is anything we probably need to care about.

sanish-bruno avatar Jun 03 '25 13:06 sanish-bruno

The mentioned issues have been resolved successfully.

davirxavier avatar Jun 21 '25 17:06 davirxavier

Hi, just chiming into the discussion here. I'm currently locally testing this PR and found that if the stream is in JSON format, the result will be displayed immediately after the first response part, but just displays [object Object] throughout and after the stream ends. image

motz0815 avatar Jun 23 '25 09:06 motz0815

@davirxavier Thanks a lot for spending time on refining the PR. We have been looking through other API testing clients and how they handle SSE. And many of the clients support SSE as a separate entity, and have a specific UI to go with it. We are thinking of a holistic approach. Even though the PR addresses many things, we will still need to put in lot more effort to make it on par with other tools.

We will be allocating time for the team to work on that. We will keep you on loop on the progress we make. We really appreciate the time you spend on this PR over the last few weeks, and the effort you put into solving each of the concerns. Thank you!

sanish-bruno avatar Jun 24 '25 07:06 sanish-bruno

Hi, any news about SSE support? is it still on the roadmap?

I work with systems that use SEE and would love to see even a not-perfect solution that would just allow me to stick with Burno for all my tasks. There's been a significant amount of work put into this contribution, maybe it could be allowed for the time "the holistic solution" is being prepared?

shwarcu avatar Oct 22 '25 08:10 shwarcu

This is on our roadmap for November. @sid-bruno will be getting this merged this week.

helloanoop avatar Nov 11 '25 10:11 helloanoop

Merging it into an internal branch for fixes and improvements over the original implementation

sid-bruno avatar Nov 12 '25 10:11 sid-bruno

Progress can be tracked on https://github.com/usebruno/bruno/pull/6074

sid-bruno avatar Nov 12 '25 10:11 sid-bruno