swagger-ui
swagger-ui copied to clipboard
Large response bodies cause hanging
| Q | A |
|---|---|
| Bug or feature request? | Bug |
| Which Swagger/OpenAPI version? | 2.0 |
| Which Swagger-UI version? | 3.4.0 |
| How did you install Swagger-UI? | local dev server |
| Which browser & version? | Chrome 61 |
| Which operating system? | macOS High Sierra |
Demonstration API definition
Swagger 2 Petstore is sufficient.
Configuration (browser query string, constructor, config.yaml)
Default.
Current Behavior
As an example, GET /pet/findByStatus in the Petstore currently returns 1.8MB of JSON when all statuses are selected. This is a pretty big chunk of data, but Swagger-UI insists on rendering the entire body, which is neither performant or useful. The main thread of the application is locked for upwards of 20 seconds on my machine.
Possible Solution
Truncate or refuse to display large textual response bodies.
Context
GET /pet/findByStatus is my favorite operation to use when testing UI enums, so this is something that impedes me pretty regularly.
cc @webron, would appreciate a priority assignment
This is strange! I swear I've used larger responses without any issues, let me see if I can make one up...
Here is my test: http://swashbuckletest.azurewebsites.net/swagger/ui/index?filter=Huge#/HugeResponse/HugeResponse_Get Yes we have a problem, even with 100000 it generates a 378K response that takes way too long
a quick fix could be not to display such a large responses, instead show something like github does on diff:
Large diffs are not rendered by default.
or truncate the body to 20000:
<ResponseBody content={ body.substring(0,20000) }
contentType={ contentType }
url={ url }
headers={ headers }
getComponent={ getComponent }/>
If we truncate the response will show: can't parse JSON. Raw result:
Profile-20171028T214335.zip Attached is my performance profile I hope that can help troubleshooting. something in react is taking too long webpack:///./~/react-dom/lib/DOMChildrenOperations.js?568f
function removeChild(parentNode, childNode) {
if (Array.isArray(childNode)) {
var closingComment = childNode[1];
childNode = childNode[0];
removeDelimitedText(parentNode, childNode, closingComment);
parentNode.removeChild(closingComment);
}
parentNode.removeChild(childNode);
}
As @heldersepu suggested, I believe it's fair to limit the size that's being rendered, and if it's larger, make a link to download the result separately. This is not perfect though, as it means running the call again, and for some APIs it won't necessarily return the same result.
@webron, we'd be able to store the response string we already received, and allow the user to download it as a file - no second request needed.
As a proof of concept, see the Save as JSON/Save as YAML options in Swagger-Editor: we take the current editor content as a string and download it as a file, all from the client side.
That's perfect then.
@shockey & @webron Love the solution! the download link will be awesome!
...but someone should try looking into why the slowdown You guys don't know any react guru that can truly get to the bottom of the issue?
In the past, we had issues with the syntax highlighter and large responses, but that doesn't seem to be the issue. I've tested the same API call in both Nightly and Chrome, and they both render fairly quickly (~3 seconds) when the format is JSON. When the format is XML though... that's when it explodes.
Okay, I dug into this a bit more, and @webron's note about XML is spot on.
When rendering XML, React spends a ton of time in the ResponseBody render function:
Everything else looks performant, so the good news is that our problem is contained to one component.
Doing a traditional profile on call stack times between clicking Execute and seeing the body revealed the true issue:
More than 95% of the main thread's time is spent parsing the whitespace-stripped XML that comes from the server into a pretty™ XML string. Even more specifically, this line appears to be the bottleneck: https://github.com/swagger-api/swagger-ui/blob/master/src/core/utils.js#L222
TL;DR: it's the formatXml(xmlStr) prettifier function in utils.js
I was looking at the formatXml function and that code smells funny... Inside that function there is a section where we:
- create an array
types - loop over types populating
results - finally get the first one of result
It might be late and I'm missing something ... ...or that can be simplified, I'm sending you guys a PR
@heldersepu, I agree - note that the code was originally taken from an SO answer.
I'd be in favor of using a third-party module to handle the formatting. In the meantime, I'll take a look at your PR 😄
I think we still have a problem with large responses, my test: http://swashbuckletest.azurewebsites.net/swagger/ui/index?filter=Huge#/HugeResponse/HugeResponse_Get that is a json response and it gets uncomfortably slow to render...
I think we should still set a hard MAX for the response size and provide the download link for those large ones
@heldersepu, yeah, I see it with your example - the request comes back in under a second, but the rendering takes about ten seconds. I agree with the hard max, but I'd also like to continue looking at the underlying causes and try to address as we go.
Reopening.
Looks like the lion's share of that time (about 8 seconds) is React fiddling with the DOM. The response rendering causes 370(!!) DOM operations to be queued, and >90% of them appear to be caused by JumpToPath affecting ModelCollapse, which is unrelated to the response coming in as far as I can see.
This is a performance issue, but I'm inclined to keep this separated from our big performance ticket, since it's related to Try-It-Out, not initial rendering.
Below is a snapshot of React's DOM operation queue (the HighlightCode update with the response body isn't even on the first page):
why not render asynchronously in stages, like eg. postman does. it works well and there's barely any need for switches or limits.
btw this issue got much worse in recent versions, presumably due to pretty-print / color enhancements which make rendering even more expensive.
why not render asynchronously in stages, like eg. postman does. it works well and there's barely any need for switches or limits.
What's the purpose of showing/render a response larger than 2MB? That is certainly not humanly readable, my sample above on postman takes a few seconds too on the "formating" step... To me what github does on diff is the way to go, let the users know the response is large and give them an option to download content or render at own risk
What's the purpose of showing/render a response larger than 2MB?
your argument makes no sense to me. if there's no purpose (in showing the content), why would you download it to eventually do exactly that.
download is okay, it would resolve the most important aspect of this issue. but it's still less convenient.
here are a couple of ideas why someone would download:
- to send it to someone else
- to processes it with a tool
Hopefully, that makes the argument make sense to you
if it makes sense to download it (very often) makes even more sense to view the content right in place.
just for quick checking a subset of a larger result, having to download, open in another editor and cleaning up afterwards are just an awful lot of unnecessary and tiring additional steps.
if it makes sense to download it (very often) makes even more sense to view the content right in place.
YES, and that is exactly what Swagger-UI is doing right now, it works perfectly for 99.9% of the cases ... but this issue is about those not very often cases, like those with large response bodies
There are purpose-built tools, Swagger-UI is no exception... From my days creating ETL jobs, I would not use a browser or a regular text editor to look at large data, instead, I get the schema definition, and load the data into a database for further processing/analysis.
This issue is more around bad practices on API design, responses should not be large, and if there is a need the architect should use pagination, that is why personally never attempted a code fix.
This issue is more around bad practices on API design, responses should not be large,
highly depends on the use case.
anyway it's not implemented that way in postman for no reason, expensive operations should not block UI, talking about bad practices.
in latest builds it doesn't even take a really "large" result for swagger-ui to begin stalling.
anyway it's not implemented that way in postman for no reason, expensive operations should not block UI, talking about bad practices.
Remember the project is open source, if this is a big issue to you the code is there for you to fix it
Remember the project is open source, if this is a big issue to you the code is there for you to fix it
it's not an issue at all for me. just some thoughts for a good solution, as it wasn't mentioned as an option before.
Configuration
- OS: Windows 10.0.18363 Build 18363
- Browser: Chrome 85.0.4183.102
- Method of installation: Swashbuckle.AspNetCore 5.6.0 NuGet
- Swagger-UI version: 3.32.5
I'd like to tag on to this issue. I've been using Swashbuckle.AspNetCore 5.5.1 (Swagger-UI v3.26.0 I believe) and have a somewhat large response body (20k lines of json give or take). In the 5.5.1, the request duration takes 45ms and displays almost instantly. Using the exact same configuration with 5.6.0 (Swagger-UI v.3.32.5), request duration takes 45ms and locks the interface 30 seconds and finally displays. Once that is rendered, if I minimize or expand anything else, the interface is locked again for 30 seconds each time. Once I clear the results, the interface speed returns to normal.
The coloring added is fantastic but I am currently stuck using the older version as the speed is simply too much of an issue.
In order for the results to be displayed i have to .Take(200), if i let more than go through, it will not render and times out. When i revert to 5.5.1 the UI will render all the records almost instantly. I am now stuck with Swashbuckle.AspNetCore 5.5.1
Ran into this recently too. Looks like you can turn it off via config. For me, this worked:
SwaggerUI({
syntaxHighlight: {
activated: false,
theme: "agate"
},
//url: path,
....
});
I am using swagger with NodeJS and one of API response size is 1.54MB not working with me and also hanging
I have at fix
Do something like this in your webservice method
//https://github.com/swagger-api/swagger-ui/blob/master/src/core/components/response-body.jsx#L61
//If you set the Content-Disposition to something containing attachment, it will not be rendered but made downloadable in swagger UI
//inline will be rendered directly in browser, if invoked not through swagger UI
httpServletResponse.setHeader("Content-Disposition", "inline; swaggerDownload=\"attachment\"; filename=\"filename.txt\"");
I.e. set the Content-Disposition header to something containg attachment
If you look at https://github.com/swagger-api/swagger-ui/blob/master/src/core/components/response-body.jsx#L61 you will see that swagger UI checks if the HTTP header "Content-Disposition" contains the string "attachment"
If it does, the content is NOT shown directly, but hidden behind a download link.
Note that response-body.jsx does NOT check anything about the well-formedness of the content disposition header. It just checks if it contains the word "attachment".
For this reason I just dumped some extra field swaggerDownload="attachment";, in order to have the word "attachment" there, without confusing other systems.
If this API method is invoked normally from a browser, rather than from swagger UI, it would just render the content inline, as this is specified, and also the default behaivour if Content-Disposition is not specified