graphql-over-http
graphql-over-http copied to clipboard
Add Request Batching Appendix
This documents what I understand to be the current support by servers for request batching. I suggest discussing whether or not this feature should be added as an appendix, or if any changes should be made. If nothing else, perhaps it should be documented as an obsolete protocol. In any case, this PR provides a starting point for such discussions.
If this appendix does not describe how existing servers currently support batching, please let me know. This should be essentially how GraphQL.NET does it, anyway, and seems to match the optional client query batching feature provided by Apollo Router.
This is not intended to replace the Variable Batching proposal.
Awesome, I will chime in once i have the basics done for variable batching.
What we need to get rid of the the json array as this will not allow for streaming, out of order responses and incremental delivery.
What we need to get rid of the the json array as this will not allow for streaming, out of order responses and incremental delivery.
Right, but that's also true of the vanilla GraphQL-over-HTTP protocol. In other words, I think of this as a vanilla extension to the vanilla protocol. That being said, it may be past its usefulness now with streaming, incremental delivery, and other requirements expected of GraphQL servers.
We have discussed many batching variants in the beginning and did decide to skip them for version one as thee is no standard around them. There is the Apollo way, there is the original Facebook way which batches with a single GraphQL document and passes the invoked operation into the url and we have a ton of other variants depending on which vendor you look at. Because of this there is no vanilla batching.
If the payloads add a "requestIndex" property it could fit quite well with #307 and fix these incompatibility issues with newer features; then we could state that where legacy servers don't supply a requestIndex property, it should be assumed to match the index in the response array, and note that this means that legacy servers do not support batching on multi-payload responses such as subscription, stream and defer.