graphql-ruby
graphql-ruby copied to clipboard
Build a smaller, faster runtime with fewer features
To me, the "interpreter" feels brand new, but in fact, it's almost 4 years old. Many people have run into performance problems that strike at the heart of how the interpreter works (TODO list them here, eg #3936).
Personally, I think it's time to consider a re-vamp of the GraphQL-Ruby runtime with (yet another) specific focus on performance, perhaps by compromising feature-completeness.
In my dreams, you could build a system like this:
- A minimal runtime module which "just" calls methods on objects based on the incoming AST, building the results into a valid response Hash (using the schema's type system). Theoretically, you could use this, along with the schema configuration API, for a very minimal GraphQL runtime.
- Mixins or plugins, or whatever, for adding existing graphql-ruby features back into the system (for example, initializing
GraphQL::Schema::Objects,authorized?, sophisticated field resolution, error handling, tracing, instrumentation, field extensions, lazy resolution, dataloader,context[:current_...]values, and so on), so that any application can grab the features it wants without slowdowns from features it isn't using.
This would be a pretty big chunk of work, but I think it's the only way to improve performance in critical cases while still supporting all the various features that have cropped up over the years.
Fixes #3998
If anyone has suggestions, questions, or comments while I ponder the idea, I'd welcome some discussion on this issue!
I've been looking into this as well for our company as we have been running into this as well. If there is anything I can do to dig in I'd be happy to help where and when I can. The lazy resolve and extensions phases seem to be where we loose performance. That and we actually have the worse case scenario in the field execution where it's checking to see if the field responds to a method, is a hash, or its inner method responds to a method. I was playing around with being able to define the primary strategy a field uses to resolve its data; and that looks like it might be a little bit of a help.
Thanks for chiming in, @benfalk -- that's definitely the kind of stuff I'd like to improve.
Something I have done recently to help highlight this problem potentially. Instead of letting the GQL system drill into the fields; for connections that return the same objects every time I have setup a system to produce RawValue for the node. This has ended up shaving about 200ms of render time from our requests. I've been wondering more on how I might setup a system to dynamically compile known queries into a set of strategies that produce these RawValue fields where it make sense.
@benfalk I added RawValue to be able to do this https://github.com/DmitryTsepelev/graphql-ruby-fragment_cache 😅
@DmitryTsepelev I noticed; I have ended up using it as part of "query analyzer" process where it builds a smarter resolve strategy and wraps it's generation into RawValue. It could use a lot more work but is saving us about 2hrs of CPU spin every 10 mins or so when we are under load 😄
I've been working on extracting features and making them opt-out-able lately, and besides that, continuing to optimize the current runtime. There's nothing else on this issue for now, besides that ongoing work :+1: