graphql-ruby
graphql-ruby copied to clipboard
[pro] add batch request support for ably client
https://www.ably.io/documentation/rest-api/beta
this would be awesome to add - it didn't seem straightforward to me how to do this because of all the nested block calls, but right now our subscription workers are our slowest because they're sending these calls serially.
Thanks for the link. I think it's possible, but I doubt I'll have the bandwidth for it soon. So I'll share my thoughts in case you want to give it a go yourself.
I think the ticket is to expand on this method, Subscriptions#execute_all
:
https://github.com/rmosolgo/graphql-ruby/blob/f5cd6823b2e03d810966f0bc06ed432c1b4ea8ea/lib/graphql/subscriptions.rb#L118-L127
By default, it gets subscription IDs one-by-one, and given an ID, it fetches query data, executes the query, and delivers the result.
I think it would be possible to do something like:
- Gather a batch of subscription IDs
- For each one, fetch query data and execute the query, gathering the results
- Then, post the results as a batch to Ably
def execute_all(event, object)
batch = []
each_subscription_id(event) do |subscription_id|
batch << subscription_id
if batch.size == MAX_BATCH_SIZE # No idea what would be best here ... 10? 100?
# Use the batch presence check API to see who's still listening;
# return any channels that are still active, `delete_subscription` for any channels that _aren't_ listening
active_batch_members = check_presence(batch)
# Gather the result as [sub_id, result] pairs
results = active_batch_members.map { |sub_id| [sub_id, execute(sub_id, event, object)] }
deliver_batch(results)
end
end
if batch.any?
# also deliver the leftover, partial batch here
end
end
One problem is that execute(...)
calls deliver(...)
, so the code above won't exactly work as-written. You'd have to either re-implement execute(...)
so that it didn't call deliver(...)
, or stub out def deliver
so it was a no-op. (Note to self: separate those methods so that they can be customized in isolation, as seen here!)
Also, deliver_batch
would have to be implemented. Although Ably's batch API supports arrays for channel and message, I think the outgoing batch here would be a list of single messages for single channels, something like:
[
{ channels: "subscription-1", messages: { result_1 } },
{ channels: "subscription-2", messages: { result_2 } },
# ...
]
Also, handling Ably's response might be tricky -- on the other hand, I don't think the current implementation handles the response at all 🤔
Finally, check_presence(batch)
would also need to be implemented. It looks like the API expects a comma-joined list of channel names.
Anyways, I hope that helps! I hope I can take a look in the next month or two, so I'll keep this open til then.
i think this might be a generally good option for all subscriptions, i'd guess other services would offer a batch api. i'm going to take a look when i get a moment.
I recently merged #2844 which will make this possible in general, then I asked about Ably's batch support and got directed to the #request
method, I hope to take a closer look soon.
Adding a note since I've been exploring Ably batch support. In my case, I'm using broadcastable subscriptions and would want to post a single message to multiple channels. Alternatively, I'd be interested in broadcastable queries sharing a channel (same subscription_id) which would eliminate my need for ably batches support.
Hey @rmosolgo any progress on this one? We realized now that the more subscribers we have, the slower the responses are getting. Seems tricky to fix so we are even thinking about moving to pusher since you already have that support but if possible we would avoid that.