Solnet icon indicating copy to clipboard operation
Solnet copied to clipboard

Add *Async overloads in batching mechanism

Open tiago18c opened this issue 2 years ago • 4 comments

Lack of async methods in batching requests make it unuseable with avalonia and blazor wasm

tiago18c avatar Feb 08 '22 17:02 tiago18c

Is avalonia the UI framework used for citadel? This issue cropped up during the original development of batching. A minimal skeleton project that demonstrates the problem for avalonia and blazor ws would be very helpful.

liquizard avatar Feb 08 '22 18:02 liquizard

There's an easy workaround. But the gist of it is that when binding actions (methods) to the UI, if the method doesn't return quickly, the UI dispatcher locks up. However, if the method bound is async (along with all inner IO-bound calls) the dispatcher manages to switch between drawing and execution to avoid freezing the app.

The quick solution is to wrap anything in an async lambda, e.g.: await Task.Run( () => /* long running task*/);.

However, for the Blazor WASM case, any .Result property call in a task will throw an exception due to some inner Monitor details. See this.

tiago18c avatar Feb 08 '22 18:02 tiago18c

I will implement an ExecuteAsync method but there is a concurrency hazard we need to defend against that may bring with it some internal changes.

At the moment, the internal Composer is reused between calls. When Execute happens syncronously, we are certain that new commands are not going to be immediately added to the batch that is currently executing. Once execution has completed and the callbacks invoked, the batch is reset ready to receive more commands. If ExecuteAsync returns immediately, callers may start throwing more requests into the batch... with auto-execute, this should not be allowed to continue indefinitely and at some point it would be a good idea to block the caller otherwise you could potentially have hundreds of large batches "in flight"

There are a couple of options on how we can implement this:

  1. make all this the callers problem, e.g. an attempt to add to a batch that is currently executing would throw a GTFO exception
  2. handle the complexity internally, e.g. execute will store the executing batch internally and create a new one to receive new commands from the caller. A call to Flush would execute if there is not a current executing batch, otherwise it would be queued.

Any thoughts or preferences?

liquizard avatar Feb 09 '22 10:02 liquizard

There is already a ExecuteAsync. The problem is that everything above it has no *Async version. Wouldn't that be enough?

1- FlushAsync that calls ExecuteAsync and awaits 2- AddAsync that conditionally calls FlushAsync and awaits 3- AddRequestAsync that calls AddAsync and awaits 4- Overloads every RPC method with an *Async version

As for the concurrency issues, I think we could solve it with a semaphore that protects the _reqs collection (in Add, Clear, .Count). I think this should be enough, and would make the batcher thread safe (might require some other small changes).

I can work a prototype of this later today.

tiago18c avatar Feb 09 '22 13:02 tiago18c