EventStore-Client-NodeJS icon indicating copy to clipboard operation
EventStore-Client-NodeJS copied to clipboard

Appending more than 100 events at once on a fresh connection fails. (Batch append?)

Open FeepingCreature opened this issue 2 years ago • 1 comments

Consider this test program with @eventstore/db-client 3.4.0 and eventstore:21.10.7-bionic:

const {
    EventStoreDBClient,
    jsonEvent,
} = require("@eventstore/db-client");

const client = EventStoreDBClient.connectionString('esdb://localhost:2113?tls=false')
const streamName = "PerformanceTestNodeJsGrpc";
const numberOfEvents = 1000;

function oneKiloByteEvent() {
    return jsonEvent({
        type: "AnyEventType",
        data: {
            value: 'A'.repeat(904)
        },
    });
}

(async () => {
    try {
        var requests = [];
        for (var i = 0; i < numberOfEvents; i++) {
            requests.push(client.appendToStream(streamName, [oneKiloByteEvent()]));
        }
        await Promise.all(requests);
        console.log('Done.');
    } catch (e) {
        console.log(`append failed: ${e}`);
    }
})();

This successfully appends 100 events and then hangs forever.

If I kill the program, the ESDB complains about lots of lost HTTP/2 connections.

If I add this code before the loop:

await client.appendToStream(streamName, [oneKiloByteEvent()]);

Then the loop completes.

I suspect there's a timing issue where the client tries to start up a batch connection for every event.

FeepingCreature avatar Aug 10 '22 05:08 FeepingCreature

Hi @FeepingCreature, thanks for the issue.

I suspect there's a timing issue where the client tries to start up a batch connection for every event.

Yep, that's exactly what is happening, the cache of batch streams is only populating after the connection was made (which is async) so as you were initiating multiple requests in the same event loop, it was initiating the creation each time.

I've changed caching logic to cache a promise of a batch append stream, so it only gets created once.

This uncovered another issue that the backpressure logic was assuming that only one call was using the stream at a time, so was flooding the connection after the first flush. I've also fixed that.

You can find the PR here: https://github.com/EventStore/EventStore-Client-NodeJS/pull/305


PerformanceTestNodeJsGrpc

If you want to eek out some more performance, using batching will give you the best performance:

For 100,000 events (after #305):

Code in issue
const {
    EventStoreDBClient,
    jsonEvent,
} = require("@eventstore/db-client");

const client = EventStoreDBClient.connectionString('esdb://localhost:2113?tls=false')
const streamName = "PerformanceTestNodeJsGrpc";
const numberOfEvents = 100_000;

function oneKiloByteEvent() {
    return jsonEvent({
        type: "AnyEventType",
        data: {
            value: 'A'.repeat(904)
        },
    });
}

(async () => {
    try {
        var requests = [];
        for (var i = 0; i < numberOfEvents; i++) {
            requests.push(client.appendToStream(streamName, [oneKiloByteEvent()]));
        }
        await Promise.all(requests);
        console.log('Done.');
    } catch (e) {
        console.log(`append failed: ${e}`);
    }
})();

Takes about 38s

Batches of 200, awaiting each call
const { EventStoreDBClient, jsonEvent } = require("@eventstore/db-client");

const client = EventStoreDBClient.connectionString(
  "esdb://localhost:2113?tls=false"
);
const streamName = "PerformanceTestNodeJsGrpc";
const numberOfEvents = 100_000;
const batchSize = 200;

function oneKiloByteEvent() {
  return jsonEvent({
    type: `event`,
    data: {
      value: "A".repeat(904),
    },
  });
}

(async () => {
  try {
    for (let i = 0; i < numberOfEvents / batchSize; i++) {
      await client.appendToStream(
        streamName,
        Array.from({ length: batchSize }, oneKiloByteEvent),
        {
          deadline: 10_000_000,
        }
      );
    }
    console.log('Done.');
  } catch (e) {
    console.log(`append failed: ${e}`, e);
  }

  await client.dispose();
})();

Takes about 8s

George-Payne avatar Aug 10 '22 15:08 George-Payne