zeromq.js icon indicating copy to clipboard operation
zeromq.js copied to clipboard

Very slow

Open marcj opened this issue 4 years ago • 6 comments

Describe the bug

I run a little Request/Reply benchmark and found it to be very slow with only 7k messages/s.

Reproducing

// server
const zmq = require("zeromq")

export async function run() {
    const sock = new zmq.Reply

    await sock.bind("ipc:///tmp/zero.sock")

    for await (const [msg] of sock) {
        await sock.send(msg)
    }
}

run();
//client
import {performance} from 'perf_hooks';

async function bench(times: number, title: string, exec: () => void | Promise<void>) {
    const start = performance.now();
    for (let i = 0; i < times; i++) {
        await exec();
    }
    const took = performance.now() - start;

    process.stdout.write([
        (1000 / took) * times, 'ops/s',
        title,
        took.toLocaleString(undefined, {maximumFractionDigits: 17}), 'ms,',
        process.memoryUsage().rss / 1024 / 1024, 'MB memory'
    ].join(' ') + '\n');
}

const zmq = require('zeromq');

async function run() {
    const client = new zmq.Request;
    client.connect('ipc:///tmp/zero.sock');

    await bench(10_000, 'zeromq', async () => {
        await client.send(Buffer.alloc(2));
        const [result] = await client.receive();
    });
}

run();

Run both, and see the output:

7186.703214043314 ops/s zeromq 1,391.4586009979248 ms, 27.9375 MB memory

Expected behavior

Expected behavior is to have way higher rates, like hundred-thousands, considering that zeromq advertises millions of messages/s. At least as much as the node's net module, which can pull 44k messages on the same machine (single threaded, whereas the benchmark above used 2 processes, so zeromq should be even faster than that).

Tested on

  • OS: macOS 10.15.5
  • ZeroMQ.js version: 6.0.0-beta.2
  • Node 14.5.0

Anything I do incorrectly here? Maybe wrong assumptions?

marcj avatar Aug 14 '20 12:08 marcj

How many sockets are you using? The request socket is sequential in its sends (await client.send actually waits for the response, not just the send). So to me, your example looks to be trying to send and receive every message before sending the next (which is naturally a lot slower than just sending without awaiting a response).

OliverNChalk avatar Aug 17 '20 10:08 OliverNChalk

How many sockets are you using?

One. I posted the full code to reproduce.

which is naturally a lot slower than just sending without awaiting a response

Yes, I know. That's on purpose for this benchmark: to see the realistic latency/round-trip time you get for a single client when doing sequential work (like requesting a value from the server, which is essentially what you see above, req/res). I'm not interested in benchmarking thousands of connections as this would be misleading. I'm interested in seeing what a single user/connection can except as total latency (which won't decrease when having many connections). The benchmark tells me what to expect from zeromq in terms of additional latency for every single request a user makes to my http endpoint when doing in this request additionally zeromq requests. Currently it seems one request to zeromq (req/res) adds an overhead of 0,1ms, which would not be acceptable to me (since I'm in the context of websocket streams, and every 1/10 of ms adds substantial overhead), especially since I need several requests to be made to the server (via req/res).

Doing the same thing with the net module yields:

35032.66048690418 ops/s net module 285.4479180574417 ms, 32.46875 MB memory

As you see a lot faster (5x faster). A reference implementation in C++ that does exactly the same yields up to 50k pps. UDP up to 60k. Only 1.7x faster. Of course zeromq's req/res protocol has a bit of overhead for parsing the message as a whole, but that shouldn't be that slow (especially since this happens in C/C++ AFAIK).

It seems certain things are not implemented correctly and create an substantial overhead that slows simple use-cases down.

Reproduction code for the `net` module
//server
import {createConnection} from 'net';
import {performance} from 'perf_hooks';

async function bench(times: number, title: string, exec: () => void | Promise<void>) {
    const start = performance.now();
    for (let i = 0; i < times; i++) {
        await exec();
    }
    const took = performance.now() - start;

    process.stdout.write([
        (1000 / took) * times, 'ops/s',
        title,
        took.toLocaleString(undefined, {maximumFractionDigits: 17}), 'ms,',
        process.memoryUsage().rss / 1024 / 1024, 'MB memory'
    ].join(' ') + '\n');
}

async function main() {
    const client = createConnection('/tmp/net.sock');
    await bench(10_000, 'zeromq', async () => {
        await new Promise((resolve) => {
            client.once('data', () => {
                resolve();
            });
            client.write(Buffer.alloc(2));
        });
    });
}

main()
//server
import {createServer} from 'net';

const net = createServer();
net.on('connection', (socket) => {
    socket.on('data', (message: Buffer) => {
        socket.write(message);
    });
    socket.on('error', (error) => {
        console.log('error jo', error);
    });
});

net.listen('/tmp/net.sock', async () => {
    console.log('running');
});

marcj avatar Aug 17 '20 11:08 marcj

Using your examples I get the following results: 20160.827192609595 ops/s zeromq 496.0113940000301 ms, 42.36328125 MB memory 41917.7431192103 ops/s net module 238.56246199994348 ms, 41.078125 MB memory

OliverNChalk avatar Aug 17 '20 14:08 OliverNChalk

@OliverNChalk That's better, thanks for trying out. Unfortunately still way slower than the net module. I just tried again and it stills hover around 7k ops/s. Which version do you use of zeromq.js? Node version? OS?

marcj avatar Aug 17 '20 14:08 marcj

  • OS: Ubuntu 20.04
  • ZeroMQ.js version: 6.0.0-beta.6
  • Node 14.4.0

OliverNChalk avatar Aug 18 '20 14:08 OliverNChalk

v6 has been released. Please upgrade to the latest version and report back if the issue still persists.

aminya avatar Sep 16 '24 05:09 aminya