node-rdkafka icon indicating copy to clipboard operation
node-rdkafka copied to clipboard

segfault in sails app on Node 18

Open stealthrabbi opened this issue 2 years ago • 4 comments

Environment Information

  • Alpine 3.17
  • 18
  • NPM Version [e.g. 5.4.2]:
  • C++ Toolchain [e.g. Visual Studio, llvm, g++]:
  • node-rdkafka version 2.15.0

I have a sails app that creates a producer in api/service. But I immediately get a segfault when the producer tries to connect to ana ccessible host. I do not get this when running the same app on a dev system that is not using alpine.

I installed hte system dependencies outlined in the Docker readme of this repo for alpine.

stealthrabbi avatar May 02 '23 13:05 stealthrabbi

This may be the same issue I reported here a while back: https://github.com/Blizzard/node-rdkafka/issues/891

stealthrabbi avatar May 02 '23 13:05 stealthrabbi

I ran into very similar problem when security.protocol is specified (SSL or SASL_SSL). I tried to use segfault-handler library to debug what causes the segmentation fault without success (no logs).

I tried to use node 16, called npm rebuild, and the problem goes away. I tried to use node 18, called npm rebuild, and segmentation fault occurs as soon as producer.connect() was called.

I believe it has something to do with the openssl version as node 18 uses openssl 3.0 where as node 16 uses openssl 1.1.1.

I have checked librdkafka (v2.1.1) and it seems to use openssl 3.0, so I'm very confused.

hin-fan-alt avatar May 05 '23 21:05 hin-fan-alt

My team has been seeing this issue, and honestly just kinda resorting to moving to kafkajs, but it definitely doesn't have the performance characteristics of node-rdkafka

ebachle avatar May 15 '23 16:05 ebachle

I tested with the Alpine image without issues whatsoever:

FROM node:18-alpine

RUN apk --no-cache add \
      bash \
      g++ \
      ca-certificates \
      lz4-dev \
      musl-dev \
      cyrus-sasl-dev \
      openssl-dev \
      make \
      python3
...
const Kafka = require('node-rdkafka');

const HOST_PORT = 'REPALCE_WITH_YOUR_KAFKA_BROKER_LIST'
const producer = new Kafka.Producer({
    'metadata.broker.list': HOST_PORT,
    'security.protocol': 'ssl',
    'ssl.key.location': 'ssl/service.key',
    'ssl.certificate.location': 'ssl/service.cert',
    'ssl.ca.location': 'ssl/ca.pem'
});

producer.connect(undefined, (err, data) => {
    if (err) {
        console.error(err);
    } else {
        console.log(JSON.stringify(data, null, 2));
    }
    process.exit(0);
});

You might need to force your docker build to clean the cache (docker build --no-cache ...) to ensure installing the latest versions of build libraries.

iradul avatar May 31 '23 03:05 iradul