confluent-kafka-python icon indicating copy to clipboard operation
confluent-kafka-python copied to clipboard

No provider for SASL mechanism GSSAPI

Open milosb793 opened this issue 4 years ago • 1 comments

Description

Hey guys! I've seen this issue was opened a couple of times, but nothing worked in my scenario... When I try creating the consumer using SASL, I'm getting the error:

cimpl.KafkaException: KafkaError{code=_INVALID_ARG,val=-186,str="Failed to create consumer: No provider for SASL mechanism GSSAPI: recompile librdkafka with libsasl2 or openssl support. Current build options: PLAIN SASL_SCRAM OAUTHBEARER"}

As already mentioned, I've installed librdkafka from source, and previously uninstalled the one installed via apt, using:

sudo  ./configure --install-deps --source-deps-only (tried also sudo ./configure --prefix=/usr --install-deps
sudo make 
sudo make install

all except this warning seems fine there:

 The following libraries were not available as static libraries and need to be linked dynamically: -lm -lsasl2 -lssl -lcrypto -lz -ldl -lpthread -lrt -lpthread -lrt

Then, when I try installing confluent-kafka using: pip install --no-binary :all: confluent-kafka --upgrade --ignore-installed I'm getting this error:

Wild error text
Collecting confluent-kafka
  Using cached confluent-kafka-1.7.0.tar.gz (103 kB)
Skipping wheel build for confluent-kafka, due to binaries being disabled for it.
Installing collected packages: confluent-kafka
    Running setup.py install for confluent-kafka ... error
    ERROR: Command errored out with exit status 1:
     command: /.env/bin/python -u -c 'import io, os, sys, setuptools, tokenize; sys.argv[0] = '"'"'/tmp/pip-install-he10etey/confluent-kafka_8b8390e6947748c89e1ebb24128366b5/setup.py'"'"'; __file__='"'"'/tmp/pip-install-he10etey/confluent-kafka_8b8390e6947748c89e1ebb24128366b5/setup.py'"'"';f = getattr(tokenize, '"'"'open'"'"', open)(__file__) if os.path.exists(__file__) else io.StringIO('"'"'from setuptools import setup; setup()'"'"');code = f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, __file__, '"'"'exec'"'"'))' install --record /tmp/pip-record-z2x3h92m/install-record.txt --single-version-externally-managed --compile --install-headers /.env/include/site/python3.6/confluent-kafka
         cwd: /tmp/pip-install-he10etey/confluent-kafka_8b8390e6947748c89e1ebb24128366b5/
    Complete output (50 lines):
    running install
    running build
    running build_py
    creating build
    creating build/lib.linux-x86_64-3.6
    creating build/lib.linux-x86_64-3.6/confluent_kafka
    copying src/confluent_kafka/deserializing_consumer.py -> build/lib.linux-x86_64-3.6/confluent_kafka
    copying src/confluent_kafka/__init__.py -> build/lib.linux-x86_64-3.6/confluent_kafka
    copying src/confluent_kafka/error.py -> build/lib.linux-x86_64-3.6/confluent_kafka
    copying src/confluent_kafka/serializing_producer.py -> build/lib.linux-x86_64-3.6/confluent_kafka
    creating build/lib.linux-x86_64-3.6/confluent_kafka/schema_registry
    copying src/confluent_kafka/schema_registry/schema_registry_client.py -> build/lib.linux-x86_64-3.6/confluent_kafka/schema_registry
    copying src/confluent_kafka/schema_registry/protobuf.py -> build/lib.linux-x86_64-3.6/confluent_kafka/schema_registry
    copying src/confluent_kafka/schema_registry/__init__.py -> build/lib.linux-x86_64-3.6/confluent_kafka/schema_registry
    copying src/confluent_kafka/schema_registry/error.py -> build/lib.linux-x86_64-3.6/confluent_kafka/schema_registry
    copying src/confluent_kafka/schema_registry/json_schema.py -> build/lib.linux-x86_64-3.6/confluent_kafka/schema_registry
    copying src/confluent_kafka/schema_registry/avro.py -> build/lib.linux-x86_64-3.6/confluent_kafka/schema_registry
    creating build/lib.linux-x86_64-3.6/confluent_kafka/serialization
    copying src/confluent_kafka/serialization/__init__.py -> build/lib.linux-x86_64-3.6/confluent_kafka/serialization
    creating build/lib.linux-x86_64-3.6/confluent_kafka/kafkatest
    copying src/confluent_kafka/kafkatest/verifiable_consumer.py -> build/lib.linux-x86_64-3.6/confluent_kafka/kafkatest
    copying src/confluent_kafka/kafkatest/verifiable_producer.py -> build/lib.linux-x86_64-3.6/confluent_kafka/kafkatest
    copying src/confluent_kafka/kafkatest/__init__.py -> build/lib.linux-x86_64-3.6/confluent_kafka/kafkatest
    copying src/confluent_kafka/kafkatest/verifiable_client.py -> build/lib.linux-x86_64-3.6/confluent_kafka/kafkatest
    creating build/lib.linux-x86_64-3.6/confluent_kafka/avro
    copying src/confluent_kafka/avro/cached_schema_registry_client.py -> build/lib.linux-x86_64-3.6/confluent_kafka/avro
    copying src/confluent_kafka/avro/__init__.py -> build/lib.linux-x86_64-3.6/confluent_kafka/avro
    copying src/confluent_kafka/avro/error.py -> build/lib.linux-x86_64-3.6/confluent_kafka/avro
    copying src/confluent_kafka/avro/load.py -> build/lib.linux-x86_64-3.6/confluent_kafka/avro
    creating build/lib.linux-x86_64-3.6/confluent_kafka/admin
    copying src/confluent_kafka/admin/__init__.py -> build/lib.linux-x86_64-3.6/confluent_kafka/admin
    creating build/lib.linux-x86_64-3.6/confluent_kafka/avro/serializer
    copying src/confluent_kafka/avro/serializer/__init__.py -> build/lib.linux-x86_64-3.6/confluent_kafka/avro/serializer
    copying src/confluent_kafka/avro/serializer/message_serializer.py -> build/lib.linux-x86_64-3.6/confluent_kafka/avro/serializer
    running build_ext
    building 'confluent_kafka.cimpl' extension
    creating build/temp.linux-x86_64-3.6
    creating build/temp.linux-x86_64-3.6/tmp
    creating build/temp.linux-x86_64-3.6/tmp/pip-install-he10etey
    creating build/temp.linux-x86_64-3.6/tmp/pip-install-he10etey/confluent-kafka_8b8390e6947748c89e1ebb24128366b5
    creating build/temp.linux-x86_64-3.6/tmp/pip-install-he10etey/confluent-kafka_8b8390e6947748c89e1ebb24128366b5/src
    creating build/temp.linux-x86_64-3.6/tmp/pip-install-he10etey/confluent-kafka_8b8390e6947748c89e1ebb24128366b5/src/confluent_kafka
    creating build/temp.linux-x86_64-3.6/tmp/pip-install-he10etey/confluent-kafka_8b8390e6947748c89e1ebb24128366b5/src/confluent_kafka/src
    x86_64-linux-gnu-gcc -pthread -DNDEBUG -g -fwrapv -O2 -Wall -g -fdebug-prefix-map=/build/python3.6-CZCBJL/python3.6-3.6.13=. -specs=/usr/share/dpkg/no-pie-compile.specs -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -I/.env/include -I/usr/include/python3.6m -c /tmp/pip-install-he10etey/confluent-kafka_8b8390e6947748c89e1ebb24128366b5/src/confluent_kafka/src/confluent_kafka.c -o build/temp.linux-x86_64-3.6/tmp/pip-install-he10etey/confluent-kafka_8b8390e6947748c89e1ebb24128366b5/src/confluent_kafka/src/confluent_kafka.o
    In file included from /tmp/pip-install-he10etey/confluent-kafka_8b8390e6947748c89e1ebb24128366b5/src/confluent_kafka/src/confluent_kafka.c:17:
    /tmp/pip-install-he10etey/confluent-kafka_8b8390e6947748c89e1ebb24128366b5/src/confluent_kafka/src/confluent_kafka.h:18:10: fatal error: Python.h: No such file or directory
       18 | #include <Python.h>
          |          ^~~~~~~~~~
    compilation terminated.
    error: command 'x86_64-linux-gnu-gcc' failed with exit status 1
    ----------------------------------------
ERROR: Command errored out with exit status 1: /.env/bin/python -u -c 'import io, os, sys, setuptools, tokenize; sys.argv[0] = '"'"'/tmp/pip-install-he10etey/confluent-kafka_8b8390e6947748c89e1ebb24128366b5/setup.py'"'"'; __file__='"'"'/tmp/pip-install-he10etey/confluent-kafka_8b8390e6947748c89e1ebb24128366b5/setup.py'"'"';f = getattr(tokenize, '"'"'open'"'"', open)(__file__) if os.path.exists(__file__) else io.StringIO('"'"'from setuptools import setup; setup()'"'"');code = f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, __file__, '"'"'exec'"'"'))' install --record /tmp/pip-record-z2x3h92m/install-record.txt --single-version-externally-managed --compile --install-headers /.env/include/site/python3.6/confluent-kafka Check the logs for full command output.

Please help!

Checklist

Please provide the following information:

  • [x] confluent-kafka-python and librdkafka version (confluent_kafka.version() and confluent_kafka.libversion()): ('1.7.0', 17235968) and ('1.7.0', 17236223)
  • [x] Apache Kafka broker version: 2.8.0 (I think)
  • [x] Client configuration: above
  • [x] Operating system: Ubuntu 20.10
  • [ ] Provide client logs (with 'debug': '..' as necessary)
  • [ ] Provide broker log excerpts
  • [x] Critical issue

Client Conf:

"bootstrap.servers": [
    ...
],
"group.id": "group1", 
"auto.offset.reset": "earliest",
"enable.auto.commit": False,
"max.poll.interval.ms": "300000" #ms
"compression.codec": "snappy",
# Auth
"security.protocol": "sasl_ssl",
"ssl.ca.location": "../var/certs/ca_public.crt",
"ssl.certificate.location": "../var/certs/client.crt",
"ssl.key.location": "../var/certs/client.key",
"enable.ssl.certificate.verification": True,

milosb793 avatar Jun 25 '21 16:06 milosb793

Thanks for asking.

From the log, this is the error.

/tmp/pip-install-he10etey/confluent-kafka_8b8390e6947748c89e1ebb24128366b5/src/confluent_kafka/src/confluent_kafka.h:18:10: fatal error: Python.h: No such file or directory
       18 | #include <Python.h>
          |          ^~~~~~~~~~
    compilation terminated.

Looks like you haven't properly installed the header files and static libraries for python dev, you can try with the following command:

sudo yum install python-devel    # for python2.x installs
sudo yum install python3-devel   # for python3.x installs

Refer to https://stackoverflow.com/questions/21530577/fatal-error-python-h-no-such-file-or-directory

jliunyu avatar Mar 24 '22 06:03 jliunyu

Closing as a solution was provided, please open another issue if this perists

nhaq-confluent avatar Mar 06 '24 00:03 nhaq-confluent