fluent-plugin-kafka
fluent-plugin-kafka copied to clipboard
Fluent-plugin-kafka is not able to connect to kafka brokers. Openssl error : dh key too small
Describe the bug
When connecting to kafka broker to send the logs, I am getting the following error.
2022-06-24 12:48:02 +0000 [warn]: failed to flush the buffer. retry_times=25 next_retry_time=2022-06-24 12:49:29 +0000 chunk="5e230939a9cdaf21474dca6a50554978" error_class=OpenSSL::SSL::SSLError error="SSL_connect returned=1 errno=0 state=error: dh key too small"
after upgrading the java version of the kafka broker from 1.8 to openjdk9.
To Reproduce
Upgrade the java version from 1.8 to openjdk9
Expected behavior
fluent-plugin-kafka should connect to kafka brokers without any problems
Your Environment
- Fluentd version:1.14.6
- fluent-plugin-kafka version:0.17.5
- ruby-kafka version:1.5.0
- Operating system:NAME="Red Hat Enterprise Linux VERSION="8.6 (Ootpa)"
- Kernel version:4.15.0-180-generic
- Ruby version:ruby-3.0.2-140.module+el8.5.0+12856+0c654ebc.x86_64
- Kafka version on the cluster:2.8.1
Your Configuration
<match **>
@type kafka2
brokers
Your Error Log
2022-06-28 18:36:39 +0000 [warn]: #5 Send exception occurred: SSL_connect returned=1 errno=0 state=error: dh key too small
2022-06-28 18:36:39 +0000 [warn]: #5 Exception Backtrace : /usr/local/share/gems/gems/ruby-kafka-1.5.0/lib/kafka/ssl_socket_with_timeout.rb:69:in `connect_nonblock'
/usr/local/share/gems/gems/ruby-kafka-1.5.0/lib/kafka/ssl_socket_with_timeout.rb:69:in `initialize'
/usr/local/share/gems/gems/ruby-kafka-1.5.0/lib/kafka/connection.rb:130:in `new'
/usr/local/share/gems/gems/ruby-kafka-1.5.0/lib/kafka/connection.rb:130:in `open'
/usr/local/share/gems/gems/ruby-kafka-1.5.0/lib/kafka/connection.rb:101:in `block in send_request'
/usr/local/share/gems/gems/ruby-kafka-1.5.0/lib/kafka/instrumenter.rb:23:in `instrument'
/usr/local/share/gems/gems/ruby-kafka-1.5.0/lib/kafka/connection.rb:100:in `send_request'
/usr/local/share/gems/gems/ruby-kafka-1.5.0/lib/kafka/sasl_authenticator.rb:65:in `authenticate!'
/usr/local/share/gems/gems/ruby-kafka-1.5.0/lib/kafka/connection_builder.rb:27:in `build_connection'
/usr/local/share/gems/gems/ruby-kafka-1.5.0/lib/kafka/broker.rb:214:in `connection'
/usr/local/share/gems/gems/ruby-kafka-1.5.0/lib/kafka/broker.rb:200:in `send_request'
/usr/local/share/gems/gems/ruby-kafka-1.5.0/lib/kafka/broker.rb:44:in `fetch_metadata'
/usr/local/share/gems/gems/ruby-kafka-1.5.0/lib/kafka/cluster.rb:432:in `block (2 levels) in fetch_cluster_info'
/usr/local/share/gems/gems/ruby-kafka-1.5.0/lib/kafka/cluster.rb:425:in `each'
/usr/local/share/gems/gems/ruby-kafka-1.5.0/lib/kafka/cluster.rb:425:in `block in fetch_cluster_info'
/usr/local/share/gems/gems/ruby-kafka-1.5.0/lib/kafka/cluster.rb:424:in `each'
/usr/local/share/gems/gems/ruby-kafka-1.5.0/lib/kafka/cluster.rb:424:in `fetch_cluster_info'
/usr/local/share/gems/gems/ruby-kafka-1.5.0/lib/kafka/cluster.rb:405:in `cluster_info'
/usr/local/share/gems/gems/ruby-kafka-1.5.0/lib/kafka/cluster.rb:105:in `refresh_metadata!'
/usr/local/share/gems/gems/ruby-kafka-1.5.0/lib/kafka/cluster.rb:59:in `add_target_topics'
/usr/local/share/gems/gems/fluent-plugin-kafka-0.17.5/lib/fluent/plugin/kafka_producer_ext.rb:93:in `initialize'
/usr/local/share/gems/gems/fluent-plugin-kafka-0.17.5/lib/fluent/plugin/kafka_producer_ext.rb:60:in `new'
/usr/local/share/gems/gems/fluent-plugin-kafka-0.17.5/lib/fluent/plugin/kafka_producer_ext.rb:60:in `topic_producer'
/usr/local/share/gems/gems/fluent-plugin-kafka-0.17.5/lib/fluent/plugin/out_kafka2.rb:246:in `write'
/usr/local/share/gems/gems/fluentd-1.14.6/lib/fluent/plugin/output.rb:1179:in `try_flush'
/usr/local/share/gems/gems/fluentd-1.14.6/lib/fluent/plugin/output.rb:1500:in `flush_thread_run'
/usr/local/share/gems/gems/fluentd-1.14.6/lib/fluent/plugin/output.rb:499:in `block (2 levels) in start'
/usr/local/share/gems/gems/fluentd-1.14.6/lib/fluent/plugin_helper/thread.rb:78:in `block in thread_create'
Additional context
Previously we were using only ssl_ca_cert and that was working fine with java 1.8 but when the upgrade to openjdk9 was done, it started throwing the error. We tried using signed certs also but it did not solve the issue.
This issue has been automatically marked as stale because it has been open 90 days with no activity. Remove stale label or comment or this issue will be closed in 30 days
This issue was automatically closed because of stale in 30 days