help request: kafka-logger cannot send messages to kafka. it creates topic but cannot send plogs.
Description
kafka-logger does not send to the logs to the kafka. I have tried two different environments.
Environment
- APISIX version (run
apisix version): 3.14.1-0 - Operating system (run
uname -a): kubernetes and ubuntu - OpenResty / Nginx version (run
openresty -Vornginx -V): 1.27.1.2 - etcd version, if relevant (run
curl http://127.0.0.1:9090/v1/server_info): etcd Version: 3.6.6 - APISIX Dashboard version, if relevant:
- Plugin runner version, for issues related to plugin runners: embedded
- LuaRocks version, for installation issues (run
luarocks --version):
2025/12/09 22:30:25 [error] 307575#307575: *2556984 [lua] batch-processor.lua:96: Batch Processor[kafka logger] failed to process entries: failed to send data to Kafka topic: closed, brokers: null, context: ngx.timer, client: 127.0.0.1, server: 0.0.0.0:9080 2025/12/09 22:30:25 [error] 307575#307575: *2556984 [lua] batch-processor.lua:106: Batch Processor[kafka logger] exceeded the max_retry_count[1] dropping the entries, context: ngx.timer, client: 127.0.0.1, server: 0.0.0.0:9080
Hi @edib, please share the steps and configurations here. So we can try to reproduce.
in kubernetes, I have used following helm. https://apache.github.io/apisix-helm-chart/docs/en/latest/apisix.html Routes etc. are clearly working. I have created a httpbin service. It works too. Then I add the following code as kafka-logger.
{
"kafka_topic": "test",
"name": "kafka logger",
"brokers": [
{
"port": 9092,
"host": "kafka-kafka-node-pool-0.kafka-kafka-brokers.apisix.svc"
}
],
"producer_type": "async",
"key": "key1",
"batch_max_size": 5
}
2025/12/10 09:32:56 [error] 56#56: *1128725 [lua] client.lua:210: _fetch_metadata(): all brokers failed in fetch topic metadata, context: ngx.timer, client: 10.233.102.128, server: 0.0.0.0:9080 2025/12/10 09:26:53 [error] 70#70: *1085195 [lua] batch-processor.lua:96: Batch Processor[kafkalogger] failed to process entries: failed to send data to Kafka topic: closed, brokers: null, context: ngx.timer, client: 10.233.102.128, server: 0.0.0.0:9080 2025/12/10 09:26:53 [error] 70#70: *1085195 [lua] batch-processor.lua:106: Batch Processor[kafkalogger] exceeded the max_retry_count[1] dropping the entries, context: ngx.timer, client: 10.233.102.128, server: 0.0.0.0:9080
these are the logs. this is the strimzi kafka installation. single node and no auth for the test purposes. it is in the same namespace. I have also tested via a kafka-ui and kafka-console-producer.sh. the kafka is working. I have also used kafka pod and bootstrap ip adresses. I tried it with in a linux machine with systemd installation. all are localhosted and facing the same error.
Hi @edib, can you test the connectivity of the Kafka service from the APISIX pod? For example, using tools like telnet or nc.