opensearch-connector-for-apache-kafka
opensearch-connector-for-apache-kafka copied to clipboard
NoClassDefFoundError
hi,
I'm using strimzi kafka operator to deploy kafka cluster.
I built a jar file. Also built docker image with this lib
FROM quay.io/strimzi/kafka:0.29.0-kafka-3.2.0
USER root:root
COPY opensearch-connector-for-apache-kafka-2.0.2.jar /opt/kafka/libs/
USER 1001
If it is important, connector config:
apiVersion: kafka.strimzi.io/v1beta2
kind: KafkaConnector
metadata:
name: opensearch-sink
labels:
strimzi.io/cluster: kafka-operator
spec:
class: io.aiven.kafka.connect.opensearch.OpensearchSinkConnector
tasksMax: 1
config:
connection.url: http://opensearch-cluster-master.opensearch:9200
connection.username: "admin"
connection.password: "xxx"
max.retries: 5
topics: "nginx"
When I'm adding connector I see log like this:
2022-08-01 12:30:42,934 INFO [Worker clientId=connect-1, groupId=connect-cluster] Connector opensearch-sink config updated (org.apache.kafka.connect.runtime.distributed.DistributedHerder) [KafkaBasedLog Work Thread - connect-cluster-configs]
2022-08-01 12:30:42,935 INFO [Worker clientId=connect-1, groupId=connect-cluster] Rebalance started (org.apache.kafka.connect.runtime.distributed.WorkerCoordinator) [DistributedHerder-connect-1-1]
2022-08-01 12:30:42,935 INFO [Worker clientId=connect-1, groupId=connect-cluster] (Re-)joining group (org.apache.kafka.connect.runtime.distributed.WorkerCoordinator) [DistributedHerder-connect-1-1]
2022-08-01 12:30:42,940 INFO [Worker clientId=connect-1, groupId=connect-cluster] Successfully joined group with generation Generation{generationId=66, memberId='connect-1-cb44525c-c1f7-490a-a8fd-2e549fa0f56c', protocol='sessioned'} (org.apache.kafka.connect.runtime.distributed.WorkerCoordinator) [DistributedHerder-connect-1-1]
2022-08-01 12:30:42,947 INFO [Worker clientId=connect-1, groupId=connect-cluster] Successfully synced group in generation Generation{generationId=66, memberId='connect-1-cb44525c-c1f7-490a-a8fd-2e549fa0f56c', protocol='sessioned'} (org.apache.kafka.connect.runtime.distributed.WorkerCoordinator) [DistributedHerder-connect-1-1]
2022-08-01 12:30:42,947 INFO [Worker clientId=connect-1, groupId=connect-cluster] Joined group at generation 66 with protocol version 2 and got assignment: Assignment{error=0, leader='connect-1-cb44525c-c1f7-490a-a8fd-2e549fa0f56c', leaderUrl='http://10.233.41.150:8083/', offset=22, connectorIds=[opensearch-sink], taskIds=[], revokedConnectorIds=[], revokedTaskIds=[], delay=0} with rebalance delay: 0 (org.apache.kafka.connect.runtime.distributed.DistributedHerder) [DistributedHerder-connect-1-1]
2022-08-01 12:30:42,947 INFO [Worker clientId=connect-1, groupId=connect-cluster] Starting connectors and tasks using config offset 22 (org.apache.kafka.connect.runtime.distributed.DistributedHerder) [DistributedHerder-connect-1-1]
2022-08-01 12:30:42,948 INFO [Worker clientId=connect-1, groupId=connect-cluster] Starting connector opensearch-sink (org.apache.kafka.connect.runtime.distributed.DistributedHerder) [StartAndStopExecutor-connect-1-1]
2022-08-01 12:30:42,948 INFO [opensearch-sink|worker] Creating connector opensearch-sink of type io.aiven.kafka.connect.opensearch.OpensearchSinkConnector (org.apache.kafka.connect.runtime.Worker) [StartAndStopExecutor-connect-1-1]
2022-08-01 12:30:42,949 INFO [opensearch-sink|worker] SinkConnectorConfig values:
config.action.reload = restart
connector.class = io.aiven.kafka.connect.opensearch.OpensearchSinkConnector
errors.deadletterqueue.context.headers.enable = false
errors.deadletterqueue.topic.name =
errors.deadletterqueue.topic.replication.factor = 3
errors.log.enable = false
errors.log.include.messages = false
errors.retry.delay.max.ms = 60000
errors.retry.timeout = 0
errors.tolerance = none
header.converter = null
key.converter = null
name = opensearch-sink
predicates = []
tasks.max = 1
topics = [nginx]
topics.regex =
transforms = []
value.converter = null
(org.apache.kafka.connect.runtime.SinkConnectorConfig) [StartAndStopExecutor-connect-1-1]
2022-08-01 12:30:42,949 INFO [opensearch-sink|worker] EnrichedConnectorConfig values:
config.action.reload = restart
connector.class = io.aiven.kafka.connect.opensearch.OpensearchSinkConnector
errors.deadletterqueue.context.headers.enable = false
errors.deadletterqueue.topic.name =
errors.deadletterqueue.topic.replication.factor = 3
errors.log.enable = false
errors.log.include.messages = false
errors.retry.delay.max.ms = 60000
errors.retry.timeout = 0
errors.tolerance = none
header.converter = null
key.converter = null
name = opensearch-sink
predicates = []
tasks.max = 1
topics = [nginx]
topics.regex =
transforms = []
value.converter = null
(org.apache.kafka.connect.runtime.ConnectorConfig$EnrichedConnectorConfig) [StartAndStopExecutor-connect-1-1]
2022-08-01 12:30:42,950 INFO [opensearch-sink|worker] Instantiated connector opensearch-sink with version 2.0.2-SNAPSHOT of type class io.aiven.kafka.connect.opensearch.OpensearchSinkConnector (org.apache.kafka.connect.runtime.Worker) [StartAndStopExecutor-connect-1-1]
2022-08-01 12:30:42,950 INFO [opensearch-sink|worker] Finished creating connector opensearch-sink (org.apache.kafka.connect.runtime.Worker) [StartAndStopExecutor-connect-1-1]
2022-08-01 12:30:42,952 INFO [opensearch-sink|worker] OpensearchSinkConnectorConfig values:
batch.size = 2000
behavior.on.malformed.documents = fail
behavior.on.null.values = ignore
behavior.on.version.conflict = fail
compact.map.entries = true
connection.password = [hidden]
connection.timeout.ms = 1000
connection.url = [http://opensearch-cluster-master.opensearch:9200]
connection.username = admin
drop.invalid.message = false
flush.timeout.ms = 10000
key.ignore = false
linger.ms = 1
max.buffered.records = 20000
max.in.flight.requests = 5
max.retries = 5
read.timeout.ms = 3000
retry.backoff.ms = 100
schema.ignore = false
topic.index.map = []
topic.key.ignore = []
topic.schema.ignore = []
(io.aiven.kafka.connect.opensearch.OpensearchSinkConnectorConfig) [connector-thread-opensearch-sink]
2022-08-01 12:30:42,954 INFO [Worker clientId=connect-1, groupId=connect-cluster] Finished starting connectors and tasks (org.apache.kafka.connect.runtime.distributed.DistributedHerder) [DistributedHerder-connect-1-1]
2022-08-01 12:30:42,956 INFO SinkConnectorConfig values:
config.action.reload = restart
connector.class = io.aiven.kafka.connect.opensearch.OpensearchSinkConnector
errors.deadletterqueue.context.headers.enable = false
errors.deadletterqueue.topic.name =
errors.deadletterqueue.topic.replication.factor = 3
errors.log.enable = false
errors.log.include.messages = false
errors.retry.delay.max.ms = 60000
errors.retry.timeout = 0
errors.tolerance = none
header.converter = null
key.converter = null
name = opensearch-sink
predicates = []
tasks.max = 1
topics = [nginx]
topics.regex =
transforms = []
value.converter = null
(org.apache.kafka.connect.runtime.SinkConnectorConfig) [DistributedHerder-connect-1-1]
2022-08-01 12:30:42,956 INFO EnrichedConnectorConfig values:
config.action.reload = restart
connector.class = io.aiven.kafka.connect.opensearch.OpensearchSinkConnector
errors.deadletterqueue.context.headers.enable = false
errors.deadletterqueue.topic.name =
errors.deadletterqueue.topic.replication.factor = 3
errors.log.enable = false
errors.log.include.messages = false
errors.retry.delay.max.ms = 60000
errors.retry.timeout = 0
errors.tolerance = none
header.converter = null
key.converter = null
name = opensearch-sink
predicates = []
tasks.max = 1
topics = [nginx]
topics.regex =
transforms = []
value.converter = null
(org.apache.kafka.connect.runtime.ConnectorConfig$EnrichedConnectorConfig) [DistributedHerder-connect-1-1]
2022-08-01 12:30:42,972 INFO [Worker clientId=connect-1, groupId=connect-cluster] Tasks [opensearch-sink-0] configs updated (org.apache.kafka.connect.runtime.distributed.DistributedHerder) [KafkaBasedLog Work Thread - connect-cluster-configs]
2022-08-01 12:30:42,974 INFO [Worker clientId=connect-1, groupId=connect-cluster] Handling task config update by restarting tasks [] (org.apache.kafka.connect.runtime.distributed.DistributedHerder) [DistributedHerder-connect-1-1]
2022-08-01 12:30:42,975 INFO [Worker clientId=connect-1, groupId=connect-cluster] Rebalance started (org.apache.kafka.connect.runtime.distributed.WorkerCoordinator) [DistributedHerder-connect-1-1]
2022-08-01 12:30:42,975 INFO [Worker clientId=connect-1, groupId=connect-cluster] (Re-)joining group (org.apache.kafka.connect.runtime.distributed.WorkerCoordinator) [DistributedHerder-connect-1-1]
2022-08-01 12:30:42,977 INFO [Worker clientId=connect-1, groupId=connect-cluster] Successfully joined group with generation Generation{generationId=67, memberId='connect-1-cb44525c-c1f7-490a-a8fd-2e549fa0f56c', protocol='sessioned'} (org.apache.kafka.connect.runtime.distributed.WorkerCoordinator) [DistributedHerder-connect-1-1]
2022-08-01 12:30:42,985 INFO [Worker clientId=connect-1, groupId=connect-cluster] Successfully synced group in generation Generation{generationId=67, memberId='connect-1-cb44525c-c1f7-490a-a8fd-2e549fa0f56c', protocol='sessioned'} (org.apache.kafka.connect.runtime.distributed.WorkerCoordinator) [DistributedHerder-connect-1-1]
2022-08-01 12:30:42,985 INFO [Worker clientId=connect-1, groupId=connect-cluster] Joined group at generation 67 with protocol version 2 and got assignment: Assignment{error=0, leader='connect-1-cb44525c-c1f7-490a-a8fd-2e549fa0f56c', leaderUrl='http://10.233.41.150:8083/', offset=24, connectorIds=[opensearch-sink], taskIds=[opensearch-sink-0], revokedConnectorIds=[], revokedTaskIds=[], delay=0} with rebalance delay: 0 (org.apache.kafka.connect.runtime.distributed.DistributedHerder) [DistributedHerder-connect-1-1]
2022-08-01 12:30:42,985 INFO [Worker clientId=connect-1, groupId=connect-cluster] Starting connectors and tasks using config offset 24 (org.apache.kafka.connect.runtime.distributed.DistributedHerder) [DistributedHerder-connect-1-1]
2022-08-01 12:30:42,985 INFO [Worker clientId=connect-1, groupId=connect-cluster] Starting task opensearch-sink-0 (org.apache.kafka.connect.runtime.distributed.DistributedHerder) [StartAndStopExecutor-connect-1-2]
2022-08-01 12:30:42,986 INFO [opensearch-sink|task-0] Creating task opensearch-sink-0 (org.apache.kafka.connect.runtime.Worker) [StartAndStopExecutor-connect-1-2]
2022-08-01 12:30:42,986 INFO [opensearch-sink|task-0] ConnectorConfig values:
config.action.reload = restart
connector.class = io.aiven.kafka.connect.opensearch.OpensearchSinkConnector
errors.log.enable = false
errors.log.include.messages = false
errors.retry.delay.max.ms = 60000
errors.retry.timeout = 0
errors.tolerance = none
header.converter = null
key.converter = null
name = opensearch-sink
predicates = []
tasks.max = 1
transforms = []
value.converter = null
(org.apache.kafka.connect.runtime.ConnectorConfig) [StartAndStopExecutor-connect-1-2]
2022-08-01 12:30:42,986 INFO [opensearch-sink|task-0] EnrichedConnectorConfig values:
config.action.reload = restart
connector.class = io.aiven.kafka.connect.opensearch.OpensearchSinkConnector
errors.log.enable = false
errors.log.include.messages = false
errors.retry.delay.max.ms = 60000
errors.retry.timeout = 0
errors.tolerance = none
header.converter = null
key.converter = null
name = opensearch-sink
predicates = []
tasks.max = 1
transforms = []
value.converter = null
(org.apache.kafka.connect.runtime.ConnectorConfig$EnrichedConnectorConfig) [StartAndStopExecutor-connect-1-2]
2022-08-01 12:30:42,986 INFO [opensearch-sink|task-0] TaskConfig values:
task.class = class io.aiven.kafka.connect.opensearch.OpensearchSinkTask
(org.apache.kafka.connect.runtime.TaskConfig) [StartAndStopExecutor-connect-1-2]
2022-08-01 12:30:42,987 INFO [opensearch-sink|task-0] Instantiated task opensearch-sink-0 with version 2.0.2-SNAPSHOT of type io.aiven.kafka.connect.opensearch.OpensearchSinkTask (org.apache.kafka.connect.runtime.Worker) [StartAndStopExecutor-connect-1-2]
2022-08-01 12:30:42,987 INFO [opensearch-sink|task-0] JsonConverterConfig values:
converter.type = key
decimal.format = BASE64
schemas.cache.size = 1000
schemas.enable = true
(org.apache.kafka.connect.json.JsonConverterConfig) [StartAndStopExecutor-connect-1-2]
2022-08-01 12:30:42,987 INFO [opensearch-sink|task-0] Set up the key converter class org.apache.kafka.connect.json.JsonConverter for task opensearch-sink-0 using the worker config (org.apache.kafka.connect.runtime.Worker) [StartAndStopExecutor-connect-1-2]
2022-08-01 12:30:42,987 INFO [opensearch-sink|task-0] JsonConverterConfig values:
converter.type = value
decimal.format = BASE64
schemas.cache.size = 1000
schemas.enable = true
(org.apache.kafka.connect.json.JsonConverterConfig) [StartAndStopExecutor-connect-1-2]
2022-08-01 12:30:42,987 INFO [opensearch-sink|task-0] Set up the value converter class org.apache.kafka.connect.json.JsonConverter for task opensearch-sink-0 using the worker config (org.apache.kafka.connect.runtime.Worker) [StartAndStopExecutor-connect-1-2]
2022-08-01 12:30:42,987 INFO [opensearch-sink|task-0] Set up the header converter class org.apache.kafka.connect.storage.SimpleHeaderConverter for task opensearch-sink-0 using the worker config (org.apache.kafka.connect.runtime.Worker) [StartAndStopExecutor-connect-1-2]
2022-08-01 12:30:42,988 INFO [opensearch-sink|task-0] Initializing: org.apache.kafka.connect.runtime.TransformationChain{} (org.apache.kafka.connect.runtime.Worker) [StartAndStopExecutor-connect-1-2]
2022-08-01 12:30:42,988 INFO [opensearch-sink|task-0] SinkConnectorConfig values:
config.action.reload = restart
connector.class = io.aiven.kafka.connect.opensearch.OpensearchSinkConnector
errors.deadletterqueue.context.headers.enable = false
errors.deadletterqueue.topic.name =
errors.deadletterqueue.topic.replication.factor = 3
errors.log.enable = false
errors.log.include.messages = false
errors.retry.delay.max.ms = 60000
errors.retry.timeout = 0
errors.tolerance = none
header.converter = null
key.converter = null
name = opensearch-sink
predicates = []
tasks.max = 1
topics = [nginx]
topics.regex =
transforms = []
value.converter = null
(org.apache.kafka.connect.runtime.SinkConnectorConfig) [StartAndStopExecutor-connect-1-2]
2022-08-01 12:30:42,988 INFO [opensearch-sink|task-0] EnrichedConnectorConfig values:
config.action.reload = restart
connector.class = io.aiven.kafka.connect.opensearch.OpensearchSinkConnector
errors.deadletterqueue.context.headers.enable = false
errors.deadletterqueue.topic.name =
errors.deadletterqueue.topic.replication.factor = 3
errors.log.enable = false
errors.log.include.messages = false
errors.retry.delay.max.ms = 60000
errors.retry.timeout = 0
errors.tolerance = none
header.converter = null
key.converter = null
name = opensearch-sink
predicates = []
tasks.max = 1
topics = [nginx]
topics.regex =
transforms = []
value.converter = null
(org.apache.kafka.connect.runtime.ConnectorConfig$EnrichedConnectorConfig) [StartAndStopExecutor-connect-1-2]
2022-08-01 12:30:42,988 INFO [opensearch-sink|task-0] ConsumerConfig values:
allow.auto.create.topics = true
auto.commit.interval.ms = 5000
auto.offset.reset = earliest
bootstrap.servers = [home-lab-kafka-bootstrap:9092]
check.crcs = true
client.dns.lookup = use_all_dns_ips
client.id = connector-consumer-opensearch-sink-0
client.rack =
connections.max.idle.ms = 540000
default.api.timeout.ms = 60000
enable.auto.commit = false
exclude.internal.topics = true
fetch.max.bytes = 52428800
fetch.max.wait.ms = 500
fetch.min.bytes = 1
group.id = connect-opensearch-sink
group.instance.id = null
heartbeat.interval.ms = 3000
interceptor.classes = []
internal.leave.group.on.close = true
internal.throw.on.fetch.stable.offset.unsupported = false
isolation.level = read_uncommitted
key.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer
max.partition.fetch.bytes = 1048576
max.poll.interval.ms = 300000
max.poll.records = 500
metadata.max.age.ms = 300000
metric.reporters = []
metrics.num.samples = 2
metrics.recording.level = INFO
metrics.sample.window.ms = 30000
partition.assignment.strategy = [class org.apache.kafka.clients.consumer.RangeAssignor, class org.apache.kafka.clients.consumer.CooperativeStickyAssignor]
receive.buffer.bytes = 65536
reconnect.backoff.max.ms = 1000
reconnect.backoff.ms = 50
request.timeout.ms = 30000
retry.backoff.ms = 100
sasl.client.callback.handler.class = null
sasl.jaas.config = null
sasl.kerberos.kinit.cmd = /usr/bin/kinit
sasl.kerberos.min.time.before.relogin = 60000
sasl.kerberos.service.name = null
sasl.kerberos.ticket.renew.jitter = 0.05
sasl.kerberos.ticket.renew.window.factor = 0.8
sasl.login.callback.handler.class = null
sasl.login.class = null
sasl.login.connect.timeout.ms = null
sasl.login.read.timeout.ms = null
sasl.login.refresh.buffer.seconds = 300
sasl.login.refresh.min.period.seconds = 60
sasl.login.refresh.window.factor = 0.8
sasl.login.refresh.window.jitter = 0.05
sasl.login.retry.backoff.max.ms = 10000
sasl.login.retry.backoff.ms = 100
sasl.mechanism = GSSAPI
sasl.oauthbearer.clock.skew.seconds = 30
sasl.oauthbearer.expected.audience = null
sasl.oauthbearer.expected.issuer = null
sasl.oauthbearer.jwks.endpoint.refresh.ms = 3600000
sasl.oauthbearer.jwks.endpoint.retry.backoff.max.ms = 10000
sasl.oauthbearer.jwks.endpoint.retry.backoff.ms = 100
sasl.oauthbearer.jwks.endpoint.url = null
sasl.oauthbearer.scope.claim.name = scope
sasl.oauthbearer.sub.claim.name = sub
sasl.oauthbearer.token.endpoint.url = null
security.protocol = PLAINTEXT
security.providers = null
send.buffer.bytes = 131072
session.timeout.ms = 45000
socket.connection.setup.timeout.max.ms = 30000
socket.connection.setup.timeout.ms = 10000
ssl.cipher.suites = null
ssl.enabled.protocols = [TLSv1.2, TLSv1.3]
ssl.endpoint.identification.algorithm = https
ssl.engine.factory.class = null
ssl.key.password = null
ssl.keymanager.algorithm = SunX509
ssl.keystore.certificate.chain = null
ssl.keystore.key = null
ssl.keystore.location = null
ssl.keystore.password = null
ssl.keystore.type = JKS
ssl.protocol = TLSv1.3
ssl.provider = null
ssl.secure.random.implementation = null
ssl.trustmanager.algorithm = PKIX
ssl.truststore.certificates = null
ssl.truststore.location = null
ssl.truststore.password = null
ssl.truststore.type = JKS
value.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer
(org.apache.kafka.clients.consumer.ConsumerConfig) [StartAndStopExecutor-connect-1-2]
2022-08-01 12:30:42,992 WARN [opensearch-sink|task-0] The configuration 'metrics.context.connect.kafka.cluster.id' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig) [StartAndStopExecutor-connect-1-2]
2022-08-01 12:30:42,992 WARN [opensearch-sink|task-0] The configuration 'metrics.context.connect.group.id' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig) [StartAndStopExecutor-connect-1-2]
2022-08-01 12:30:42,992 INFO [opensearch-sink|task-0] Kafka version: 3.2.0 (org.apache.kafka.common.utils.AppInfoParser) [StartAndStopExecutor-connect-1-2]
2022-08-01 12:30:42,992 INFO [opensearch-sink|task-0] Kafka commitId: 38103ffaa962ef50 (org.apache.kafka.common.utils.AppInfoParser) [StartAndStopExecutor-connect-1-2]
2022-08-01 12:30:42,992 INFO [opensearch-sink|task-0] Kafka startTimeMs: 1659357042992 (org.apache.kafka.common.utils.AppInfoParser) [StartAndStopExecutor-connect-1-2]
2022-08-01 12:30:42,993 INFO [Worker clientId=connect-1, groupId=connect-cluster] Finished starting connectors and tasks (org.apache.kafka.connect.runtime.distributed.DistributedHerder) [DistributedHerder-connect-1-1]
2022-08-01 12:30:42,994 INFO [opensearch-sink|task-0] [Consumer clientId=connector-consumer-opensearch-sink-0, groupId=connect-opensearch-sink] Subscribed to topic(s): nginx (org.apache.kafka.clients.consumer.KafkaConsumer) [task-thread-opensearch-sink-0]
2022-08-01 12:30:42,994 INFO [opensearch-sink|task-0] Starting OpensearchSinkTask. (io.aiven.kafka.connect.opensearch.OpensearchSinkTask) [task-thread-opensearch-sink-0]
2022-08-01 12:30:42,995 INFO [opensearch-sink|task-0] OpensearchSinkConnectorConfig values:
batch.size = 2000
behavior.on.malformed.documents = fail
behavior.on.null.values = ignore
behavior.on.version.conflict = fail
compact.map.entries = true
connection.password = [hidden]
connection.timeout.ms = 1000
connection.url = [http://opensearch-cluster-master.opensearch:9200]
connection.username = admin
drop.invalid.message = false
flush.timeout.ms = 10000
key.ignore = false
linger.ms = 1
max.buffered.records = 20000
max.in.flight.requests = 5
max.retries = 5
read.timeout.ms = 3000
retry.backoff.ms = 100
schema.ignore = false
topic.index.map = []
topic.key.ignore = []
topic.schema.ignore = []
(io.aiven.kafka.connect.opensearch.OpensearchSinkConnectorConfig) [task-thread-opensearch-sink-0]
2022-08-01 12:30:42,995 ERROR [opensearch-sink|task-0] WorkerSinkTask{id=opensearch-sink-0} Task threw an uncaught and unrecoverable exception. Task is being killed and will not recover until manually restarted (org.apache.kafka.connect.runtime.WorkerTask) [task-thread-opensearch-sink-0]
java.lang.NoClassDefFoundError: org/opensearch/client/RestClientBuilder$HttpClientConfigCallback
at io.aiven.kafka.connect.opensearch.OpensearchSinkTask.start(OpensearchSinkTask.java:77)
at org.apache.kafka.connect.runtime.WorkerSinkTask.initializeAndStart(WorkerSinkTask.java:312)
at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:186)
at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:243)
at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
at java.base/java.lang.Thread.run(Thread.java:829)
Caused by: java.lang.ClassNotFoundException: org.opensearch.client.RestClientBuilder$HttpClientConfigCallback
at java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:581)
at java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:178)
at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:522)
... 9 more
2022-08-01 12:30:42,996 INFO [opensearch-sink|task-0] Stopping OpensearchSinkTask. (io.aiven.kafka.connect.opensearch.OpensearchSinkTask) [task-thread-opensearch-sink-0]
2022-08-01 12:30:42,996 INFO [opensearch-sink|task-0] [Consumer clientId=connector-consumer-opensearch-sink-0, groupId=connect-opensearch-sink] Resetting generation and member id due to: consumer pro-actively leaving the group (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator) [task-thread-opensearch-sink-0]
2022-08-01 12:30:42,996 INFO [opensearch-sink|task-0] [Consumer clientId=connector-consumer-opensearch-sink-0, groupId=connect-opensearch-sink] Request joining group due to: consumer pro-actively leaving the group (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator) [task-thread-opensearch-sink-0]
2022-08-01 12:30:42,996 INFO [opensearch-sink|task-0] Metrics scheduler closed (org.apache.kafka.common.metrics.Metrics) [task-thread-opensearch-sink-0]
2022-08-01 12:30:42,996 INFO [opensearch-sink|task-0] Closing reporter org.apache.kafka.common.metrics.JmxReporter (org.apache.kafka.common.metrics.Metrics) [task-thread-opensearch-sink-0]
2022-08-01 12:30:42,996 INFO [opensearch-sink|task-0] Metrics reporters closed (org.apache.kafka.common.metrics.Metrics) [task-thread-opensearch-sink-0]
2022-08-01 12:30:42,997 INFO [opensearch-sink|task-0] App info kafka.consumer for connector-consumer-opensearch-sink-0 unregistered (org.apache.kafka.common.utils.AppInfoParser) [task-thread-opensearch-sink-0]
Unfortunately, I'm not a java developer, I don't know what does NoClassDefFoundError
exception mean and how to fix this error.
What does it mean and how to fix this kind of error?
more informations
curl -s http://kafka-operator-connect-api:8083/
{
"version": "3.2.0",
"commit": "38103ffaa962ef50",
"kafka_cluster_id": "T69paG1BQXueuusb7O_iDw"
}
curl -s http://kafka-operator-connect-api:8083/connector-plugins
[
{
"class": "io.aiven.kafka.connect.opensearch.OpensearchSinkConnector",
"type": "sink",
"version": "2.0.1"
},
{
"class": "org.apache.kafka.connect.mirror.MirrorCheckpointConnector",
"type": "source",
"version": "3.2.0"
},
{
"class": "org.apache.kafka.connect.mirror.MirrorHeartbeatConnector",
"type": "source",
"version": "3.2.0"
},
{
"class": "org.apache.kafka.connect.mirror.MirrorSourceConnector",
"type": "source",
"version": "3.2.0"
}
]
curl -s http://kafka-operator-connect-api:8083/connectors
["opensearch-sink"]
curl -s http://kafka-operator-connect-api:8083/connectors/opensearch-sink/config
{
"connector.class": "io.aiven.kafka.connect.opensearch.OpensearchSinkConnector",
"connection.password": "admin",
"topics": "nginx",
"tasks.max": "1",
"connection.username": "admin",
"name": "opensearch-sink",
"connection.url": "http://opensearch-cluster-master.opensearch:9200"
}
curl -s http://kafka-operator-connect-api:8083/connectors/opensearch-sink/status
{
"name": "opensearch-sink",
"connector": {
"state": "RUNNING",
"worker_id": "10.233.41.145:8083"
},
"tasks": [
{
"id": 0,
"state": "FAILED",
"worker_id": "10.233.41.145:8083",
"trace": "java.lang.NoClassDefFoundError: org/opensearch/client/RestClientBuilder$HttpClientConfigCallback\n\tat io.aiven.kafka.connect.opensearch.OpensearchSinkTask.start(OpensearchSinkTask.java:77)\n\tat org.apache.kafka.connect.runtime.WorkerSinkTask.initializeAndStart(WorkerSinkTask.java:312)\n\tat org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:186)\n\tat org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:243)\n\tat java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)\n\tat java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)\n\tat java.base/java.lang.Thread.run(Thread.java:829)\nCaused by: java.lang.ClassNotFoundException: org.opensearch.client.RestClientBuilder$HttpClientConfigCallback\n\tat java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:581)\n\tat java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:178)\n\tat java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:522)\n\t... 9 more\n"
}
],
"type": "sink"
}
I found this #28 and #76 Make changes:
FROM quay.io/strimzi/kafka:0.29.0-kafka-3.2.0
USER root:root
RUN mkdir -p /usr/share/auth0/lib
COPY opensearch-connector-for-apache-kafka-2.0.1.jar /usr/share/auth0/lib
COPY opensearch-connector-for-apache-kafka-2.0.1.jar /opt/kafka/libs/
RUN chown 1001:1001 /usr/share/auth0/lib*
ENV CONNECT_PLUGIN_PATH="/opt/kafka/libs,/usr/share/auth0/lib"
USER 1001
but I still receive such error
2022-08-02 08:18:33,637 ERROR [opensearch-sink|task-0] WorkerSinkTask{id=opensearch-sink-0} Task threw an uncaught and unrecoverable exception. Task is being killed and will not recover until manually restarted (org.apache.kafka.connect.runtime.WorkerTask) [task-thread-opensearch-sink-0]
java.lang.NoClassDefFoundError: org/opensearch/client/RestClientBuilder$HttpClientConfigCallback
at io.aiven.kafka.connect.opensearch.OpensearchSinkTask.start(OpensearchSinkTask.java:77)
at org.apache.kafka.connect.runtime.WorkerSinkTask.initializeAndStart(WorkerSinkTask.java:312)
at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:186)
at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:243)
at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
at java.base/java.lang.Thread.run(Thread.java:829)
Caused by: java.lang.ClassNotFoundException: org.opensearch.client.RestClientBuilder$HttpClientConfigCallback
at java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:581)
at java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:178)
at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:522)
@mirisu2 You need to include all of the dependency jars included in the release artifact in the same directory as the connector jar in the plugin.path.
For example:
RUN mkdir -p /opt/plugins
COPY build/distributions/opensearch-connector-for-apache-kafka-3.0.0.zip /opt/plugins
RUN unzip /opt/plugins/opensearch-connector-for-apache-kafka-3.0.0.zip opensearch-connector-for-apache-kafka-3.0.0
ENV CONNECT_PLUGIN_PATH="/opt/plugins"
This should create the /opt/plugins/opensearch-connector-for-apache-kafka-3.0.0/opensearch-connector-for-apache-kafka-3.0.0.jar with all of the dependencies in the same directory.