kafka-connect-hdfs icon indicating copy to clipboard operation
kafka-connect-hdfs copied to clipboard

java.lang.NoSuchMethodError: org.apache.avro.Schema.addProp in version 5.2.4

Open josevi96 opened this issue 5 years ago • 1 comments

We are working with connect-standalone script and this two configurations:

Worker.properties -bootstrap.servers=127.0.0.1:9092 -key.converter=org.apache.kafka.connect.storage.StringConverter -value.converter=io.confluent.connect.avro.AvroConverter -key.converter.schemas.enable=false -value.converter.schemas.enable=false -key.converter.schema.registry.url=http://127.0.0.1:8081 -value.converter.schema.registry.url=http://127.0.0.1:8081 -internal.key.converter.schemas.enable=false -internal.value.converter.schemas.enable=false -offset.storage.file.filename=/tmp/connect.offsets -offset.flush.interval.ms=10000 -offset.flush.timeout.ms=50000 -producer.max.request.size=15728640 -rest.port=9002 -plugin.path= My path to connectors

connector.properties -name=hdfs-sink -connector.class=io.confluent.connect.hdfs.HdfsSinkConnector -tasks.max=1 -topics=development.raw.bigdecimal -format.class=io.confluent.connect.hdfs.parquet.ParquetFormat -hadoop.home=/ -hdfs.url=hdfs://localhost:9000 -flush.size=5

we are getting this exception

 ERROR WorkerSinkTask{id=hdfs-sink-0} Task threw an uncaught and unrecoverable exception. Task is being killed and will not recover until manually restarted. Error: org.apache.avro.Schema.addProp(Ljava/lang/String;Lorg/codehaus/jackson/JsonNode;)V (org.apache.kafka.connect.runtime.WorkerSinkTask:565)
java.lang.NoSuchMethodError: org.apache.avro.Schema.addProp(Ljava/lang/String;Lorg/codehaus/jackson/JsonNode;)V
	at io.confluent.connect.avro.AvroData.fromConnectSchema(AvroData.java:825)
	at io.confluent.connect.avro.AvroData.addAvroRecordField(AvroData.java:1082)
	at io.confluent.connect.avro.AvroData.fromConnectSchema(AvroData.java:904)
	at io.confluent.connect.avro.AvroData.addAvroRecordField(AvroData.java:1082)
	at io.confluent.connect.avro.AvroData.fromConnectSchema(AvroData.java:904)
	at io.confluent.connect.avro.AvroData.fromConnectSchema(AvroData.java:736)
	at io.confluent.connect.avro.AvroData.fromConnectSchema(AvroData.java:730)
	at io.confluent.connect.hdfs.parquet.ParquetRecordWriterProvider$1.write(ParquetRecordWriterProvider.java:80)
	at io.confluent.connect.hdfs.TopicPartitionWriter.writeRecord(TopicPartitionWriter.java:690)
	at io.confluent.connect.hdfs.TopicPartitionWriter.write(TopicPartitionWriter.java:385)
	at io.confluent.connect.hdfs.DataWriter.write(DataWriter.java:374)
	at io.confluent.connect.hdfs.HdfsSinkTask.put(HdfsSinkTask.java:143)
	at org.apache.kafka.connect.runtime.WorkerSinkTask.deliverMessages(WorkerSinkTask.java:545)
	at org.apache.kafka.connect.runtime.WorkerSinkTask.poll(WorkerSinkTask.java:325)
	at org.apache.kafka.connect.runtime.WorkerSinkTask.iteration(WorkerSinkTask.java:228)
	at org.apache.kafka.connect.runtime.WorkerSinkTask.execute(WorkerSinkTask.java:200)
	at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:184)
	at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:234)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)

if i list and grep classes from my fat jar:

jar -tf kafka-connect-hdfs-5.2.4-jar-with-dependencies.jar | grep org/apache/avro/Schema.class
org/apache/avro/Schema.class

we add avro (version 1.9.0) just in case to pom.xml

        <dependency>
            <groupId>org.apache.avro</groupId>
            <artifactId>avro</artifactId>
            <version>1.9.0</version>
        </dependency>

and the error still continue. We are working in this commit Thank you very much in advance!

josevi96 avatar Aug 20 '20 11:08 josevi96

Have you found a solution for this?

alonisser avatar Oct 18 '21 06:10 alonisser