kafka-connect-hdfs icon indicating copy to clipboard operation
kafka-connect-hdfs copied to clipboard

Kafka Connect HDFS connector

Results 131 kafka-connect-hdfs issues
Sort by recently updated
recently updated
newest added

This change enables the use of `CompressionCodec`s installed in Hadoop with the `StringFormat` of HDFS Connector. It introduces a new config parameter - `format.string.compression`, with a default value of "none",...

Hi there, I'm trying to connect up my Confluent Kafka Connect Docker container to a HDFS Docker container ([sequenceiq/hadoop:2.7.1](https://hub.docker.com/r/sequenceiq/hadoop-docker/)) using the bundled Confluent HDFS connector. All containers are run from...

format.class=io.confluent.connect.hdfs.json.JsonFormat java.lang.UnsupportedOperationException: Hive integration is not currently supported with JSON format at io.confluent.connect.hdfs.json.JsonFormat.getHiveFactory(JsonFormat.java:68) at io.confluent.connect.hdfs.DataWriter.(DataWriter.java:292) at io.confluent.connect.hdfs.DataWriter.(DataWriter.java:101) at io.confluent.connect.hdfs.HdfsSinkTask.start(HdfsSinkTask.java:82) at org.apache.kafka.connect.runtime.WorkerSinkTask.initializeAndStart(WorkerSinkTask.java:301) at org.apache.kafka.connect.runtime.WorkerSinkTask.execute(WorkerSinkTask.java:190) at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:175) at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:219) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at...

Am getting the following error while trying to import data from kafka. Data has been produced by a debezium connector. Data is in avro format. Funny enough the data is...

Hi, I made a study on this project, for directory and filename stored in HDFS, I got 2 points, can any contributor give a confirmation?? 1. directory - directory can...

Hi all, I came to a problem that hdfs connector not consuming my kafka topic, however connect-file-sink do. After I started the job, i've wait about 3 minutes but none...

question

I would like to handle multiple schema types while keeping compatibility and hive integration turned on. Kafka connect supports it by 'value.subject.name.strategy' property. HDFS sink should be able to write...

Upon running this command, `./bin/connect-standalone etc/schema-registry/connect-avro-standalone.properties etc/kafka-connect-hdfs/quickstart-hdfs.properties` i got an error: `[2018-12-19 01:11:50,998] ERROR WorkerSinkTask{id=hdfs-sink-0} Task threw an uncaught and unrecoverable exception (org.apache.kafka.connect.runtime.WorkerTask:177) java.lang.NullPointerException at io.confluent.connect.hdfs.HdfsSinkTask.open(HdfsSinkTask.java:133) at org.apache.kafka.connect.runtime.WorkerSinkTask.openPartitions(WorkerSinkTask.java:613) at org.apache.kafka.connect.runtime.WorkerSinkTask.access$1100(WorkerSinkTask.java:70)...

We are experiencing an issue where HDFS connector uses incorrect permissions when creating log directories given configurations with heterogeneous set of keytabs. This happens when we run a script to...

enhancement