camel-kafka-connector icon indicating copy to clipboard operation
camel-kafka-connector copied to clipboard

camel-hdfs-kafka-connector sink configuration with compression

Open vitofico opened this issue 3 years ago • 1 comments

I have the following .json for creating the hdfs-kafka connector

`{ "name":"CamelHdfsSinkConnector",

"config":{
    
    "connector.class":"org.apache.camel.kafkaconnector.hdfs.CamelHdfsSinkConnector",
    "task.max":2,

    "key.converter":"org.apache.kafka.connect.storage.StringConverter",
    "value.converter":"org.apache.kafka.connect.storage.StringConverter",

    "transforms": "CamelTypeConverterTransform",
    "transforms.CamelTypeConverterTransform.type": "org.apache.camel.kafkaconnector.transforms.CamelTypeConverterTransform$Value",
    "transforms.CamelTypeConverterTransform.target.type":"java.lang.String",

    "topics":"xxxx",
    "camel.sink.path.hostName": "xxxx",
    "camel.sink.endpoint.namedNodes": "namenode-1:8020,namenode-2:8020",
    "camel.sink.endpoint.splitStrategy": "IDLE:100000,BYTES:132120576",
    "camel.sink.endpoint.compressionCodec": "BZIP2",
    "camel.sink.endpoint.compressionType": "RECORD",
    "camel.sink.path.path": "stream_compression"


}

}`

I don't get any error, but no stream is produced on hadoop. Without the compressionCodec and compressionType lines, it works just fine.

Any hint?

vitofico avatar Jun 21 '22 12:06 vitofico

Hello! I'm also having this issue when trying to compress records sent from Kafka to HDFS. In my case the stream is created, but there is no compression of the files in the endpoint. Any help would be welcome.

DanielGarcia117 avatar Jul 21 '22 12:07 DanielGarcia117