streamx
streamx copied to clipboard
Not a valid partitioner class: io.confluent.connect.hdfs.partitioner.DefaultPartitioner
Hi
I am getting below issue while submitting job. Could you please help me on this issue?
{"error_code":400,"message":"Connector configuration is invalid and contains the following 5 error(s):\nNot a valid partitioner class: io.confluent.connect.hdfs.partitioner.DefaultPartitioner\nNot a valid partitioner class: io.confluent.connect.hdfs.partitioner.DefaultPartitioner\nNot a valid partitioner class: io.confluent.connect.hdfs.partitioner.DefaultPartitioner\nNot a valid partitioner class: io.confluent.connect.hdfs.partitioner.DefaultPartitioner\nNot a valid partitioner class: io.confluent.connect.hdfs.partitioner.DefaultPartitioner\nYou can also find the above list of errors at the endpoint /{connectorType}/config/validate
"}
My properties looks like below
{ "name": "s3-streamx-test", "config": { "connector.class": "com.qubole.streamx.s3.S3SinkConnector", "tasks.max": "4", "topics": "event", "s3.url": "s3://bucket", "flush.size": "2", "format.class": "io.confluent.connect.hdfs.parquet.ParquetFormat", "hadoop.conf.dir": "/opt/streamx/hadoop-conf/", "name": "s3-streamx-test" } }
and hdfs-site.xml
<configuration> <property> <name>fs.s3.impl</name> <value>org.apache.hadoop.fs.s3native.NativeS3FileSystem</value> </property> <property> <name>fs.s3.awsAccessKeyId</name> <value>mykey</value> <property> <name>fs.s3.awsSecretAccessKey</name> value>mysecretkey</value> </property> </configuration>
Regards Anand.
Hello @anandreddyakidi
I was facing the same issue. Confluent team has already fix the issue in kafka-connect-hdfs, https://github.com/confluentinc/kafka-connect-hdfs/pull/150
You can rebuild the package with above fix to make it work.
Hope it helps you :)
I am using confluent-5.0.1
and still having this issue, Any idea on how to resolve it ?