Bin
Bin
I noticed I have to configure the hadoop config files like core-site.xml, hdfs-site.xml to configure S3. And I could not find the mentioned config/hadoop-conf in my installation (Kafka 0.10.2.0). So...
Redshift just supported to copy using parquet format, which might be much faster than csv. https://aws.amazon.com/about-aws/whats-new/2018/06/amazon-redshift-can-now-copy-from-parquet-and-orc-file-formats/
Here is the error message. Anyone can help? `java.lang.SecurityException: class "com.amazonaws.auth.DefaultAWSCredentialsProviderChain"'s signer information does not match signer information of other classes in the same package ` My scala code is...