Jeremy Custenborder
Jeremy Custenborder
haha no please use the schema. Try setting `"csv.first.row.as.header":"true"` in your config.
Have either of you tried the [2.0](https://github.com/jcustenborder/kafka-connect-spooldir/tree/2.0) version? > Moreover I see the offsets being stored in _kafka-connect-offsets topic, but do not find the corresponding source file name, it shows...
Unfortunately that happens after the task has converted the data to records. I wouldn't be able to catch the exception any way. You could potentially look at implementing a transformation...
Can you run the connector with trace logging and this file? If it's having trouble with the conversion it might output why
This advice worked pretty well for me. I did the following. ``` org.apache.maven.plugins maven-resources-plugin 2.5 process-helm-resources process-resources copy-resources ${project.build.directory} ${basedir}/target/helm-filtered ${project.basedir}/src/helm true io.kokuwa.maven helm-maven-plugin 6.3.0 ${basedir}/target/helm-filtered ${project.version} ${project.version} true false...
@renukaradhya Just a FYI. We are not currently testing the Confluent platform on windows. The class not found is in core Kafka. In general the server processes for Kafka are...
@sachin-rajwade I would look at altering the logger level on `com.github.jcustenborder.kafka.connect.spooldir.InputFileDequeue` to `warn`. This will eliminate the log message.
@peacecwz Happy to merge this! Would you mind adding some unit tests?
Hi @okayhooni! Thanks for submitting this. Can you add a unit test for the new functionality.
@moreiravictor What version did you use? Do you see a guava-*.jar in that directory?