kafka-connect-spooldir
kafka-connect-spooldir copied to clipboard
Kafka Connect connector for reading CSV files into Kafka.
Need DLQ implemented for SpoolDir source connector like available for Sink connectors. Currently if there is a failure the task is failing, instead it is better if there is an...
Hi, I have a file that contains some lines of 500 columns. The last one (is corrupted) has over than 130 000 000 columns. When the connector process the file,...
Trying to leverage some of the nice features related to file management by customizing this connector. The capability to override the file selection process via the InputFileDequeue is one way...
I get the following error (at the bottom) with dates (time and timestamp as well). Am I missing something here? appreciate your help. thx ### configuration ``` name=testWithDates1 connector.class=com.github.jcustenborder.kafka.connect.spooldir.SpoolDirCsvSourceConnector csv.first.row.as.header=true...
What would be the best method to handle a timestamp in "20180530143000.167" format using kafka-connect-spooldir // Kafka // Tranquility? I tried INT64 on kafka-connect-spooldir but had no luck. I am...
I am planning to implement column skipping. Will add a config that will allow skipped column numbers to be specified as a comma separated list. In the CSVSourceTask I’ll skip...
https://github.com/jcustenborder/kafka-connect-spooldir/issues/213 Found two misleading logs while [testing SpoolDirLineDelimitedSourceConnector](https://github.com/vdesabou/kafka-docker-playground/blob/master/connect/connect-spool-dir-source/line-delimited.sh#L14) 1. `Finished processing 0 record(s) in 0 second(s)` is logged even though connector is processing records 2. `Failed to delete input.path sub-directory:...
Found two misleading logs while [testing SpoolDirLineDelimitedSourceConnector](https://github.com/vdesabou/kafka-docker-playground/blob/master/connect/connect-spool-dir-source/line-delimited.sh#L14) - `Finished processing 0 record(s) in 0 second(s)` is logged even though connector is processing records - `Failed to delete input.path sub-directory: /tmp/data/input/fix.json`...
when the connector move file to the finished.path, he add subfolder with the name of the file and add the file to it. config: "input.file.pattern": "^.+\\.txt$", "input.path": "/app/vol/testconnect/source", "error.path": "/app/vol/testconnect/error",...