logstash-filter-csv
logstash-filter-csv copied to clipboard
Filter CSV convert doesn't work as documentation says
migrated from https://github.com/elastic/logstash/issues/7191
Hello, i'm configuring logstash for parsing some csv files. I wanted to use the convert option from the CSV filter as in https://www.elastic.co/guide/en/logstash/current/plugins-filters-csv.html#plugins-filters-csv-convert
- Version: 5.4.0
- Operating System: RHEL
- Config File
filter {
csv {
skip_empty_columns => "true"
separator => ","
autogenerate_column_names => "true"
convert => [ "timeStamp" => "date_time" , "elapsed" => "integer" , "responseCode" => "integer" , "success" => "boolean" , "bytes" => "integer" , "Latency" => "integer" , "IdleTime" => "integer"
add_tag => [ "JMETER" ]
}
}
Result of the config test :
Sending Logstash's logs to /exploit/logstash/logs/ which is now configured via log4j2.properties
[2017-05-24T10:55:12,964][FATAL][logstash.runner ] The given configuration is invalid. Reason: Expected one of #, {, } at line 22, column 45 (byte 582) after filter {
csv {
skip_empty_columns => "true"
separator => ","
autogenerate_column_names => "true"
#columns => [ "timeStamp","elapsed","label","responseCode","responseMessage","threadName","dataType","success","failureMessage","bytes","grpThreads","allThreads","Latency","IdleTime"]
convert => { "timeStamp" => "date_time"
- Steps to Reproduce: was working with a convert option by colum to filter
like mutate convert https://www.elastic.co/guide/en/logstash/current/plugins-filters-mutate.html#plugins-filters-mutate-convert )
convert => { "timeStamp" => "date"}
Thanks for clearing this case ! :)
David