logstash-input-mongodb
logstash-input-mongodb copied to clipboard
array not stored to elasticsearch
Hi guys, i have a problem, for stored array from mongodb to elasticsearch (picture above), fields with array type not store to elasticsearch why? have set a config?
Any update on this? It seems that arrays are not supported at all. Flattening seems to be working only for objects (arrays are ignored)
Any update on this? I have many arrays with two float elements in my MongoDB. When they came to Kibana, all my arrays became [0, 0].
hey guys, I come again for a similar shot. I tried to import some Array type data from mongo to elasticsearch. the arrays look like [{a:"bla",b:blabla"},{a:"bla",b:blabla"},{a:"bla",b:blabla"}]
, but they are not stored in any field in elasticsearch.
Hi, I have the same problem. The other issue is that I want to handle the array by myself but it seems the plugin does not store the array in the next logstash pipeline (the filter stage after mongo input). A work-around would be highly appreciated
I worked my way through this issue. I had to use a custom ruby filter to extract the array from the log_entry field that this plugin stores in elasticsearch, extract the array from it, and save each array item as a new field. In my case, the array is a root-level field called "messages", here is my code in case someone finds this issue again:
ruby {
code => "
@log_entry_hash = eval(event.get('log_entry'))
counter = 0
@log_entry_hash['messages'].to_a.each {|log|
event.set('message' + counter.to_s, log)
counter += 1
}
"
}
Hi , can someone provide a demo of config file in logstash to send data from mongodb to logstash to store in ES ?
i downloaded the logstash-mongodb connector .
@phutchins Do you think this issue would be resolved by https://github.com/phutchins/logstash-input-mongodb/pull/64 ?
@phutchins I am also facing the same challenge when i am trying to port array type of date from mongodb to elastic. Anybody knows the solution for this.
Anybody knows the solution for this issue?
Problem still existed. But the data is persisted in "log_entry" field.
I already have this problem with object and array lists while pushing data from mongodb to elasticsearch with logstash. this problem has not been solved yet with rubby filter. someone help me
I worked my way through this issue. I had to use a custom ruby filter to extract the array from the log_entry field that this plugin stores in elasticsearch, extract the array from it, and save each array item as a new field. In my case, the array is a root-level field called "messages", here is my code in case someone finds this issue again:
ruby { code => " @log_entry_hash = eval(event.get('log_entry')) counter = 0 @log_entry_hash['messages'].to_a.each {|log| event.set('message' + counter.to_s, log) counter += 1 } " }
Worked perfectly for my arrays! Change 'messages' to the array in question and it splits them.
Thank you for this , its worked