logstash-codec-netflow
logstash-codec-netflow copied to clipboard
Current status of support for Cisco High-Speed Logging (HSL)
I've created a new "cisco-hsl" branch to add support for Cisco HSL.
However, I'm running into a structural issue:
- In template 284 both l4_src_port and l4_dest_port occur twice. This leads to a duplicate field definition error in the BinData library we use to process Netflow. This leads to dropping of the rest of the packet, which contains other templates. This leads to the inability to decode 50% of the netflow data. This may very well be a bug on Cisco's side, unfortunately we can't gracefully handle it, at all.
Currently only these flowset_id's are properly decoded:
- 258
- 261
- 262
It appears this duplicate field business also happens for IPFIX from Cisco ASR1k with IE 12235 for PEN 9.
I am also seeing a similar issue with Netflow from Forcepoint NGFW. Error log and PCAP below.
[2018-04-23T16:48:10,483][ERROR][logstash.inputs.udp ] Exception in inputworker {
"exception"=>#<NameError: field 'icmp_type' in BinData::Struct, is defined multiple times.>,
"backtrace"=>[
"/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/bindata-2.4.3/lib/bindata/struct.rb:409:in `block in ensure_field_names_are_valid'",
"org/jruby/RubyArray.java:1734:in `each'",
"/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/bindata-2.4.3/lib/bindata/struct.rb:399:in `ensure_field_names_are_valid'",
"/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/bindata-2.4.3/lib/bindata/struct.rb:375:in `block in sanitize_fields'",
"/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/bindata-2.4.3/lib/bindata/sanitize.rb:266:in `block in sanitize_fields'",
"/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/bindata-2.4.3/lib/bindata/sanitize.rb:283:in `sanitize'",
"/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/bindata-2.4.3/lib/bindata/sanitize.rb:264:in `sanitize_fields'",
"/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/bindata-2.4.3/lib/bindata/struct.rb:369:in `sanitize_fields'",
"/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/bindata-2.4.3/lib/bindata/struct.rb:345:in `sanitize_parameters!'",
"/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/bindata-2.4.3/lib/bindata/sanitize.rb:302:in `sanitize!'",
"/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/bindata-2.4.3/lib/bindata/sanitize.rb:210:in `initialize'",
"/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/bindata-2.4.3/lib/bindata/sanitize.rb:192:in `sanitize'",
"/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/bindata-2.4.3/lib/bindata/base.rb:302:in `extract_args'",
"/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/bindata-2.4.3/lib/bindata/base.rb:249:in `extract_args'",
"/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/bindata-2.4.3/lib/bindata/base.rb:81:in `initialize'",
"/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/bindata-2.4.3/lib/bindata/warnings.rb:21:in `initialize_with_warning'",
"/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-codec-netflow-3.12.0/lib/logstash/codecs/netflow.rb:224:in `block in decode_netflow9'",
"org/jruby/ext/thread/Mutex.java:148:in `synchronize'",
"/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-codec-netflow-3.12.0/lib/logstash/codecs/netflow.rb:223:in `block in decode_netflow9'",
"org/jruby/RubyKernel.java:1114:in `catch'",
"/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-codec-netflow-3.12.0/lib/logstash/codecs/netflow.rb:186:in `block in decode_netflow9'",
"/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/bindata-2.4.3/lib/bindata/array.rb:208:in `block in each'",
"org/jruby/RubyArray.java:1734:in `each'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/bindata-2.4.3/lib/bindata/array.rb:208:in `each'",
"/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-codec-netflow-3.12.0/lib/logstash/codecs/netflow.rb:185:in `decode_netflow9'",
"/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-codec-netflow-3.12.0/lib/logstash/codecs/netflow.rb:124:in `block in decode'",
"/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/bindata-2.4.3/lib/bindata/array.rb:208:in `block in each'",
"org/jruby/RubyArray.java:1734:in `each'",
"/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/bindata-2.4.3/lib/bindata/array.rb:208:in `each'",
"/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-codec-netflow-3.12.0/lib/logstash/codecs/netflow.rb:120:in `decode'",
"/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-input-udp-3.2.1/lib/logstash/inputs/udp.rb:133:in `inputworker'",
"/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-input-udp-3.2.1/lib/logstash/inputs/udp.rb:102:in `block in udp_listener'"
]
}
Oops just found this in RFC7011, chapter 8. It's one of the MUST requirements we currently don't implement, see also issue #83.
Collecting Processes MUST properly handle Templates with multiple identical Information Elements.
Here is another sample from another Forcepoint NGFW. forcepoint_netflow.pcap.zip
Confirming the observations robcowart made about Forcepoint NGFW, seeing exactly same issue here.
As a workaround, using CEF format log forwarding and Logstash CEF input codec with '\n' as delimiter.
The duplicate from Forcepoint is fixed with this #171
I am seeing same errors for HSL on logstash 6.8.4 and elastiflow.
[2019-10-27T17:53:13,708][ERROR][logstash.inputs.udp ] Exception in inputworker {"exception"=>#<NameError: field 'l4_src_port' in BinData::Struct, is defined multiple times.>, "backtrace"=>["/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/bindata-2.4.4/lib/bindata/struct.rb:409:in
block in ensure_field_names_are_valid'", "org/jruby/RubyArray.java:1792:in each'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/bindata-2.4.4/lib/bindata/struct.rb:399:in
ensure_field_names_are_valid'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/bindata-2.4.4/lib/bindata/struct.rb:375:in block in sanitize_fields'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/bindata-2.4.4/lib/bindata/sanitize.rb:266:in
block in sanitize_fields'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/bindata-2.4.4/lib/bindata/sanitize.rb:283:in sanitize'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/bindata-2.4.4/lib/bindata/sanitize.rb:264:in
sanitize_fields'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/bindata-2.4.4/lib/bindata/struct.rb:369:in sanitize_fields'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/bindata-2.4.4/lib/bindata/struct.rb:345:in
sanitize_parameters!'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/bindata-2.4.4/lib/bindata/sanitize.rb:302:in sanitize!'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/bindata-2.4.4/lib/bindata/sanitize.rb:210:in
initialize'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/bindata-2.4.4/lib/bindata/sanitize.rb:192:in sanitize'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/bindata-2.4.4/lib/bindata/base.rb:302:in
extract_args'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/bindata-2.4.4/lib/bindata/base.rb:249:in extract_args'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/bindata-2.4.4/lib/bindata/base.rb:81:in
initialize'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/bindata-2.4.4/lib/bindata/warnings.rb:21:in initialize_with_warning'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-codec-netflow-4.2.1/lib/logstash/codecs/netflow.rb:603:in
do_register'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-codec-netflow-4.2.1/lib/logstash/codecs/netflow.rb:569:in block in register'", "org/jruby/ext/thread/Mutex.java:165:in
synchronize'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-codec-netflow-4.2.1/lib/logstash/codecs/netflow.rb:568:in register'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-codec-netflow-4.2.1/lib/logstash/codecs/netflow.rb:203:in
block in decode_netflow9'", "org/jruby/RubyKernel.java:1193:in catch'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-codec-netflow-4.2.1/lib/logstash/codecs/netflow.rb:167:in
block in decode_netflow9'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/bindata-2.4.4/lib/bindata/array.rb:208:in block in each'", "org/jruby/RubyArray.java:1792:in
each'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/bindata-2.4.4/lib/bindata/array.rb:208:in each'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-codec-netflow-4.2.1/lib/logstash/codecs/netflow.rb:166:in
decode_netflow9'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-codec-netflow-4.2.1/lib/logstash/codecs/netflow.rb:97:in block in decode'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/bindata-2.4.4/lib/bindata/array.rb:208:in
block in each'", "org/jruby/RubyArray.java:1792:in each'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/bindata-2.4.4/lib/bindata/array.rb:208:in
each'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-codec-netflow-4.2.1/lib/logstash/codecs/netflow.rb:93:in decode'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-udp-3.3.4/lib/logstash/inputs/udp.rb:151:in
inputworker'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-udp-3.3.4/lib/logstash/inputs/udp.rb:63:in block in run'"]}
Hello, do we believe this will be resolved? Cisco is using HSL in their SDWAN firewall logging and using this plugin would be very valuable.
No as the logstash netflow codec is deprecated. I can confirm that using Filebeat 7.6.x and its netflow module works, there is just an enhancement request open to populate a human-readable field in the ecs output for the firewall action which is numerical only at the moment.
@abraxxa the netflow CODEC is not deprecated. The netflow MODULE (which was essentially ElastiFlow 1.0.0 repackaged as a Logstash module) has been deprecated.
I had a support ticket open for that and Elastic said it won't develop/support the Logstash netflow codec any more and we have to use the Filebeat one. A Logstash module is just an input configured with a codec which isn't adding any value to the product anyways.
Elastic has never really actively developed the Netflow codec. The community always has.
BTW, a Logstash module is more than an input and codec. They also include a full pipeline to further process the data and the necessary index templates, and Kibana content.
Whether modules offer a significant value or not is a different question, and is about more than just Logstash. I also find most of the Beats modules to be mediocre at best. While the stack itself allows for some compelling solutions to be built on top of it, Elastic isn't really great at doing so IMO.