jaeger
jaeger copied to clipboard
Potential null pointer exception when span without `process.tags` in ES storage
Describe the bug
Potential null pointer exception at storage data pipeline when span without process.tags
is sent with ES as storage type
To Reproduce Steps to reproduce the behavior:
- Send Spans which doesn't have any
process.tags
values - Set Elasticsearch as storage_type at collector end
Expected behavior Shouldn't cause any NPE at downstream pipeline even when processing spans with no process.tags
Screenshots If applicable, add screenshots to help explain your problem.
Version (please complete the following information):
- OS: Linux
- Jaeger version: All Latest Version
- Deployment: K8s
What troubleshooting steps did you try?
Upon further checking, In this method, When there are no incoming process.tags
the resultant tagsMap
value is nil. In the same method, kvs
tag array is instantiated as empty array when there are no tags. If the same behaviour can be applied to tagsMap
, Potential NPE could be avoided.
Additional context
We make use of our own data pipeline(Kinesis + Logstash) before indexing to our storage. There are few transformations happening at Logstash based on few use cases where the process.tags
is expected to be empty instead of nil if there is none in original span event.
IMO, I expect the process.tags
as empty map instead of null. If this is considered as an acceptable behaviour, I can raise PR to fix this by instantiating the map
using make
instead of just declaring.
Raising it as a bug since I don't find any category for enhancements type of requests.
Potential null pointer exception at storage data pipeline
is it "potential" or do you have a stack trace?
I have a stack trace of NPE happening at logstash end (not at jaeger collector).
could not process event: undefined method `each' for nil:NilClass {:script_path=>"/etc/logstash/conf.d/tags_flatten.rb", :class=>"NoMethodError", :backtrace=>["/etc/logstash/conf.d/tags_flatten.rb:36:in `filter'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-ruby-3.1.7/lib/logstash/filters/ruby/script/context.rb:55:in `execute_filter'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-ruby-3.1.7/lib/logstash/filters/ruby/script.rb:30:in `execute'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-ruby-3.1.7/lib/logstash/filters/ruby.rb:107:in `file_script'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-ruby-3.1.7/lib/logstash/filters/ruby.rb:88:in `filter'", "/usr/share/logstash/logstash-core/lib/logstash/filters/base.rb:159:in `do_filter'", "/usr/share/logstash/logstash-core/lib/logstash/filters/base.rb:178:in `block in multi_filter'", "org/jruby/RubyArray.java:1821:in `each'", "/usr/share/logstash/logstash-core/lib/logstash/filters/base.rb:175:in `multi_filter'", "org/logstash/config/ir/compiler/AbstractFilterDelegatorExt.java:134:in `multi_filter'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:299:in `block in start_workers'"]}