dd-opentracing-cpp icon indicating copy to clipboard operation
dd-opentracing-cpp copied to clipboard

seg fault errors during span extraction

Open Dhivyakumaresan opened this issue 3 years ago • 1 comments

Hi,

we are receiving below seg faults. i have given the gdb backtrace, info args, info register, disas output from core dump. could someone look at and provide some insights why the error occurs. we suspect the error is due to missing span headers or it could be something else.

we are using Datadog tracer.

Jul 18 18:32:45 kong-enterprise-data-i-0d8579c86c063a722 kernel: nginx[14525]: segfault at 4 ip 00007f9342372a1b sp 00007fff545d4e20 error 4 in libdd_opentracing_plugin.so[7f9342320000+f0000]

gdb outputs:

[``` dh-api-admin@kong-enterprise-data-i-0d8579c86c063a722 cores]$ sudo gdb /usr/local/openresty/nginx/sbin/nginx ./core.14525 GNU gdb (GDB) Red Hat Enterprise Linux 8.0.1-36.amzn2.0.1 Copyright (C) 2017 Free Software Foundation, Inc. License GPLv3+: GNU GPL version 3 or later http://gnu.org/licenses/gpl.html This is free software: you are free to change and redistribute it. There is NO WARRANTY, to the extent permitted by law. Type "show copying" and "show warranty" for details. This GDB was configured as "x86_64-redhat-linux-gnu". Type "show configuration" for configuration details. For bug reporting instructions, please see: http://www.gnu.org/software/gdb/bugs/. Find the GDB manual and other documentation resources online at: http://www.gnu.org/software/gdb/documentation/. For help, type "help". Type "apropos word" to search for commands related to "word"... Reading symbols from /usr/local/openresty/nginx/sbin/nginx...done. [New LWP 14525] [New LWP 14526] [Thread debugging using libthread_db enabled] Using host libthread_db library "/lib64/libthread_db.so.1". Core was generated by `nginx: worker process '. Program terminated with signal SIGSEGV, Segmentation fault. #0 std::_Function_handler<opentracing::v3::expected<void, std::error_code>(opentracing::v3::string_view, opentracing::v3::string_view), datadog::opentracing::SpanContext::deserialize(std::shared_ptr<const datadog::opentracing::Logger>, const opentracing::v3::TextMapReader&, const datadog::opentracing::HeadersImpl&)::<lambda(opentracing::v3::string_view, opentracing::v3::string_view)> >::_M_invoke(const std::_Any_data &, opentracing::v3::string_view &&, opentracing::v3::string_view &&) (__functor=..., __args#0=..., __args#1=...) at /usr/include/c++/7/bits/std_function.h:302

warning: Source file is more recent than executable. 302 std::forward<_ArgTypes>(__args)...); [Current thread is 1 (Thread 0x7f9344d4f780 (LWP 14525))] Missing separate debuginfos, use: debuginfo-install kong-enterprise-edition-2.8.1.1-1.noarch



(gdb) bt #0 std::_Function_handler<opentracing::v3::expected<void, std::error_code>(opentracing::v3::string_view, opentracing::v3::string_view), datadog::opentracing::SpanContext::deserialize(std::shared_ptr<const datadog::opentracing::Logger>, const opentracing::v3::TextMapReader&, const datadog::opentracing::HeadersImpl&)::<lambda(opentracing::v3::string_view, opentracing::v3::string_view)> >::_M_invoke(const std::_Any_data &, opentracing::v3::string_view &&, opentracing::v3::string_view &&) (__functor=..., __args#0=..., __args#1=...) at /usr/include/c++/7/bits/std_function.h:302 #1 0x0000000000552950 in std::function<opentracing::v3::expected<void, std::error_code> (opentracing::v3::string_view, opentracing::v3::string_view)>::operator()(opentracing::v3::string_view, opentracing::v3::string_view) const (__args#1=..., __args#0=..., this=0x7fff545d4f80) at /usr/include/c++/4.8.2/functional:2471 #2 ngx_opentracing::(anonymous namespace)::NgxHeaderCarrierReader::__lambda0::operator() (header=..., __closure=) at /tmp/nginx-opentracing/opentracing/src/extract_span_context.cpp:36 #3 ngx_opentracing::for_each<ngx_table_elt_t, ngx_opentracing::(anonymous namespace)::NgxHeaderCarrierReader::ForeachKey(std::function<opentracing::v3::expected(opentracing::v3::string_view, opentracing::v3::string_view)>) const::__lambda0> ( f=..., list=...) at /tmp/nginx-opentracing/opentracing/src/utility.h:74 #4 ngx_opentracing::(anonymous namespace)::NgxHeaderCarrierReader::ForeachKey(std::function<opentracing::v3::expected<void, std::error_code>(opentracing::v3::string_view, opentracing::v3::string_view)>) const (this=this@entry=0x7fff545d5120, f=...) at /tmp/nginx-opentracing/opentracing/src/extract_span_context.cpp:37 #5 0x00007f9342370326 in datadog::opentracing::SpanContext::deserialize (logger=std::shared_ptr (count 4, weak 0) 0x3e65310, reader=..., headers_impl=...) at /root/dd-opentracing-cpp/src/propagation.cpp:469 #6 0x00007f9342371b6e in datadog::opentracing::SpanContext::deserialize (logger=std::shared_ptr (count 4, weak 0) 0x3e65310, reader=..., styles=std::set with 2 elements = {...}) at /root/dd-opentracing-cpp/src/propagation.cpp:438 #7 0x00007f934239c1da in datadog::opentracing::Tracer::Extract (this=0x3e53ca0, reader=...) at /root/dd-opentracing-cpp/src/tracer.cpp:368 #8 0x0000000000552a3a in ngx_opentracing::extract_span_context (tracer=..., request=0x3cf4ab0) at /tmp/nginx-opentracing/opentracing/src/extract_span_context.cpp:52 #9 0x00000000005581ff in ngx_opentracing::RequestTracing::RequestTracing (this=0x443ae30, request=0x3cf4ab0, core_loc_conf=, loc_conf=, parent_span_context=0x0) at /tmp/nginx-opentracing/opentracing/src/request_tracing.cpp:101 #10 0x00000000005563da in __gnu_cxx::new_allocator<ngx_opentracing::RequestTracing>::construct<ngx_opentracing::RequestTracing, ngx_http_request_s*&, ngx_http_core_loc_conf_s*&, ngx_opentracing::opentracing_loc_conf_t*&> (__p=, this=0x3df76f0) at /usr/include/c++/4.8.2/ext/new_allocator.h:120 #11 std::allocator_traits<std::allocator<ngx_opentracing::RequestTracing> >::_S_construct<ngx_opentracing::RequestTracing, ngx_http_request_s*&, ngx_http_core_loc_conf_s*&, ngx_opentracing::opentracing_loc_conf_t*&> (__p=, __a=...) at /usr/include/c++/4.8.2/bits/alloc_traits.h:254 #12 std::allocator_traits<std::allocator<ngx_opentracing::RequestTracing> >::construct<ngx_opentracing::RequestTracing, ngx_http_request_s*&, ngx_http_core_loc_conf_s*&, ngx_opentracing::opentracing_loc_conf_t*&> (__p=, __a=...) at /usr/include/c++/4.8.2/bits/alloc_traits.h:393 #13 std::vector<ngx_opentracing::RequestTracing, std::allocator<ngx_opentracing::RequestTracing> >::_M_emplace_back_aux<ngx_http_request_s*&, ngx_http_core_loc_conf_s*&, ngx_opentracing::opentracing_loc_conf_t*&> (this=this@entry=0x3df76f0) at /usr/include/c++/4.8.2/bits/vector.tcc:408 #14 0x0000000000555f73 in std::vector<ngx_opentracing::RequestTracing, std::allocator<ngx_opentracing::RequestTracing> >::emplace_back<ngx_http_request_s*&, ngx_http_core_loc_conf_s*&, ngx_opentracing::opentracing_loc_conf_t*&> (this=0x3df76f0) at /usr/include/c++/4.8.2/bits/vector.tcc:101 #15 ngx_opentracing::OpenTracingContext::OpenTracingContext (this=0x3df76f0, request=0x3cf4ab0, core_loc_conf=0x1fc4cb8, loc_conf=0x1fc63e0) at /tmp/nginx-opentracing/opentracing/src/opentracing_context.cpp:21 #16 0x00000000005552ce in ngx_opentracing::on_enter_block (request=0x3cf4ab0) at /tmp/nginx-opentracing/opentracing/src/opentracing_handler.cpp:39 #17 0x000000000046508c in ngx_http_core_rewrite_phase (r=0x3cf4ab0, ph=) at src/http/ngx_http_core_module.c:932 #18 0x0000000000460a85 in ngx_http_core_run_phases (r=r@entry=0x3cf4ab0) at src/http/ngx_http_core_module.c:878 #19 0x0000000000460b6c in ngx_http_handler (r=r@entry=0x3cf4ab0) at src/http/ngx_http_core_module.c:861 #20 0x000000000046b583 in ngx_http_process_request (r=r@entry=0x3cf4ab0) at src/http/ngx_http_request.c:2106 #21 0x000000000046bac1 in ngx_http_process_request_headers (rev=rev@entry=0x7f931f99d6d0) at src/http/ngx_http_request.c:1508 #22 0x000000000046be14 in ngx_http_process_request_line (rev=0x7f931f99d6d0) at src/http/ngx_http_request.c:1175 #23 0x00000000004543a7 in ngx_epoll_process_events (cycle=, timer=, flags=) at src/event/modules/ngx_epoll_module.c:901 #24 0x000000000044b673 in ngx_process_events_and_timers (cycle=cycle@entry=0x1f4ca60) at src/event/ngx_event.c:257 #25 0x0000000000452722 in ngx_worker_process_cycle (cycle=cycle@entry=0x1f4ca60, data=data@entry=0x0) at src/os/unix/ngx_process_cycle.c:782 #26 0x00000000004510c0 in ngx_spawn_process (cycle=cycle@entry=0x1f4ca60, proc=proc@entry=0x4526b0 <ngx_worker_process_cycle>, data=data@entry=0x0, name=name@entry=0x5a21dd "worker process", respawn=respawn@entry=-3) at src/os/unix/ngx_process.c:199 #27 0x0000000000452b7c in ngx_start_worker_processes (cycle=cycle@entry=0x1f4ca60, n=1, type=type@entry=-3) at src/os/unix/ngx_process_cycle.c:382 #28 0x0000000000453328 in ngx_master_process_cycle (cycle=cycle@entry=0x1f4ca60) at src/os/unix/ngx_process_cycle.c:135 #29 0x000000000042afa6 in main (argc=, argv=) at src/core/nginx.c:386



(gdb) info args __functor = @0x7fff545d4f80: {_M_unused = {_M_object = 0x4443080, _M_const_object = 0x4443080, _M_function_pointer = 0x4443080, _M_member_pointer = (void (std::_Undefined_class::*)(std::_Undefined_class * const)) 0x4443080}, _M_pod_data = "\200\060D\004", '\000' <repeats 11 times>} args#0 = @0x3cf5718: {data = 0x7265737574736f68 <error: Cannot access memory at address 0x7265737574736f68>, length = 7161132899992297773} __args#1 =


(gdb) info register rax 0x4 4 rbx 0x7fff545d4e70 140734608789104 rcx 0x4 4 rdx 0x3cf5718 63919896 rsi 0x7fff545d4f80 140734608789376 rdi 0x7fff545d4e70 140734608789104 rbp 0x3cf4b20 0x3cf4b20 rsp 0x7fff545d4e20 0x7fff545d4e20 r8 0x3d6974c 64395084 r9 0xe 14 r10 0x70 112 r11 0x7f934399a5ca 140270471062986 r12 0x7fff545d4f00 140734608789248 r13 0x3d88d10 64523536 r14 0x7fff545d4f80 140734608789376 r15 0x5528d0 5581008 rip 0x7f9342372a1b 0x7f9342372a1b <std::_Function_handler<opentracing::v3::expected<void, std::error_code>(opentracing::v3::string_view, opentracing::v3::string_view), datadog::opentracing::SpanContext::deserialize(std::shared_ptr<const datadog::opentracing::Logger>, const opentracing::v3::TextMapReader&, const datadog::opentracing::HeadersImpl&)::<lambda(opentracing::v3::string_view, opentracing::v3::string_view)> >::_M_invoke(const std::_Any_data &, opentracing::v3::string_view &&, opentracing::v3::string_view &&)+11> eflags 0x10202 [ IF RF ] cs 0x33 51 ss 0x2b 43 ds 0x0 0 es 0x0 0 fs 0x0 0 gs 0x0 0 k0 0x0 0 k1 0x0 0 k2 0x0 0 k3 0x0 0 k4 0x0 0 k5 0x0 0 k6 0x0 0 k7 0x0 0



(gdb) disas Dump of assembler code for function std::_Function_handler<opentracing::v3::expected<void, std::error_code>(opentracing::v3::string_view, opentracing::v3::string_view), datadog::opentracing::SpanContext::deserialize(std::shared_ptr<const datadog::opentracing::Logger>, const opentracing::v3::TextMapReader&, const datadog::opentracing::HeadersImpl&)::<lambda(opentracing::v3::string_view, opentracing::v3::string_view)> >::_M_invoke(const std::_Any_data &, opentracing::v3::string_view &&, opentracing::v3::string_view &&): 0x00007f9342372a10 <+0>: push %rbx 0x00007f9342372a11 <+1>: mov %rcx,%rax 0x00007f9342372a14 <+4>: mov %rdi,%rbx 0x00007f9342372a17 <+7>: sub $0x20,%rsp => 0x00007f9342372a1b <+11>: mov (%rax),%r8 0x00007f9342372a1e <+14>: mov 0x8(%rax),%r9 0x00007f9342372a22 <+18>: mov %fs:0x28,%rcx 0x00007f9342372a2b <+27>: mov %rcx,0x18(%rsp) 0x00007f9342372a30 <+32>: xor %ecx,%ecx 0x00007f9342372a32 <+34>: mov (%rsi),%rsi 0x00007f9342372a35 <+37>: mov 0x8(%rdx),%rcx 0x00007f9342372a39 <+41>: mov %rsp,%rdi 0x00007f9342372a3c <+44>: mov (%rdx),%rdx 0x00007f9342372a3f <+47>: callq 0x7f93423721d0 <datadog::opentracing::SpanContext::<lambda(opentracing::v3::string_view, opentracing::v3::string_view)>::operator()(opentracing::v3::string_view, opentracing::v3::string_view) const> 0x00007f9342372a44 <+52>: movzbl (%rsp),%eax 0x00007f9342372a48 <+56>: test %al,%al 0x00007f9342372a4a <+58>: mov %al,(%rbx) 0x00007f9342372a4c <+60>: jne 0x7f9342372a58 <std::_Function_handler<opentracing::v3::expected<void, std::error_code>(opentracing::v3::string_view, opentracing::v3::string_view), datadog::opentracing::SpanContext::deserialize(std::shared_ptr<const datadog::opentracing::Logger>, const opentracing::v3::TextMapReader&, const datadog::opentracing::HeadersImpl&)::<lambda(opentracing::v3::string_view, opentracing::v3::string_view)> >::_M_invoke(const std::_Any_data &, opentracing::v3::string_view &&, opentracing::v3::string_view &&)+72> 0x00007f9342372a4e <+62>: movdqu 0x8(%rsp),%xmm0 0x00007f9342372a54 <+68>: movups %xmm0,0x8(%rbx) 0x00007f9342372a58 <+72>: mov 0x18(%rsp),%rcx 0x00007f9342372a5d <+77>: xor %fs:0x28,%rcx 0x00007f9342372a66 <+86>: mov %rbx,%rax 0x00007f9342372a69 <+89>: jne 0x7f9342372a71 <std::_Function_handler<opentracing::v3::expected<void, std::error_code>(opentracing::v3::string_view, opentracing::v3::string_view), datadog::opentracing::SpanContext::deserialize(std::shared_ptr<const datadog::opentracing::Logger>, const opentracing::v3::TextMapReader&, const datadog::opentracing::HeadersImpl&)::<lambda(opentracing::v3::string_view, opentracing::v3::string_view)> >::_M_invoke(const std::_Any_data &, opentracing::v3::string_view &&, opentracing::v3::string_view &&)+97> 0x00007f9342372a6b <+91>: add $0x20,%rsp 0x00007f9342372a6f <+95>: pop %rbx 0x00007f9342372a70 <+96>: retq 0x00007f9342372a71 <+97>: callq 0x7f934234f1f0 __stack_chk_fail@plt End of assembler dump.


Thanks in advance

Dhivyakumaresan avatar Jul 21 '22 18:07 Dhivyakumaresan

It looks like you're using the dd-opentracing-cpp OpenTracing plugin with Kong.

The crash seems to be due to memory corruption. The std::function object that is invoked for each HTTP header in an incoming request refers to an invalid address. It's likely that the underlying issue lies elsewhere.

In order to debug this, I'd need to be able to reproduce the crash.

Is the crash happening with any regularity? If so, have you found a way to trigger the crash?

dgoffredo avatar Aug 19 '22 14:08 dgoffredo