dd-opentracing-cpp
dd-opentracing-cpp copied to clipboard
seg fault errors during span extraction
Hi,
we are receiving below seg faults. i have given the gdb backtrace, info args, info register, disas output from core dump. could someone look at and provide some insights why the error occurs. we suspect the error is due to missing span headers or it could be something else.
we are using Datadog tracer.
Jul 18 18:32:45 kong-enterprise-data-i-0d8579c86c063a722 kernel: nginx[14525]: segfault at 4 ip 00007f9342372a1b sp 00007fff545d4e20 error 4 in libdd_opentracing_plugin.so[7f9342320000+f0000]
gdb outputs:
[``` dh-api-admin@kong-enterprise-data-i-0d8579c86c063a722 cores]$ sudo gdb /usr/local/openresty/nginx/sbin/nginx ./core.14525 GNU gdb (GDB) Red Hat Enterprise Linux 8.0.1-36.amzn2.0.1 Copyright (C) 2017 Free Software Foundation, Inc. License GPLv3+: GNU GPL version 3 or later http://gnu.org/licenses/gpl.html This is free software: you are free to change and redistribute it. There is NO WARRANTY, to the extent permitted by law. Type "show copying" and "show warranty" for details. This GDB was configured as "x86_64-redhat-linux-gnu". Type "show configuration" for configuration details. For bug reporting instructions, please see: http://www.gnu.org/software/gdb/bugs/. Find the GDB manual and other documentation resources online at: http://www.gnu.org/software/gdb/documentation/. For help, type "help". Type "apropos word" to search for commands related to "word"... Reading symbols from /usr/local/openresty/nginx/sbin/nginx...done. [New LWP 14525] [New LWP 14526] [Thread debugging using libthread_db enabled] Using host libthread_db library "/lib64/libthread_db.so.1". Core was generated by `nginx: worker process '. Program terminated with signal SIGSEGV, Segmentation fault. #0 std::_Function_handler<opentracing::v3::expected<void, std::error_code>(opentracing::v3::string_view, opentracing::v3::string_view), datadog::opentracing::SpanContext::deserialize(std::shared_ptr<const datadog::opentracing::Logger>, const opentracing::v3::TextMapReader&, const datadog::opentracing::HeadersImpl&)::<lambda(opentracing::v3::string_view, opentracing::v3::string_view)> >::_M_invoke(const std::_Any_data &, opentracing::v3::string_view &&, opentracing::v3::string_view &&) (__functor=..., __args#0=..., __args#1=...) at /usr/include/c++/7/bits/std_function.h:302
warning: Source file is more recent than executable. 302 std::forward<_ArgTypes>(__args)...); [Current thread is 1 (Thread 0x7f9344d4f780 (LWP 14525))] Missing separate debuginfos, use: debuginfo-install kong-enterprise-edition-2.8.1.1-1.noarch
(gdb) bt
#0 std::_Function_handler<opentracing::v3::expected<void, std::error_code>(opentracing::v3::string_view, opentracing::v3::string_view), datadog::opentracing::SpanContext::deserialize(std::shared_ptr<const datadog::opentracing::Logger>, const opentracing::v3::TextMapReader&, const datadog::opentracing::HeadersImpl&)::<lambda(opentracing::v3::string_view, opentracing::v3::string_view)> >::_M_invoke(const std::_Any_data &, opentracing::v3::string_view &&, opentracing::v3::string_view &&) (__functor=...,
__args#0=..., __args#1=...) at /usr/include/c++/7/bits/std_function.h:302
#1 0x0000000000552950 in std::function<opentracing::v3::expected<void, std::error_code> (opentracing::v3::string_view, opentracing::v3::string_view)>::operator()(opentracing::v3::string_view, opentracing::v3::string_view) const (__args#1=...,
__args#0=..., this=0x7fff545d4f80) at /usr/include/c++/4.8.2/functional:2471
#2 ngx_opentracing::(anonymous namespace)::NgxHeaderCarrierReader::__lambda0::operator() (header=..., __closure=
(gdb) info args
__functor = @0x7fff545d4f80: {_M_unused = {_M_object = 0x4443080, _M_const_object = 0x4443080, _M_function_pointer = 0x4443080, _M_member_pointer = (void (std::_Undefined_class::*)(std::_Undefined_class * const)) 0x4443080},
_M_pod_data = "\200\060D\004", '\000' <repeats 11 times>}
args#0 = @0x3cf5718: {data = 0x7265737574736f68 <error: Cannot access memory at address 0x7265737574736f68>, length = 7161132899992297773}
__args#1 =
(gdb) info register rax 0x4 4 rbx 0x7fff545d4e70 140734608789104 rcx 0x4 4 rdx 0x3cf5718 63919896 rsi 0x7fff545d4f80 140734608789376 rdi 0x7fff545d4e70 140734608789104 rbp 0x3cf4b20 0x3cf4b20 rsp 0x7fff545d4e20 0x7fff545d4e20 r8 0x3d6974c 64395084 r9 0xe 14 r10 0x70 112 r11 0x7f934399a5ca 140270471062986 r12 0x7fff545d4f00 140734608789248 r13 0x3d88d10 64523536 r14 0x7fff545d4f80 140734608789376 r15 0x5528d0 5581008 rip 0x7f9342372a1b 0x7f9342372a1b <std::_Function_handler<opentracing::v3::expected<void, std::error_code>(opentracing::v3::string_view, opentracing::v3::string_view), datadog::opentracing::SpanContext::deserialize(std::shared_ptr<const datadog::opentracing::Logger>, const opentracing::v3::TextMapReader&, const datadog::opentracing::HeadersImpl&)::<lambda(opentracing::v3::string_view, opentracing::v3::string_view)> >::_M_invoke(const std::_Any_data &, opentracing::v3::string_view &&, opentracing::v3::string_view &&)+11> eflags 0x10202 [ IF RF ] cs 0x33 51 ss 0x2b 43 ds 0x0 0 es 0x0 0 fs 0x0 0 gs 0x0 0 k0 0x0 0 k1 0x0 0 k2 0x0 0 k3 0x0 0 k4 0x0 0 k5 0x0 0 k6 0x0 0 k7 0x0 0
(gdb) disas Dump of assembler code for function std::_Function_handler<opentracing::v3::expected<void, std::error_code>(opentracing::v3::string_view, opentracing::v3::string_view), datadog::opentracing::SpanContext::deserialize(std::shared_ptr<const datadog::opentracing::Logger>, const opentracing::v3::TextMapReader&, const datadog::opentracing::HeadersImpl&)::<lambda(opentracing::v3::string_view, opentracing::v3::string_view)> >::_M_invoke(const std::_Any_data &, opentracing::v3::string_view &&, opentracing::v3::string_view &&): 0x00007f9342372a10 <+0>: push %rbx 0x00007f9342372a11 <+1>: mov %rcx,%rax 0x00007f9342372a14 <+4>: mov %rdi,%rbx 0x00007f9342372a17 <+7>: sub $0x20,%rsp => 0x00007f9342372a1b <+11>: mov (%rax),%r8 0x00007f9342372a1e <+14>: mov 0x8(%rax),%r9 0x00007f9342372a22 <+18>: mov %fs:0x28,%rcx 0x00007f9342372a2b <+27>: mov %rcx,0x18(%rsp) 0x00007f9342372a30 <+32>: xor %ecx,%ecx 0x00007f9342372a32 <+34>: mov (%rsi),%rsi 0x00007f9342372a35 <+37>: mov 0x8(%rdx),%rcx 0x00007f9342372a39 <+41>: mov %rsp,%rdi 0x00007f9342372a3c <+44>: mov (%rdx),%rdx 0x00007f9342372a3f <+47>: callq 0x7f93423721d0 <datadog::opentracing::SpanContext::<lambda(opentracing::v3::string_view, opentracing::v3::string_view)>::operator()(opentracing::v3::string_view, opentracing::v3::string_view) const> 0x00007f9342372a44 <+52>: movzbl (%rsp),%eax 0x00007f9342372a48 <+56>: test %al,%al 0x00007f9342372a4a <+58>: mov %al,(%rbx) 0x00007f9342372a4c <+60>: jne 0x7f9342372a58 <std::_Function_handler<opentracing::v3::expected<void, std::error_code>(opentracing::v3::string_view, opentracing::v3::string_view), datadog::opentracing::SpanContext::deserialize(std::shared_ptr<const datadog::opentracing::Logger>, const opentracing::v3::TextMapReader&, const datadog::opentracing::HeadersImpl&)::<lambda(opentracing::v3::string_view, opentracing::v3::string_view)> >::_M_invoke(const std::_Any_data &, opentracing::v3::string_view &&, opentracing::v3::string_view &&)+72> 0x00007f9342372a4e <+62>: movdqu 0x8(%rsp),%xmm0 0x00007f9342372a54 <+68>: movups %xmm0,0x8(%rbx) 0x00007f9342372a58 <+72>: mov 0x18(%rsp),%rcx 0x00007f9342372a5d <+77>: xor %fs:0x28,%rcx 0x00007f9342372a66 <+86>: mov %rbx,%rax 0x00007f9342372a69 <+89>: jne 0x7f9342372a71 <std::_Function_handler<opentracing::v3::expected<void, std::error_code>(opentracing::v3::string_view, opentracing::v3::string_view), datadog::opentracing::SpanContext::deserialize(std::shared_ptr<const datadog::opentracing::Logger>, const opentracing::v3::TextMapReader&, const datadog::opentracing::HeadersImpl&)::<lambda(opentracing::v3::string_view, opentracing::v3::string_view)> >::_M_invoke(const std::_Any_data &, opentracing::v3::string_view &&, opentracing::v3::string_view &&)+97> 0x00007f9342372a6b <+91>: add $0x20,%rsp 0x00007f9342372a6f <+95>: pop %rbx 0x00007f9342372a70 <+96>: retq 0x00007f9342372a71 <+97>: callq 0x7f934234f1f0 __stack_chk_fail@plt End of assembler dump.
Thanks in advance
It looks like you're using the dd-opentracing-cpp OpenTracing plugin with Kong.
The crash seems to be due to memory corruption. The std::function object that is invoked for each HTTP header in an incoming request refers to an invalid address. It's likely that the underlying issue lies elsewhere.
In order to debug this, I'd need to be able to reproduce the crash.
Is the crash happening with any regularity? If so, have you found a way to trigger the crash?