glusterfs icon indicating copy to clipboard operation
glusterfs copied to clipboard

dht: Fix double free issue in the cbk function dht_common_mark_mdsxattr_cbk

Open mohit84 opened this issue 3 years ago • 1 comments

During fresh lookup for directory dht set and heal mds xattr on the directory. The function dht_common_mark_mdsxattr_cbk is trying to destroy frame->root even the same frame is passed to the function(dht_selfheal_dir_setattr) to heal the xattr. Ideally we have this bug from the day when a feature was implemented but the bug as not caught till today.After move on tcmalloc it was easily caught and a client process was crashed.It is a memory allocator behavior we can't expect every time an allocator should crash, as per free man page the behavior is undefined in case if ptr has already been called before.

It is throwing below crash.

#0 0x00007f864a7332f5 in STACK_DESTROY (stack=0x7f86deadc0de00) at /home/jenkins/root/workspace/periodic-regression-RHGS-3.5.7-on-RHEL8/libglusterfs/src/glusterfs/stack.h:183 #1 0x00007f864a736c65 in dht_common_mark_mdsxattr_cbk (frame=0x15f15b0, cookie=0x108a850, this=0x111c050, op_ret=-1, op_errno=107, xdata=0x0) at /home/jenkins/root/workspace/periodic-regression-RHGS-3.5.7-on-RHEL8/xlators/cluster/dht/src/dht-common.c:780 #2 0x00007f864aa4546e in client4_0_setxattr_cbk (req=0x7f864bc7a070, iov=0x0, count=0, myframe=0x1579c70) at /home/jenkins/root/workspace/periodic-regression-RHGS-3.5.7-on-RHEL8/xlators/protocol/client/src/client-rpc-fops_v2.c:856 #3 0x00007f864a9e6bdb in client_submit_request (this=0x108a850, req=0x7f864bc7a340, frame=0x1579c70, prog=0x7f864ac7ad80 <clnt4_0_fop_prog>, procnum=17, cbkfn=0x7f864aa44f5c <client4_0_setxattr_cbk>, cp=0x0, xdrproc=0x7f86600c540d <xdr_gfx_setxattr_req>) at /home/jenkins/root/workspace/periodic-regression-RHGS-3.5.7-on-RHEL8/xlators/protocol/client/src/client.c:252 #4 0x00007f864aa59d47 in client4_0_setxattr (frame=0x1579c70, this=0x108a850, data=0x7f864bc7a430) at /home/jenkins/root/workspace/periodic-regression-RHGS-3.5.7-on-RHEL8/xlators/protocol/client/src/client-rpc-fops_v2.c:4308 #5 0x00007f864a9ed4b4 in client_setxattr (frame=0x1579c70, this=0x108a850, loc=0x1561258, dict=0x156ae50, flags=0, xdata=0x0) at /home/jenkins/root/workspace/periodic-regression-RHGS-3.5.7-on-RHEL8/xlators/protocol/client/src/client.c:1353 #6 0x00007f864a737b2d in dht_common_mark_mdsxattr (frame=0x15f15b0, errst=0x0, mark_during_fresh_lookup=0) at /home/jenkins/root/workspace/periodic-regression-RHGS-3.5.7-on-RHEL8/xlators/cluster/dht/src/dht-common.c:920 #7 0x00007f864a720473 in dht_selfheal_dir_mkdir (frame=0x15f15b0, loc=0x1561258, layout=0x13b89f0, force=0) at /home/jenkins/root/workspace/periodic-regression-RHGS-3.5.7-on-RHEL8/xlators/cluster/dht/src/dht-selfheal.c:1509 #8 0x00007f864a72230c in dht_selfheal_directory (frame=0x15f15b0, dir_cbk=0x7f864a734ea9 <dht_lookup_selfheal_cbk>, loc=0x1561258, layout=0x13b89f0) at /home/jenkins/root/workspace/periodic-regression-RHGS-3.5.7-on-RHEL8/xlators/cluster/dht/src/dht-selfheal.c:2182 #9 0x00007f864a73bc2e in dht_revalidate_cbk (frame=0x15f15b0, cookie=0x1088450, this=0x111c050, op_ret=0, op_errno=0, inode=0x1093c50, stbuf=0x7f864bc7a9e0, xattr=0x1584350, postparent=0x7f864bc7a940) at /home/jenkins/root/workspace/periodic-regression-RHGS-3.5.7-on-RHEL8/xlators/cluster/dht/src/dht-common.c:1787 #10 0x00007f864aa50a63 in client4_0_lookup_cbk (req=0x1359390, iov=0x13593c8, count=1, myframe=0x15ddc70) at /home/jenkins/root/workspace/periodic-regression-RHGS-3.5.7-on-RHEL8/xlators/protocol/client/src/client-rpc-fops_v2.c:2664 #11 0x00007f86602e89d5 in rpc_clnt_handle_reply (clnt=0x1122a10, pollin=0x1082f50) at /home/jenkins/root/workspace/periodic-regression-RHGS-3.5.7-on-RHEL8/rpc/rpc-lib/src/rpc-clnt.c:764 #12 0x00007f86602e8f04 in rpc_clnt_notify (trans=0x106bbd0, mydata=0x1122a40, event=RPC_TRANSPORT_MSG_RECEIVED, data=0x1082f50) at /home/jenkins/root/workspace/periodic-regression-RHGS-3.5.7-on-RHEL8/rpc/rpc-lib/src/rpc-clnt.c:931 #13 0x00007f86602e4f37 in rpc_transport_notify (this=0x106bbd0, event=RPC_TRANSPORT_MSG_RECEIVED, data=0x1082f50) --Type <RET> for more, q to quit, c to continue without paging-- at /home/jenkins/root/workspace/periodic-regression-RHGS-3.5.7-on-RHEL8/rpc/rpc-lib/src/rpc-transport.c:547 #14 0x00007f864c33bcfa in socket_event_poll_in (this=0x106bbd0, notify_handled=true) at /home/jenkins/root/workspace/periodic-regression-RHGS-3.5.7-on-RHEL8/rpc/rpc-transport/socket/src/socket.c:2631 #15 0x00007f864c33cd12 in socket_event_handler (fd=16, idx=3, gen=1, data=0x106bbd0, poll_in=1, poll_out=0, poll_err=0, event_thread_died=0 '\000') at /home/jenkins/root/workspace/periodic-regression-RHGS-3.5.7-on-RHEL8/rpc/rpc-transport/socket/src/socket.c:3040 #16 0x00007f86605c0834 in event_dispatch_epoll_handler (event_pool=0x1056050, event=0x7f864bc7afec) at /home/jenkins/root/workspace/periodic-regression-RHGS-3.5.7-on-RHEL8/libglusterfs/src/event-epoll.c:656 #17 0x00007f86605c0d24 in event_dispatch_epoll_worker (data=0x1068ec0) at /home/jenkins/root/workspace/periodic-regression-RHGS-3.5.7-on-RHEL8/libglusterfs/src/event-epoll.c:769 #18 0x00007f865ece214a in start_thread () from ./lib64/libpthread.so.0 #19 0x00007f865e52bdc3 in clone () from ./lib64/libc.so.6

For more please refer this https://gluster-downstream-jenkins-csb-storage.apps.ocp4.prod.psi.redhat.com/job/periodic-regression-RHGS-3.5.7-on-RHEL8/18/consoleFull

mohit84 avatar Feb 01 '22 04:02 mohit84

Thank you for your contributions. Noticed that this issue is not having any activity in last ~6 months! We are marking this issue as stale because it has not had recent activity. It will be closed in 2 weeks if no one responds with a comment here.

stale[bot] avatar Sep 21 '22 00:09 stale[bot]

Closing this issue as there was no update since my last update on issue. If this is an issue which is still valid, feel free to open it.

stale[bot] avatar Oct 22 '22 18:10 stale[bot]