glusterfs
glusterfs copied to clipboard
Optimized the mdcache return error attribute bug
mdcache : xlators/performance/md-cache/src/md-cache.c
Problem: In the distributed replication volume scenario, when a single host node breaks down or the brick goes offline, mdcache returns an error attribute: stale file handle when an NFS client creates a folder and immediately searches for the folder
Solution: Return from afr consistent-metadata,do not cache
Fixes: #4117
Signed-off-by: Hu Ping [email protected]
Change-Id: I00b72fb9b0dc73eebefdd8110947883774cc14c9
We sincerely hope that you two experts can help us pre-evaluate whether this patch is effective in solving the cache layer return error. @amarts @avati
Can one of the admins verify this patch?
Can one of the admins verify this patch?
Can one of the admins verify this patch?
CLANG-FORMAT FAILURE: Before merging the patch, this diff needs to be considered for passing clang-format
index 090a1b1b3..325b9af22 100644
--- a/xlators/performance/md-cache/src/md-cache.c
+++ b/xlators/performance/md-cache/src/md-cache.c
@@ -1617,7 +1617,7 @@ mdc_mkdir_cbk(call_frame_t *frame, void *cookie, xlator_t *this, int32_t op_ret,
if (local->loc.parent) {
if (postparent && postparent->ia_nlink == 0) {
- //return from afr consistent-metadata, do not cache
+ // return from afr consistent-metadata, do not cache
goto out;
}
mdc_inode_iatt_set(this, local->loc.parent, postparent,
We sincerely hope that you two experts can help us pre-evaluate whether this patch is effective in solving the cache layer return error. @amarts @avati
/run regression
1 test(s) failed ./tests/00-geo-rep/00-georep-verify-non-root-setup.t
0 test(s) generated core
https://build.gluster.org/job/gh_centos7-regression/3291/
@huping-chinamobile a patch to the release-6
branch won't be accepted (there won't be any new release based on that branch). You should send the patch to the devel
branch.
Thank you very much for your suggestion, which has been submitted to the devel branch.
Thank you for your contributions. Noticed that this issue is not having any activity in last ~6 months! We are marking this issue as stale because it has not had recent activity. It will be closed in 2 weeks if no one responds with a comment here.
Closing this issue as there was no update since my last update on issue. If this is an issue which is still valid, feel free to open it.