glusterfs icon indicating copy to clipboard operation
glusterfs copied to clipboard

Gluster 11 upgrade failure

Open xmarcux opened this issue 2 years ago • 3 comments

Upgrading from gluster 10.3 to 11.

Description of problem: Upgraded arbiter node (brick3) first, after upgrade only glusterd stared, not glusterfsd

Running: gluster volume status

Status of volume: gv0
Gluster process                             TCP Port  RDMA Port  Online  Pid
------------------------------------------------------------------------------
Brick host3:/brick3                         N/A       N/A        N       N/A
Self-heal Daemon on localhost               N/A       N/A        N       N/A

Task Status of Volume gds-common
------------------------------------------------------------------------------
There are no active volume tasks 

Looking at logs the checksum fail between the nodes.

Descovered the following on the arbiter node in file: /var/lib/glusterd/vols/gv0/info The parameter "nfs.disable=on" was added by the upgrade and made the checksum fail. I removed "nfs.disable=on" on host3 and all the three nodes connected fine.

Upgraded one of the other nodes and no changes were made to the /var/lib/glusterd/vols/gv0/info file, so the arbiter node and the resent upgraded node had contact.

I upgraded the last node and on this node the parameter "nfs.disable=on" was added in file: /var/lib/glusterd/vols/gv0/info I removed "nfs.disable=on" and restarted glusterd and the entire cluster was up and running the way it should.

The operating system / glusterfs version: Gluster 10.3 -> 11.0

Gluster info: Volume Name: gv0 Status: Started Snapshot Count: 0 Number of Bricks: 1 x (2 + 1) = 3 Transport-type: tcp Bricks: Brick1: host1:/brick1 Brick2: host2:/brick2 Brick3: host3:/brick3 (arbiter) Options Reconfigured: cluster.granular-entry-heal: on storage.fips-mode-rchecksum: on transport.address-family: inet nfs.disable: on performance.client-io-threads: off

OS on all nodes: debian bullseye (11)

xmarcux avatar Feb 28 '23 10:02 xmarcux

Thank you for your contributions. Noticed that this issue is not having any activity in last ~6 months! We are marking this issue as stale because it has not had recent activity. It will be closed in 2 weeks if no one responds with a comment here.

stale[bot] avatar Oct 15 '23 13:10 stale[bot]