install
install copied to clipboard
Adding a new master node to the existing cluster fails
Description: Tried to add a new contiv master to the existing cluster with - 3 master nodes and 7 worker nodes. The addition of master node failed; however, when we tried to add a new node as a worker to the exiting cluster, the installation was successful on the new node.
Requirement: This is a test case where you add more cluster nodes and scale the master nodes so that you can have a 5-node HA from 3-Node HA (currently we have 3 master nodes in high-availability mode) to sustain two master node failure.
Steps to reproduce the issue:
- Copy the ssh key to the new node to be added to the existing cluster.
- Add node details in the existing cfg.yml with a master role called out specifically: example:
10.65.122.71:
role: master
control: eno1
data: eno2
- Run the installer. The installer fails at v2 plugin installation task.
Expected behavior: The installer skips the install steps on all the nodes expect the new one. The new node should have got added to the exiting 3-node masters in the cluster.
Please find the logs attached. contiv_install_11-13-2017.07-46-17.UTC.log host_inventory.log
@sisudhir we don't have automation to scale additional master node right now. This will be a feature enhancement.