davidtclin58
davidtclin58
Trying to validate this issue on `v1.3.0-rc3`, This time when I power off node machine, wait around 1 hour to power on machine. Like the previous test result in https://github.com/harvester/harvester/issues/4033#issuecomment-1963336920...
As per discussed, close this issue and we will track in [#5109](https://github.com/harvester/harvester/issues/5109) since they all caused by the same root cause.
We encounter similar issue on Rancher v2.7.9 with Harvester v1.1.3-rc1 While scaling down the RKE2 cluster from 3 nodes to 2 nodes, the first node stuck in `Reconcilling` state ...
Verified **fixed** on `v1.3-df661762-head` (02/06). ### Result $\color{green}{\textsf{PASS}}$ UI and webhook validation on the auto-rotate-rke2-certs setting $~~$ 1. On K9s, we can found the `auto-rotate-rke2-certs` in settings ``` # Please...
Thanks @FrankYang0529 for the information, After discussion we should add the settings value in the following format of the Harvester setting: ``` value: '{"enable":true,"expiringInHours":8759}' ``` We would continue to retest...
Verified **fixed** according to the test result in https://github.com/harvester/harvester/issues/3863#issuecomment-1930450378 @bk201, @FrankYang0529 Before closing this issue, I have a question for whether we have the plan to provide the on Harvester...
Verified **fixed** on `v1.3-50a00ad1-head` (24/02/20). Close this issue. ### Result $\color{green}{\textsf{PASS}}$ Clean install: bump Rancher to 2.8.2 and RKE2 to 1.27.10 $~~$ ``` harv1:~ # kubectl get settings.management.cattle.io server-version NAME...
Verifing on `v1.2.2-rc1` and `v1.3-head` (24/04/23) ### Result $\color{green}{\textsf{PASS}}$ 3 nodes: can upgrade from v1.2.1 to v1.2.2-rc1 when all the VMs shutdown from the OS $~~$ 1. When we login...
Verified failed on 4 nodes Harvester while upgrading from v1.2.1 to v1.2.2-rc with shutdown vm from os. The upgrade process stuck in the last node `Pre-draining`  Check the pre-drain...
The 4 nodes environment encounter the issue is on the dolphin cluster. I also keep it for further investigation.