rocky-tools
rocky-tools copied to clipboard
Migrate EC2 Instance
Not sure if this is a dumb question or not, but would migrate2rocky theoretically work for an AWS EC2 instance that was started from a Centos 8 image, or are there some behind the scenes things that EC2 is doing that would definitely cause issues.
It should work. We migrated a large chunk of our CentOS 8 infra to Rocky 8 in AWS and haven't had issues. Wouldn't hurt to take backups/snapshots, though.
I definitely will take backups and snapshots. Thanks for the info.
I made an image first, but the migration seems to have worked great. I just had to manually remount one EFS drive afterwards, but other than that, all tests I have passed.
That's great to hear! As we're always looking to improve migrate2rocky, can you elaborate on the issue with the EFS drive?
I am migrating another and will give better notes. But after the restart the EFS drive did not mount, although it was in the fstab.
Before trying anything, I reinstalled this tool (https://github.com/aws/efs-utils) and then performed an unmount
and then a mount
command and it mounted properly. On this next migration (same exact config) I will check for logs and just try remounting right away if I have this issue and see if it works.
Error message after rebooting:
Dec 31 02:11:00 cioapp103 systemd[1]: Started amazon-efs-mount-watchdog. Dec 31 02:11:00 cioapp103 kernel: FS-Cache: Loaded Dec 31 02:11:00 cioapp103 kernel: FS-Cache: Netfs 'nfs' registered for caching Dec 31 02:11:00 cioapp103 kernel: Key type dns_resolver registered Dec 31 02:11:00 cioapp103 systemd[1]: user_content.mount: Mount process exited, code=exited status=1 Dec 31 02:11:00 cioapp103 systemd[1]: user_content.mount: Failed with result 'exit-code'. Dec 31 02:11:00 cioapp103 systemd[1]: Failed to mount /user_content. Dec 31 02:11:00 cioapp103 systemd[1]: Dependency failed for Remote File Systems. Dec 31 02:11:00 cioapp103 systemd[1]: remote-fs.target: Job remote-fs.target/start failed with result 'dependency'.
I confirmed that the EFS drive won't mount until I manually re-install the efs-utils
package. My installation steps are:
git clone https://github.com/aws/efs-utils sudo dnf install git make rpm-build cd efs-utils sudo make rpm sudo dnf install ./build/amazon-efs-utils*rpm sudo mount -a
I can see why this has happened. 10 days ago, Rocky was officially merged into amazon-efs-utils. I wonder if you update amazon-efs-utils before you migrate if that would avoid the issue altogether.
If you have another one and can try what Luis suggested then I'll wait on those results and then add a note to the known issues section of the README file for others who come across this.
Yes, that solves the issue. I was performing the efs-utils update only after the migration when running into the issue, but updating efs-utils before the migration works without issue.
Edit: Pretty convenient timing for me!
Thanks. I'll make a note of it under known issues, both the recommendation to update efs-utils first and the fix of reinstalling them after (probably updating them after will work as well).