Sunday, July 20, 2014

A Lightly Technical Post on Gaining Access to Lost AWS EC2 Cloud Compute Machines

Data Loss Background 

Last Sunday a friend of mine contacted me and let me know I had to move all my crap off his ancient machine he had been hosting for me.  It was a machine I had given him in the 90's... a Sun Ultra 10.  Ancient.  Unfortunately I had several websites on it and content for this site hosted on it.

I quickly made an account with Amazon Web Services (AWS) and spun up a "free" machine, got the ssh keys, and then tar-gz'd everything and copied it all down to the new EC2 instance.  Some nearly 2GB of data.  Much of which were old family photos.  All was well. I also pulled down a copy to my local desktop.  He then nuked the machine.  All was well, and I had a decent amount of work ahead to recreate some web servers to host this.


Then within four hours my desktop failed.  Hard.  Turned out to be the motherboard.  I had backed up almost all the data in two places but with one stroke I had lost everything.  I could no longer access my AWS instance because the ssh keys to enter it were on the desktop.


I will most likely buy a replacement desktop and then slot those drives and have the data, but that isn't happening this month.


Necessity being the mother of invention I figured out how to recover access to the cloud instance. It being a couple decades since I did any heavy unix sysadmin work, and not being an AWS expert, it took me a few hours to piece together.  Luckily this was pretty light stuff.  Here's how.



Gaining Access to Your EBS EC2 Instance Without the SSH Keys

1. Make a new EC2 host in the same AWS account.  Ensure it is in the same availability zone as the host with the lost keys. (When spinning up, set it in the config step under subnet.)  Make sure you use keys that you have available. ;-)

2. Stop the original EC2 host with lost keys.  Wait 'til it's stopped.  And do not accidentally TERMINATE it.

3. Go to volumes.  Note the attachment information of the new and lost-keys hosts.  Copy-paste it.

4. Detatch the EBS from the host with the lost keys.  Wait until its state changes to "available".

5. ssh in to the replacement host.sudo -u root mkdir /mnt/lost-key-volume
6. Attach the volume to the replacement host by selecting it and attaching (right click or menu pull down) to the new host id.

7. When it's attached the console will show attachment information.  Note where it is attached. e.g.-/dev/sdf

7a. Note: Newer Linux kernels may rename your devices to /dev/xvdf through /dev/xvdp internally, even when the device name entered here (and shown in the details) is /dev/sdf through /dev/sdp.

7b. If you're using ubuntu, the kernel wont necessarily attach the block device as /dev/sdf.  Type 'lsblk' to see what's up and 'df' or 'mount' to see what is already mounted.  You can also 'ls -lt /dev/' for the recent blocks mounted.  Use what lsblk gives you.  

ubuntu@ip-172-31-45-8:/dev$ lsblk
NAME    MAJ:MIN RM SIZE RO TYPE MOUNTPOINT
xvda    202:0    0   8G  0 disk
+-xvda1 202:1    0   8G  0 part /
xvdf    202:80   0   8G  0 disk
+-xvdf1 202:81   0   8G  0 part


7c. So in my case /dev/sdf means I want to mount /dev/xvdf1
sudo -u root mount /dev/xvdf1 /mnt/lost-key-volume

8. Now go in there and add the known pub ssh key to the authorized_keys file.  Don't change the perms of anything in the .ssh directory.  If you're feeling adventurous, you can do something like cd'ng into the mounted FS, nav to the ubuntu/.ssh and run:'cat ~/.ssh/authorized_keys >> authorized_keys'

9. Verify the
authorized_keys looks good, or you'll have to repeat a lot of work!


10. 
Unmount it cleanly 'sudo -u root umount -d /dev/xvdf1'

11. Detatch it using the console.


12. Attach it to the original host and the original device block. In my case, my original device was /dev/sda1 If you don't have right dev point your instance will not boot.  

13. Once it's attached, boot your original host. Note: for whatever reason (maybe just because I was typing out steps and this was #13) my original host didn't start the first time around. I decided to stop the temporary host I used to mount the EBS. Then I was able to start it up.  There might be something fishy in the EBS Volumes manager.

14. ssh in using the keys you have and put in to the authorized_keys file and enjoy access to your old host!


That shows how reasonably easy it is to move EBS around from host to host.  It also shows that you should have two factor authentication (2FA) on your AWS Web Console login because ssh only gets you so far.


I'll get the title fixed when I get around to setting up a webserver.


No comments:

Post a Comment