Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
1.2k views
in Technique[技术] by (71.8m points)

ubuntu - Locked myself out of SSH with UFW in EC2 AWS

I have an EC2 Instance with Ubuntu. I used sudo ufw enable and after only allow the mongodb port

sudo ufw allow 27017

When the ssh connection broke, I can′t reconnect

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Reply

0 votes
by (71.8m points)

# Update

Easiest way is to update the instance's user data

  • Stop your instance

  • Right click (windows) or ctrl + click (Mac) on the instance to open context menu, then go to Instance Settings -> Edit User Data or select the instance and go to Actions -> Instance Settings -> Edit User Data

    If you're still on the old AWS console, select the instance, go to Actions -> Instance Settings -> View/Change User Data

And paste this

Content-Type: multipart/mixed; boundary="//"
MIME-Version: 1.0
--//
Content-Type: text/cloud-config; charset="us-ascii"
MIME-Version: 1.0
Content-Transfer-Encoding: 7bit
Content-Disposition: attachment; filename="cloud-config.txt"
#cloud-config
cloud_final_modules:
- [scripts-user, always]
--//
Content-Type: text/x-shellscript; charset="us-ascii"
MIME-Version: 1.0
Content-Transfer-Encoding: 7bit
Content-Disposition: attachment; filename="userdata.txt"
#!/bin/bash
ufw disable
iptables -L
iptables -F
--//
  • Once added, restart the instance and ssh should work. The userdata disables ufw if enabled and also flushes any iptable rules blocking ssh access

Source here

# Old Answer

Detach and fix the volume of the problem instance using another instance

  • Launch a new instance (recovery instance).

  • Stop the original instance (DO NOT TERMINATE)

  • Detach the volume (problem volume) from the original instance

  • Attached it to the recovery instance as /dev/sdf.

  • Login to the recovery instance via ssh/putty

  • Run sudo lsblk to display attached volumes and confirm the name of the problem volume. It usually begins with /dev/xvdf. Mine is /dev/xvdf1

  • Mount problem volume.

      $ sudo mount /dev/xvdf1 /mnt
      $ cd /mnt/etc/ufw
    
  • Open ufw configuration file

      $ sudo vim ufw.conf
    
  • Press i to edit the file.

  • Change ENABLED=yes to ENABLED=no

  • Type Ctrl-C and type :wq to save the file.

  • Display content of ufw conf file using the command below and ensure that ENABLED=yes has been changed to ENABLED=no

      $ sudo cat ufw.conf 
    
  • Unmount volume

      $ cd ~
      $ sudo umount /mnt
    
  • Detach problem volume from recovery instance and re-attach it to the original instance as /dev/sda1.

  • Start the original instance and you should be able to log back in.

Source: here


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
OGeek|极客中国-欢迎来到极客的世界,一个免费开放的程序员编程交流平台!开放,进步,分享!让技术改变生活,让极客改变未来! Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...