Yet another SYN flood to contend with, and yet again, it's the same extortionists that took this company last year for quite a bit of money (and I have to wonder—how do they note this in their financials? “Unexpected hiring of Russian Security Consultants”?) that “promised” protection for (I think) a year or so against such attacks. One it was brought to their attention, the attack subsided.
But in the mean time, I was tasked to move several of the larger sites still on the Boca Raton servers to the ones down in Miami. The intent of the Miami servers is for each to act as a backup of the other (and the Boca Raton servers will eventually act as a backup for the Miami servers) so in the process of creating the accounts needed on the two servers I made a slight mistake. Nothing bad, like an errant rm -rf * or attempting to restart the network remotely. Nope. Just a simple overwriting of /etc/passwd with the wrong file.
Nothing major. It just meant that no one could log into the system. I didn't notice until I attempted to copy over some more files and they failed. Or rather, I think scp or rsync started asking for passwords when I explicitely set up a trust mechanism between the two servers on their private network interfaces. I started poking around on the server with the munged /etc/passwd file and it came quite apparent what happened.
Fortunately, I still logged into the server with the munged /etc/passwd file.
Unfortunately, I was not root. Nor could I become root. This was not good.
Can't ssh since the authentication was blown. Which meant that scp wouldn't work either. I thought maybe rsync would work, but then I realized I set up rsync to use ssh and since authentication didn't work … (not that I realized until after trying rsync).
That's when I realized that a trip down to Miami might be required. Several hours worth of driving for less than a minutes worth of work to restore /etc/passwd. It was then I had a brainstorm … why not hack my way back to root? Wasn't illegal—I was, after all, the administrator for the system, and I had local access, which would make it easier than a remote exploit.
One Google [1] search later, and I'm perusing 0day-exploits [2]. Downloaded a few, got the code to the borked server, compile, run and nothing. Download another one, get it on the server, compile, run, and nothing.
Damn you, Gentoo [3], I thought, and your custom compiliation installs! I can't even hack my way back into the system!
I supposed I could have kept at it, but at there comes a time of diminishing returns, which would be the time it would take me to drive to Miami, reboot the server into single user mode, restore the file, reboot and drive back home. The drive and reboot is the simpler solution in this case (if a bit tedious); had the server been on the other side of the country, then yes, maybe I would have stuck with the hacking attempt a bit longer …