Well, it had to happen…
Yesterday evening two of our linux boxes were exploited.
I had to try it out for myself; and yes, it really does work. 😐
Booted up my Ubuntu in Parallels, installed build-essential & ran that program!
1 2 3
sudo apt-get install build-essential gcc what-ever-the-file-name-is.c ./a.out
This is what it looks like:
I’m pretty sure this doesn’t require any more explanations 😉
While checking the logs of one of my websites I noticed something rather weird.
Some person (18.104.22.168) with User-Agent/browser “Mozilla/4.0 (compatible; MSIE 4.0; Windows NT; ……/1.0 )” was downloading _all_ files from my website. Totally ignoring robots.txt and requesting pages without providing a referral.
This seemed quite odd and didn’t seem to be a decent/real search robot. It kept requesting files every 3-4 seconds for about one hour. Decent search bots try to spread the load over a few minutes, and wouldn’t request about 1000 files (1.6 Gb) at once.
So, well, I banned it.
Whilst reading over “Chapter I: The Internet” in my Hypermedia course; I couldn’t help noticing a few things.
- All IP’s of local PC’s in a local network begin with “192.168”
- DNS is a service on the internet that links hard to remember IP’s to easy to remember domain names
- A PC with a Pentium processor and at least 64 Mb RAM is needed to be able to connect to the internet.
This is when I start to wonder why I even bother to go to school…