top | item 11637602

My first DDoS attack for a $200 ransom

96 points| LaurentGh | 9 years ago |ghirardotti.fr

66 comments

order
[+] tyingq|9 years ago|reply
Roughly, a somewhat lackluster response to a somewhat lackluster DDoS attempt.

They tried blocking specific ip addresses, which didn't work, because the attack was somewhat distributed. They then just turned on some caching, which allowed the site to function, albeit with an unknown excess bandwidth charge pending.

And, the DDoS itself can't of been terribly impressive, as all it took to mitigate was a bit of caching. He mentions 10 requests / sec as the scale of the attack.

[+] tyingq|9 years ago|reply
Thinking on this some more, this story makes even less sense.

He first mentions having to change Apache to recognize X-Forwarded-For, because there is Amazon Elastic Load Balancing between his site and the internet.

This means, of course, that the "attacking ips" aren't making direct connections to his EC2 instance. They are proxied connections, all from the internal ELB service.

So later, when he mentions trying to use iptables to block traffic...that just doesn't make sense. There are no connections from those ips to the EC2 instance. You could use .htaccess rules, since Apache is aware of X-Forwarded-For.

Lastly...why would you put an elastic load balancer in front of a single web server?

[+] SomeCallMeTim|9 years ago|reply
I was shocked that 12 requests/second could take down any site.

I use async logic (previously OpenResty, more recently NodeJS and Go) and largely pregenerated sites, so 2500 requests/second is a minimum baseline -- on a much lower end instance than an m4.xlarge.

There's a reason I don't use PHP (or any primarily synchronous language like Ruby) any more.

[+] ultramancool|9 years ago|reply
This is an amazingly weak DDoS, put your site behind CloudFlare or similar free service and go take a nap. They'll tank this without raising an eyebrow.
[+] ninjakeyboard|9 years ago|reply
probably because I've been playing an mmo, but i like the use of the word 'tank' here
[+] LaurentGh|9 years ago|reply
Yep, true, it's planned. But sometimes their captcha page tend to block some legitimate trafic...

It's not that impressive because we read everyday articles about crazy DDoS big companies are able to mitigate. But when it's the website your responsible for, whatever the number of requests/sec, you just need to find way to manage it, and CloudFlare can have some weird side effects.

[+] adrianpike|9 years ago|reply
> 40 cores [m4.10xlarge], but still unable to process 10 requests/sec

my goodness.

[+] cpncrunch|9 years ago|reply
That's php for you. Although I use php myself quite often, it can be a resource hog if you're lazy about optimization. A customer I was working with was using wordpress, and their homepage took about 5 seconds to load due to a hideously inefficient wordpress module that was doing the exact same sql query thousands of times! With a little bit of optimization I managed to get it down to about 1 or 2 seconds.

For my own sites, I mostly use static html or server-parsed html.

[+] otto_ortega|9 years ago|reply
Ummmm.... A cache layer for any web application is a must have, perhaps he could have avoided the attack all along if it were present on the system since day one?...

At least for this kind of attack, a more serious DDoS won't be tamed by "just adding cache"

[+] cortesoft|9 years ago|reply
Well, when your DDOS is '10 requests a second', your site is probably not the most sophisticated.
[+] woud420|9 years ago|reply
For next time you don't want to have to copy and paste. No need for SED.

cat <file> | cut -d ' ' -f1 | sort | uniq -c | sort -nr

[+] jnpatel|9 years ago|reply
I'd suggest replacing the use of cat with:

    tail -n 10000 apache.log
[+] xlucas|9 years ago|reply
no need for cat
[+] jasonlfunk|9 years ago|reply
Apparently, it didn't work. :)

Site not installed The site ghirardotti.fr is not yet installed

[Edit: it's up now.]

[+] jasonlfunk|9 years ago|reply
Though, it is interesting how an article with a dead link made it on the frontpage with 3 points.
[+] st78|9 years ago|reply
Well, typical SLA for server side is 500 ms, then you have a chance to load a whole page under 3 seconds, which is recommended by google usability findings.

villa-bali is not even close to this, my bet that you (or your ORM) are making too many requests to database. Try to record ALL requests to database during page rendering and I bet you have about hundred. Check out following test results:

8 test agents: http://loadme.socialtalents.com/Result/ViewById/57341f645b5f... - 5% of users have to wait more than 2 seconds 16 test agents: http://loadme.socialtalents.com/Result/ViewById/57341f1a5b5f... 5% of users need to wait for more than 4 seconds.

Definitely, any bot can nuke your website easily​.

[+] cft|9 years ago|reply
How come the original post has 55 upvotes, but the karma of of original poster is only 18 (6:33 PM GMT)?
[+] kornish|9 years ago|reply
Users receive one karma for one comment upvote, but one submission upvote yields less than one karma for the user.
[+] raverbashing|9 years ago|reply
I wonder what would happen if GET / only returned a redirect to somewhere (either an HTTP code or an HTML with window.location='http:/yoursite.com/new_page'
[+] placeybordeaux|9 years ago|reply
> 40 cores, but still unable to process 10 requests/sec

Stopped reading after that.