It really annoys me when the links on the frontpage of HN are clicked on so much they overload the server of that website. What about a website/service which caches all the pages on the frontpage of HN, and throws them away after X amount of time after they disappear from the frontpage. If you think this is a good idea, me and my co-worker might create something like this.
[+] [-] edent|12 years ago|reply
I've been hit several times by being on the front page. By my estimates, that worth around 700 requests per hour (http://shkspr.mobi/blog/2012/11/whats-the-front-page-of-hack...). I don't think that's excessive and using the above, my bog standard WordPress install has never fallen over.
*I'm aware not all stories are submitted by their writer - but I consider the above to be best practice for any competent website owner.
[+] [-] petercooper|12 years ago|reply
On the other hand, if HN could frequently check if a page is still responsive and, if not, then show a cached version until it's back up, that would be awesome, but given the underlying software doesn't get many updates anyway, I doubt we'd see anything like this soon.
[+] [-] dutchbrit|12 years ago|reply
People would be better off optimizing their site/server. It's a good lesson to see what happens when your site gets a boost in traffic.
[+] [-] unknown|12 years ago|reply
[deleted]
[+] [-] gkoberger|12 years ago|reply
As for that, I think most sites are probably cached already -- something could easily just find the link to the Google cache.
[+] [-] arb99|12 years ago|reply
[+] [-] gojomo|12 years ago|reply
[+] [-] moreentropy|12 years ago|reply
It's better to struggle with the hacker news treatment early on and learn to cope with traffic spikes than have the same problems when your site gets mainstream coverage.
[+] [-] haliphax|12 years ago|reply
[+] [-] bencevans|12 years ago|reply
[+] [-] JeroenRansijn|12 years ago|reply
[+] [-] Shish2k|12 years ago|reply