top | item 33680852

10 KB Club: Curated list of websites whose home pages do not exceed 10 KB size

180 points| susam | 3 years ago |10kbclub.com | reply

82 comments

order
[+] londons_explore|3 years ago|reply
I don't care about how big a page is. I care how fast it loads.

And some of these pages fail that test badly. For example, sdf.org takes a whopping 1.60 seconds for the home page GET request to return a single byte.

I'd like to see a new leaderboard of 'instant' pages. An instant page is one where a typical user click has the result fully rendered inside 100 milliseconds (almost imperceptible latency).

[+] onion2k|3 years ago|reply
An instant page is one where a typical user click has the result fully rendered inside 100 milliseconds (almost imperceptible latency).

That's an incredibly hard to reach target. Establishing a secure connection from the browser to a server requires several round trips - 1 for DNS lookup, 1 for TCP handshake, 2 to establish the TLS connection, and 1 for the HTTP request. If you divide your 100ms perf budget equally between them that means your servers can be no more than 1800 miles away from the user using (100ms c) / 2 (for the round trip) / 5 requests*. Using TLS 1.3 removes one round trip which is definitely a good optimization. That's before you've even started any serverside rendering, session handling, db connections, etc.

Even with a really good edge network that's going to be incredibly hard to achieve.

I'd love it if every website served a page in 100ms, but it's never going to happen. I got https://ooer.com down to below 500ms for myself (495ms last time I checked) after a fair bit of effect. Getting any lower would be a waste of time. Users can wait an extra 400ms.

[+] speed_spread|3 years ago|reply
I'd rather have a page load in 2 seconds and be done with than one that shows up in 500 millis and keeps loading advertising shit in the background for 10 seconds. Page size is merely a hint about the level of counter-user sophistication, the lower the better.
[+] kingofpandora|3 years ago|reply
Maybe I can't afford a powerful server. But I can always afford not to use 6mb of JavaScript.
[+] neurostimulant|3 years ago|reply
This seems to be really hard to do unless you're using global cdn and forgoing https. Ping between US and SG is almost 300ms in the best condition. That alone already blew the 100ms budget for visitors far away from your server location. And then we have https, where initial https handshake alone could take several hundreds ms (depending on various reasons).
[+] serf|3 years ago|reply
> I don't care about how big a page is. I care how fast it loads.

you should care about both, not everyone has unlimited data capacity.

[+] donohoe|3 years ago|reply
On one hand I love this, but most of these sites are terrible and would have benefited from changes that would push it above 10kb.

Small sizes at extreme cost to UX is not worth it.

[+] andai|3 years ago|reply
Most of them would have benefited from a few lines of CSS that wouldn't have pushed them over 10KB!
[+] bugfix-66|3 years ago|reply
The user experience is excellent.

Can you point to a specific problem?

[+] simonsarris|3 years ago|reply
It seems like every dev I know has the idea of making a minimalist website. I guess I would much rather see a curated list of beautiful (in some unique way) websites than another 1000 minimalist websites.
[+] danuker|3 years ago|reply
To me, search.marginalia.nu returns the best kind of website: hardware-resource-light and information-dense.
[+] bArray|3 years ago|reply
> The website must either be very noteworthy or some content from the website must have received at least 100 points on Reddit or Hacker News on at least one occasion.

It's great we now have an elite (10kb club) of the elite (100+ points on a link curator). I don't think this is particularly useful for discovery.

My new club, the 5kb club, requires at least 200 internet points, to be a member of the 10kb club, have met with the Queen before she passed and be under 5kb.

[+] antirez|3 years ago|reply
My website, http://invece.org, is 5k not compressed. Assuming the linked images don't count.
[+] endofreach|3 years ago|reply
Well in that case… if you don‘t count my css & js, i beat you!
[+] remram|3 years ago|reply
Mine has 6kB of raw text, making it under 10kB would be difficult. Unless I split it into multiple pages artificially, one per project rather than the collapsible sections I have now.

Still, those 6kB of text turn into 44kB of HTML with the markup, I should probably do something about that.

[+] codazoda|3 years ago|reply
I thought, for sure, I was squarely in this club with my MD files hosted on GitHub. Then, I looked at point #1. 25kB transferred. Wut?

It looks like GitHub includes normalize.css, which is 12kB all by itself. Damn it.

Next I looked at the css framework I built and use for all my sites. I know it's only about 2.5k, but I have a screenshot. 114kB transferred. Damn it.

I'm not in this club even though I try to be tiny.

[+] giancarlostoro|3 years ago|reply
What if you just link to the raw files? Wouldnt look like nice HTML but could be insanely smaller.
[+] tasuki|3 years ago|reply
> It looks like GitHub includes normalize.css

What?

> I know it's only about 2.5k, but I have a screenshot. 114kB transferred.

What?

[+] taxman22|3 years ago|reply
None of those websites are worth visiting. Guess you need more than 10KB to display any value.
[+] CynicusRex|3 years ago|reply
But all of them need to have at least one article with 100 upvotes on Hacker News or Reddit, so it's probable at least some of them are worth visiting.

PS I am biased because mine is on there.

[+] tyingq|3 years ago|reply
Berkshirehathaway.com would have made the list if they dropped Google Analytics.
[+] ad404b8a372f2b9|3 years ago|reply
As soon as you include images and custom fonts it becomes really hard to keep things small.

I just measured my landing page which looks like every other SaaS landing page out there and it's 12KB of HTML/CSS and 202KB of illustrations and fonts (4*35KB illustrations, 2*25KB fonts).

[+] londons_explore|3 years ago|reply
If you compile your webpages, there are tools to remove from fonts any unicode codepoint and weights and ligatures you don't use. Many fonts get 90% smaller. For fonts only displayed small, you can also simplify curves to get further smaller.
[+] tasuki|3 years ago|reply
> 12KB of HTML/CSS and 202KB of illustrations and fonts (435KB illustrations, 225KB fonts).

That's reasonable. I don't care whether a website is 10kb or 200kb, I mind if it's 20mb of things which aren't even necessary.

[+] Ferret7446|3 years ago|reply
I think having custom fonts disqualify you from the intent of this (avoiding extra bloat that has (near) zero impact on the content/value of the page).
[+] JJMcJ|3 years ago|reply
Maybe not quite in the 10KB club, but Drudge Report and Craigslist both still look like 1998 high school computer club projects and load almost instantly.

Actually Hacker News is quick as well.

[+] uallo|3 years ago|reply
That website currently weighs 9.50 kB. Hence, it cannot add too many more sites (without removing some other bits) before hitting its own arbitrary size limit.
[+] rbonvall|3 years ago|reply
Not as bad as the website listing websites that do not list themselves.
[+] rnestler|3 years ago|reply
Couldn't they just add paging?
[+] ghoward|3 years ago|reply
My home page could get on this site if I reduce the amount of CSS, but I already have it remove unused items from the CSS. Eh, fun project for another day.
[+] woofyman|3 years ago|reply
What’s the difference between a list and a curated list?
[+] galangalalgol|3 years ago|reply
My assumption is that the list was automatically generated and contained many pages that met the aize criteria but were error messages, place holders, or other uninteresting things like a curse word repeating down the page, or worse, tiny POCs for CVEs
[+] yoz-y|3 years ago|reply
Most of these seem to be an introduction + a list of blog posts. I guess, why not. But it’s not really hard to do <10k with this kind of content.
[+] rozenmd|3 years ago|reply
These clubs remind me of this tweet (as someone that has always been the perf guy wherever I worked):

---

Also, hot take: users care way less about performance than you think. They want "fast enough", but we're over-indexing on "as fast as possible" instead of caring about other things that matter more to users.

Source: https://mobile.twitter.com/DavidKPiano/status/15787403709971...

[+] saagarjha|3 years ago|reply
I have seen very little code that is written to be "as fast as possible" and a lot of code that aims to be "fast enough" and fails to meet that bar. Perhaps still worth being mindful about it?
[+] 2pEXgD0fZ5cF|3 years ago|reply
> we're over-indexing on "as fast as possible"

Solid research on this exists, so there is not much to deny, at least when your goal is selling to the average user.

But: where is this magical place where people put heavy emphasis on performance? It certainly isn't here, and I would like to visit it for once. Companies caring, or giving their programmer the time to care, about performance are absolutely a rare sight. Unless they work in areas in which they are forced to care about it, and even then many stick to the minimum.

Snappy, low latency software is such a delight to encounter because it is just so damn rare. Especially on the web, I constantly feel like I'm wading through molasses, and the only reason I am able to endure this constant, agonizing pain is that I've gotten so used to it...

[+] dspillett|3 years ago|reply
A lot of people worry about super-scale prematurely, thinking that "fast enough" for each individual user means needing to be as far as possible because your have a great many users making concurrent requests.

Somewhat counter-intuitively to some: this way if thinking can actually make things needlessly less responsive for the individual users because of the "over-architecting" it can cause.

[+] luckylion|3 years ago|reply
I agree completely regarding "users want fast enough".

Also: for regular sites, html/css/js optimization is less important than server location. If your server is in Europe and your user is in the US, that's the big one, not your HTML, CSS or JS.

And if you're fetishizing over Lighthouse scores, stop. It's only a very rough measure and shouldn't be treated as a goal itself.

[+] tjoff|3 years ago|reply
That fast enough is good enough shouldn't be a hot take?

But if you are implying that the web today is fast enough that for sure is a hot take. Our industry most certainly don't care the slightest about performance nor user experience.

Popups alone are proof of that.

[+] abruzzi|3 years ago|reply
I'm sad...I tested my page and its 14.1k. I suppose I should remove that header image.
[+] agumonkey|3 years ago|reply
the 'content ratio' column made my brain fly

this is something we should reward, less chrome, more ideas and expression (unless said chrome is also partly that)