Do a test proving it this way is more inefficient.
BTW, everything is separated before publishing. CSS is one file, reset CSS is in another, there is a HTML template etc. Easy maintenance off line, online and published does not matter. It is similar with dynamic publishing.
As for the inefficiency we all know HTTP requests are the biggest enemy. All my pages are exactly one request. With everything being minimized, gzipped and unused CSS selectors stripped I do not see how this can be more 'inefficient'.
My build scripts are such an "evolved" mish-mash of Python3, Python2, Java, makefiles, and shell scripts that I'm amazed they work for me, much less for anyone else. (This point was brought home to me when I moved to a new laptop and spent too long trying to remember and install all the dependencies.)
Anyway, nice job, and hooray for unintended uses of open source code.
You are right. It takes a certain structure, quite a bit of adjusting etc.
But the result is what it makes all worthwhile.
BTW, I was doing all the optimization you were doing on some of my sites, but I always missed some nice ways to automate. Imagine the happiness when I stumbled on your hg.diveinto* repositories. A candy shop.
I like it, it's clever, but I was disappointed that the headline was misleading in the interest of being catchy. "How I made my website a one-hit-wonder (intentionally)" might have done the same thing without me thinking "oh, he just stripped out newlines".
I can see being it received like that. But the issue is deeper. Take a regular thousand line site and strip the newlines. It is not a same thing. A fact that buildout depends on so many utilities and little scripts and glue makes it clear the result is something a majority of people would not bother messing with. I bother because I really do care about the little differences. There is also no body, html, head, any divs or optional end tags etc.
On the website, it is clearly stated I am a minimalist and perfectionist. The URL slug of this article also reveals something :)
As for the title, it might be perceived as catchy. But I like nice titles on my sites and the fact I self publish entries (because of the HN blog comments experiment -- which I am happy with) makes it even harder.
Is it not a one-line website? :) Short and accurate title I would say.
Added a note about nginx http_gzip_static_module which doesn't get used as much as it should.
One tip on the gzip parameters, I use -9cn. The 'n' flag is important as your gzipped files will not appear corrupted by many gzip online checking tools. E.g. everything is green: http://redbot.org/?uri=http%3A%2F%2Fsimeramov.com%2F2010-07-.... Took me a while to figure it out. I believe it also benefits rsync.
There is one serious downside to all this reckless optimization -- I don't even notice the bandwidth RRD graphs changing if some page hit the HN front page or get featured elsewhere. The pages are so small the graphs almost always look the same, i.e. basically flat :) All the bandwidth is from me backing up my home dir to server.
50% of my load time was trying to pull the favicon and getting the 404. I wonder if there is a way to use the <LINK rel=icon ...> tag to tell a browser not to look for a favicon. It is eluding my googlabilty if there is.
Looking at the source, there's no html, head or body tag in the document (but there are CSS styles referencing the body tag, hmm).
This could be some minimization trickery that assumes most browsers can deal with malformed html documents, but [slams fists on table] IT'S NOT RIGHT, I TELL YOU!
Thought: do data urls work in .css files? e.g. background: url(data:....); ? Because then you could have one cacheable file (everything.css) that sits along side your pages (use background images instead of plain old imgs), so you would only have to download them once, not on every page load.
Includes are valuable for re-usability. If a website has a relatively low number of page hits/user or doesn't implement a standard across many pages, the benefits of includes are reduced.
This concept could be progressed into a proxy server!
[+] [-] jasonkester|15 years ago|reply
There's a reason we do image, css and javascript includes in HTML. This idea undoes all that.
I can't imagine why the author, clearly a bright guy, went out of his way to build this thing.
[+] [-] sramov|15 years ago|reply
BTW, everything is separated before publishing. CSS is one file, reset CSS is in another, there is a HTML template etc. Easy maintenance off line, online and published does not matter. It is similar with dynamic publishing.
As for the inefficiency we all know HTTP requests are the biggest enemy. All my pages are exactly one request. With everything being minimized, gzipped and unused CSS selectors stripped I do not see how this can be more 'inefficient'.
[+] [-] MarkPilgrim|15 years ago|reply
Anyway, nice job, and hooray for unintended uses of open source code.
[+] [-] sramov|15 years ago|reply
But the result is what it makes all worthwhile.
BTW, I was doing all the optimization you were doing on some of my sites, but I always missed some nice ways to automate. Imagine the happiness when I stumbled on your hg.diveinto* repositories. A candy shop.
So, thank you.
[+] [-] efsavage|15 years ago|reply
[+] [-] sramov|15 years ago|reply
On the website, it is clearly stated I am a minimalist and perfectionist. The URL slug of this article also reveals something :)
As for the title, it might be perceived as catchy. But I like nice titles on my sites and the fact I self publish entries (because of the HN blog comments experiment -- which I am happy with) makes it even harder.
Is it not a one-line website? :) Short and accurate title I would say.
[+] [-] sramov|15 years ago|reply
One tip on the gzip parameters, I use -9cn. The 'n' flag is important as your gzipped files will not appear corrupted by many gzip online checking tools. E.g. everything is green: http://redbot.org/?uri=http%3A%2F%2Fsimeramov.com%2F2010-07-.... Took me a while to figure it out. I believe it also benefits rsync.
There is one serious downside to all this reckless optimization -- I don't even notice the bandwidth RRD graphs changing if some page hit the HN front page or get featured elsewhere. The pages are so small the graphs almost always look the same, i.e. basically flat :) All the bandwidth is from me backing up my home dir to server.
[+] [-] jws|15 years ago|reply
[+] [-] sramov|15 years ago|reply
I've just touched a favicon.ico. Zero byte, but at least it's not 404. I might do the same for robots.txt. Thanks.
[+] [-] famfam|15 years ago|reply
[+] [-] slater|15 years ago|reply
body {border-left:18px solid #ebf7fb;}
:)
Oh, and an RSS feed would be great, too!
[+] [-] sramov|15 years ago|reply
Got tired of it, put background image instead and simplified the CSS further :)
As for the RSS feed, it won't have any.
[+] [-] jmcnevin|15 years ago|reply
This could be some minimization trickery that assumes most browsers can deal with malformed html documents, but [slams fists on table] IT'S NOT RIGHT, I TELL YOU!
[+] [-] sramov|15 years ago|reply
[+] [-] famfam|15 years ago|reply
[+] [-] daverodecker|15 years ago|reply
This concept could be progressed into a proxy server!
[+] [-] fishercs|15 years ago|reply
[+] [-] marknutter|15 years ago|reply
[+] [-] kabanossen|15 years ago|reply
[+] [-] sramov|15 years ago|reply
Maybe has some usability issues but it was a minimal and elegant solution to a problem.
[+] [-] suckitlolol|15 years ago|reply
[deleted]