top | item 34517214

The CSS at w3.org is gone

111 points| internetter | 3 years ago |w3.org

65 comments

order
[+] nneonneo|3 years ago|reply
The main CSS file they're using is https://www.w3.org/2008/site/css/minimum. If you download this with `curl 'https://www.w3.org/2008/site/css/minimum' -H 'Accept-Encoding: gzip, deflate, br'`, you see that it starts with 03 D7 86 1F 8B 08 08 and ends with 25 10 91 4D 7B 30 00 00 03. This is a brotli metablock header (03 D7 86: length = 3503 bytes, uncompressed), followed by a gzip header (1F 8B 08 08: signature, compression=deflate, filename present), and ends with a gzip trailer (25 10 91 4D 7B 30 00 00: crc32, size of file) and finally an empty brotli metablock header (03).

So, what's happening is that they're serving gzip files from their server (which is hinted by the "content-location: minimum.css.gz" response header), which are being compressed again using brotli somewhere else (e.g. at a reverse proxy).

[+] infogulch|3 years ago|reply
My guess is they've been serving their css file direct from disk with no content encoding header, but browsers rolled their eyes and decoded it anyway (do they do that?). But now it's changed to serve behind a reverse proxy which isn't so forgiving and recompresses it with brotli (does it even reduce the size?) and that's too much for the browser to infer implicitly. Fin.
[+] nneonneo|3 years ago|reply
Worth noting: if you're not seeing the Brotli header, it's probably because something (e.g. your browser) is transparently decoding the declared Content-Encoding (which is `br` for Brotli). That'll yield raw gzip data. In this case, your browser or other user agent has already applied Content-Encoding, so they aren't going to do it again.
[+] zekica|3 years ago|reply
The problem is that they are sending gzipped css without specifying Content-Encoding in the response headers.
[+] justinclift|3 years ago|reply
Tested it just now (Firefox on Linux), and the response (for me) has 'Content-Encoding: br'.

So, it seems more like it's indicating Brotli compression, but the actual file (https://www.w3.org/2008/site/js/main) is gz encoded.

[+] brabel|3 years ago|reply
HTML:

    <link rel="stylesheet" href="/2008/site/css/minimum" type="text/css" media="all" />
    <style type="text/css" media="print, screen and (min-width: 481px)">
    /*<![CDATA[*/
    @import url("/2008/site/css/advanced");
    /*]]>*/
    </style>
    <link href="/2008/site/css/minimum" rel="stylesheet" type="text/css" media="only screen and (max-width: 480px)" />
    <meta name="viewport" content="width=device-width" />
    <link rel="stylesheet" href="/2008/site/css/print" type="text/css" media="print" />
/2008/site/css/minimum headers:

    Content-Type: text/css;charset=utf-8
    Transfer-Encoding: chunked
    Connection: keep-alive
    content-location: minimum.css.gz
    vary: negotiate,Accept-Encoding
Downloading this file, I can see it's valid "gzip compressed data".

Seems to be missing the Content-Encoding header?!

[+] bl4ckm0r3|3 years ago|reply
This no-js-no-css trend is gaining traction
[+] x98asfd|3 years ago|reply
What's not to like! 0K CSS, responsive, works in desktop, mobile and tablets, and not to mention blazing fast load speed.
[+] ghusto|3 years ago|reply
NoJS I was aware of (and support where it makes sense, which is most of the time), but NoCSS? When did that start, and why?
[+] icepat|3 years ago|reply
Now I can read it in Emacs
[+] zoobab|3 years ago|reply
Send one HTML file with blue links like in the old days!
[+] crispyambulance|3 years ago|reply
As always when styling comes up there's always a few people on HN who claim to prefer "no CSS" because they "always" use Lynx, or deeply customize all content on their own from scratch, or are using a 1G flip phone, or care about milliseconds of loading time or they're RMS (oh, wait, stallman.css exists!).

Sure, some idiosyncratic blogs can get away with that css-free "look" (eg https://danluu.com/). The W3? NO? but now I guess so!

[+] iLoveOncall|3 years ago|reply
> some idiosyncratic blogs can get away with that css-free "look" (eg https://danluu.com/)

Not even. The articles there are incredibly compact blocks of text that are way too wide to be comfortable for reading.

Minimal CSS is an absolute requirement if you want a website that is pleasant to read.

[+] jefftk|3 years ago|reply
Dan does have a bit of CSS, though:

    .pd{width:4em;flex-shrink:0;padding-bottom:.9em}.par div{display:flex}
[+] KyeRussell|3 years ago|reply
As someone with a visual impairment, when I open that website I see someone that wants to put weird internet idealism above not giving me eye strain. This attitude is a bit of oldschool internet culture that I’m glad to see die off.
[+] eu|3 years ago|reply
The site actually looks decent. Welcome back to 1995.
[+] shadowgovt|3 years ago|reply
It's a testament to good structure that the site is legible without its styling.

... legible, not good. If I had to read documentation that looked like that all day I'd consider a career change (or, perhaps, building an infrastructure to improve the legibility of web pages...).

[+] kaptainscarlet|3 years ago|reply
It surprisingly reads better without the css. It somehow feels like my room when I throw out things I don't need.
[+] quink|3 years ago|reply
That's a WCAG paddlin'.

    * Focus indicators
    * Text block width
    * Target Sizes
Apart from that, even if it were there there's some trouble with the markup:

    * <abbr> tags missing
    * No landmark elements
    * bloody tabindex
    * repeated same div
    * search input has no visible label
    * no context-sensitive help or whatnot
[+] artemonster|3 years ago|reply
Someone got very tired of adjusting margins and just said: „screw it, Im done. Its even better without CSS“
[+] tannhaeuser|3 years ago|reply
If only the CSS WG had gone away ...

Would've been the last thing WWW Inc. has influence over on the web.

And even that has been a joke (of the consider-your-carreer-choices variety) from the beginning, while today it's just annoying holding on for job security.

At least their complaining about being financially dependent on Google and being part of the web standard circus preventing real independent standardization makes for an entertaining read lately [1] if you had any doubt Google are the ones to call the shots.

[1]: https://mastodon.social/@robin/109524929231432913

[+] mirekrusin|3 years ago|reply
When will people learn to use web standards correctly?
[+] tinsmith|3 years ago|reply
Am I the only one who likes it better this way?
[+] ergonaught|3 years ago|reply
I like it "better this way" conceptually, in that I like browsing with w3m/lynx/links/eww/etc but the content structure is bad "this way" (ex: scrolling past site nav links too much before I reach actual content; most of the vertical nav should and would be horizontal if this weren't a bug; etc).
[+] pictur|3 years ago|reply
a comment you can only see on HN. I really like the pointless hatred towards CSS and javascript on this site lol
[+] x98asfd|3 years ago|reply
No I think pretty most real old school developers prefer that, and it reads alot better on Lynx/Links2 type of browser. Browser != chrome || firefox,
[+] dredmorbius|3 years ago|reply
The bug's been fixed.

Firefox users (desktop) can see the unstyled site by selecting View -> Page Style -> Unstyled.

And yes, it is pleasantly readable even without CSS. Not ideal, but good.

[+] itaibo|3 years ago|reply
Disabling Brotli from Cloudflare would stop this error until they fix the Content-Encoding
[+] makach|3 years ago|reply
hm, the css seems to be there, but looks very obfuscated? could it be encrypted or attacked?