Personally, I think the web/browsers has churned far too much, and if that stopped happening, perhaps we would get more accessible sites and browser diversity as people "stop looking for new dogs and start teaching new tricks to the old ones." Of course, Google would try its hardest to never let that happen, since change is its weapon of monopoly.
May be risking a lot of downvotes, but I really want to work for somewhere that cares about a11y for very selfish reasons - I want to use semantic html and not have to do stupid shit like using a package that re-implements the select element with divs
I can't think of many new browser features I've used in the past few years. There's the permissions API and the fileReader API that are part of apps I've built (and they're very useful), and I've played with a few things like sharedArrayBuffers and WebGL2 but not in anything that's been deployed. Browsers do move fast, but not that fast, and most of the new features are super niche things that the majority of developers don't really need.
If you're prioritizing looking at new things over accessibility and cross-browser support then you are making that choice; browser vendors are not forcing it upon you.
This reminds me of Spolsky's blog post about Fire and Motion. [0] He gives the example of Microsoft creating an endless stream of new technologies which kept their competition busy (eg: ODBC, RDO, DAO, ADO, OLEDB, ADO.NET, ...). Get the competition to spend their resources keeping up rather than competing.
I agree with you, and I think it can be done today, by taking matters into your own hands.
I've done it by testing with historic browsers alongside modern, and challenging myself to make it work with all of them, JS and noJS, without errors, reliably.
In the process, I found a set of "lowest common denominator tags which work across almost anything. I use these to build the "base" HTML. Then, I add JS with feature-checks to allow older browsers with JS enabled to still run it, though it doesn't do much at the moment.
I think it's a worthwhile exercise to learn where the roots of the web are, and how it evolved, and allows you to write much more resilient code in general. The reason for this is that the longer a technique or syntax has been in use, the longer it's likely to continue to be used.
As long as it doesn’t break or give up backwards compatibility, I don’t want the web to stop improving. It’s one of the most important platforms in history. I thought we can’t teach the old web new tricks because of backwards compatibility?
"Start teaching new tricks to the old ones" on the web usually means shipping a bunch of extra JavaScript. Which is fine when a new pattern or paradigm is being felt out, but once everybody is using this one library to do this thing the exact same way, you reach a point where it makes more sense to enshrine it in a native API.
See jQuery (both Ajax and selection), Moment/Temporal, etc
The recurring calls for a "faster web" or a "safer web" are responses to problems for users that web developers, "tech" companies and their advertiser customers have themselves created. Users did not create these problems and yet, unless I am mistaken, these marketing campaigns for "pushing the web forward" are directed at users. As a user, I want those problems fixed but I am under no illusions about where they come from. It stands to reason that a user-controlled web would be much faster and much safer.
I think it's OK for Google to make Chrome into an OS and cram whatever they want into it as fast as they can.
Regular web browsers don't necessarily need to be on the same feature treadmill/death march (assuming they could keep up.)
We seem to be trying really hard to reinvent the applet systems of the 1990s, but in a way that brings systems with 10-100x the memory and CPU resources to their knees and requires huge armies of programmers to implement. I guess JavaScript/webasm is better than Java in some ways.
This to refer to my annoyance of browser makers leaving incomplete implementations stagnant for years.
As an example, the <dialog> element, a browser-native standardized spec as promising replacement for alert. Except that it doesn't work, it's inaccessible at its core.
And nobody fixes it, it's just left in this broken state forever.
<section>, the element that was supposed to cut up an HTML document into multiple outlines, hence making componentized SEO-optimized headings easy and better, do nothing at all.
CSS columns, a simple and easy way to distribute text, were unusable for 8 years because Mozilla refused to fix their own small bug. Which pales compared to the extraordinary amount of Webkit bugs that are absolutely never ever fixed.
Form controls, since the very invention of them, are terrible. It has cost the world billions, as worldwide every single developer and project has to reinvent some of them, often breaking basic accessibility in the process.
I could go on, but I'll sum it up as the failure to address extremely common real world problems, and to just let broken buggy solutions linger. When you accept a standard and implement it, bloody finish the implementation. Fix bugs. Otherwise, what is the point?
I'd call this hardening the web. There's no reason I can see why you can't harden it whilst also making progress on new features.
> <section>, the element that was supposed to cut up an HTML document into multiple outlines, hence making componentized SEO-optimized headings easy and better, do nothing at all
They do a lot for Reader view and I’m fairly certain they help with screen readers where heading hierarchies might otherwise be ambiguous.
- - -
Overall, I share your concerns but feel they’re overstated. Mostly because I remember the bad old days of NS4, IE4-6, etc. The web as a set of reliable standards isn’t perfect but it’s worlds better than I ever anticipated.
Just as a matter of perspective: when I started web dev, I learned and used by rote dozens upon dozens of hacky incompatibility workarounds. I moved more backend in recent years, but have had more web work over the last year. I can count on one hand the number of browser compatibility issues I’ve had to address (yes, all Safari). And not because build tools are helping. If anything build tools are the biggest source of frustration for me now.
> Form controls, since the very invention of them, are terrible. It has cost the world billions, as worldwide every single developer and project has to reinvent some of them, often breaking basic accessibility in the process.
We're close to being down to two browser engines that matter, one of them with well over half total market share, and they still don't bother to fix forms. You're right about the costs, they're immense. Billions isn't an overestimate for ball-parking the figure, I'd say.
Shit, Firefox, you want to do something to stay relevant, be the first to do that right. Go nuts with some (well-considered) non-standard extensions, behaviors, and tags, worst case no-one uses them and no other browsers adopt them, best case you achieve arguably the greatest thing your project ever has (which is saying something—early FF, especially, was awesome). It may be too late (like every other idea I can come up with to save FF, they should have started at least a decade ago) but it's worth a try.
I was just talking with someone today about how Fetch API is now 10 years old and still isn't a complete replacement for XmlHTTPRequest (or is it XMLHttpRequest? I can never remember) because it can't do file upload progress tracking. It's been a decade of "the spec doesn't support that yet". Yet? Will it ever? At what point do you admit you're just not going to do it?
Can you elaborate on <dialog>? It is currently implemented behind a feature flag in Firefox, and not implemented at all in Safari. So the answer with caniuse is maybe with a polyfill. How exactly is it inaccessible at the core? As far as I know a <dialog> is required to have at least one focusable element inside it (I usually have a close button in the dialog’s <header>), and then the user agent is supposed to trap focus inside it.
Is it not usable for users with assistive technology? Is it a bad UX for them? Is it broken or buggy? Does the polyfill not workk? etc.
Fixing bugs vs adding new features doesn't bring in new users, so it doesn't get prioritized. And if as you say these bugs are in features that nobody uses, why bother? Nevermind nobody might be using them because of the bugs, but the PR buz over those features has already been had, so on to the next.
Turns out it was removed temporarily because of internal architecture changes, and never reintroduced because of lack of adoption. Of course the biggest barrier of adoption is Safari not supporting them, so talk about a self fulfilling prophecy.
> This feature was originally removed temporarily during multiprocess bring-up, and actual usage on the web has been pretty low. We're willing to reconsider if there is significant demand.
IIRC, it was only even available through WebKit1 (i.e., the single-process WebKit API) and never through WebKit2, so it will have gone from Safari when Safari moved to WebKit2, even though the implementation lived on in WebKit for longer.
It was only 4+ years after it was dropped from Safari that anyone started to ask about Safari support for it again, so it had kinda fallen to the wayside due to lack of interest.
> Why do people think that the universality of decay can be stopped?
Theory: A lot of those people tend to survive by jumping from metaphorical ship to ship (API or software or whatever). So to them, the reality is that the ship you're currently on pretty much always feels like it's sinking, and you're always looking out for signs. You write articles lamenting sinking ships, because that's your reality.
These people also may feel like the universality of decay can be stopped in the current context, by moving away from a decaying ship/system. It's more of a question of where the decay isn't as bad, or as seemingly needless.
Example Pro: After a while you can get really good at evaluating ships. Con: It feels useless to build your own ship; you're afraid you'd have to jump from your own ship and wouldn't that feel awful.
Other people survive by building ships. They're cool with decay, because building new stuff that works better is interesting. Their job is to support their stuff, and to a lesser degree to patch others' stuff, like maybe their supply ship, or a friend's ship. To people like this, the reality is that holes just happen. So you learn to deal. Maybe you even learn to love patching holes, and you get so good at building ships that your ship's holes are downright fascinating anyway.
These people aren't usually as worried about decay. But they may have a problem of eventually going down with their ship, or finding that their ship is no longer just a ship but also a lot like a baroque form of floating junk pile.
Example Pro: Obvs, you can build ships. Con: People will try to jump on your ship, and they'll probably tell you they think it's sinking, and expect you to do something about it.
> Why do people think that the universality of decay can be stopped?
Because information can be perfectly copied? If you replace the components as they wear out, you can still have a computer from the 1980's running fine. The OS and software still work exactly the same. Whatever data will still exist as a perfect copy.
(In fact, rumors are that George RR Martin does exactly that, and sends whatever he writes to his publisher on 3.5" floppies)
Absolutely. However all these endless discussions have an implicit time frame, say 50-100 years backwards and forwards, during which time, old, well used stuff like HTML are expected to not suddenly remove stuff that's being used in the wild.
Because web pages and simple web scripting is often done by non-professionals, who cannot be expected to follow standards processes in perpetuity to keep their pages/scripts working. They often author a few pages and move on with other things in life, which is why browsers being ultra-conservative about breaking stuff is important for the robustness of the Web.
One of the great things about the web is the fact that I can expect anything I create today, to be relatively forwards compatible, and I can build things in a way so they're either backwards compatible or degrade gracefully.
The philosophies around entropy isn't really relevant here IMO. The web should continue to be evergreen.
If you only use a super-basic subset of HTML tags, design carefully, and ignore "standards", you can write a website which works across 25 years of browsers, mainstream and obscure, with gated enhancements for browsers which support them.
I think that's pretty impressive as far as API age.
Browser vendors have long opposed making backwards incompatible changes. The problem is if any existing websites start breaking some users will switch browsers because of it. Browsers that don't implement the backwards incompatible change will in turn gain users. No browser wants to lose users, all browsers want to gain users, so no browser is willing to make any changes that cause old websites to break. Once it's a browser feature, it's always a browser feature (with very limited exceptions).
> Why do people think that the universality of decay can be stopped?
I'd rather say that the "universality of decay" is a U-shaped curve. Just have a look at old arcade and console games... they went out of fashion, the hardware (sometimes literally) rotted, but emulator technology is getting better and better all the time - the result is you can use any modern computer or many smartphones (!) to run all that stuff that is sometimes many decades old.
All that any kind of technology needs to survive into modern ages is one (single or group of) person that engineers an appropriate abstraction layer. Polyfills, virtual machines and other emulators, FPGA-based hybrids... the list is endless.
Since it looks like Chromium is set on removing alert() altogether[0], I don't see why this can't be handled in the same way browsers handle popup windows. If a website tries to open a pop-up, it gets blocked; but, on Firefox at least, I get a small notification in the toolbar, where I can choose to copy the popup window's URL, open the popup, or allow the website to open popups as much as it wants.
The problem is that alert() blocks JavaScript from executing. If a web page fires an alert, it expects to be blocked in that moment. If the browser were to delay the alert, the page would be blocked unexpectedly at some later time. This would probably cause bugs.
From humble point of view HTML 4 was just fine for the purposes of the Web, and everything else should be done via native applications and network protocols.
Google, alongside all of those that push Chrome wrapped in whatever kind of package, have managed to turn the Web into ChromeOS.
I expect job adverts for HTML 6 to be about years of experience developing ChromeOS applications.
I author a CSS 2.1 rasterizer, and a partial implementation of it is used in a game engine my company's open source organization publishes.
There's such a small number of people who have written their own pieces of visual web client technology, and an uncountable number who consume it.
I've entertained the idea of writing a partially compliant web browser and releasing that for fun. It's still totally possible to write your own web browser today.
You will of course need to put in effort than an exceptionally small number of people have, and even after you do that, you'll only have something partially compliant. But it will be valid!
Hell, you could build an HTML5 valid web browser that didn't even use CSS. Invent your own thing. Make JSON style sheets or something.
Anyway. We don't have enough people toying around with web tech. For years, I've only ever seen people toy around with the easy. Things like making little web clients to help you with API requests instead of turning to curl, or rehashing a CSS framework for the nth time.
And frankly it's sad to see because its so uninspired and boring.
Where are the people creating basic web browsers that use C# as a programming language instead of JavaScript? Or people inventing a new hypertext markup language as an alternative to HTML that still uses HTTP as a transport protocol?
It's <plaintext> which basically means "stop parsing for rest of the page". There's no way to close the tag. It's super easy to implement which is probably why it's still around.
> Forms with passwords marked Not Secure over HTTP
It requires a rather curious definition of “breaking change” to consider this one.
> A̶r̶r̶a̶y̶.̶p̶r̶o̶t̶o̶t̶y̶p̶e̶.̶f̶l̶a̶t̶t̶e̶n̶ ̶b̶r̶e̶a̶k̶s̶ ̶M̶o̶o̶T̶o̶o̶l̶s̶ renamed to Array.prototype.flat
That doesn’t belong in the list at all; it’s a prime example of the platform bending over backwards to avoid a breaking change, for better or for worse (it means that future users are stuck with an inferior name, see also contains which got renamed to includes because of, if I recall correctly, MooTools again).
Caniuse, rather infuriatingly, used to deliberately not track usability of obsolete/deprecated web stuff (it would just say something like "Feature X is obsolete, don't use it" instead of the browser support graph), but it seems to have changed course
Web developers should be angry that their craft has so much churn and leads to works that end up being ephemeral in nature. That people in the field exalt this state is upsetting because this state is bad for both users and developers.
unless i haven’t read deep enough into that groups thread. this seems overblown. alert, prompt, etc. isn’t being deprecated on the entire web, only in the context of cross-origin iframes.
that being said, this post definitely strikes a chord with me. instead of adding a new index of deprecated features, it think it speaks more to the fact that caniuse needs to rethink how they approach deprecated features.
Back in 2014, I built some very flashy interfaces using Polymer components just for fun. Fast forward to 2021, and these pages are completely broken. They don't even display the text that I wrote.
I'm guessing that the depreciation of HTML imports is why they don't work anymore.
It was depending on Web Components v0 stuff that was enabled in Chrome before it was ready.
You may find that the component does still work in Firefox (which never implemented the stuff that Chrome eventually removed, and for which Polymer included polyfills).
Also, my hobby horse: HTML imports were removed, not just deprecated (or depreciated). Two (three!) very different things.
If the pages worked in Safari and Firefox then they would still work in Chrome now. The only pages that would have broken are those that didn't load the polyfills at all.
It was a bad situation which I hope is never repeated.
About chromium on cross origin remove of alert and co.(link in article):
"We haven’t engaged with other browser vendors regarding this change yet, but plan to submit a spec change proposal once the change is approved for Chrome."
So chrome just changes it. And then officially applys for the spec change, so other browsers might follow - or not.
I mean, why pretending at all, that you care about the spec, when you are fat monopolist?
The bigger problem is that the web itself is rotting away as we speak. Yes, browsers no longer supporting old API calls is a huge problem, for instance for archived content that at some point will simply stop working, a bit like a 78 RPM record. Good luck finding a player. All the Flash content and so much other work that is part of our digital record is no longer working (and I absolutely loathe Flash).
So whether or not you can still use it today, the bigger question is will you be able to use that website 10 years or longer into the future? Because any book ever printed can still be read today (assuming you know the script and the language it was written in), I think the longevity of the web will top out at a couple of decades at best before the digital termites and worms will consume the devices that could have rendered the content you are interested in.
Plain ascii text will likely live the longest, with markdown as a good second. Anything that executes will likely simply die.
[+] [-] userbinator|4 years ago|reply
Related: Stop Pushing the Web Forward (2015) https://news.ycombinator.com/item?id=9961613
[+] [-] LAC-Tech|4 years ago|reply
[+] [-] onion2k|4 years ago|reply
If you're prioritizing looking at new things over accessibility and cross-browser support then you are making that choice; browser vendors are not forcing it upon you.
[+] [-] redsparrow|4 years ago|reply
This reminds me of Spolsky's blog post about Fire and Motion. [0] He gives the example of Microsoft creating an endless stream of new technologies which kept their competition busy (eg: ODBC, RDO, DAO, ADO, OLEDB, ADO.NET, ...). Get the competition to spend their resources keeping up rather than competing.
0: https://www.joelonsoftware.com/2002/01/06/fire-and-motion/
[+] [-] forgotmypw17|4 years ago|reply
I've done it by testing with historic browsers alongside modern, and challenging myself to make it work with all of them, JS and noJS, without errors, reliably.
In the process, I found a set of "lowest common denominator tags which work across almost anything. I use these to build the "base" HTML. Then, I add JS with feature-checks to allow older browsers with JS enabled to still run it, though it doesn't do much at the moment.
I think it's a worthwhile exercise to learn where the roots of the web are, and how it evolved, and allows you to write much more resilient code in general. The reason for this is that the longer a technique or syntax has been in use, the longer it's likely to continue to be used.
Lindy effect is the name of the trend.
[+] [-] rapind|4 years ago|reply
[+] [-] FractalHQ|4 years ago|reply
[+] [-] achairapart|4 years ago|reply
[0]: https://www.quirksmode.org/blog/archives/2021/08/breaking_th...
[+] [-] brundolf|4 years ago|reply
See jQuery (both Ajax and selection), Moment/Temporal, etc
[+] [-] 1vuio0pswjnm7|4 years ago|reply
[+] [-] musicale|4 years ago|reply
I think it's OK for Google to make Chrome into an OS and cram whatever they want into it as fast as they can.
Regular web browsers don't necessarily need to be on the same feature treadmill/death march (assuming they could keep up.)
We seem to be trying really hard to reinvent the applet systems of the 1990s, but in a way that brings systems with 10-100x the memory and CPU resources to their knees and requires huge armies of programmers to implement. I guess JavaScript/webasm is better than Java in some ways.
[+] [-] fleddr|4 years ago|reply
This to refer to my annoyance of browser makers leaving incomplete implementations stagnant for years.
As an example, the <dialog> element, a browser-native standardized spec as promising replacement for alert. Except that it doesn't work, it's inaccessible at its core.
And nobody fixes it, it's just left in this broken state forever.
<section>, the element that was supposed to cut up an HTML document into multiple outlines, hence making componentized SEO-optimized headings easy and better, do nothing at all.
CSS columns, a simple and easy way to distribute text, were unusable for 8 years because Mozilla refused to fix their own small bug. Which pales compared to the extraordinary amount of Webkit bugs that are absolutely never ever fixed.
Form controls, since the very invention of them, are terrible. It has cost the world billions, as worldwide every single developer and project has to reinvent some of them, often breaking basic accessibility in the process.
I could go on, but I'll sum it up as the failure to address extremely common real world problems, and to just let broken buggy solutions linger. When you accept a standard and implement it, bloody finish the implementation. Fix bugs. Otherwise, what is the point?
I'd call this hardening the web. There's no reason I can see why you can't harden it whilst also making progress on new features.
[+] [-] eyelidlessness|4 years ago|reply
MDN/caniuse cover quite a lot of this already.
> <section>, the element that was supposed to cut up an HTML document into multiple outlines, hence making componentized SEO-optimized headings easy and better, do nothing at all
They do a lot for Reader view and I’m fairly certain they help with screen readers where heading hierarchies might otherwise be ambiguous.
- - -
Overall, I share your concerns but feel they’re overstated. Mostly because I remember the bad old days of NS4, IE4-6, etc. The web as a set of reliable standards isn’t perfect but it’s worlds better than I ever anticipated.
Just as a matter of perspective: when I started web dev, I learned and used by rote dozens upon dozens of hacky incompatibility workarounds. I moved more backend in recent years, but have had more web work over the last year. I can count on one hand the number of browser compatibility issues I’ve had to address (yes, all Safari). And not because build tools are helping. If anything build tools are the biggest source of frustration for me now.
[+] [-] handrous|4 years ago|reply
We're close to being down to two browser engines that matter, one of them with well over half total market share, and they still don't bother to fix forms. You're right about the costs, they're immense. Billions isn't an overestimate for ball-parking the figure, I'd say.
Shit, Firefox, you want to do something to stay relevant, be the first to do that right. Go nuts with some (well-considered) non-standard extensions, behaviors, and tags, worst case no-one uses them and no other browsers adopt them, best case you achieve arguably the greatest thing your project ever has (which is saying something—early FF, especially, was awesome). It may be too late (like every other idea I can come up with to save FF, they should have started at least a decade ago) but it's worth a try.
[+] [-] moron4hire|4 years ago|reply
[+] [-] runarberg|4 years ago|reply
Is it not usable for users with assistive technology? Is it a bad UX for them? Is it broken or buggy? Does the polyfill not workk? etc.
[+] [-] dylan604|4 years ago|reply
[+] [-] osener|4 years ago|reply
Here it is in the standard: https://html.spec.whatwg.org/multipage/workers.html#dom-shar...
Safari/Webkit removed it in 2015 (edit: maybe even earlier!) and never reintroduced it: https://caniuse.com/sharedworkers
Here is the apparent reason for cherry picking features:
> The implementation of Shared Web Workers was imposing undesirable constraints on the engine. It never gained any adoption.
https://stackoverflow.com/a/48193804
Edit: Here is the ticket for it’s reinclusion https://bugs.webkit.org/show_bug.cgi?id=149850
Turns out it was removed temporarily because of internal architecture changes, and never reintroduced because of lack of adoption. Of course the biggest barrier of adoption is Safari not supporting them, so talk about a self fulfilling prophecy.
> This feature was originally removed temporarily during multiprocess bring-up, and actual usage on the web has been pretty low. We're willing to reconsider if there is significant demand.
[+] [-] gsnedders|4 years ago|reply
It was only 4+ years after it was dropped from Safari that anyone started to ask about Safari support for it again, so it had kinda fallen to the wayside due to lack of interest.
[+] [-] javajosh|4 years ago|reply
On a long enough timeline, the survival rate of all APIs goes to zero. (Except Lisp, which is Eternal).
[+] [-] themodelplumber|4 years ago|reply
Theory: A lot of those people tend to survive by jumping from metaphorical ship to ship (API or software or whatever). So to them, the reality is that the ship you're currently on pretty much always feels like it's sinking, and you're always looking out for signs. You write articles lamenting sinking ships, because that's your reality.
These people also may feel like the universality of decay can be stopped in the current context, by moving away from a decaying ship/system. It's more of a question of where the decay isn't as bad, or as seemingly needless.
Example Pro: After a while you can get really good at evaluating ships. Con: It feels useless to build your own ship; you're afraid you'd have to jump from your own ship and wouldn't that feel awful.
Other people survive by building ships. They're cool with decay, because building new stuff that works better is interesting. Their job is to support their stuff, and to a lesser degree to patch others' stuff, like maybe their supply ship, or a friend's ship. To people like this, the reality is that holes just happen. So you learn to deal. Maybe you even learn to love patching holes, and you get so good at building ships that your ship's holes are downright fascinating anyway.
These people aren't usually as worried about decay. But they may have a problem of eventually going down with their ship, or finding that their ship is no longer just a ship but also a lot like a baroque form of floating junk pile.
Example Pro: Obvs, you can build ships. Con: People will try to jump on your ship, and they'll probably tell you they think it's sinking, and expect you to do something about it.
[+] [-] HWR_14|4 years ago|reply
Because information can be perfectly copied? If you replace the components as they wear out, you can still have a computer from the 1980's running fine. The OS and software still work exactly the same. Whatever data will still exist as a perfect copy.
(In fact, rumors are that George RR Martin does exactly that, and sends whatever he writes to his publisher on 3.5" floppies)
[+] [-] Santosh83|4 years ago|reply
Because web pages and simple web scripting is often done by non-professionals, who cannot be expected to follow standards processes in perpetuity to keep their pages/scripts working. They often author a few pages and move on with other things in life, which is why browsers being ultra-conservative about breaking stuff is important for the robustness of the Web.
[+] [-] dannyw|4 years ago|reply
The philosophies around entropy isn't really relevant here IMO. The web should continue to be evergreen.
[+] [-] forgotmypw17|4 years ago|reply
I think that's pretty impressive as far as API age.
[+] [-] vbezhenar|4 years ago|reply
[+] [-] zeven7|4 years ago|reply
[+] [-] mschuster91|4 years ago|reply
I'd rather say that the "universality of decay" is a U-shaped curve. Just have a look at old arcade and console games... they went out of fashion, the hardware (sometimes literally) rotted, but emulator technology is getting better and better all the time - the result is you can use any modern computer or many smartphones (!) to run all that stuff that is sometimes many decades old.
All that any kind of technology needs to survive into modern ages is one (single or group of) person that engineers an appropriate abstraction layer. Polyfills, virtual machines and other emulators, FPGA-based hybrids... the list is endless.
[+] [-] ag8|4 years ago|reply
Why not do the same exact thing for alert()s?
[0]: https://news.ycombinator.com/item?id=28310716
[+] [-] SimeVidas|4 years ago|reply
[+] [-] pjmlp|4 years ago|reply
Google, alongside all of those that push Chrome wrapped in whatever kind of package, have managed to turn the Web into ChromeOS.
I expect job adverts for HTML 6 to be about years of experience developing ChromeOS applications.
[+] [-] andrewmcwatters|4 years ago|reply
There's such a small number of people who have written their own pieces of visual web client technology, and an uncountable number who consume it.
I've entertained the idea of writing a partially compliant web browser and releasing that for fun. It's still totally possible to write your own web browser today.
You will of course need to put in effort than an exceptionally small number of people have, and even after you do that, you'll only have something partially compliant. But it will be valid!
Hell, you could build an HTML5 valid web browser that didn't even use CSS. Invent your own thing. Make JSON style sheets or something.
Anyway. We don't have enough people toying around with web tech. For years, I've only ever seen people toy around with the easy. Things like making little web clients to help you with API requests instead of turning to curl, or rehashing a CSS framework for the nth time.
And frankly it's sad to see because its so uninspired and boring.
Where are the people creating basic web browsers that use C# as a programming language instead of JavaScript? Or people inventing a new hypertext markup language as an alternative to HTML that still uses HTTP as a transport protocol?
[+] [-] teewuane|4 years ago|reply
[+] [-] styfle|4 years ago|reply
Here’s a non-exhaustive list of breaking changes to the web platform:
https://github.com/styfle/breaking-changes-web
[+] [-] kristopolous|4 years ago|reply
It's <plaintext> which basically means "stop parsing for rest of the page". There's no way to close the tag. It's super easy to implement which is probably why it's still around.
Deprecated 28 years ago in HTML 1.1, yet still supported in all major browsers. Test page over here: http://9ol.es/TopLevel/example.html reference rendering: http://9ol.es/tl.png
There's some modern timing issue in chrome I think, it's intermittent Looks like there's a bug.
My original post on the hack, blowing off the cyber dust from 2014: https://news.ycombinator.com/item?id=7850301
[+] [-] chrismorgan|4 years ago|reply
It requires a rather curious definition of “breaking change” to consider this one.
> A̶r̶r̶a̶y̶.̶p̶r̶o̶t̶o̶t̶y̶p̶e̶.̶f̶l̶a̶t̶t̶e̶n̶ ̶b̶r̶e̶a̶k̶s̶ ̶M̶o̶o̶T̶o̶o̶l̶s̶ renamed to Array.prototype.flat
That doesn’t belong in the list at all; it’s a prime example of the platform bending over backwards to avoid a breaking change, for better or for worse (it means that future users are stuck with an inferior name, see also contains which got renamed to includes because of, if I recall correctly, MooTools again).
[+] [-] buu700|4 years ago|reply
[+] [-] sente|4 years ago|reply
[+] [-] jan_Inkepa|4 years ago|reply
https://caniuse.com/?search=blink
thanks, caniuse!
[+] [-] noobermin|4 years ago|reply
[+] [-] qiqitori|4 years ago|reply
[+] [-] one_comment|4 years ago|reply
The author could have used the date relative view. 8 years (implemented in all major browsers 2012, deprecated in 2020) is not short-lived in my book.
[+] [-] tmwed|4 years ago|reply
that being said, this post definitely strikes a chord with me. instead of adding a new index of deprecated features, it think it speaks more to the fact that caniuse needs to rethink how they approach deprecated features.
[+] [-] xpressvideoz|4 years ago|reply
[+] [-] qnxub|4 years ago|reply
I'm guessing that the depreciation of HTML imports is why they don't work anymore.
[+] [-] chrismorgan|4 years ago|reply
You may find that the component does still work in Firefox (which never implemented the stuff that Chrome eventually removed, and for which Polymer included polyfills).
Also, my hobby horse: HTML imports were removed, not just deprecated (or depreciated). Two (three!) very different things.
[+] [-] spankalee|4 years ago|reply
It was a bad situation which I hope is never repeated.
[+] [-] hutzlibu|4 years ago|reply
"We haven’t engaged with other browser vendors regarding this change yet, but plan to submit a spec change proposal once the change is approved for Chrome."
So chrome just changes it. And then officially applys for the spec change, so other browsers might follow - or not.
I mean, why pretending at all, that you care about the spec, when you are fat monopolist?
[+] [-] jacquesm|4 years ago|reply
So whether or not you can still use it today, the bigger question is will you be able to use that website 10 years or longer into the future? Because any book ever printed can still be read today (assuming you know the script and the language it was written in), I think the longevity of the web will top out at a couple of decades at best before the digital termites and worms will consume the devices that could have rendered the content you are interested in.
Plain ascii text will likely live the longest, with markdown as a good second. Anything that executes will likely simply die.
[+] [-] ianlevesque|4 years ago|reply