When is ECMA/JS going to be versioned in <script> tags so we can avoid this the next time around?
Catering to an implementation that does bad things is an awful way to build standards, but it's only going to happen again and again and again unless the developer is forced to pick a version and build / maintain compatibility with exactly that version.
This is unlikely to happen. Browser developers don't want to maintain multiple forks of their JavaScript engines. They want new features introduced in a backward compatible manner so they can maintain a single engine. Imagine in 10 years if you had to maintain 7 slightly incompatible engines in parallel. A new browser written from scratch would have to implement all 7 engines since the older versions would never go away.
Because I don't know what would happen if you don't define your version in script tags, it'll default to I guess LTS, which wouldn't have fixed this issue, right?
>When is ECMA/JS going to be versioned in <script> tags so we can avoid this the next time around?
Unless there is a will to plow the ranks of committee sitters, and remove inadequate people, you will never see that till the moment the web platform kicks the bucket.
I admit I was an avid MooTools user back in the day. I remember that users of that other js library often pointed out that MooTools' messing with prototypes was going to be a problem someday. They were right. Also, looking back on it, MooTools really wasn't as cool as I thought it was.
Everyone should be using MooTools now. The ECMAScript committee has basically made it an unofficial extension of the language by vowing never to break that library. Guaranteeing its stability for years to come.
You can be certain the ECMAScript committee is not going to break any of the more popular libraries (like JQuery, Angular etc) either. It just because these libraries were better designed in the first place that it hasn't been an issue.
Why do they make such a fuss about MooTools .flatten? Nobody cared about prototype.js breaking dozens of ECMAScript array methods (e.g. map, reduce, contains,...) some years ago. Site owners were simply supposed to update or remove it.
The answer is in the article. MooTools and Prototype both replaced Array methods that were later standardized, but they still work, because their provided copies overwrite the standard versions.
MooTools went further by copying “all methods from Array” onto another MooTools object, “Elements.” THAT broke, because the standard Array.flatten would be non-enumerable, and so it wouldn’t get copied to Elements, and so Elements.flatten would fail.
The article addresses your question directly under the heading "Why don’t we just keep the existing name and break the Web?"
> Why don’t we just keep the existing name and break the Web?
> In 1996, before CSS became widespread, and long before “HTML5” became a thing, the Space Jam website went live. Today, the website still works the same way it did 22 years ago.
> How did that happen? Did someone maintain that website for all these years, updating it every time browser vendors shipped a new feature?
> As it turns out, “don’t break the Web” is the number one design principle for HTML, CSS, JavaScript, and any other standard that’s widely used on the Web. If shipping a new browser feature causes existing websites to stop working, that’s bad for everyone:
> - visitors of the affected websites suddenly get a broken user experience;
> - the website owners went from having a perfectly-working website to a non-functional one without them changing anything;
> - browser vendors shipping the new feature lose market share, due to users switching browsers after noticing “it works in browser X”;
> - once the compatibility issue is known, other browser vendors refuse to ship it. The feature specification does not match reality (“nothing but a work of fiction”), which is bad for the standardization process.
> Sure, in retrospect MooTools did the wrong thing — but breaking the web doesn’t punish them, it punishes users. These users do not know what a moo tool is. Alternatively, we can find another solution, and users can continue to use the web. The choice is easy to make."
The "don't break the web" excuse is the same old bad-faith nonsense:
1. The number of websites using the old version of Mootools is vanishingly remote, relative to the number of existing websites not using it, and the number of as-yet-unmade websites who will also not use it (but who will be harmed by the committee's dedication to retarding the development of the language).
2. Of those websites using this old version of Mootools, the number that use flatten, and do so in a way that would definitely be broken by the introduction of standardised "flatten", and wouldn't be fixed thereafter is, statistically speaking, zero.
3. Most of the same people who are affecting concern about backwards compatibility in the case of the web work for tech giants who regularly retire products and APIs that are relied upon by thousands of people, and make backwards incompatible changes to their operating systems that break masses of unmaintained apps.
But the fact is, you can't make arguments like this to groups like the TC39 and expect to even get a hearing. Despite their occasional spiel about seeking outsider feedback they are, like any institution, happy in their long-settled groupthink positions, and reflexively frightened and dismissive of any challenge or criticism by outsiders.
You saw this when the whole "smoosh" affair first broke. There was a long, passionate, but civilised debate about it on GitHub (https://github.com/tc39/proposal-flatMap/pull/56 ) which was abruptly cut off by a TC39 committee member, who locked out non-contributors entirely, with the dishonest claim that things had become "too heated". What he really couldn't stand was the sight of the committee's dogma being challenged.
Remember that next time the representative of a standards body whines about lack of participation in the process by regular web developers. It's a lie. These groups are only interested in the opinion of outsiders if they align with a set of fixed and unchallengeable precepts, and they will revert to depressingly familiar suppression of speech the moment it looks like things aren't going their way.
Frankly, as an outsider, your post doesn't convince me. Where's your source for (1) and (2)? I read the whole GH thread and didn't find anyone presenting credible data. (3) is even weaker; it's perfectly reasonable to distinguish between an external API and the runtime of the web (it's also fine to disagree with this distinction, of course).
Also, at least two, maybe three comments were moderated for violated the code of conduct, and the commenter admitted his language was inappropriate. So there's an argument to be made that the conversation was too heated. Again, you can disagree, but your opinion doesn't make anyone else's invalid or bad-faith.
I also saw multiple committee members discussing the counterarguments, and by the time the thread was locked it was essentially just rehashing the same points. So I find it hard to agree with your claim that they refused to hear dissenters.
I've made most of these points before https://certsimple.com/blog/break-the-web, but will add: we now have a bunch of verbs you can do with an array - like splice, join, shift, reverse - and suddenly we have a noun method called flat.
Inconsistency aside, naming non-accessor functions and methods nouns is a bad practice anyway.
The discussion comes down to whether a significant amount of websites (in absolute numbers) will break. The article hints to "extensive telemetry" to figure out the number of affected sites but doesn't cite any numbers. On the other hand you don't cite any numbers or empirical evidence either.
Wouldn’t it be possible to introduce a versioning mechanism, similar to strict mode, where you could opt in to a newer version of JavaScript? This issue is not going to go away, almost any sensible method names you might want to use in the future are likely to cause problems with some legacy code.
It is possible, but I think the committees would prefer to avoid too many "modes" which makes specification and testing much more complex. Just think about the complexity caused by having "quirks" and "standards" rendering modes. In this case the issue could be resolved with a simple (if slightly awkward) rename which is certainly preferable to forking all the JavaScript engines.
I think it's just about how much legacy code, and in which sites.
+1 though. I'd really like to see a version of JavaScript not only with new features, but also without some (all?) of the weird semantics it was once dreaded for. That can certainly only happen if it's versioned.
I see so much API breakage in modern software ecosystems that I consciously try to minimise my dependencies so I don't have to spend all of my time constantly updating my code to work with the latest versions of everything. One only has to look at disasters like the transition from Python 2 to Python 3 to see how much pain breaking changes cause for both developers and users.
If TC39 took the attitude of "hey, here's a cool new feature, who cares if it breaks a bunch of existing sites, everyone can update their code, right?" then web developers would never get to make actual improvements their sites or do anything cool, because they'd be stuck in a constant cycle of maintenance hell. Sure, I find it silly that typeof(null) === 'object' because of a 20 year-old bug, but I consider the stability that comes from the "don't break the web" principle to be more than worth a few small annoyances like this.
If more of the industry followed this approach the world would be a better place.
While I agree with your general point ("don't break existing APIs"), I disagree with the current approach (I made a specific comment down there).
This would NOT have been a breaking language change; the problem was introduced by a library that overrides a BUILTIN PROTOTYPE. Let's hope no library (or userspace code) overrides the "flat" method as well.
Overriding other people's code, especially core languages parts, is inherently risky from an API stability POV. I don't think all users appreciate such risk.
What would actually happen is people would stop upgrading their browsers if never versions made some previously working sites inaccessible. The age of automatic browser upgrades would come to an end and it would again take a decade before new features could be used.
I have seen a couple people suggesting the solution that reassigning a non-enumerable property should make it enumerable. Does anyone here have any insight on why they didn't go with that solution?
Possibly it's considered too difficult to move the standard to this behaviour since it's not a new feature per-se? Or I guess a change as subtle as this could also have unforeseen consequences?
It does seem to me like a more elegant solution, and perhaps what the behaviour should have been in the first place.
I wish they would've gone with Array.flatten() instead of Array.prototype.flat. We have `Array.from` and `Array.of` in the language already why not continue that trend? Am I missing some reason why that couldn't have happened? Also strange to see that infinite flattening is not the default behavior.
The static methods take probably-not-arrays and return arrays. An instance method takes an existing array (`this`) and returns a modified array.
Of course, you could implement a static flatten, but it goes against the grain of all the other instance methods (map, reduce, filter, etc.). Considering how often they are chained together, it would be especially weird.
Why isn't versioning being put into the spec now? If we have array.prototype.flatten why can't we have array.prototype.v1.flatten so that we have an easy out to this in the future?
Apis do this all the time, why can't JavaScript of backwards compatibility is this important?
At some point, javascript is going to have to realize that they are the language and these other little tools are just that: little tools that other people have written.
The programming language will be around for a really long time. Little tools come and go. Never rename things in your language just to accomodate some little tool. There are millions of little tools. There are simply not enough synonyms available to accomodate them all.
The only sensible choice is to stand up for yourself and name things the right way in the language itself. The library will come out with a patch next week.
Here's a link to the last time they caved on the exact same type of issue, to the same silly library, a couple weeks ago:
> Sure, in retrospect MooTools did the wrong thing...
Are you kidding me? Overwriting prototype is a widespread design pattern. Either ECMA ships the next version making it impossible (and breaking significantly larger parts of the web) or they shut up and accept that this mess is fully theirs. They designed themselves into a corner where developing without breaking user space is hard, they did that, and then they failed, they did that.
This is what happens when you use monkey patching (ie. overwriting the builtins of a language/library). Unfortunately, it's a very common technique in the JS World. It's sad that so much of modern development (JavaScript) is based on such bad practices. Just because it's easy doesn't mean people should do it. Touching globals is almost never a good idea.
Monkey patching is also what allows polyfills, which are one of the things which allow the web to move forward. So I would say it is more of a double-edged sword than an all-around bad practice.
The chance for this to happen should make us think both about a) the language b) the commonly accepted use of language features.
How come that user code of any kind gets to override a builtin prototype, and that people are happy with that?
Would it be THAT bad to create a "mootools.Arrays" object with a "flatten" function? I think that either you've got a way to do such language extensions in a narrow scope (I think Ruby modules can do that, but I should check) so that they don't influence the whole app, or you simply shouldn't do that, period. Crazy interactions are hidden everywhere, otherwise.
I think the article does a pretty good job at convincing the most perfectionist of web developers that "Well... okay, sure, we'll call it flat instead of flatten."
[+] [-] web007|7 years ago|reply
Catering to an implementation that does bad things is an awful way to build standards, but it's only going to happen again and again and again unless the developer is forced to pick a version and build / maintain compatibility with exactly that version.
[+] [-] olavk|7 years ago|reply
[+] [-] ryanwatkins|7 years ago|reply
[+] [-] whatever_dude|7 years ago|reply
[+] [-] komali2|7 years ago|reply
Because I don't know what would happen if you don't define your version in script tags, it'll default to I guess LTS, which wouldn't have fixed this issue, right?
[+] [-] loftyal|7 years ago|reply
[+] [-] baybal2|7 years ago|reply
Unless there is a will to plow the ranks of committee sitters, and remove inadequate people, you will never see that till the moment the web platform kicks the bucket.
[+] [-] okonomiyaki3000|7 years ago|reply
[+] [-] cjslep|7 years ago|reply
[+] [-] olavk|7 years ago|reply
[+] [-] jsdwarf|7 years ago|reply
[+] [-] dfabulich|7 years ago|reply
MooTools went further by copying “all methods from Array” onto another MooTools object, “Elements.” THAT broke, because the standard Array.flatten would be non-enumerable, and so it wouldn’t get copied to Elements, and so Elements.flatten would fail.
[+] [-] grzm|7 years ago|reply
> Why don’t we just keep the existing name and break the Web?
> In 1996, before CSS became widespread, and long before “HTML5” became a thing, the Space Jam website went live. Today, the website still works the same way it did 22 years ago.
> How did that happen? Did someone maintain that website for all these years, updating it every time browser vendors shipped a new feature?
> As it turns out, “don’t break the Web” is the number one design principle for HTML, CSS, JavaScript, and any other standard that’s widely used on the Web. If shipping a new browser feature causes existing websites to stop working, that’s bad for everyone:
> - visitors of the affected websites suddenly get a broken user experience;
> - the website owners went from having a perfectly-working website to a non-functional one without them changing anything;
> - browser vendors shipping the new feature lose market share, due to users switching browsers after noticing “it works in browser X”;
> - once the compatibility issue is known, other browser vendors refuse to ship it. The feature specification does not match reality (“nothing but a work of fiction”), which is bad for the standardization process.
> Sure, in retrospect MooTools did the wrong thing — but breaking the web doesn’t punish them, it punishes users. These users do not know what a moo tool is. Alternatively, we can find another solution, and users can continue to use the web. The choice is easy to make."
[+] [-] stupidcar|7 years ago|reply
1. The number of websites using the old version of Mootools is vanishingly remote, relative to the number of existing websites not using it, and the number of as-yet-unmade websites who will also not use it (but who will be harmed by the committee's dedication to retarding the development of the language).
2. Of those websites using this old version of Mootools, the number that use flatten, and do so in a way that would definitely be broken by the introduction of standardised "flatten", and wouldn't be fixed thereafter is, statistically speaking, zero.
3. Most of the same people who are affecting concern about backwards compatibility in the case of the web work for tech giants who regularly retire products and APIs that are relied upon by thousands of people, and make backwards incompatible changes to their operating systems that break masses of unmaintained apps.
But the fact is, you can't make arguments like this to groups like the TC39 and expect to even get a hearing. Despite their occasional spiel about seeking outsider feedback they are, like any institution, happy in their long-settled groupthink positions, and reflexively frightened and dismissive of any challenge or criticism by outsiders.
You saw this when the whole "smoosh" affair first broke. There was a long, passionate, but civilised debate about it on GitHub (https://github.com/tc39/proposal-flatMap/pull/56 ) which was abruptly cut off by a TC39 committee member, who locked out non-contributors entirely, with the dishonest claim that things had become "too heated". What he really couldn't stand was the sight of the committee's dogma being challenged.
Remember that next time the representative of a standards body whines about lack of participation in the process by regular web developers. It's a lie. These groups are only interested in the opinion of outsiders if they align with a set of fixed and unchallengeable precepts, and they will revert to depressingly familiar suppression of speech the moment it looks like things aren't going their way.
[+] [-] icebraining|7 years ago|reply
Also, at least two, maybe three comments were moderated for violated the code of conduct, and the commenter admitted his language was inappropriate. So there's an argument to be made that the conversation was too heated. Again, you can disagree, but your opinion doesn't make anyone else's invalid or bad-faith.
I also saw multiple committee members discussing the counterarguments, and by the time the thread was locked it was essentially just rehashing the same points. So I find it hard to agree with your claim that they refused to hear dissenters.
[+] [-] Zarel|7 years ago|reply
TC39 can't introduce breaking changes because browsers won't implement breaking changes.
Browsers won't implement breaking changes because users would switch to other browsers which don't implement breaking changes.
It doesn't matter whose "fault" it is, if a website works in one browser but not in another, users are going to use the browser it works in.
[+] [-] nailer|7 years ago|reply
Inconsistency aside, naming non-accessor functions and methods nouns is a bad practice anyway.
[+] [-] olavk|7 years ago|reply
[+] [-] tobr|7 years ago|reply
[+] [-] olavk|7 years ago|reply
[+] [-] aylmao|7 years ago|reply
+1 though. I'd really like to see a version of JavaScript not only with new features, but also without some (all?) of the weird semantics it was once dreaded for. That can certainly only happen if it's versioned.
[+] [-] genezeta|7 years ago|reply
[+] [-] baybal2|7 years ago|reply
Some people in web standards community and dotcoms will protest. There are big believers in "Live version"
It is very sensible thing to do, but you need to kick these people out of committees. Despite the numerous attempts, none has succeeded at that yet.
[+] [-] peterkelly|7 years ago|reply
I see so much API breakage in modern software ecosystems that I consciously try to minimise my dependencies so I don't have to spend all of my time constantly updating my code to work with the latest versions of everything. One only has to look at disasters like the transition from Python 2 to Python 3 to see how much pain breaking changes cause for both developers and users.
If TC39 took the attitude of "hey, here's a cool new feature, who cares if it breaks a bunch of existing sites, everyone can update their code, right?" then web developers would never get to make actual improvements their sites or do anything cool, because they'd be stuck in a constant cycle of maintenance hell. Sure, I find it silly that typeof(null) === 'object' because of a 20 year-old bug, but I consider the stability that comes from the "don't break the web" principle to be more than worth a few small annoyances like this.
If more of the industry followed this approach the world would be a better place.
[+] [-] alanfranz|7 years ago|reply
This would NOT have been a breaking language change; the problem was introduced by a library that overrides a BUILTIN PROTOTYPE. Let's hope no library (or userspace code) overrides the "flat" method as well.
Overriding other people's code, especially core languages parts, is inherently risky from an API stability POV. I don't think all users appreciate such risk.
[+] [-] olavk|7 years ago|reply
[+] [-] shawnz|7 years ago|reply
[+] [-] anewhnaccount2|7 years ago|reply
It does seem to me like a more elegant solution, and perhaps what the behaviour should have been in the first place.
[+] [-] tylerhou|7 years ago|reply
[+] [-] Cakez0r|7 years ago|reply
[+] [-] pg_bot|7 years ago|reply
[+] [-] bsimpson|7 years ago|reply
Of course, you could implement a static flatten, but it goes against the grain of all the other instance methods (map, reduce, filter, etc.). Considering how often they are chained together, it would be especially weird.
[+] [-] lovich|7 years ago|reply
Apis do this all the time, why can't JavaScript of backwards compatibility is this important?
[+] [-] __david__|7 years ago|reply
[+] [-] baron816|7 years ago|reply
[+] [-] KwanEsq|7 years ago|reply
[+] [-] jasonkester|7 years ago|reply
The programming language will be around for a really long time. Little tools come and go. Never rename things in your language just to accomodate some little tool. There are millions of little tools. There are simply not enough synonyms available to accomodate them all.
The only sensible choice is to stand up for yourself and name things the right way in the language itself. The library will come out with a patch next week.
Here's a link to the last time they caved on the exact same type of issue, to the same silly library, a couple weeks ago:
https://news.ycombinator.com/item?id=16753851
[+] [-] groby_b|7 years ago|reply
"Might makes right" is rarely a good choice, not even for programming languages.
[+] [-] tomtimtall|7 years ago|reply
Are you kidding me? Overwriting prototype is a widespread design pattern. Either ECMA ships the next version making it impossible (and breaking significantly larger parts of the web) or they shut up and accept that this mess is fully theirs. They designed themselves into a corner where developing without breaking user space is hard, they did that, and then they failed, they did that.
[+] [-] mherrmann|7 years ago|reply
[+] [-] olavk|7 years ago|reply
[+] [-] bsimpson|7 years ago|reply
[+] [-] csomar|7 years ago|reply
Last release was more than two years ago. It's time to move on.
[+] [-] alanfranz|7 years ago|reply
How come that user code of any kind gets to override a builtin prototype, and that people are happy with that?
Would it be THAT bad to create a "mootools.Arrays" object with a "flatten" function? I think that either you've got a way to do such language extensions in a narrow scope (I think Ruby modules can do that, but I should check) so that they don't influence the whole app, or you simply shouldn't do that, period. Crazy interactions are hidden everywhere, otherwise.
https://developer.mozilla.org/en-US/docs/Web/JavaScript/Inhe...
"One misfeature that is often used is to extend Object.prototype or one of the other built-in prototypes."
[+] [-] fiiv|7 years ago|reply
So what is to be done here?
How about leave it up to Mootools to release a new version that is compatible and the website owners can adjust their code accordingly.
[+] [-] raidicy|7 years ago|reply
[+] [-] vortico|7 years ago|reply
[+] [-] ivanhoe|7 years ago|reply