I come from a Python background. However, thanks to work and school, I now program mostly in JavaScript, PHP, and Java (though I still use Python when I get a chance).
Now, I could use underscore_names in Java and JavaScript, but I don't. Even though I personally prefer underscore_names to camelCaseNames, I also realize that those languages are designed with camelCaseNames in mind, that the community conventions are for camelCaseNames, and that it is better to write code that looks nice and idiomatic in that language than it is to write code that looks nice and idiomatic in Python.
Not to mention that there are some cases where camelCaseNames are required - for example, when overriding inherited methods in Java - and if I used camelCase where required and underscore_names everywhere else, my code would be inconsistent, which to me is worse than using a style I don't like. So just because I could use underscore_names when I wanted to, there are a lot of reasons that I shouldn't.
A lot of those points also apply to JavaScript:
* JavaScript was designed with the use of semicolons in mind. Brandon Eich himself has said that ASI was only intended as an extra check for sloppy programmers.
* Outside the Ruby on Rails crowd, all the JavaScript I have ever seen uses semicolons. Even within the Rails crowd, this "no semicolons" thing is fairly recent.
* Since the majority of JavaScript syntax is intended to mimic Java syntax, which does require semicolons to separate statements, semicolons blend well with the language, and are therefore nice and idiomatic.
* There are situations where you have to use semicolons due to ambiguity to write straightforward code - there are workarounds, like tricks involving !, but they confuse the intent of the code.
One thing that I noticed is that most of the notable semicolon-haters - fat, mislav, and the GitHub guys - come from Ruby on Rails. Conveniently enough, Ruby does not require semicolons at the end of statements. I suspect this anti-semicolon fervor may come from a desire to use Ruby's conventions with JavaScript.
I think your camelCase example really nails it. Code should be written in a way that is both non-ambiguous and idiomatic to the existing codebase.
If you're in a position where you can define that idiom, then by all means do so. But stay consistent so that when others join a project, or you leave a project, the intent of your codebase is well understood without needing to wade through pages of documentation.
So, IMO, while semicolons are important, they're relatively trivial and an easy "bug" to fix. If I'm looking for an authority, I typically check out the Google Style Guides.
A similar, though more pressing issue that I've faced recently is the proper parenthization of conditions for if. It's generally nice when you don't have to spend a minute or two remembering/looking up the nuances of C++ operator precedence. Know all of the intricacies and tricks for a language doesn't mean every member of your team knows them that well either.
Your comment makes me wonder if this is localized issue, or a more general anti-pattern in polyglot programming. The pattern being: "push my favorite language into the other ones I use". It may or may not be covered by "you can write FORTRAN in any language", or some sort of corollary to Greenspun's 10th law.
As a counterpoint, we decided to with underscored_names for our JS since we work with a lot of serialized Python data structures via JSON and using the same naming convention on server as client means our JS doesn't have a mix of camelCase and under_scores for things received via JSON and things defined in the JS. We're not writing reusable libraries for other projects to use, and we don't use a heavy amount of 3rd party code, so it's no big deal.
Incidentally, we use CamelCaseMethodNames in Python at Google, presumably to be consistent between languages. If you're one company that uses multiple languages, it may be beneficial to use the same style in each, even if that style is non-idiomatic.
I've been shocked at the level of disrespect for language standards here. Yes: adding semicolons is probably good practice because it avoids the chance of stumbling over bugs like this. And yes: the ASI feature in Javascript is in hindsight a terrible mistake.
That said: you go to war with the language you have, not the one you might want or wish to have.
ECMAScript is ECMAScript. Arguing that your transformation tool doesn't need to handle a feature specified in the language and supported by all known implementations is just ridiculous. Arguing that people making use of a feature that the language standard says they can (and that works) are "sloppy" is equally dumb. Even weirder are the people who jumped on the use of the && operator to effect an "if" as "abuse" -- this is an idiom pervasive in lots of areas and (again) well-supported by Javascript.
Not everyone is going to have the same aesthetics, everyone has a different idea about what features are "fun" and which are "sloppy". And decades of experience in the "code convention hell" world of enterprise programming has taught us nothing if not that this sort of pedantry helps no one. If you want to use Javascript, you have to take the whole language -- warts and all.
This is not really about whether semicolons are required or not.
The point is that in order to ensure maintainability of code you should try to use language (and framework) in a way which ensures better maintainability, supportability, and portability of your code.
Look, the statement like "a && b" is 100% valid in many languages but in order to increase maintainability, supportability, and portability of your code it should be written like "if (a) { b; }".
The easiest way to understand the point of this rule is to get a job maintaing some old crappy code-base :) - I learned that way.
I agree with you to a certain extent, but I do think it is important to distinguish between those that think that they should just use a semi-colon and those that seek to disrespect a standard.
In my case, and I've seen this sentiment expressed repeatedly by others on HN, I think it is odd that a minification script would actively choose not to support a syntax that is standards compliant. On the other hand, I think it is crazy for a major project to use a valid syntax that not only breaks said [popular] minifier but also offers no noticeable benefit.
I do not feel as if I'm disrespecting any standard just because I think simply adhering to that standard is not a sufficient justification for doing something.
That said: you go to war with the language you have, not the one you might want or wish to have.
That's taking the quote out of context, which is saying a lot because it's a oft-quoted example of Donald Rumsfeld's tap-dancing. I doubt Mr. Rumsfeld was advocating driving around in Humvees as if they had armor. I'm sure he would laud the attempts of soldiers to improvise and mitigate the risks inherent in their equipment as much as possible.
And decades of experience in the "code convention hell" world of enterprise programming has taught us nothing if not that this sort of pedantry helps no one.
Over a decade of experience consulting in that world has shown me that adherence to code conventions has tremendous benefits. In shops where there was strict adherence to code conventions, I could be 10X or 100X more productive when refactoring using automated tools.
If you want to use Javascript, you have to take the whole language -- warts and all.
You need to justify this. This strikes me as a silly and counterproductive notion. Even in a tiny language like Smalltalk, you don't want to use, "the whole language -- warts and all," every chance you get. Hacky tricks have a cost. Just because you can implement an entire parsing system using doesNotUnderstand handlers, doesn't mean you really want to. (And yes, I've seen this happen in real life -- you really Do Not Want!)
Are you kidding? JSMin is open source. If you care so much, send a pull request. JS has plenty of weird corner cases. It's up to the author to decide whether he wants to spend time dealing with every single one.
Haven't yet seen anyone point out what unpleasantly quirky code this was in the first place:
!isActive && $parent.toggleClass('open')
It should have been written like this:
if (!isActive) {
$parent.toggleClass('open');
}
What if somebody needs to add a second bit of code to be executed if isActive is false? In the first case, they'd have to refactor the code into an if statement before adding it. It should have been an if statement in the first place.
If you want something to happen if something is true, then you should use an if statement. This is not controversial stuff. Don't look for "clever" ways to misappropriate other parts of the syntax in order to appeal to your own personal minimalist aesthetic taste. Be cooperative.
Edit: On re-reading this it comes off as preachy. In fact I've very recently taken a closer look at some of my own "little quirks" and realised how unhelpful they were for other developers. I guess I'm embarrassed about that and want to spread the embarrassment around.
Because all you see it as is an argument about syntax. It's actually an argument between the old guard and the up-and-coming hotshots.
Nobody really cares one way or another, and while it would take Twitter more time to append changes to their code than it would Crockford, my guess is the trouble for either would be negligible.
That said, this is very obviously a pride war between those who stick to convention and those who undermine it. The question isn't "should we use semicolons", it's "who is going to start dictating the direction Javascript goes from here on out?"
Clearly, both of these individuals want that spot, but if there's anything I've learned in my short time of coding, convention always wins out.
I started programming and hacking because i wanted to make something cool and interesting which others would appreciate, not to bicker about something as silly as semicolons. If i wanted to do that, i'd work in retail. When an argument over something so trivial gets to this level, i cannot help but be bored by it.
I think it's so boring because it's a one-sided debate.
One side chooses a style that they find aesthetically-pleasing even if it causes issues for some small subset of potential users.
The other side is flabbergasted that someone would be so reckless and argues for the sensible, safe option, which requires simply terminating your lines of code with an extra character, making those few issues for a small subset of potential users vanish instantly.
Actually, my understanding that there is nothing technical about this debate: the debate is about writing maintainable, supportable, and portable code versus "cool" code.
But I also agree this is "by far the most boring debate ever to hit HN" (without technical :).
I think this piece is a great place to end it, as it builds a solid pragmatic case.
It was interesting for me to learn about the possibility of using that kind of '!' notation in JS, even if it's impenetrable to most other developers. Maybe I'll be able to parse some other hipster's code thanks to this.
Of the sustained debates, I feel it is tied with the "my nosql is better than yours" debate. Both of them seem to boil down to "understand your tools, your use case may not map to the thing someone else is advocating for their use case".
Maybe. But I barely spent any time reading all the arguments and more time reading through and learning about the syntax of JavaScript, so it was probably more interesting for me than most people simply because I tuned out most of the boring bits. Also, I learned that Closure linter's fixjsstyle can add semicolons for you, which is nice to know.
But that's what I always do: scan for the interesting bits or move on.
There's so much FOR it, it's unbelievable. Yes, it's good to 'break the rules' now and then. Add the semicolon
This is not about if JS allows it or not. It's important, but mostly irrelevant.
You can also not add 'var' to JS variables unless needed. That's going to be a lot of fun when it goes wrong.
But here are the million dollar questions:
1 - How much time is spent making it work without a semicolon as opposed to just typing it? This is not about typing ';' - time for that is irrelevant, but the mental effort of doing so
2 - How much time will it be wasted to fix minifiers (that is, add to them the intricacies of no-semi colon parsing and probably having to write a whole different parser)
3 - How much mental effort is needed to comprehend and correctly fix non semicolon code
Number 3 is the biggest issue and if you don't believe me it goes by another name: coding standard
Yes, JS works without semicolons. And yes, C works without indenting, without meaningful names to variables, etc.
It's not about "to write JS you should know all the nitty-gritty rules of the language and you're stupid if you don't know them so you just add semicolons". It's about teamwork, and facilitating code comprehension (and maintenance).
And language designers make mistakes. They don't know if a feature is going to become a trap, irrelevant to 99% of developers (and with an easy workaround) or just a pain in the behind.
1. How many times are we going to have to debug and rewrite code to work around this defect? If it were just this once sure add the the ';', unfortunately it is never just this once.
2. How much time will be saved by using a minifier that actually supports the javascript language. What other language features will break because someone didn't feel like supporting javascript in their javascript minifier?
3. How much mental effort is needed to support half a dozen different minifiers that all support a different subset of the language as well as the cross browser differences we already have to take into account.
It is a matter of trust. If your tools skip supporting language features because someone decided they didn't like language feature X, what other corner cases did they skip because they were unliked?
Now, this could be easily solved – by adding the friggin semicolon.
What this furore misses is that the original issue (JSMin failing to minify bootstrap-dropdown.js) was already fixed when the bug was raised [1]. Fixed without adding semicolons. Everyone should be happy with that. Developers of bootstrap got to stick to their "no semicolons" schtick, and the person with the original problem got it fixed. Everyone seems to forget this salient point when they rush in to this debate.
For reference, I sometimes use semicolons in my javascript, sometimes I don't. It depends on context and whether it makes the code more readable. It's not an issue I care enough about to get involved in a holy war, one only marginally more relevant than tabs vs spaces.
That isn't really the original issue at all, Crockford's comment is just what brought it to a boilover. The lack of semi-colons has been brought up in issue after issue on the bootstrap project. Each time it's been rejected for the same arguably poor reasoning.
Christian's post is completely spot on here. Javascript was designed to be tolerant of errors and inconsistencies as much as it could. That fact however, shouldn't be used as an excuse for advocating inconsistent coding. Not that I'm saying semi-colons are the epitome of consistent coding (I prefer Ruby myself) but that Javascript was not designed with significant whitespace in mind, rather it just has a tolerance for inconsistent and arguably erroneous syntax.
The argument here is that we shouldn't let Javascript's tolerance excuse laxness on our parts. We should know better.
All the people arguing that the code as was presented should is good and right, I present this:
Everyone knows that debugging is twice as hard as writing a program in the first place. So if you are as clever as you can be when you write it, how will you ever debug it? ~Brian Kernighan
I couldn't tell that the second line in the code in question was an if statement at first without actually thinking about it. How is that helpful?
The author is right on the money. I've always written code as specified - just because it felt 'right' - but I could never pin down my arguments - the author captures my thoughts beautifully.
Particularly his points about reading other people's code & extending functionality - there are very few use-cases where it makes sense to omit semi-colons, end-tags, etc - and if you aren't sure - yours isn't one of those use-cases.
I have grown extremely weary at the level of discourse that this whole situation has provoked - the linked post is one ad hominem after another! What is this supposed to accomplish? Hopefully I can get the people I'm criticizing to change their ways by making them feel really bad about themselves? By telling them they aren't visionaries, they are semi-colons, they are arrogant, sloppy, lazy? It's destructive, self-indulgent, and completely unnecessary.
> The main issue with these parser-fetish arguments is that they assume that you write code for a parser – sometimes even a certain browser – and not for other developers.
That's an excellent and important point beyond the petty incident. You write code solely for humans, not for the compiler.
I think the real issue in the whole debate is that 95%+ of people programming do not understand how programming languages are implemented and haven't been exposed to basic theoretical stuff e. g. to the fact that languages can be ambiguous, that there are cases where it might be impossible to interpret a part of the code, that handling syntax errors is actually hard etc. This post is a good example, it completely misses the point, as the only reason semicolons are present in some languages is to make parsing possible. I think many people would not argue about this if they had a clue why programming languages syntax is the way it is.
This whole debate seems like two folks stuck in their way blowing a whole thing out of proportion.
How about Bootstrap adds the damn semi-colon and JSLint accepts that for the most part the lack of a semi-colon is working and "valid" JS add allows for the edge case. Now everyone gets to go home happy.
The author is picking and choosing who he wants in this argument of his. He has Douglas Crockford and Brendan Eich on the pro-trailing-semicolon side. On the other side he has @fat. If he wanted to be fair he could have included someone like @izs or Thomas Fuchs. But if he referenced their viewpoints on it, it would make it harder to pretend that all code that doesn't include trailing semicolons after every statement is brittle.
I think this whole semicolon story is a natural step of a language becoming more wide-spread. The more people spend time appropriating the language, the more they'll want to push the envelope, exploit the quirks and get the best out of the language's syntax.
It's a natural cycle. C programmers went through the very same phase at some point and to this day different coding styles persist.
It's only when those choices cause incompatibilities that friction emerges, but we should see it as a natural step towards a more unified grasp of what Javascript means for people who program with it. That the general tone of the conversation is antagonistic is just a symptom of the fact that people care about their opinions and their choices, which by all means should be seen as a very healthy questioning on the part of the community. Just my 2 cents.
TL;DR: Sure the image conveyed is bad, but the reasons why such a debate emerges are natural and are part of the evolution of a language, it'll get better.
To me this argument boils down to style over maintainability. The arguments for the latter seem so clear in my mind that I don't really want to dignify the other side with a response.
Every programmer should have had a professor who was really, inordinately fond of Ada, so much so that at least one assignment required coding in, or basic knowledge of, the language.
Ada is strict as balls, but unlike its wannabe-successor, C++, its strictness is for the sake of clairty and the compiler usually actively helps you bring your code into compliance. (Rather than, say, complaining randomly because the syntax for template instantiations changed this week, or punishing you for pronouncing the word "const" with the improper intonation.)
With a bit of exposure to Ada, programmers might understand better why languages are so finicky about syntax details, and that just because a language is lenient, doesn't mean you should take advantage of that leniency. And that, in cases like JavaScript with its bloody semicolons, perhaps leniency is a disadvantage.
I had a professor like that, he also was a signatory to the original ECMAscript standard. He also claims he came up with that name. Since they were spending too much time debating the name, he figured they'd call it that for now, and since it was such an awful name they would be sure to go back and change it.
I think the argument that we shouldn't rely on the parser for certain language features is a bit silly (including interpreting end-of-statements). The language is precisely what the parser says it is, and nothing more or less. JSMin is free to not do what the parser does of course, but that won't be Javascript.
So, if your code needs syntactical changing when it gets extended, to me you’ve optimized prematurely. Code will always change and making sure our maintainers have a hard time breaking it is a very simple and good idea.
In other words, a good programmer writes code for project maintainability, not to signal dominance by showing off knowledge of the parser. [+]
(Corollary: One's cleverness is always a finite resource. I'd rather work with someone who devotes that resource to what's good for the project, not his own fame and ego.)
[+] - Unfortunately, the way teaching CS sometimes works, students are rewarded for showing off clever and elite code every chance they get, to show the prof they're a "real" coder.
[+] [-] LeafStorm|14 years ago|reply
Now, I could use underscore_names in Java and JavaScript, but I don't. Even though I personally prefer underscore_names to camelCaseNames, I also realize that those languages are designed with camelCaseNames in mind, that the community conventions are for camelCaseNames, and that it is better to write code that looks nice and idiomatic in that language than it is to write code that looks nice and idiomatic in Python.
Not to mention that there are some cases where camelCaseNames are required - for example, when overriding inherited methods in Java - and if I used camelCase where required and underscore_names everywhere else, my code would be inconsistent, which to me is worse than using a style I don't like. So just because I could use underscore_names when I wanted to, there are a lot of reasons that I shouldn't.
A lot of those points also apply to JavaScript:
* JavaScript was designed with the use of semicolons in mind. Brandon Eich himself has said that ASI was only intended as an extra check for sloppy programmers.
* Outside the Ruby on Rails crowd, all the JavaScript I have ever seen uses semicolons. Even within the Rails crowd, this "no semicolons" thing is fairly recent.
* Since the majority of JavaScript syntax is intended to mimic Java syntax, which does require semicolons to separate statements, semicolons blend well with the language, and are therefore nice and idiomatic.
* There are situations where you have to use semicolons due to ambiguity to write straightforward code - there are workarounds, like tricks involving !, but they confuse the intent of the code.
One thing that I noticed is that most of the notable semicolon-haters - fat, mislav, and the GitHub guys - come from Ruby on Rails. Conveniently enough, Ruby does not require semicolons at the end of statements. I suspect this anti-semicolon fervor may come from a desire to use Ruby's conventions with JavaScript.
[+] [-] 140dbs|14 years ago|reply
If you're in a position where you can define that idiom, then by all means do so. But stay consistent so that when others join a project, or you leave a project, the intent of your codebase is well understood without needing to wade through pages of documentation.
So, IMO, while semicolons are important, they're relatively trivial and an easy "bug" to fix. If I'm looking for an authority, I typically check out the Google Style Guides.
A similar, though more pressing issue that I've faced recently is the proper parenthization of conditions for if. It's generally nice when you don't have to spend a minute or two remembering/looking up the nuances of C++ operator precedence. Know all of the intricacies and tricks for a language doesn't mean every member of your team knows them that well either.
[+] [-] sophacles|14 years ago|reply
[+] [-] wahnfrieden|14 years ago|reply
[+] [-] jrockway|14 years ago|reply
[+] [-] ajross|14 years ago|reply
That said: you go to war with the language you have, not the one you might want or wish to have.
ECMAScript is ECMAScript. Arguing that your transformation tool doesn't need to handle a feature specified in the language and supported by all known implementations is just ridiculous. Arguing that people making use of a feature that the language standard says they can (and that works) are "sloppy" is equally dumb. Even weirder are the people who jumped on the use of the && operator to effect an "if" as "abuse" -- this is an idiom pervasive in lots of areas and (again) well-supported by Javascript.
Not everyone is going to have the same aesthetics, everyone has a different idea about what features are "fun" and which are "sloppy". And decades of experience in the "code convention hell" world of enterprise programming has taught us nothing if not that this sort of pedantry helps no one. If you want to use Javascript, you have to take the whole language -- warts and all.
[+] [-] tlogan|14 years ago|reply
This is not really about whether semicolons are required or not.
The point is that in order to ensure maintainability of code you should try to use language (and framework) in a way which ensures better maintainability, supportability, and portability of your code.
Look, the statement like "a && b" is 100% valid in many languages but in order to increase maintainability, supportability, and portability of your code it should be written like "if (a) { b; }".
The easiest way to understand the point of this rule is to get a job maintaing some old crappy code-base :) - I learned that way.
[+] [-] courtewing|14 years ago|reply
In my case, and I've seen this sentiment expressed repeatedly by others on HN, I think it is odd that a minification script would actively choose not to support a syntax that is standards compliant. On the other hand, I think it is crazy for a major project to use a valid syntax that not only breaks said [popular] minifier but also offers no noticeable benefit.
I do not feel as if I'm disrespecting any standard just because I think simply adhering to that standard is not a sufficient justification for doing something.
[+] [-] stcredzero|14 years ago|reply
That's taking the quote out of context, which is saying a lot because it's a oft-quoted example of Donald Rumsfeld's tap-dancing. I doubt Mr. Rumsfeld was advocating driving around in Humvees as if they had armor. I'm sure he would laud the attempts of soldiers to improvise and mitigate the risks inherent in their equipment as much as possible.
And decades of experience in the "code convention hell" world of enterprise programming has taught us nothing if not that this sort of pedantry helps no one.
Over a decade of experience consulting in that world has shown me that adherence to code conventions has tremendous benefits. In shops where there was strict adherence to code conventions, I could be 10X or 100X more productive when refactoring using automated tools.
If you want to use Javascript, you have to take the whole language -- warts and all.
You need to justify this. This strikes me as a silly and counterproductive notion. Even in a tiny language like Smalltalk, you don't want to use, "the whole language -- warts and all," every chance you get. Hacky tricks have a cost. Just because you can implement an entire parsing system using doesNotUnderstand handlers, doesn't mean you really want to. (And yes, I've seen this happen in real life -- you really Do Not Want!)
[+] [-] WiseWeasel|14 years ago|reply
[+] [-] egonschiele|14 years ago|reply
[+] [-] h2s|14 years ago|reply
If you want something to happen if something is true, then you should use an if statement. This is not controversial stuff. Don't look for "clever" ways to misappropriate other parts of the syntax in order to appeal to your own personal minimalist aesthetic taste. Be cooperative.
Edit: On re-reading this it comes off as preachy. In fact I've very recently taken a closer look at some of my own "little quirks" and realised how unhelpful they were for other developers. I guess I'm embarrassed about that and want to spread the embarrassment around.
[+] [-] miahi|14 years ago|reply
This actually reminds me of some cool hacks with structure pointers and functions in the Linux kernel.
[+] [-] arrakeen|14 years ago|reply
[+] [-] tptacek|14 years ago|reply
[+] [-] dclowd9901|14 years ago|reply
Nobody really cares one way or another, and while it would take Twitter more time to append changes to their code than it would Crockford, my guess is the trouble for either would be negligible.
That said, this is very obviously a pride war between those who stick to convention and those who undermine it. The question isn't "should we use semicolons", it's "who is going to start dictating the direction Javascript goes from here on out?"
Clearly, both of these individuals want that spot, but if there's anything I've learned in my short time of coding, convention always wins out.
[+] [-] jamesu|14 years ago|reply
I started programming and hacking because i wanted to make something cool and interesting which others would appreciate, not to bicker about something as silly as semicolons. If i wanted to do that, i'd work in retail. When an argument over something so trivial gets to this level, i cannot help but be bored by it.
[+] [-] brown9-2|14 years ago|reply
One side chooses a style that they find aesthetically-pleasing even if it causes issues for some small subset of potential users.
The other side is flabbergasted that someone would be so reckless and argues for the sensible, safe option, which requires simply terminating your lines of code with an extra character, making those few issues for a small subset of potential users vanish instantly.
[+] [-] davesims|14 years ago|reply
"In any dispute the intensity of feeling is inversely proportional to the value of the issues at stake."
http://en.wikipedia.org/wiki/Sayre%27s_law
[+] [-] tlogan|14 years ago|reply
[+] [-] WiseWeasel|14 years ago|reply
It was interesting for me to learn about the possibility of using that kind of '!' notation in JS, even if it's impenetrable to most other developers. Maybe I'll be able to parse some other hipster's code thanks to this.
[+] [-] sophacles|14 years ago|reply
[+] [-] gnuvince|14 years ago|reply
[+] [-] lifeformed|14 years ago|reply
[+] [-] slantyyz|14 years ago|reply
It amazes me how much time people have spent arguing over this.
When I see an open source library that's not coded in the way I like, I either customize it to my liking or look for alternatives.
The only time I would really care about someone's programming style is when I work with them, otherwise live and let live.
[+] [-] ScottBurson|14 years ago|reply
Maybe the next time a language designer contemplates including such a feature as ASI, he/she will remember this furore and reconsider.
[+] [-] Cushman|14 years ago|reply
It boggles the mind that so many otherwise intelligent people have decided to take the time to form an opinion about semicolons.
[+] [-] Natsu|14 years ago|reply
But that's what I always do: scan for the interesting bits or move on.
[+] [-] raverbashing|14 years ago|reply
"but but but" add the semicolon
There's so much FOR it, it's unbelievable. Yes, it's good to 'break the rules' now and then. Add the semicolon
This is not about if JS allows it or not. It's important, but mostly irrelevant.
You can also not add 'var' to JS variables unless needed. That's going to be a lot of fun when it goes wrong.
But here are the million dollar questions:
1 - How much time is spent making it work without a semicolon as opposed to just typing it? This is not about typing ';' - time for that is irrelevant, but the mental effort of doing so 2 - How much time will it be wasted to fix minifiers (that is, add to them the intricacies of no-semi colon parsing and probably having to write a whole different parser) 3 - How much mental effort is needed to comprehend and correctly fix non semicolon code
Number 3 is the biggest issue and if you don't believe me it goes by another name: coding standard
Yes, JS works without semicolons. And yes, C works without indenting, without meaningful names to variables, etc.
It's not about "to write JS you should know all the nitty-gritty rules of the language and you're stupid if you don't know them so you just add semicolons". It's about teamwork, and facilitating code comprehension (and maintenance).
And language designers make mistakes. They don't know if a feature is going to become a trap, irrelevant to 99% of developers (and with an easy workaround) or just a pain in the behind.
[+] [-] stonemetal|14 years ago|reply
1. How many times are we going to have to debug and rewrite code to work around this defect? If it were just this once sure add the the ';', unfortunately it is never just this once.
2. How much time will be saved by using a minifier that actually supports the javascript language. What other language features will break because someone didn't feel like supporting javascript in their javascript minifier?
3. How much mental effort is needed to support half a dozen different minifiers that all support a different subset of the language as well as the cross browser differences we already have to take into account.
It is a matter of trust. If your tools skip supporting language features because someone decided they didn't like language feature X, what other corner cases did they skip because they were unliked?
[+] [-] LoonyPandora|14 years ago|reply
For reference, I sometimes use semicolons in my javascript, sometimes I don't. It depends on context and whether it makes the code more readable. It's not an issue I care enough about to get involved in a holy war, one only marginally more relevant than tabs vs spaces.
[1] https://github.com/twitter/bootstrap/issues/3057#issuecommen...
[+] [-] lucisferre|14 years ago|reply
Christian's post is completely spot on here. Javascript was designed to be tolerant of errors and inconsistencies as much as it could. That fact however, shouldn't be used as an excuse for advocating inconsistent coding. Not that I'm saying semi-colons are the epitome of consistent coding (I prefer Ruby myself) but that Javascript was not designed with significant whitespace in mind, rather it just has a tolerance for inconsistent and arguably erroneous syntax.
The argument here is that we shouldn't let Javascript's tolerance excuse laxness on our parts. We should know better.
[+] [-] unknown|14 years ago|reply
[deleted]
[+] [-] ehutch79|14 years ago|reply
Everyone knows that debugging is twice as hard as writing a program in the first place. So if you are as clever as you can be when you write it, how will you ever debug it? ~Brian Kernighan
I couldn't tell that the second line in the code in question was an if statement at first without actually thinking about it. How is that helpful?
[+] [-] lowboy|14 years ago|reply
[+] [-] bmj1|14 years ago|reply
Particularly his points about reading other people's code & extending functionality - there are very few use-cases where it makes sense to omit semi-colons, end-tags, etc - and if you aren't sure - yours isn't one of those use-cases.
[+] [-] latortuga|14 years ago|reply
[+] [-] ExpiredLink|14 years ago|reply
That's an excellent and important point beyond the petty incident. You write code solely for humans, not for the compiler.
[+] [-] stiff|14 years ago|reply
[+] [-] mikeryan|14 years ago|reply
How about Bootstrap adds the damn semi-colon and JSLint accepts that for the most part the lack of a semi-colon is working and "valid" JS add allows for the edge case. Now everyone gets to go home happy.
[+] [-] benatkin|14 years ago|reply
[+] [-] VeejayRampay|14 years ago|reply
It's a natural cycle. C programmers went through the very same phase at some point and to this day different coding styles persist.
It's only when those choices cause incompatibilities that friction emerges, but we should see it as a natural step towards a more unified grasp of what Javascript means for people who program with it. That the general tone of the conversation is antagonistic is just a symptom of the fact that people care about their opinions and their choices, which by all means should be seen as a very healthy questioning on the part of the community. Just my 2 cents.
TL;DR: Sure the image conveyed is bad, but the reasons why such a debate emerges are natural and are part of the evolution of a language, it'll get better.
[+] [-] n_time|14 years ago|reply
[+] [-] bitwize|14 years ago|reply
Ada is strict as balls, but unlike its wannabe-successor, C++, its strictness is for the sake of clairty and the compiler usually actively helps you bring your code into compliance. (Rather than, say, complaining randomly because the syntax for template instantiations changed this week, or punishing you for pronouncing the word "const" with the improper intonation.)
With a bit of exposure to Ada, programmers might understand better why languages are so finicky about syntax details, and that just because a language is lenient, doesn't mean you should take advantage of that leniency. And that, in cases like JavaScript with its bloody semicolons, perhaps leniency is a disadvantage.
[+] [-] rblatz|14 years ago|reply
[+] [-] Ramone|14 years ago|reply
[+] [-] EvilTerran|14 years ago|reply
"The parser is the spec" is a nasty design smell, IMO.
[+] [-] mike_esspe|14 years ago|reply
At least for me, mental switch between semicolon and no semicolon is hard for some reason, if it happens a lot during the day :)
[+] [-] stcredzero|14 years ago|reply
So, if your code needs syntactical changing when it gets extended, to me you’ve optimized prematurely. Code will always change and making sure our maintainers have a hard time breaking it is a very simple and good idea.
In other words, a good programmer writes code for project maintainability, not to signal dominance by showing off knowledge of the parser. [+]
(Corollary: One's cleverness is always a finite resource. I'd rather work with someone who devotes that resource to what's good for the project, not his own fame and ego.)
[+] - Unfortunately, the way teaching CS sometimes works, students are rewarded for showing off clever and elite code every chance they get, to show the prof they're a "real" coder.
[+] [-] Tichy|14 years ago|reply