I never tire of comparing Lua with Javascript, and this is another one of those cases. A semicolon in Lua is simply a character which is illegal to place anywhere but the end of a statement.
There is no insertion stage, the language parses just fine on a single line, or any whitespace you choose. Semicolons are vanishingly uncommon in Lua code, but if you want to compress a couple statements onto a line, they signal your intention.
Lua is like what Javascript would be if it were: based on Pascal, written by resource-constrained Brazilians, and capable of making backwards-incompatible changes. Though with the de-facto fork between Lua and LuaJIT, the latter has become much more difficult, as it should at some point in the maturing of a language.
I'm not sure Lua is that different than Javascript here. What is a semicolon in Javascript if not a character you can only place at the end of a statement? There are some similar ambiguities in the Lua parser if you insert a newline into the middle of a statement.
The difference is that Lua is quite a bit fussier about statements versus expressions, and won't allow things like "x() or y()" as statements. So the Javascript example, translated to Lua, wouldn't compile with or without a semicolon.
The real surprise here is that people still use JSMin, written by an opinionated grump who apparently does not understand the concept of using a real parser. Several superior alternatives exist. UglifyJS is better in every respect.
You can get by perfectly fine without semicolons. Going further, if you're using a linter like eslint, I think you can even configure it to give you a warning when the missing semicolon could cause problems.
The biggest problem I have is the "leaders" forgetting the beginners. The whole semicolon thing took JS from "end a statement with a ;" to:
THE RULES:
In general, \n ends a statement unless:
1. The statement has an unclosed paren, array literal, or object literal or ends in some other way that is not a valid way to end a statement. (For instance, ending with . or ,.)
2. The line is -- or ++ (in which case it will decrement/increment the next token.)
3. It is a for(), while(), do, if(), or else, and there is no {
4. The next line starts with [, (, +, *, /, -, ,, ., or some other binary operator that can only be found between two tokens in a single expression.
You have a junior dev now that's 2 steps behind because he's worrying about self-imposed "beatifying" edge cases instead of getting his code to run.
Not using semicolons has never be a problem for me.
Just like everyone else I put them where they are needed. I don't put them where they aren't needed, because they aren't needed and adding pointless syntax noise is dumb. If I forget to put them in somewhere they are needed, as anyone might do, I add them. No problems.
Except that there actually was a problem in this case as evidenced by the surrounding discussion. If the code had been written
with a semicolon (or an if statement!) from the beginning, there never would have been a problem. You can argue that it wouldn't have been a problem if JSLint had been written "correctly" from the beginning, but every tool has bugs.
Code defensively, and don't get fancy unless you need to. You're not just complying with the language spec, you are communicating your intent to the other developers who will read and maintain your code.
Which programming language first used a semicolon as a statement delimiter? Did they choose the semicolon because it was easy to type on a QWERTY keyboard? Why not end a statement with a period like a sentence?
A while back I think there was a github issue where the authors of some popular library refused to use semi-colons and it was causing all sorts of problems for people. I can't remember what library, but it was funny watching hundreds of people comment on the issue to say "Just use the damn semi-colons!"
This has never really struck me as a good argument for using or not using semi-colons.
1.) The non-use of semi-colons was perfectly valid JavaScript in that case.
2.) The parser portion of the minifier from Douglas Crockford did not account for this particular case.
3.) It became a flamewar because Douglas Crockford called it out as an error in a very ham fisted way, and the developer responded in kind. If a c++ compiler chokes on valid input, then a bug is sent to the compiler maintainer, and the developer rewrites their code until the bug is fixed.
4.) And that's exactly what happened. The minifier parser was fixed, and the Bootstrap code was updated to work around the parser error.
5.) The reason for not using semi-colons was legitimate. This is a library with a large audience, and not everyone has a minifier/compresser available. It was an easy win for something that wasn't needed. That's exactly the type of stuff that I want when I lean on 3rd parties.
So for me this could have gone a completely different way had the two parties been more civil:
BS: Hey DC, can you fix your minifier, it looks like it's busted on this valid input.
DC: Hrm, looks like your right. I'll fix it up, why don't you add a semi-colon in there until I can get it fixed
BS: Hey, no problemo. Done. I look forward to your fix!
So this is really more of a story about being professional to your fellow developers than anything that has to do with semi-colons or not.
I see these warnings a lot, but I work on a team that does not ever use semicolons, and hasn't for years. It is simply not a problem, I don't know what else to say. And I have come to really dislike the syntactic noise the semicolons add.
[+] [-] samatman|11 years ago|reply
There is no insertion stage, the language parses just fine on a single line, or any whitespace you choose. Semicolons are vanishingly uncommon in Lua code, but if you want to compress a couple statements onto a line, they signal your intention.
Lua is like what Javascript would be if it were: based on Pascal, written by resource-constrained Brazilians, and capable of making backwards-incompatible changes. Though with the de-facto fork between Lua and LuaJIT, the latter has become much more difficult, as it should at some point in the maturing of a language.
[+] [-] outworlder|11 years ago|reply
Think Blizzard. WoW uses Lua heavily. Though I suspect that niche will eventually be taken away from Lua too, which is unfortunate.
[+] [-] tedunangst|11 years ago|reply
The difference is that Lua is quite a bit fussier about statements versus expressions, and won't allow things like "x() or y()" as statements. So the Javascript example, translated to Lua, wouldn't compile with or without a semicolon.
[+] [-] ScottBurson|11 years ago|reply
(For those who don't know the reference: [0], #3.)
[0] http://www.cs.yale.edu/homes/perlis-alan/quotes.html
[+] [-] tel|11 years ago|reply
[+] [-] DonPellegrino|11 years ago|reply
[+] [-] marijn|11 years ago|reply
[+] [-] implicit|11 years ago|reply
It would be nice if it were deleted from the language entirely.
[+] [-] pcwalton|11 years ago|reply
[+] [-] TheAceOfHearts|11 years ago|reply
You can get by perfectly fine without semicolons. Going further, if you're using a linter like eslint, I think you can even configure it to give you a warning when the missing semicolon could cause problems.
[+] [-] lbotos|11 years ago|reply
THE RULES:
In general, \n ends a statement unless:
1. The statement has an unclosed paren, array literal, or object literal or ends in some other way that is not a valid way to end a statement. (For instance, ending with . or ,.)
2. The line is -- or ++ (in which case it will decrement/increment the next token.)
3. It is a for(), while(), do, if(), or else, and there is no {
4. The next line starts with [, (, +, *, /, -, ,, ., or some other binary operator that can only be found between two tokens in a single expression.
You have a junior dev now that's 2 steps behind because he's worrying about self-imposed "beatifying" edge cases instead of getting his code to run.
[+] [-] rgrannell1|11 years ago|reply
Just like everyone else I put them where they are needed. I don't put them where they aren't needed, because they aren't needed and adding pointless syntax noise is dumb. If I forget to put them in somewhere they are needed, as anyone might do, I add them. No problems.
[+] [-] wallyhs|11 years ago|reply
Except that there actually was a problem in this case as evidenced by the surrounding discussion. If the code had been written with a semicolon (or an if statement!) from the beginning, there never would have been a problem. You can argue that it wouldn't have been a problem if JSLint had been written "correctly" from the beginning, but every tool has bugs.
Code defensively, and don't get fancy unless you need to. You're not just complying with the language spec, you are communicating your intent to the other developers who will read and maintain your code.
[+] [-] IshKebab|11 years ago|reply
[+] [-] panic|11 years ago|reply
[+] [-] cpeterso|11 years ago|reply
[+] [-] WorldWideWayne|11 years ago|reply
[+] [-] dsp1234|11 years ago|reply
1.) The non-use of semi-colons was perfectly valid JavaScript in that case.
2.) The parser portion of the minifier from Douglas Crockford did not account for this particular case.
3.) It became a flamewar because Douglas Crockford called it out as an error in a very ham fisted way, and the developer responded in kind. If a c++ compiler chokes on valid input, then a bug is sent to the compiler maintainer, and the developer rewrites their code until the bug is fixed.
4.) And that's exactly what happened. The minifier parser was fixed, and the Bootstrap code was updated to work around the parser error.
5.) The reason for not using semi-colons was legitimate. This is a library with a large audience, and not everyone has a minifier/compresser available. It was an easy win for something that wasn't needed. That's exactly the type of stuff that I want when I lean on 3rd parties.
So for me this could have gone a completely different way had the two parties been more civil:
So this is really more of a story about being professional to your fellow developers than anything that has to do with semi-colons or not.[+] [-] slig|11 years ago|reply
[+] [-] tedunangst|11 years ago|reply
[+] [-] swang|11 years ago|reply
[+] [-] serve_yay|11 years ago|reply
[+] [-] BrendanEich|11 years ago|reply
Do you use ; at end of var declarations? If not, beware; if so, do you use comma-first? Thanks for answers, just curious.