top | item 42510616

(no title)

kvark | 1 year ago

I disagree with the first point. Say, the compiler figured out your missing semicolon. Doesn't mean it's easy for another human to clearly see it. The compiler can spend enormous compute to guess that, and that guess doesn't even have to be right! Ever been in a situation where following the compiler recommendation produces code that doesn't work or even build? We are optimizing syntax for humans here, so pointing out some redundancies is totally fine.

discuss

order

lisper|1 year ago

> Doesn't mean it's easy for another human to clearly see it.

Why do you think that matters? If it's not needed, then it should never have been there in the first place. If it helps to make the program readable by humans then it can be shown as part of the rendering of the program on a screen, but again, that should be part of the work the computer does, not the human. Unnecessary cognitive load is still unnecessary cognitive load regardless of the goal in whose name it is imposed.

Calavar|1 year ago

In languages (both natural and machine languages) a certain amount of syntax redundancy is a feature. The point of syntax "boilerplate" is to turn typos into syntax errors. When you have a language without any redundant syntactical features, you run the risk that your typo is also valid syntax, just with different semantics than what you intended. IMHO, that's much worse than dealing with a missing semicolon error.

layer8|1 year ago

The full stop at the end of your last sentence isn’t strictly needed. It isn’t even strictly needed in the preceding sentences (except for the first one with the question mark), because the capitalization already indicates the beginning of the next sentence. We still use full stops because redundancy and consistency help preventing errors and ambiguities. Reducing the tolerances to zero increases the risk of mistakes and misinterpretations.

Ideally, adding/removing/changing a single character in valid source code would always render the code invalid instead of “silently” changing its meaning.

dwattttt|1 year ago

Aside from the other good points, this thread is about cognitive load. If a language lets you leave off lots of syntactic elements & let the compiler infer from context, that also forces anyone else reading it to also do the cognitive work to infer from context.

The only overhead it increases is the mechanical effort to type the syntax by the code author; they already had to know the context to know there should be two statements, because they made them, so there's no increased "cognitive" load.

naasking|1 year ago

> If it helps to make the program readable by humans then it can be shown as part of the rendering of the program on a screen, but again, that should be part of the work the computer does, not the human.

That means you and the compiler front-end are looking at different representations. Sounds doesn't sound like a good idea. Keep it stupid simple: all of our source control tools should work on the same representation, and it should be simpler and portable. Well defined syntax is a good choice here, even if there's some redundancy.

nullstyle|1 year ago

> then it can be shown as part of the rendering of the program on a screen

I disagree with this, and can most easily express my disagreement by pointing out that people look at code with a diversity of programs: From simple text editors with few affordances to convey a programs meaning apart from the plain text like notepad and pico all the way up to the full IDEs that can do automatic refactoring and structured editing like the Jet Brains suite, Emacs+Paredit, or the clearly ever-superior Visual Interdev 6.

If people view code through a diversity of programs, then code's on-disk form matters, IMO.

glitchc|1 year ago

No, as python and other languages amply demonstrate, the semicolon is for the compiler, not the developer. If the compiler is sophisticated enough to figure out that a semicolon is needed, it has become optional. That's the OP's point.

taormina|1 year ago

But the language spec for Python is what allows for this, not the compiler. \n is just the magic character now except now we also need a \ to make multiline expressions. It’s all trade offs, compilers are not magic

nazgul17|1 year ago

It's not that the semicolon is somehow a special character and that's why it's required/optional. It's the context that makes it necessary or not. Python proves that it's possible to design a language that doesn't need semicolons; it does not mean that e.g. Java or C are well defined if you make semicolons optional.

sokoloff|1 year ago

If it’s in the language spec as required there and I’m using a compiler that claims to implement that language spec, I want the compiler to raise the error.

Additionally offering help on how to fix it is welcome, but silently accepting not-language-X code as if it were valid language-X code is not what I want in a language-X compiler.