There's a certain irony that the presenter quickly dismisses exception-based error handling, and then the first example handles an error by printing a message and exiting -- exactly what an unhandled exception does.
This is more than a small piece of irony with a somewhat artificial example. Often, there simply isn't very much you can do with an error. In a SaaS world, you might not be able to tell the user what wrong, in order to not disclose some private information. You can log the error, but only if the network is available, and you can only log it to disk if the disk isn't full.
And then there are cases where you could try to explain the error to the user, but the reasons are very complicated and/or require intimate knowledge of the architecture to make sense, and/or require intimate subject knowledge to understand, or even obscure legal reasons.
I made this argument in the talk: the only problem with exception-based handling is the lack of explicitness. It's too easy to call functions without being aware of the set of possible errors. Many c++ projects disable exceptions entirely. Writing "exception safe" code is tricky and non-obvious. Functions which should be guaranteed to never fail often can throw std::bad_alloc. Try-catch syntax forces incorrect nesting.
Related: Only zig has fast errors with traces. Other languages have no traces or a high performance cost. Look up error return traces in the docs.
>There's a certain irony that the presenter quickly dismisses exception-based error handling, and then the first example handles an error by printing a message and exiting -- exactly what an unhandled exception does.
That would indeed be ironic if that case was the only thing exception-based error handling entails. That is, if every program just let all exceptions go uncaught. Which is nowhere near why exception handling was invented, or how it's used in practice.
I think it is safe to assume Zig fits into low level applications, staying closer to hardware, so competition should be with Go instead. SaaS could be out of the circle and often web based having less memory utilization.
I honestly think Zig has the potential to be the C/C++ replacement. I haven’t checked out Jai yet but will now that you mention, thanks!
This is somewhat subjective of course, but from what I’ve seen, Zig has just the right set of features to modernize systems programming, without making the language too complex or difficult to write (which arguably Rust’s “borrow checker” system does), and (like Rust) gets rid of some huge legacy language design mistakes most people agree on today (e.g. nullable-by-default pointer types, or no way to know at compile time or at a glance what range of exceptions a function may throw).
And of course, the “automatic” interoperability with C is an essential part of any C/C++ replacement contender.
My hypothesis is that Zig's compiler passes the equivalent of -march=native to the backend, which is why it should also be given to the C compiler to give a fair comparison (and a speedup of 30% or so).
The definition of 'perfect software' used in the talk is 'it gives you the correct output for every input in the input domain'. To that end no funny business like hidden allocations or hidden control flow should happen behind your back, because an out of memory error or some exception your code does not deal with explicitly would not be a correct output according to that definition. Of course you do not need that level of control for most projects.
While I agree with the content of your post literally, I think we often underestimate the importance of software reliability and performance, and end up giving it less attention than it deserves.
I understand the place for rapid prototyping etc., and that not every software application deals with life-and-death situations — but even those that aren’t, I think our industry suffers a bit here. For example, to this day the Windows 10 start menu refuses to open sometimes (randomly) when I click on it, even multiple times. You could argue that this isn’t a huge deal, because within 30 seconds it usually “fixes itself” (or something like that), but it still doesn’t shake the overall feeling that we’re tolerating way too much shoddy software in 2018 than we should.
Or in terms of performance: I know not every application needs bare-to-the-metal speed, but something feels wrong with the world when my “supercomputer” (compared to a 1990s PC, for example) literally lags when I’m typing or scrolling in some apps, when a 1990s era PC could respond to essentially the same content interaction with almost zero latency.
Some few decades ago, we had far more klunkier programming languages, far slower hardware, and yet somehow yielded better tangible/functional results in some cases. Therefore, I’m very much in favor of anything that moves us towards higher quality software, and Zig (and Rust, and others) are all exciting examples of that.
Zig has a bunch of runtime safety checking. It applies to divide by zero as well as integer overflow, using the wrong union field, and many more. The runtime safety checks are enabled in Debug and ReleaseSafe mode and disabled in ReleaseFast and ReleaseSmall mode. (Actually they are used as assertions to the optimizer so that it can rely on extra stuff being undefined behavior.)
Pointers can not be null. However you can have optional pointers which are guaranteed to use the 0x0 value as null and have the same size as normal pointers.
Many very bright people in the major sects of ML and Scheme tried to achieve perfection, and they have concluded, many times, that perfection implies a mostly-functional strongly (and, perhaps, even statically typed but with optional annotations only, like it is in Haskell) language, possibly with uniform pattern-matching, annotated laziness, and high-order channels, and select and receive in the language itself.
Such a language could be visualized as strict-by-default Haskell (with type-classes, uniform pattern-matching, minimalist syntax - everything, except monads) plus ideas from Erlang, Go and Ocaml.
Perfection and imperativeness, it seems, does not match.
Also, perfection and practicality do not match either, since imperative languages are the most practical for many applications. Pushing software toward perfection gives diminishing returns, and after some threshold, a company will have negative profit due to expensive development costs.
It is actually a nice informative video, but the title is as dumb as dumbness itself. Software should not be perfect. It should be useful. In places where perfection increases usefulness (s.a autopilot), go ahead make it perfect. In most cases, striving for "perfection" is a profound misallocation of resources.
However, I interpret the message from the video as "Let's make it really easy to achieve perfection".
In other words if we keep improving development tools and technologies, we may eventually be able to achieve perfection in each individual project nearly for free.
Upvoted since this is a useful comment and worth mentioning.
I'd expect some downvotes are based on negative reactions to this part of the comment: "the title is as dumb as dumbness itself." (Dear avip: if your comment said "the title is off-base" you would have made your point just as effectively and without the downvotes.)
I'll put it this way: "perfection" is too overloaded of a word to be particularly useful in this context.
I prefer to say it this way: I want software to adhere to a contract. That implies that we want people that use the software to understand that contract. To be more precise, I'd say that:
(1) a good contract defines the scope of correct behavior.
(2) a contract may (or may not) give some bounds (or constraints) about what happens outside of the scope of correct behavior
Yes, it gives the impression that the author thinks memory allocation errors are the only type of bug. Obviously there are thousands more, so it's kind of odd.
So... the perfect programming language has error-prone manual memory management and rampant undefined behaviour (well at least it can build the code to crash instead of “nasal deamons”)? Yeah right.
No one said the language was perfect. The language is to help you write perfect cod which was immediately defined at the start as code which does not produce errors on any valid inputs.
Oh come on, we already know that logical systems cannot even self reconcile it's own promises, as Gödel proved, now someone claim software should be perfect?!
Edit: the original title does not mention perfect.
Come on, a title like that isn’t meant to be taken too seriously.
And you’re reaching a bit far. You can write a perfect function that adds two 32 bit integers. There is a subset of code that can be written perfectly. Especially if you don’t need Turing completeness to write it.
Zig just tries it’s best to make it easy to write as much as possible of low level code in a perfect way.
Yeah. Original title is “Zig: A programming language designed for robustness, optimality, and clarity – Andrew Kelley” and “Software should be perfect” is much more sensational.
Javascript (programming language) and Browser (runtime environment) is a perfect piece of combination.
With JS, you have many choices of implementation.
With Browser, runtime error doesn't crash user's device.
[+] [-] perlgeek|7 years ago|reply
This is more than a small piece of irony with a somewhat artificial example. Often, there simply isn't very much you can do with an error. In a SaaS world, you might not be able to tell the user what wrong, in order to not disclose some private information. You can log the error, but only if the network is available, and you can only log it to disk if the disk isn't full.
And then there are cases where you could try to explain the error to the user, but the reasons are very complicated and/or require intimate knowledge of the architecture to make sense, and/or require intimate subject knowledge to understand, or even obscure legal reasons.
[+] [-] AndyKelley|7 years ago|reply
Related: Only zig has fast errors with traces. Other languages have no traces or a high performance cost. Look up error return traces in the docs.
[+] [-] coldtea|7 years ago|reply
That would indeed be ironic if that case was the only thing exception-based error handling entails. That is, if every program just let all exceptions go uncaught. Which is nowhere near why exception handling was invented, or how it's used in practice.
[+] [-] Ace17|7 years ago|reply
Does it still hold if you assume automatic resource (memory/locks/handles/etc.) freeing? (e.g RAII)
[+] [-] leowoo91|7 years ago|reply
[+] [-] visualstudio|7 years ago|reply
[+] [-] electrograv|7 years ago|reply
This is somewhat subjective of course, but from what I’ve seen, Zig has just the right set of features to modernize systems programming, without making the language too complex or difficult to write (which arguably Rust’s “borrow checker” system does), and (like Rust) gets rid of some huge legacy language design mistakes most people agree on today (e.g. nullable-by-default pointer types, or no way to know at compile time or at a glance what range of exceptions a function may throw).
And of course, the “automatic” interoperability with C is an essential part of any C/C++ replacement contender.
[+] [-] muthdra|7 years ago|reply
[+] [-] loeg|7 years ago|reply
[+] [-] tom_mellior|7 years ago|reply
My hypothesis is that Zig's compiler passes the equivalent of -march=native to the backend, which is why it should also be given to the C compiler to give a fair comparison (and a speedup of 30% or so).
[+] [-] Avamander|7 years ago|reply
[+] [-] mschwaig|7 years ago|reply
[+] [-] electrograv|7 years ago|reply
I understand the place for rapid prototyping etc., and that not every software application deals with life-and-death situations — but even those that aren’t, I think our industry suffers a bit here. For example, to this day the Windows 10 start menu refuses to open sometimes (randomly) when I click on it, even multiple times. You could argue that this isn’t a huge deal, because within 30 seconds it usually “fixes itself” (or something like that), but it still doesn’t shake the overall feeling that we’re tolerating way too much shoddy software in 2018 than we should.
Or in terms of performance: I know not every application needs bare-to-the-metal speed, but something feels wrong with the world when my “supercomputer” (compared to a 1990s PC, for example) literally lags when I’m typing or scrolling in some apps, when a 1990s era PC could respond to essentially the same content interaction with almost zero latency.
Some few decades ago, we had far more klunkier programming languages, far slower hardware, and yet somehow yielded better tangible/functional results in some cases. Therefore, I’m very much in favor of anything that moves us towards higher quality software, and Zig (and Rust, and others) are all exciting examples of that.
[+] [-] ajennings|7 years ago|reply
What about other run-time exceptions, like divide by zero? Are they checked?
What about Hoare's billion-dollar mistake (null pointer exceptions)? Does Zig have non-nullable references?
[+] [-] AndyKelley|7 years ago|reply
Pointers can not be null. However you can have optional pointers which are guaranteed to use the 0x0 value as null and have the same size as normal pointers.
[+] [-] unknown|7 years ago|reply
[deleted]
[+] [-] Tempest1981|7 years ago|reply
Of course many projects don’t use them, for various reasons.
I wonder: is turning on full compiler warnings, then fixing them - is it making my software better, or just satisfying some type of neuroticism?
[+] [-] clarry|7 years ago|reply
One warning I recently disabled on $workproject is -Wtrigraphs.
[+] [-] trustmath|7 years ago|reply
[+] [-] gbuk2013|7 years ago|reply
> Documentation is about 80% done
And yet there is not standard library documentation. :(
https://github.com/ziglang/zig/issues/965
[+] [-] throwaway487548|7 years ago|reply
Such a language could be visualized as strict-by-default Haskell (with type-classes, uniform pattern-matching, minimalist syntax - everything, except monads) plus ideas from Erlang, Go and Ocaml.
Perfection and imperativeness, it seems, does not match.
[+] [-] vortico|7 years ago|reply
[+] [-] unknown|7 years ago|reply
[deleted]
[+] [-] avip|7 years ago|reply
[+] [-] _cs2017_|7 years ago|reply
However, I interpret the message from the video as "Let's make it really easy to achieve perfection".
In other words if we keep improving development tools and technologies, we may eventually be able to achieve perfection in each individual project nearly for free.
Whether this is realistic or not, I do not know.
[+] [-] dj-wonk|7 years ago|reply
I'd expect some downvotes are based on negative reactions to this part of the comment: "the title is as dumb as dumbness itself." (Dear avip: if your comment said "the title is off-base" you would have made your point just as effectively and without the downvotes.)
[+] [-] dj-wonk|7 years ago|reply
I prefer to say it this way: I want software to adhere to a contract. That implies that we want people that use the software to understand that contract. To be more precise, I'd say that:
(1) a good contract defines the scope of correct behavior.
(2) a contract may (or may not) give some bounds (or constraints) about what happens outside of the scope of correct behavior
[+] [-] vortico|7 years ago|reply
[+] [-] tomp|7 years ago|reply
[+] [-] c3534l|7 years ago|reply
[+] [-] unknown|7 years ago|reply
[deleted]
[+] [-] justicezyx|7 years ago|reply
Edit: the original title does not mention perfect.
[+] [-] audunw|7 years ago|reply
And you’re reaching a bit far. You can write a perfect function that adds two 32 bit integers. There is a subset of code that can be written perfectly. Especially if you don’t need Turing completeness to write it.
Zig just tries it’s best to make it easy to write as much as possible of low level code in a perfect way.
[+] [-] TaylorAlexander|7 years ago|reply
[+] [-] Ericson2314|7 years ago|reply
[+] [-] viach|7 years ago|reply
[+] [-] revskill|7 years ago|reply
[+] [-] viraptor|7 years ago|reply
There's a large number of JS bugs and sandbox escapes in all browsers. They certainly can crash the user device.
[+] [-] pshc|7 years ago|reply
[+] [-] nerdponx|7 years ago|reply
[+] [-] clockbolder|7 years ago|reply
[deleted]