top | item 26302344

Giving Ada a Chance

177 points| jayp1418 | 5 years ago |ajxs.me

254 comments

order
[+] iamevn|5 years ago|reply
> I can’t help but think that complicated programming paradigms would seem more intuitive to beginners if taught through Ada instead of C and its derivative languages, as is common in computer science and engineering faculties worldwide.

At my university, the first courses you took in CS used Ada. I think it was a really good choice but I was in the minority I guess because after my year they switched to either using Java or Python depending on who taught which of the courses in that first series.

People found it frustrating how much work it'd take to get their programs to even compile but that's a good thing in my view. If it wasn't compiling, that was normally because the compiler found an error that would still be there at runtime in another language.

[+] MaxBarraclough|5 years ago|reply
> People found it frustrating how much work it'd take to get their programs to even compile

Makes sense if they're students working on small projects. Ada is explicitly designed to make large programs readable, and willingly trades off on writeability when the two come into conflict. It isn't going to shine if you're writing small 'single shot' applications, that isn't what Ada is for.

(Ada also commits to using many English language words where languages like C use symbols. SQL does this too. I'm not sold on the idea that this improves readability. Of course, there's far more to Ada than the skin-deep matter of its wordy syntax.)

This is similar to the readability/writeability tradeoff of moving from JavaScript (more writeable) to TypeScript (more readable and more refactorable). See the current discussion thread at [0].

Interesting reading (for some of us at least): the original Ada design rationale document, from 1986 [1].

[0] https://news.ycombinator.com/item?id=26314756

[1] https://apps.dtic.mil/dtic/tr/fulltext/u2/a187106.pdf

[+] bee_rider|5 years ago|reply
I help beginner python students. Python might be a nice easy scripting language for bashing out NUMPY scripts, but I'm beginning to suspect it is terrible for teaching. The "what type is this variable, and will this function automatically convert it for me" game is not very fun at all for beginners.
[+] ksec|5 years ago|reply
I am not sure if this is still the case today, but Computer Engineering in EE Department used to teach Pascal / Ada / C as the first programming language. With the expectation that making a program to compile correctly is hard. Before you move off to something like Perl / Java / Python.

While in CS they tends to start to Python or Java. And you start learning all the OO, Procedure or whatever paradigm before going into Web or other area of development with PHP. Where you get some results earlier.

I used to think EE way of teaching sucks. Old fashioned, not following the industry trend. Boring. Teaching you about OSI Models while CS students were already having some fun with the higher level Web Development.

Now I tend to think EE's way is far better.

[+] 7thaccount|5 years ago|reply
There's a lot of Python dislike on here, so I thought I'd add my anecdotal story.

I learned Basic, C, Assembly, and Matlab in college (in that order). However, I was never a very good programmer. After graduating, I bought an intro to Python book and read it cover to cover and did the examples. I then started writing scripts and it all kind of clicked. I found it really simple to build stuff. There's lists, tuples, dictionaries, functions, iterating, easy branching...etc. There wasn't a whole lot to remember. I did find it confusing at first why some things used function syntax min(a) and other things used object oriented notation like mylist.sort(), but it wasn't too hard to remember all the stuff I actually needed. I was able to start using classes when I felt I was ready.

Since then I've read books on and played around with Clojure, Common Lisp, Haskell, APL, Forth, Prolog, Ada, Powershell, Perl, Bash, Awk, Julia, C#, SQL...etc. During that time, I've found that Python holds it's ground in being very expressive, while easy for others to understand and performant enough for most uses with libraries like Numpy. Each time I try to move to a "grown-up" language like Java, I'm shocked at how verbose and clunky everything is. Sure it's great for production systems, but it's awful at just getting stuff done.

Python is used almost exclusively by all engineers at my company (some C#) in hundreds of automation scripts. Some programs are 10,000+ lines of code and are still maintained and easily read by others. We're mostly electrical engineers too, so everyone picked up coding on their own and we find most codebases are still pretty consistent.

I think Ada is a neat language for high reliability systems, but I don't think I'd choose it as a teaching language due to a lot of the friction with getting simple programs to work and wrestling with types. I've had very few issues with types in Python. I generally just use string, int, and float conversions when needed.

[+] jandrese|5 years ago|reply
When I went to school the instructor talked a bit about Ada, but as far as I know there weren't any compilers available that mere students could afford. This was in the mid-90s.

I remember doing a bit of comparison of the syntax between languages, and Ada's lack of anything like a switch/case statement stood out, but the instructor did talk up the idea that if you could get it to compile it would run in Ada, assuming you didn't hit a compiler bug. Apparently getting the compilers working properly was a problem at the time.

[+] throwawayboise|5 years ago|reply
> after my year they switched to either using Java or Python

They do this because of intense negative feedback from students who don't like having to learn languages for which there is no job market.

[+] agumonkey|5 years ago|reply
I think there's a pedagogic void here. Everybody enjoys quick returns. be it a lisp repl, php f5 refresh or bash .. the desire to have clear mental model of your program state comes after (unless you have both brain power, talent and/or passion for that)
[+] ardit33|5 years ago|reply
what school did you go to? My university (Radford) did teach Ada as well the first year, way back in 1999, and switched to Java after it.

I liked Ada a lot, but Java gave me employment after school.

[+] d3ntb3ev1l|5 years ago|reply
Ada was also taught at my Comp Sci program back in 1987
[+] Lucretia9|5 years ago|reply
A lot of Universities and company's were bribed by Sun to start using Java.
[+] kibwen|5 years ago|reply
In answer to what appears to be a misunderstanding about Rust:

> Its foreign function interface seems particularly poorly implemented. The official Rust documentation suggests the use of the external third-party libc library (called a 'crate' in Rust parlance) to provide the type definitions necessary to interface with C programs. As of the time of writing, this crate has had 95 releases. Contrast this with Ada’s Interfaces.C package, which was added the language in Ada 95 and hasn’t needed to change in any fundamental way since.

Rust's libc crate isn't third-party, it's first-party, developed by the Rust project itself: https://github.com/rust-lang/libc/ . It's also not just for type definitions necessary to interface with C programs; here's the first heading and first paragraph of its README:

"libc - Raw FFI bindings to platforms' system libraries"

"libc provides all of the definitions necessary to easily interoperate with C code (or "C-like" code) on each of the platforms that Rust supports. This includes type definitions (e.g. c_int), constants (e.g. EINVAL) as well as function headers (e.g. malloc)."

The fact that this library contains low-level type definitions for every platform that Rust supports explains why it's had more than one release: new platforms get added, platforms add new interfaces, and platforms change the definitions of existing interfaces (possibly incompatibly, which explains why this isn't in the standard library).

> It lacks basic features necessary for the task, like bitfields, and data structure packing.

The latter is achieved via the built-in `repr(packed)` attribute (https://doc.rust-lang.org/nomicon/other-reprs.html#reprpacke...) and the former is provided by the bitflags crate: https://crates.io/crates/bitflags (while unlike libc this does not live under the rust-lang org on Github, it does live under its own org which appears to be populated exclusively by Rust project team members).

[+] ajxs|5 years ago|reply
Regarding `repr(packed)`: Thank you for posting this. I really like Rust's official documentation on the subject. I stand corrected regarding Rust's support of structure packing. The following statement is a little troubling however: "As of Rust 2018, this still can cause undefined behavior." This greatly affects Rust's suitability for bare-metal programming, where you very often require control over a structure's layout in memory at bit-level granularity.

Regarding bitfields: At the risk of sounding a little old-fashioned, I don't like the idea of having to import external packages to provide these kinds of fundamental features. The article hints as much. It might be a bit of a culture clash however I feel that learning the different styles and interfaces of a bunch of external packages is an extra, undesirable cognitive burden imposed on the developer. Plus, "A macro to generate structures which behave like bitflags" (The crate's official description) doesn't sound very robust to me. It sounds like precisely the kind of hack that a future release could break.

In fairness, it should be mentioned that the C standard does not guarantee the layout and order of individual bitfields either (refer to section 6.7.2.1 paragraph 11 of the C1X standard). Even though the usage of bitfields is common in C, it's not without its issues.

[+] aw1621107|5 years ago|reply
> the former is provided by the bitflags crate: https://crates.io/crates/bitflags

The linked crate appears to implement an unrelated feature. c2rust-bitfields [0] or rust-bitfield [1] might be better examples.

Rust's working on adding native support for bitfields, but it seems that that might be a while out at this time [2, 3]

[0]: https://crates.io/crates/c2rust-bitfields

[1]: https://github.com/dzamlo/rust-bitfield

[2]: https://github.com/rust-lang/rfcs/issues/314

[3]: https://github.com/rust-lang/rfcs/pull/3064

[+] ajdude|5 years ago|reply
This was a good read. I instantly recognized the title of the textbook that is mentioned in the blogpost? I own it!

Having a background in C, I went back and forth with Ada for years, without really jumping all in. In the last couple years in particular, with the growing popularity of Rust, I started to renew my interest.

I'm reminded of a popular reddit thread on r/Ada-- someone called Rust a "toy language", which prompted the valid response that Rust is being used in a lot of commercial products lately. The response[0] kind of brings home the caliber that Ada is capable of, starting with: > Rust being used in commercial products isn’t really the same ballpark as what I’m talking about. It’s not even the same game.

It seems like the easiest way to trend on HN is to make a post such as "<old software> rewritten in Rust", but each time I see more and more people advertising Rust, I just wonder why Ada didn't get the credit it deserved as being absolutely bulletproof. Eventually, I came across an "Ada Manifesto" of sorts [1] that finally pushed me to "put my money where my mouth is" and start going all in with the language. (the same author of that "Manifesto" maintained a "Should have Used Ada"[2] series for a while that points out just how using Ada could have stopped certain security vulnerabilities from ever being a problem in the first place)

Ada is anything but dead and there's a lot of interesting things coming out for the 202x specification. I hope to see community enthusiasm grow as people begin to shift their interest more and more to safe languages.

[0]: https://old.reddit.com/r/ada/comments/js6edd/regarding_this_... [1]: https://old.reddit.com/r/ada/comments/7p12n3/going_allin_wit... [2]: https://annexi-strayline.com/blog/

[+] trott|5 years ago|reply
> I just wonder why Ada didn't get the credit it deserved as being absolutely bulletproof

It's not. All languages make trade-offs in the performance-convenience-safety-etc space, and Ada's choice is not "100% safety". It lacks memory safety and has holes in its type system: https://www.enyo.de/fw/notes/ada-type-safety.html

[+] mcculley|5 years ago|reply
Is Unchecked_Deallocation still necessary? If so, then it is hard for me to take seriously any claims of Ada's safety.
[+] UncleOxidant|5 years ago|reply
I really like Ada's ranged types (VHDL has them also - it inherited them from Ada). You can say:

    type OperatingTemp is range 33 .. 90;
And then declare variables of that type and they will be range checked - an exception will be thrown if the variable goes out of that range. Wish more languages had this feature.
[+] throwaway894345|5 years ago|reply
I agree. I learned Ada in university (my professor was on the Ada committee) and used it for an embedded development course. The language was really good (many fewer foot guns than C or C++; clearly Ada was holistically designed), but I recall having issues with the tooling and the ecosystem, and the community was super defensive and hostile. If something wasn't working for your use case and the community didn't have a solution, then your use case was invalid. If you politely asked for help, you were incompetent or too lazy to figure it out on your own. This was a decade ago, so maybe things have changed since, but I can honestly understand why the language is niche. The type system and other intra-language features are very interesting but ultimately not the most important aspects of a programming language.
[+] drfuchs|5 years ago|reply
Ranged types are directly from Pascal. You can also use them to specify the valid indexes for an array (so a given array might be zero-based, or one-based, or 1900-based, or even -32768-based). A very positive feature, I agree.
[+] thesuperbigfrog|5 years ago|reply
I like the "clean feel" of Ada's syntax: it combines the elegance of Python with a bit more structure and does not suffer from Python's significant whitespace issues.

The so-called "Ada comb" structure that is used for packages, subprograms, and even declare blocks makes it easy to find what you are looking for because it makes the source code more regular.

The "Ada comb" is formed by the shape of the source code with the subprogram header / declare, begin, exception, and end forming the "teeth" of the comb and the rest of the source code indented between the "teeth":

function Square_Root (Arg : Float) return Float is

    -- local variables declared here
begin

    -- program work here
exception

    -- exception handling here
end Square_Root;
[+] dmh2000|5 years ago|reply
I took my first university course in Ada in 1981. I also worked for a company that provided an Ada runtime in the mid 80's.

Ada was sabotaged early on because it was 'mandated' by the DOD for new programs. That meant that all the usual suspects, like Lockheed, GD , TI (I don't remember exactly which ones) came up with Ada compilers and runtimes that cost on the order of $10K per seat. The typical military contractor ripoff. So it was impossible for individuals or small companies to use Ada on their own dime. It was only feasible if the cost was rolled into a larger (bloated) defense contract. So it couldn't get a following. Of course much later on free versions became available but it was too late.

That said, Ada was absolutely no fun to program with. It was awkward and verbose. I hated it from the get go, compared to the alternatives. If it is so super why is rarely used.

[+] roland35|5 years ago|reply
I went to a summer program at the US Air Force Academy and took a short class using Ada with Lego mindstorm robots. Having never done any programming at the time beyond BASIC (thanks to QBASIC and my TI-83+ calculator), I really enjoyed learning Ada. The instructor talked about the safety aspect of Ada which went over my head at the time :)

I have done a lot of embedded programming in C over the years and while the reality is most embedded programmers know C best, I am starting to think we would be better served to try a new language with better features such as Ada or Rust. C++ is nice as well, but has it's own set of problems when used for embedded programming.

[+] kevin_thibedeau|5 years ago|reply
Nim has some influence from Ada in its type system and has much of the same feeling that the language is a power tool at your disposal. They have been making changes to make it useful without GC for embedded targets.
[+] the_only_law|5 years ago|reply
I love Ada, unfortunately it’s real world use seems to be relegated to old legacy code. I’d like to use it a little more on the side, but I also need to keep my priorities focused on realism, which sadly means ignoring Ada and learning something like C++ which seems unapproachable from any angle.

Ada also seems to have a weirdly negative rep in many circles it seems. I recall looking around for an Ada compiler for a moderately popular platform and came across and old thread where people didn’t give any options but instead just joking about how the OP was interested in such a terrible language. Maybe it’s the Pascal/Algol type syntax?

[+] Bostonian|5 years ago|reply
His discussion of Ada (not all upper case -- can the HN title be changed?) is interesting, but his criticism of Fortran, although colorful ("eldritch"?) , is vague and likely uninformed. I program in Fortran 95 a lot and can easily understand my code years later.

"Admittedly, I had pictured Ada’s syntax resembling the uncompromising verbosity and rigid construction of COBOL, or perhaps the Lovecraftian hieroglyphics of Fortran’s various eldritch incarnations."

[+] Sniffnoy|5 years ago|reply
> The writing is on the wall: Ada is here to stay.

OK, this has no relation to the actual content of the article, but I have to point out that that is not what the phrase "the writing is on the wall" means.

[+] UncleOxidant|5 years ago|reply
In the original biblical usage a Babylonian king held a feast and suddenly a disembodied hand wrote: MENE, MENE, TECKEL, PARSIN on the wall, which translated to:

"Mene: God has numbered the days of your reign and brought it to an end."

"Tekel: You have been weighed on the scales and found wanting."

"Peres: Your kingdom is divided and given to the Medes and Persians.”

So you're right, it doesn't connote that something will endure, but that something will end.

[+] ajxs|5 years ago|reply
Author here. Thank you for pointing this out! I actually did not know that this phrase had such a specific meaning. I had incorrectly inferred from other usage that the phrase meant that the fact in question was supported by evidence to a degree that warranted no further debate. I feel a little embarrassed for the misuse. However I'm all the better off for having been educated on its proper use! Thank you.
[+] shannongreen|5 years ago|reply
> It is possible to define a struct type in C with bit-fields for the individual elements, however the C standard does not guarantee the layout and order of the individual fields.

As a professional embedded developer who uses bitfields to access registers every day, this doesn't really make a practical difference. On any bare-metal or embedded project you will rely on the behaviour of your compiler, and portability is largely irrelevant if you're accessing memory-mapped registers. Probably, the manufacturer has already provided register maps using bitfields anyway.

[+] hashmash|5 years ago|reply
Having direct control over this type of thing is important when updating the fields of a persistent data structure. I've had to deal with mistake before, where the original developer thought the layout matched what they specified, but the actual layout that got persisted didn't match. For compatibility, the broken layout stuck around forever, and special rules were required to detect this.
[+] butterisgood|5 years ago|reply
Ada shouldn't be all caps I don't think.

(I always expand that to the American Dental Association)

[+] stsmwg|5 years ago|reply
Or the Americans with Disabilities Act.
[+] Tomte|5 years ago|reply
I suppose the HN software changed it and the submitter didn't notice (or didn't care). It happens silently after submitting, unlike the "13 too long" message that blocks submission.

Many of those rewrites are benign, even good, many others are stupid and infuriating.

Unfortunately there is no way to have yourself declared to be a "well-known submitter with a history of not editorializing or outrage-optimizing submission titles" and get this thing switched off.

In this case there have been so many discussions and submissions about the American with Disabilities Act that it sounds plausible to be such an automatic rewrite.

[+] jkw|5 years ago|reply
Or Cardano
[+] Kototama|5 years ago|reply
Except if you access the website through the TOR network.
[+] JulianMorrison|5 years ago|reply
I think the thing that's been under-sold in Ada is the way you can get so much at compile time as properties of things, like, you can say "this type counts between 1 and 30" and then later refer to "the largest value of this type" as a loop bound, say, in a way that won't break if you later set the maximum to 40. And hides the fact it's implemented as an unboxed primitive integer.
[+] protomyth|5 years ago|reply
I tend to like Ada, but it is a tiring language to read with the all caps. Also, it 'feels' like it has a gatekeeper group and really doesn't come up in any mobile conversation. I still believe someone will do something akin to a syntax substitution and come up with a well liked language.

Also, modern Fortran is not that bad of a language much like the modern parts of C++.

[+] Jtsummers|5 years ago|reply
The community is definitely not helping, or has not helped. It seems to be improving but I've seen a lot of negative responses to novices that read like some of the things I saw in the 00s when trying to get into Common Lisp. Some individuals are willing to push past this, for others it's a deal breaker.

The community is becoming more open (AdaCore has been very helpful here), I think, but you still have a fair amount of vocal gatekeepers that are going to continue to keep people out (deliberately or not).

[+] jhbadger|5 years ago|reply
The tradition of all caps keywords (just a convention, not required as others have noted) was common in most languages in the era before most editors had syntax highlighting. It wasn't meant to be "tiring" but rather the opposite, to make reading code easier. Of course modern syntax highlighting makes the convention obsolete.
[+] henrikeh|5 years ago|reply
> [...] tiring with the all caps.

I guess there is more to like then, since the keywords are case insensitive. The OP doesn’t even use all caps.

[+] Lucretia9|5 years ago|reply
All caps haven't been a part of the language since Ada83, go check the follow up spec, Ada95. Maybe you're thinking of Niklaus Wirth's languages, the Modula's and Oberon?
[+] drannex|5 years ago|reply
> the Lovecraftian hieroglyphics of Fortran’s various eldritch incarnations.

This is one of the best lines I've read in awhile, gave me a good chuckle. Thanks for that.

[+] TomMasz|5 years ago|reply
I took an Ada programming course when I worked at a defense contractor in the late 80s. Coming from C and Pascal it seemed familiar enough to learn quickly but was overkill for what we were doing at the time. I left soon after and I have no idea if they ever actually adopted it.
[+] cartoonfoxes|5 years ago|reply
No discussion of Ada is complete without referencing the cost of commercial development licenses. They're expensive. Very expensive.

Any enthusiasm for this language is inevitably quashed upon encountering the $$,$$$ per-seat price of the compilers for the absolute bare-bones x86 version. It's more if you want to target non x86. I pester AdaCore for info every few years and while they've dropped a little bit, they're still out of reach for companies not in the Fortune 500. I'd love to use SPARK, but I don't see that happening any time soon.

[+] coliveira|5 years ago|reply
Ada syntax is closely related to Pascal and Algol. If you like these languages, then you will enjoy Ada. Unfortunately many people prefer the tenseness of C++.
[+] nevster|5 years ago|reply
We did Ada at UNSW in the early 90's but for only one subject - parallel programming. I think many dining philosophers starved during the assignments...
[+] walshemj|5 years ago|reply
" Lovecraftian hieroglyphics of Fortran’s various eldritch incarnations."

I take it the Author hasn't seen any APL code.