top | item 7824570

Why Ada isn't Popular (1998)

118 points| Jtsummers | 11 years ago |adapower.com | reply

112 comments

order
[+] kibwen|11 years ago|reply
As a Rust contributor, I'm perpetually trying to find people who are familiar with Ada to comment on our design. I know nothing of the language except for that it's allegedly designed for both safety and efficiency, which is exactly the niche that we're targeting. If there's anything that Rust can learn from Ada, we'd love to know while we still have the leeway to make breaking changes.

While we're at it, I'd love to see some Erlang programmers take a look at Rust as well and tell us where we're falling down on concurrency. Sadly, both of these languages seem underrepresented in the circles that I frequent.

[+] vince_refiti|11 years ago|reply
Inviting others to critique your work. Great to see. I will have to have a serious look at Rust sometime then.
[+] brson|11 years ago|reply
We've talked to Tucker Taft about Ada and Rust before, though it was a while ago and I mostly recall that we discussed aspects of Ada's standardization process (I think Graydon was a fan of Ada's spec). Niko would probably remember more.
[+] airplane|11 years ago|reply
I don't know anything about Rust, but things I disliked about Ada (besides horrendous tooling support, free or paid) are:

- Complex language that isn't matched with an equally complex language standard (a lot of undefined behaviors or gaps where things are not explained as fully as they could be, but I don’t remember any specific instances, so maybe I'm crazy on this one)

- Too many situations with the syntax where you can do something only in certain situations, or the opposite. From what I see the only possible reason why in these bizarre situations is to lower the complexity of the compiler. For example, why can't I put a package instance in an array, when the generic package features are so strong that they can be pretty much (ab)used as object oriented structs. Try to search in Barnes' Ada book for terms like, only when, except when, however.

- Reuse of keywords that mean very different or slightly different things in one context versus another. An example would be delay, two lines of code might say delay 10, one actually means sleep the task for 10 seconds, the other means goto the next line of code if one of the previous points of entry in the lines above is called.

- The ability to tiptoe very easily around the safety mechanics of the language and start using it as if it was C, which is very tempting for two reasons: 1. Most programmers programming Ada were/are from a C background and the first things they pick up on are the ways to do things in Ada like you could in C while never moving on past that. It makes sort of sense to not move pass that if you aren't given time to fully digest the Ada way of doing things. 2. Takes less lines of code to program something up if you ignore the all the safety features that language can provide. I would have not allowed the ability to ignore all the things that make Ada stand out. It makes sense to not use Ada for any development, because programming C in Ada is going to be slower than programming C in C any day of the week.

- Hacked on OO. Can hardly find any Ada code online using OO language features added in Ada 95, but that makes sense because it's a bunch of keyword boilerplate on top of the existing records that's very unwieldy to use and ugly.

On the topic of Ada not taking off, I think one big reason why Ada can't take off the ground right now is that you can't find anything online about it. Try to search for some question you have, you'll find no answers, search the same question but pertaining to the Java or C version, and you’ll get a bunch of answers, blogs, even unanswered questions, Ada lacks that web presence greatly. Where are people going to talk about Ada and develop modern tools for it and frameworks in it if such places don’t exist? (comp.lang.ada is the only location on the internet with an active Ada community) It seems like that can only appear out of thin air if the language is new, where a sudden influx of many individuals feel like they can contribute where their energies are combined together, but if you get an individual here and there that wants to help out, but their energies are separated by one week or one month, then nothing for Ada will ever come about.

On the topic of concurrency, you may want to look into Ada on that too, concurrent structures are baked into the syntax of the Ada language, and can be quite complex/powerful/concise.

[+] lostcolony|11 years ago|reply
What are some good resources for Rust? I've been coding Erlang professionally for one and a half years now (spearheading a lot of our development), and will be continuing to work primarily in it for some time to come, and Rust is on my list of languages to look at. I don't suppose there's any equivalent to Erlang's Learn You Some Erlang For Great Good (I'd assume not given the language's still being in flux)?
[+] zem|11 years ago|reply
ats is also doing some very interesting stuff around adding algebraic datatypes and linear types to a c-level language.

http://www.ats-lang.org/

[+] pronoiac|11 years ago|reply
Did you know that Truman State University used Ada in its CS curriculum?
[+] jacquesm|11 years ago|reply
I think the biggest factor why Ada never 'made it' was that the compilers that were available were priced ridiculously.

One of the big drivers in language popularity in the long run is how many people can become familiar with a language by using it, and by having all the implementations be terribly expensive you're almost automatically limiting the number of people exposed.

I think companies that did this (much less so today than in the past) saw the languages as their profit centers, not the eco systems. Microsoft, Borland, Zortech and a whole pile of others were in a never ending battle to get the programmers on board. Every computer came with at least one programming language pre-installed (usually some form of BASIC). What you did after learning BASIC (and some of its limitations) was dictated by need and funds.

For most people that meant that their second language was some variety of assembler. For others to re-implement Forth (since that was doable). A C compiler was something you could still buy on a regular salary.

To get exposure to Lisp, ADA, Modula-2 or any one of a whole bunch of other high level languages required shelling out significant bucks. So during what we might refer to as my 'formative years' as a programmer none of these really nice languages made it to my computers.

[+] RogerL|11 years ago|reply
It really wasn't ridiculous. I know, from the point of view of your budget, it really was, but...

The compiler writers were serving a tiny sliver of a market. At the time it took a lot of resources to build such a compiler, and then you had to support a bunch of not-off-the-shelf, militarized and embedded chips. You had to past a huge suite of tests, and if a customer submitted a bug, well, you just had to fix it in a timely manner. I'm sure we paid a ton for our compiler, but they also ended up flying people to us regularly from NYC to DC. The compiler writers were all but on speed dial. I'm sure we got our money's worth out of it.

Now, what we could have gotten from the same amount of money applied to a C compiler plus extensive tooling, testing, and so on is an interesting question...

[+] colechristensen|11 years ago|reply
Something I see missing from computers these days (and lots of days previous) is a programming language designed for non-programmers.

The likes of Ruby and Python are _approachable_ but everything is a mess of libraries, RVM and lookalikes, dependancies, and a whole lot of complication. This is great for the target audience but less so for your average user.

Excel and the Microsoft like are currently filling the role, and doing so poorly.

The scientific community has things like R and MATLAB which do a good job for scientists.

Everybody else _can_ learn a language, but it can be daunting and it's definitely geared towards hackers and professionals.

In an ideal world all computers would ship with a native programming language, a big fat manual for users, and a set of software that's core design allows for and is meant to be programmed for everyday usage.

This language would have to be relatively simple, stable over time, and more monolithic than most. It would have to sacrifice speed and functionality for approachability, but if it were done well it could be world-changing.

Sadly the market is moving in the opposite direction (but in some sense the same direction too) Everybody these days is using an app with simple buttons which doesn't do much at all. It's successful because it's so approachable, but scary because it's so controlled by the vendor.

[+] atombender|11 years ago|reply
What about GNAT? I remember sniffing at Ada back in the mid-90s, and at that time GNAT had implemented most of Ada 95 (which had a lot of new OO features that seemed on par with Borland Pascal). I did not choose Ada because Borland was so much better -- IDE, big community, many libraries -- and then Delphi came along; Ada looked more advanced, but not fun.
[+] WalterBright|11 years ago|reply
It was also initially considered to be unimplementable by a number of compiler people (including me). We were wrong, but it was a while before a working Ada compiler was produced.

C had the great advantage that a C compiler would fit in MS-DOS's address space with some room left over for the symbol table.

[+] jacquesm|11 years ago|reply
What were your main reasons for thinking Ada could not be implemented?
[+] krazydad|11 years ago|reply
I started programming in the early 80s, and never used Ada. At the time, Ada had a reputation as "that over-bloated language that people in the defense department have to use." This perhaps underserved reputation is addressed in section 7 of the linked article.

From my point of view at the time, any language that the US Government would mandate as a required technology MUST be flawed. A language embraced by a large bureaucracy must be full of large bureaucratic nonsense. That was the thinking, at any rate. Ada, along with Cobol, was a frequent target of snarky jokes. In the mid-80s, the cool kids were programming in C (while being paid to program in Basic or Cobol or Fortran).

I freely admit these notions were likely borne out of ignorance, but I imagine this reputation stifled its adaptation outside of government circles. Would you want to use a language mandated by the DMV?

[+] waveman2|11 years ago|reply
One other factor I would mention is the massive uncoolness of Ada. This seems to come from three directions:

1. Invented by and for a bureaucracy known for its extravagance and wastefulness.

2. Verbosity of a degree only before seen in COBOL.

3. A focus on avoiding errors above all else. For people who have a positive/benefits focus like most hackers, this negative focus is very unappealing and provokes references to anal-retentive pedants and so forth.

I would agree with others who pointed out the other problem "won't run on a computer I can buy - and anyway I can't afford the compiler". This is not to criticise the compiler developers but nonetheless it was a problem.

http://en.wikipedia.org/wiki/Worse_is_better

[+] matthewjheaney|11 years ago|reply
Oh please, the comparison of Ada to COBOL is invidious. Ada simply requires you to be explicit about your intent. This is a feature, not a flaw. You mean to say that you've never been burned by an implicit conversion in C, or a misplaced semi-colon? Ada is no less verbose than Java.
[+] cafard|11 years ago|reply
I would mention here Richard Gabriel's essay "The End of History and the Last Programming Language", which you can find at http://www.dreamsongs.com/Files/PatternsOfSoftware.pdf

His argment that "Languages are accepted and evolve by a social process, not a technical or technological one." seems to apply here, also (in the context of the time) "Successful languages must have modest or minimal computer resource requirements."

[+] humpt|11 years ago|reply
I am a 25 years old student from one of the top CS engineering school of France. People who enter there usually have a heavy math background, but are fairly new to programming.

The first semester all the students are introduced to basic programming concepts, and the language that's used for that is Ada. Then Ada is used to teach algorithms and compilation class. In fact in the second year, we had a full-time project where we wrote a compiler for a language close to Java (in terms of syntax and features), in Ada.

I remember being in my python phase at the time, and bitching at Ada's verbosity. But I realized how comfortable it is to let the compiler do most of the debugging for you, especially on very large projects that are structured as a pipeline (eg. a compiler, every part depends heavily on the other). When it compiles, it works 99% of time. Static typing/subtyping, generic package, all the time spent in the console with GNAT yelling at you really teaches you how to structure, secure and bullet proof your code. Ada really is a great teacher!

[+] epsylon|11 years ago|reply
(Fellow ENSIMAG-er, I presume?)
[+] mraison|11 years ago|reply
Even though Ada is now completely forgotten by most people, it's worth mentioning that the GNAT compiler is being actively maintained by the company AdaCore (which core business, as the name suggests, is about Ada). http://www.adacore.com/
[+] matthewjheaney|11 years ago|reply
Right. Also note that GNAT is just GCC. The most recent language standard is Ada 2012. (I was involved in the design of the Ada standard container library, which originally appeared in Ada 2005.)
[+] matthewjheaney|11 years ago|reply
Glad to see that this old post of mine (originally posted on comp.lang.ada) has been revived on Hacker News!
[+] cicero|11 years ago|reply
I first learned Ada in a university programming languages class in 1984. It was chosen by our professor as the culmination of classic programming language ideas. We did not have access to a full Ada compiler, but we used "Janus Ada", which implemented a subset. I really liked it because it was very consistent and readable and you could create sophisticated data types, and the compiler strictly enforced their use.

A few years later (1988), I was working an a large military aircraft project that used Ada. We did a huge up front design with reams of design documentation. The first code that was written was a package of data types to ensure consistency. Unfortunately, there was very little Ada experience among the team and we sometimes designed ourselves into corners. Such problems were often solved by using "unchecked conversions" to get around the type enforcement, which defeated the purpose. The problems we had were similar to problems huge C++ projects would have a few years later.

The other problem we had was with the compiler. The government had placed high performance demands on the compiler, so it was highly optimized. Unfortunately, it would sometimes optimize away code that you needed. Fortunately, we could generate a code listing that interspersed the assembly output of the compiler with the corresponding Ada source code and thus find the problem. Usually adding an intermediate local variable to break up a complex calculation solved the problem.

In 1990, I got hired by a start up that was doing military contract work simulating radars for flight simulators. They had done all of their work in C, but now they had a contract that required Ada, so I became their "Ada expert." All of the C guys hated Ada because of its strictness. Ada wouldn't let you mix numeric data types without an explicit conversion, and that really chafed the C folks. It reminds me of today's dynamic/static typing debate.

The Navy ended up canceling the aircraft so they no longer needed our radar simulator. The next project I was on decided to use the new C++ language everyone was excited about. Ada was starting to fall out of favor, so waivers for the Ada mandate were getting easy to obtain.

At they time, I was thrilled to use C++. It seemed edgy compared to the bureaucratic nature of Ada. Now, however, I don't know if C++ was an improvement overall. It has some advantages, but also some disadvantages. I'm happy to see that the Rust guys are trying to learn from Ada. I was an inexperienced kid when I used Ada, so I am by no means an expert, but from what I do know, I think it still has something to teach us.

[+] bleair|11 years ago|reply
This is horribly pragmatic, but a reason that can contribute to a langugauge's success and popularity comes from how easy it is for new users to "pick it up", "find libraries that they find useful" and build working prototypes for "itches they want to scratch".

I'm making broad generalizations, but as examples, if you wanted to build a program to perform some numerical analysis or maybe build an analytical simulation you could pick fortran and you'd likely find libraries and examples to help you out. If you wanted to build a simple web page backed by a database you could grab php & some of its libraries and you would be quickly constructing web pages. With java you could find examples of tools / libraries for churning through databases and also plugging into various web application frameworks. If you wanted to write a PhD about computer languages you could use lisp :P. I've found great value in python personally because the examples were decent and more importantly it was easy to see how to build the programatic bridges into the other languages and libraries I wanted to use. I'd strongly argue that ruby and node.js popularity can be partially traced to seeing the "fun" and / or neat examples that are shown off in tutorials. They show how to leverage the constructs provided by each language's common libraries and the resulting programs are interesting / neat to some number of potential adopters.

In the 90s when I looked at ada there wasn't much in terms of libraries that I could leverage to explore problems that interested me at the time.

Again, not every language has to be ideally suited to writing video games or web pages to be useful. Ada the language might have outstanding academically interesting aspects. If it doesn't help me solve real world problems I care about though it's less likely I'll invest the time to learn about it.

[+] wting|11 years ago|reply
I like to categorize many features that lower initial difficulty as "deferred technical debt." There are type system features that ensure a higher degree of correctness, but it's not fun debugging compiler errors when you're trying to get something working.

For example, being forced to handle errors immediately via return codes or option types is not "fun". By comparison, exceptions act as a giant GOTO and no one blinks an eye.

Dynamic typing is also another form of deferred technical debt. It is preferable to handle type errors at runtime or through testing instead of at compile time. People who claim very few bugs are a result of type errors typically do not have experience with ADTs or stronger type systems than Java / C++ / C.[0]

> Most programmers think that getting run-time errors, and then using a debugger to find and fix those errors, is the normal way to program. They aren't aware that many of those errors can be detected by the compiler. And those that are aware, don't necessarily like that, because repairing bugs is challenging, and, well, sorta fun.

I don't think this attitude has changed in the past 16 years. People prefer to debug a run time stack rather than deal with compile errors, perhaps even more so with the rise of dynamic languages.

[0] Clojure community likes to argue that bugs arise from mutability more than type safety, but I don't have not experience to comment either way.

[+] waveman2|11 years ago|reply
To put it another way

"It's the ecosystem, stupid."

Apologies to Bill "It's the economy, stupid" Clinton.

[+] ellyagg|11 years ago|reply
My dad is a retired DoD engineer who worked extensively with Ada. Sent him this link, and this was his response:

Yeah, I liked Ada because it was very readable, and decomposition into smaller units was easy. I always liked the package concept.

Our compilations weren't all that slow. Maybe an hour or so for 500,000 lines of code. It was always best to compile units individually, not to build everything and compile all at one time. It could take days to get through compile-time errors that way. A lot were due to mismatching data types.

Once we tried to upgrade to a new version of the compiler, but couldn't get it to work with our legacy code. We had the vendor send us an engineer from Phoenix, and he could never get it to work, so we gave up and kept using the old compiler.

I hear Ada is still popular in Europe. There were a lot of useful improvements in Ada95 that eliminated some of the clunky features.

On large systems like ours we had subdivided the code into many separate functional units. Each unit consisted of a directory of multiple packages that all worked to perform a certain function. Each FCI directory would usually contain an interface spec and code for interfacing to any other FCI that needed to share data with it. There were a couple of ways to share data. One way was to send an FCI a message that we were putting data into shared memory, and then the other would grab the data. Another way was to just send a message containing the actual data. This worked ok for small amounts of data. I think to communicate with external devices at the lowest level we were calling c code.

Software engineering in a large project is actually more fun than in a small group I think. There's more activity and hustle and bustle. Also it does force a certain amount of discipline that one would rarely do on an individual basis. Some of the code reviews could be brutal. Or it least so it seemed.

Testing was always a challenge on the real hardware because it always involved multiple computer systems networked together sometimes with wireless devices, and it could be a challenge just to get them all stable and talking together, let alone getting your own code working. When we went to a test facility for a week or two to test we always kept our fingers crossed. Usually nothing would work for the first two or three days and then on Thursday and Friday the problems started to go away and were magically solved, and we could go home successful. It was always very nerve wracking because of course there was a whole management schedule that depended on it.

Of course the boring part was writing software requirements documents, software design documents, test procedures and the like. We weren't supposed to design while coding, and weren't supposed to write any code until our design was complete. But a lot of times we didn't know how to design it, so we would write code in advance and call it prototyping. But if we reused our prototype code we could get huge code counts during the code phase, because we had already written that during the design phase and charged it to design.

Since we got assessed partly by the amount of code we could write in a given time, sometimes you would find people, who instead of writing a subroutine would just duplicate the same code multiple times. It increased their code count. It really didn't pay to write really tight, highly efficient code because of the development time constraints. However, then during testing you might have to go back and fix it to perform better.

[+] acomjean|11 years ago|reply
> I think to communicate with external devices at the lowest level we were calling c code.

As someone who spent a few years maintaining the ada ->c code wrappers, our team would use I would say he's correct.

We used ada wrappers to make the system calls to c to do networking and unix ipc. While ADA can call c directly, we had to support HPUX and SUN so sometimes we had to write c code and call that (#IFDEF HPUX). This was a pain.

The package system was very good, especially at dividing work among a large team (30+ programmers).

We had a lot of fake external hardware simulators we used to test our code against. This worked well as long as our simulators worked like the actual external hardware. It did with some frustrating exceptions of course discovered during the final test.

Dealing with time is a pain in all languages (UTC vs GPS) I think sidereal time thrown in for fun.

I remember some of those procedures (design->code->integrate,formal test->ship) that DOD projects had. We counted SLOC (Lines Of Code), but thankfully our reviews never were based on them. Code reviews could be useful or just frustrating. I suspected somepeople were just difficult so they wouldn't be invited to as many of them

Interesting to hear others perspectives on it.

[+] davidgerard|11 years ago|reply
>Most programmers think that getting run-time errors, and then using a debugger to find and fix those errors, is the normal way to program. They aren't aware that many of those errors can be detected by the compiler. And those that are aware, don't necessarily like that, because repairing bugs is challenging, and, well, sorta fun. You are not giving a programmer good news when you tell him that he'll get fewer bugs, and that he'll have to do less debugging.

What. the.

[+] kd0amg|11 years ago|reply
Sure, it might sound crazy, but I've met plenty of programmers who were proud of using tools that make them do lots of extra work. It is a skill, even if it's not the most cost-effective way to get a working product.
[+] Tomte|11 years ago|reply
In university I actually tried to get into Ada, especially since I studied compilers with professor Ploedereder (great guy!) who was working on (and at some point chairing) the standardization in ISO. I loved the story about how the committee was in a seriously fight about whether Sunday or Monday should be the first member of the week enum.

It just wasn't my thing, although I deeply admire the language. There has been fantastic new stuff in the 2005 and 2012 versions. Like the Ravenscar profile for embedded and safety-critical systems.

And just like the "Young Lispers" did with Lisp, Ada had some sort of renaissance, where a lot of Ada stuff was going into Debian and young Open Source people dreamt of re-writing the Internet. :-)

I think it was partly the tool chain that repulsed me. There is really only one compiler in existence if you're a hobbyist: GNAT.

Nowadays Adacore are pretty open, having a big Open Source compiler download page and all, but back then, you could get GNAT, as shipped with GCC, or "big GNAT".

GNAT, as shipped in GCC, was always at least one version behind, and since Adacore always hired everyone who got familiar with the GCC frontend, it was highly dependent on Adacore. So no "second source" where GCC maintainers could step up if Adacore ever left. The Ada frontend just was a second class citizen.

"Big GNAT" was something everyone would have liked to use, but it was expensive. So that was out of the question.

Well, I got it, as part of a special University cooperation program, but only after signing that I would only use it for the assigned coursework, that I would never distribute it to anyone etc. pp.

Oh, and how did Adacore manage to keep it closed? I mean wasn't the predecessor, on which GNAT was based, GPL'ed?

Yes, and "big GNAT" also was. Kind-of.

Rumor has it that Adacore basically told all of their customers unofficially "sure, you're legally free to distribute it, it's GPL after all. Just don't be surprised if we don't pick up the phone for a few hours when you need support".

As far as I know it never really leaked. For some time I was actively looking for some "big GNAT" archive on the net, but never found one.

[+] jv2|11 years ago|reply
Ada's main popularity problem appears to me to be due to its advantages not being immediately obvious when writing small programs. For example, comparing a small program written in C and Ada the Ada version would appear extremely verbose to most C programmers.

Ada's advantages start to really show when you deal with much larger pieces of software; what appears verbose or overly-strict at a smaller scale provides valuable assistance when dealing with large codebases.

Unfortunately, many seem to dismiss Ada after doing little more than looking at small code examples and complaining about verbosity...

[+] nl|11 years ago|reply
We learnt Ada at University in the late 90s. It was OK, but nothing amazing.

Then I went into the real world, and !y first job was with Delphi (Object Pascal).

It's surprising no one discusses Delphi with Ada, because it had many of the same features with the same Pascal ancestry, but better tooling and aimed at commercial as opposed to defense work.

For a while it was very successful. It lost because of commercial reasons, but for a long time it was a better Visual Basic.

[+] acomjean|11 years ago|reply
I used Ada(95) a lot. All the radars built by the company I worked for are using it today.

Ada has a lot of thing going for it I miss. It was rubust. Custom types (this is an int from 0-100, it goes out of range throw an exception). Records (structs) were nicely implemented. The package system really worked well for large software.

It was interesing. It suffered a lot from running on systems that are written in c. Although you can make system calls through a wrapper it was odd, so we wrote packages to interface with the underlying c libraries we need to call (java did this as part of core). We had a lot of libraries to make the networking and unix ipc work the same across our hpux and sun systems. The displays were written in C as it just wasn't really feasable in Ada.

It never reached the critical mass to get great tools, so debugging could be a pain. It didn't have a lot of useful third party libraries that make a language powerful and fun.

When your ada compiled there was a good chance it would work.

I do miss parts of it. When I see Go code sometimes I get flashbacks.

[+] AceJohnny2|11 years ago|reply
I know this is tangent, but as someone who hasn't programmed Ada, I'm struck by how the superficial syntax of VHDL is similar to Ada.

See: http://en.wikipedia.org/wiki/VHDL#Design_examples vs: http://en.wikipedia.org/wiki/Ada_programming_language#Langua...

Same "use packagename". Same "entity foo is", same block definition by "begin [...] end entity"

Coincidence, or is there a historical reason for this?

Edit: The answer was in the VHDL article itself: "Due to the Department of Defense requiring as much of the syntax as possible to be based on Ada, in order to avoid re-inventing concepts that had already been thoroughly tested in the development of Ada,[citation needed] VHDL borrows heavily from the Ada programming language in both concepts and syntax."

[+] sitkack|11 years ago|reply
VHDL was modeled after Ada.

And yes, the languages look almost identical.

[+] javert|11 years ago|reply
So who here (at HN) is using Ada, what for, and do you expect it to become more or less widely used in the future?
[+] achille|11 years ago|reply
Until leaving Oracle about a year ago, I'd been doing a lot of development in PL/SQL, which is extremely similar and compiles to the same ADA bytecode (called DIANA).

Most of Oracle's E-Business Suite is written in PL/SQL, although some components are being re-written in Java (ADF) in the new Fusion Middleware stack.

If your company is using Oracle for payroll or financials (this includes lots of companies, including Apple, Google etc), it's likely some of the customizations have been developed recently, and they've been developed in PL/SQL. Many of the components in EBS have not changed in decades

So to answer your question, developers at Google right now are writing in Ada (well almost):

> https://www.google.com/about/careers/search/?#!t=jo&jid=1144...

It's not easy to rewrite either. Writing ERP software is extremely hard, because it can't be cleanly modeled in a few abstractions. There are a lot of edge cases. I would suggest reading Ward Cunningham's http://c2.com/cgi/wiki?WhyIsPayrollHard

[+] jmilkbal|11 years ago|reply
I use Ada everyday developing a predictive dialer and contact management system for call centers and it's web-based front-end. Interest has been steadily growing since I started with the language in 2006. One, possibly useless, indication is that the #ada channel on freenode used to hover in the single digits and now regularly hovers around 70. People regularly enter the channel asking how to get started.
[+] tjr|11 years ago|reply
We still have some Ada code lingering in a suite of avionics simulation tools; I am actually right now in the process of migrating our code to GNAT from a proprietary Ada vendor.

But we don't tend to write new code in Ada. My observation in the avionics / defense world is that Ada is being used less and less, in favor of C, C++, or sometimes Java. Even some old Ada code is being rewritten in other languages.

[+] tormeh|11 years ago|reply
I hear Kongsberg writes their missile software in Ada.
[+] Jtsummers|11 years ago|reply
Technically not the linked page's title (the delightfully uninformative: Ada - AdaPower.com - The Home of Ada), but the title of the post in it.