I would guess that Ada is simply more known. Keep in mind that tech exploded in the past ~3.5 decades whereas those languages are much older and lost the popularity contest. If you ask most people about older languages, the replies other than the obvious C and (kind of wrong but well) C++ are getting thin really quickly. COBOL, Ada, Fortran, and Lisp are probably what people are aware of the most, but other than that?
Turbo Pascal could check ranges on assignment with the {$R+} directive, and Delphi could check arithmetic overflow with {$Q+}. Of course, nobody wanted to waste the cycles to turn those on :)
I would argue that was one of the reasons why those languages lost.
I distinctly remember arguments for functions working on array of 10. Oh, you want array of 12? Copy-paste the function to make it array of 12. What a load of BS.
It took Pascal years to drop that constraint, but by then C had already won.
I never ever wanted the compiler or runtime to check a subrange of ints. Ever. Overflow as program crash would be better, which I do find useful, but arbitrary ranges chosen by programmer? No thanks. To make matters worse, those are checked even by intermediate results.
I realize this is opinioned only on my experience, so I would appreciate a counter example where it is a benefit (and yes, I worked on production code written in Pascal, French variant even, and migrating it to C was hilariously more readable and maintainable).
weinzierl|4 months ago
Can't tell you what the current state is but this should give you the keywords to find out.
Also, here is a talk Oli gave in the Ada track at FOSDEM this year: https://hachyderm.io/@oli/113970047617836816
afdbcreid|4 months ago
There were some talks about general pattern type, but it's not even approved as an experiment, not to talk about RFC or stabilization.
pjmlp|4 months ago
For some strange reason people always relate to Ada for it.
jdrek1|4 months ago
Veliladon|4 months ago
18 year old me couldn't appreciate how beautiful a language it is but in my 40s I finally do.
sehugg|4 months ago
prerok|4 months ago
I distinctly remember arguments for functions working on array of 10. Oh, you want array of 12? Copy-paste the function to make it array of 12. What a load of BS.
It took Pascal years to drop that constraint, but by then C had already won.
I never ever wanted the compiler or runtime to check a subrange of ints. Ever. Overflow as program crash would be better, which I do find useful, but arbitrary ranges chosen by programmer? No thanks. To make matters worse, those are checked even by intermediate results.
I realize this is opinioned only on my experience, so I would appreciate a counter example where it is a benefit (and yes, I worked on production code written in Pascal, French variant even, and migrating it to C was hilariously more readable and maintainable).
nicce|4 months ago
vacuity|4 months ago
[0] https://news.ycombinator.com/item?id=45474777