Well, it did eliminate some kinds of coding (the part where human produces machine code from block diagrams) and debugging (the part where you look for errors in the above translation.)
Just a day ago I've mentioned that the printf equivalent that existed in FORTRAN as early as 1956 was able to do the type checking of the parameters and the compile-time code generation, versus the run-time interpretation as in C's printf.
Yes, and the paper distinguishes between programming and coding. Step one is "Analysis and Programming", step two is "Coding." It's step two that FORTRAN virtually eliminates.
Being currently involved in GPGPU research, these descriptions remind me of the state we are in for GPGPU computing. CUDA, OpenCL and now OpenACC have been steps towards a higher-level abstraction of stream computing and every time a new framework / language bubbles up, the inventors praise it as the end of coding close to the machine.
Half a century later, people are still overly optimistic about software development. According to recent studies, this is true for everything (and society at large is actually in favor of it).
Yet it is particulary visible in computing. Why is that?
Side note: bullshit in marketing brochure is here to stay.
Half a century later the change isn't from assembly to an actually useful level of abstraction. The bullshit-o-meter is damped further when half the document shows code examples that would be much more difficult in assembly.
What's overly optimistic about saying that interpreted languages will virtually eliminate the need to hand-debug machine code? Isn't that what happened?
If you limit yourself to the types of programs that were created before FORTRAN existed, then this might be true. But of course as capability increased, demand for more complicated program increased just as fast (or faster?)
Thankfully, they were right - it did. I've known a lot of C programmers, C++ programmers, Python programmers, and Java programmers; only a tiny handful of those programmers actually knew how to "code" (that is, write machine code.) FORTRAN, and the interpreted languages that came after it, really did "virtually eliminate coding and debugging."
Please mind that the report was written in 1954. The real complexity of programming and computing in general was not fully understood at that time. (I'm confident that we still do not understand it in its entirety, however.)
[+] [-] praptak|14 years ago|reply
[+] [-] acqq|14 years ago|reply
http://groups.engin.umd.umich.edu/CIS/course.des/cis400/fort...
Just a day ago I've mentioned that the printf equivalent that existed in FORTRAN as early as 1956 was able to do the type checking of the parameters and the compile-time code generation, versus the run-time interpretation as in C's printf.
http://news.ycombinator.com/item?id=3964475
[+] [-] KC8ZKF|14 years ago|reply
[+] [-] Peaker|14 years ago|reply
[+] [-] DeepDuh|14 years ago|reply
[+] [-] jhrobert|14 years ago|reply
Yet it is particulary visible in computing. Why is that?
Side note: bullshit in marketing brochure is here to stay.
[+] [-] maclaren|14 years ago|reply
[+] [-] koeselitz|14 years ago|reply
[+] [-] mrgoldenbrown|14 years ago|reply
[+] [-] ericHosick|14 years ago|reply
However, people will still need to know how to program (AI not being a factor).
[+] [-] josefonseca|14 years ago|reply
I think that by coding, you mean typing? In your example, we won't be eliminating coding, we'll just be entering code using a different language.
[+] [-] koeselitz|14 years ago|reply
[+] [-] lifthrasiir|14 years ago|reply
[+] [-] Edootjuh|14 years ago|reply
[+] [-] unknown|14 years ago|reply
[deleted]
[+] [-] unknown|14 years ago|reply
[deleted]