A lot of reminiscing and feeling old, no actual content:
I did a lot of my early programming on BASIC6, since I grew up nearby and my school gave us DCTS accounts. It was basically the same language as BASIC(1): variables could only be two digits long and the result of an IF/THEN could only be a GOTO. I still loved the language, even though it was objectively garbage compared to BASIC7 and BASIC8 which had already been developed. They had cool things like proper subroutines, and using them would probably have been a lot more fun. The interesting thing was that the language was actually compiled, not interpreted like crummy PC BASIC. This made it (relatively) screaming fast.
I wrote my first big program in BASIC6. It was a chat application called ECHO that was a clone of something called Xcalibur that the college kids used. It was awful spaghetti code. I remember using up line number 10,000 and panicking because I could no longer find intermediate line numbers in big chunks of the code (I started counting by 10s and then gradually filled in a lot of the intervening lines. There was no renumbering procedure I was aware of.) All that speed came in handy since there was no concept of threading in BASIC6 and my program had to sequentially handle commands from all the users. People hated it with a burning passion.
One time I found John Kemeny’s email and showed him my amazing BASIC6 monstrosity. He wasn’t very happy about that.
It's amazing how much you can do with just a few simple statements.
Creating a computer language simple enough for novices to learn in an afternoon but powerful enough to write numerical code, games, and other software was an extraordinary accomplishment, especially in 1964.
The utopian vision of empowering non-experts to unlock the power of computing by writing their own interactive programs for their own purposes is also groundbreaking. (They also expanded access to the BASIC system by developing a time-sharing OS and deploying terminals in multiple locations, which meant the system could be used for synchronous (chat) and asynchronous (email, bulletin board) text messaging and collaboration.)
Every modern computing device with a web browser includes JavaScript, but I can't help but imagine what the computing landscape might look like if JavaScript in the browser were as visible and easy to use as BASIC systems were, from the 1960s to the 1990s. Why doesn't Firefox come with a button you can press to open up a user-friendly programming tab, e.g. with a REPL, a canvas/textarea, a code editor, and the ability to easily share programs and games over the internet?
I think you can (still) learn the basics (haha) of Python/Smalltalk/Logo/Scratch/Lisp/Matlab/JavaScript/etc. in an afternoon, but it's hard to beat BASIC's straightforward simplicity.
I also miss the days when
print "Hello, world!"
was a valid program in both Python and BASIC.
(Note John McCarthy started the work that led to LISP when he was at Dartmouth, but then promptly moved to MIT. Imagine if Dartmouth had gone with LISP instead of BASIC...)
Did anyone notice how in the code listings, the O's had slashes through them to distinguish them from zeroes, instead of the other way around? I wonder when it became a standard to slash the zero.
I think I've seen some older content where the convention was reversed like this. Wikipedia's article on slashed zero claims that IBM and some other mainframes put their slash on the letter O instead of the number 0. However, in a bit of circularity, its citation (1) is this same BASIC manual. Checking an IBM manual (2) for punching cards, neither 0 nor O has a slash. The manual is undated, but documents the model 26 keypunch from 1949 and not the model 29 keypunch from 1964, so it's probably older than this BASIC manual.
In the 60's there was no consistent standard for this. I remember this being written about in the CACM 1967. Those with access to old copies of the Communications of the ACM can take a look at [1]. Here is the title and abstract:
Towards standards for handwritten zero and oh: much ado about nothing (and a letter), or a partial dossier on distinguishing between handwritten zero and oh, by R.W. Bemer
The Chairman of the ACM Standards Committee, Julien Green, has charged me with making “more effective use of CACM for communication … to get grass-roots opinions from the ACM membership.” This paper is the first attempt.
A partial dossier on distinguishing between handwritten zero and the letter oh is assembled here. This presentation was triggered by a request for guidance in this matter presented by the United Kingdom Delegation to ISO/TC97/SC2, Character Sets and Coding, at the meeting in Paris on 1967 March 13-16. The matter is just now in the province of USASI X3.6, to which comments might be directed.
Comments will be expected within sixty days [by approximately October 1st].
That's a tough question. The slash-0 (for decimal zero) had been in use long before the BASIC and the ASR-33 came along. (In ham radio for one.) Dartmouth or Teletype may have had a reason for trying this ... but it didn't last!
[Edit] OK, think I found the culprit.
"The rule which has the letter O with a slash and the zero without was used at IBM and a few other early mainframe makers; this is even more of a problem for Scandinavians, because it looks like two of their letters at the same time." -- https://simple.wikipedia.org/wiki/Zero#Telling_zero_and_the_...
I remember in 1976 we were using mark sense cards to write programs, and we definitely had to slash the letter O. I think we also had to slash the Z to distinguish it from a 7.
At least as early as 1977 the Commodore PET and the Apple ][ had "slashed zeros"; followed by the Atari 400/800 in 1979, the BBC Micro in 1981 and the Sinclair ZX Spectrum and Commodore 64 in 1982:
Many years ago I adopted the personal convention of using a slash for the number zero and a little curly cue at the top of the capital letter O. By using both in my handwriting where they could be confused, I was free to use just an unadorned letter or digit in situations where context made it clear.
That seems like it must be a mistake. How could things have flipped 180 degrees like that? I'll need to investigate other programming books from this era...
EDIT: I do appreciate the Game of Thrones look it gives some of the pages. Especially the error messages like "T00 MANY L00PS"
It’s a really nicely written manual, clearly targeted at the novice.
Appendix B (Limitations on BASIC) gives a rule-of-thumb for the maximum length of a program as "in general about 2 feet".
It was evidently possible to create a complex program in 2 feet as they make the suggestion (on page 46) that a GOTO or IF might jump to a REM statement.
However, some unusual advice is given on page 4 which might have caused confusion: Because “line numbers also serve to specify the order in which the statements are performed by the computer … you could type your program in any order”.
I have this weird problem where bitsavers.org never works for me. I think they've blocked my ISP (Aussie Broadband), or some routing issue, etc. It isn't a new thing, it has consistently been like that for many months. The trailing-edge.com mirror works fine.
Reminds me the first time I visited Dartmouth campus not so long ago and went to the Baker Library, I saw the real one ! I then met with Petra Bonfert Taylor working at the Thayer school of engineering and we produced this award winning C programming with Linux MOOC (on edx.org) Dartmouth is so inspiring !
I read that many non-computer departments used this BASIC system to provide "online" training for their departments. Classics departments like History and English etc. It was incredibly liberating when folks got the computer involved in their non-computer specialties.
The tidbit I didn't know was that (like ForTran) this BASIC system didn't require spaces on a line. I expect it recognized the keywords rather than looking for a space, and then comparing what word they found with a list of keywords.
Is the source code for the original BASIC available? What kinds of techniques did they use ? I think I read it was an actual compiler. Does anyone know what limitations were attached to BASIC from that compile-only implementation?
My first foray into programming was QBasic and my "manual" was reverse engineering GORILLA.BAS because as a curious 11 year old I just had to make that banana more explosive.
This article as others have said was more a good trip down memory lane :)
My first programming "job" (i.e. something I did for someone else, no money was exchanged) was removing all sound code from GORILLA.BAS so that it could be used in the back row of an otherwise boring class.
Funny how everywhere in the manual, the letter 'O' has a strike-through (slash), while the digit '0' does not. I have seen lots of slashed zeros in my experience, but I've never seen the reverse.
[+] [-] matthewdgreen|5 years ago|reply
I did a lot of my early programming on BASIC6, since I grew up nearby and my school gave us DCTS accounts. It was basically the same language as BASIC(1): variables could only be two digits long and the result of an IF/THEN could only be a GOTO. I still loved the language, even though it was objectively garbage compared to BASIC7 and BASIC8 which had already been developed. They had cool things like proper subroutines, and using them would probably have been a lot more fun. The interesting thing was that the language was actually compiled, not interpreted like crummy PC BASIC. This made it (relatively) screaming fast.
I wrote my first big program in BASIC6. It was a chat application called ECHO that was a clone of something called Xcalibur that the college kids used. It was awful spaghetti code. I remember using up line number 10,000 and panicking because I could no longer find intermediate line numbers in big chunks of the code (I started counting by 10s and then gradually filled in a lot of the intervening lines. There was no renumbering procedure I was aware of.) All that speed came in handy since there was no concept of threading in BASIC6 and my program had to sequentially handle commands from all the users. People hated it with a burning passion.
One time I found John Kemeny’s email and showed him my amazing BASIC6 monstrosity. He wasn’t very happy about that.
[+] [-] tempfs|5 years ago|reply
actual LOL.
Kemeny seems like he was the kind of guy that had a good sense of humor and every good comp-sci professor needs counterexamples.
[+] [-] musicale|5 years ago|reply
Creating a computer language simple enough for novices to learn in an afternoon but powerful enough to write numerical code, games, and other software was an extraordinary accomplishment, especially in 1964.
The utopian vision of empowering non-experts to unlock the power of computing by writing their own interactive programs for their own purposes is also groundbreaking. (They also expanded access to the BASIC system by developing a time-sharing OS and deploying terminals in multiple locations, which meant the system could be used for synchronous (chat) and asynchronous (email, bulletin board) text messaging and collaboration.)
Every modern computing device with a web browser includes JavaScript, but I can't help but imagine what the computing landscape might look like if JavaScript in the browser were as visible and easy to use as BASIC systems were, from the 1960s to the 1990s. Why doesn't Firefox come with a button you can press to open up a user-friendly programming tab, e.g. with a REPL, a canvas/textarea, a code editor, and the ability to easily share programs and games over the internet?
I think you can (still) learn the basics (haha) of Python/Smalltalk/Logo/Scratch/Lisp/Matlab/JavaScript/etc. in an afternoon, but it's hard to beat BASIC's straightforward simplicity.
I also miss the days when
was a valid program in both Python and BASIC.(Note John McCarthy started the work that led to LISP when he was at Dartmouth, but then promptly moved to MIT. Imagine if Dartmouth had gone with LISP instead of BASIC...)
[+] [-] 8bitsrule|5 years ago|reply
TYPING IS NO SUBSTITUTE FOR THINKING"
[+] [-] incanus77|5 years ago|reply
[+] [-] jasperry|5 years ago|reply
[+] [-] alexjm|5 years ago|reply
1: https://en.wikipedia.org/wiki/Slashed_zero#cite_ref-9
2: https://archive.org/details/bitsavers_ibmpunchedAnIntroducti... (page 4, 15)
[+] [-] todd8|5 years ago|reply
Towards standards for handwritten zero and oh: much ado about nothing (and a letter), or a partial dossier on distinguishing between handwritten zero and oh, by R.W. Bemer
The Chairman of the ACM Standards Committee, Julien Green, has charged me with making “more effective use of CACM for communication … to get grass-roots opinions from the ACM membership.” This paper is the first attempt. A partial dossier on distinguishing between handwritten zero and the letter oh is assembled here. This presentation was triggered by a request for guidance in this matter presented by the United Kingdom Delegation to ISO/TC97/SC2, Character Sets and Coding, at the meeting in Paris on 1967 March 13-16. The matter is just now in the province of USASI X3.6, to which comments might be directed. Comments will be expected within sixty days [by approximately October 1st].
[1] https://dl.acm.org/doi/10.1145/363534.363563
[+] [-] 8bitsrule|5 years ago|reply
[Edit] OK, think I found the culprit.
"The rule which has the letter O with a slash and the zero without was used at IBM and a few other early mainframe makers; this is even more of a problem for Scandinavians, because it looks like two of their letters at the same time." -- https://simple.wikipedia.org/wiki/Zero#Telling_zero_and_the_...
[+] [-] dmolony|5 years ago|reply
[+] [-] lsllc|5 years ago|reply
https://damieng.com/blog/2011/02/20/typography-in-8-bits-sys...
Uh, ok. Turns out it was the 12th and 13th century that it was first known to have been used!
https://en.wikipedia.org/wiki/Slashed_zero#Origins
[+] [-] todd8|5 years ago|reply
[1] Like the capital O found in the google font: https://fonts.google.com/specimen/Cedarville+Cursive?categor...
[+] [-] phendrenad2|5 years ago|reply
EDIT: I do appreciate the Game of Thrones look it gives some of the pages. Especially the error messages like "T00 MANY L00PS"
[+] [-] Vextasy|5 years ago|reply
Appendix B (Limitations on BASIC) gives a rule-of-thumb for the maximum length of a program as "in general about 2 feet".
It was evidently possible to create a complex program in 2 feet as they make the suggestion (on page 46) that a GOTO or IF might jump to a REM statement.
However, some unusual advice is given on page 4 which might have caused confusion: Because “line numbers also serve to specify the order in which the statements are performed by the computer … you could type your program in any order”.
[+] [-] varjag|5 years ago|reply
[+] [-] sellyme|5 years ago|reply
http://bitsavers.trailing-edge.com/pdf/dartmouth/BASIC_Oct64...
[+] [-] skissane|5 years ago|reply
Maybe you are facing a similar issue?
[+] [-] remisharrock|5 years ago|reply
[+] [-] whitten|5 years ago|reply
The tidbit I didn't know was that (like ForTran) this BASIC system didn't require spaces on a line. I expect it recognized the keywords rather than looking for a space, and then comparing what word they found with a list of keywords.
Is the source code for the original BASIC available? What kinds of techniques did they use ? I think I read it was an actual compiler. Does anyone know what limitations were attached to BASIC from that compile-only implementation?
[+] [-] herodoturtle|5 years ago|reply
This article as others have said was more a good trip down memory lane :)
[+] [-] mhd|5 years ago|reply
[+] [-] anonymousiam|5 years ago|reply
[+] [-] gbin|5 years ago|reply
[+] [-] playdead|5 years ago|reply