Perhaps the most interesting part is the last question:
Gazette: Despite this, the machines sometimes get rapped as being old-fashioned, obsolete.
McIntyre: Today, if we positioned the 64 in the marketplace – forget RAM, bytes, bits – if we went out and functionally described the Commodore 64, it would be heralded as a fantastic advancement in personal micro-computing.
The problem is that if you start to talk to people who have been in the business since its inception, they start to get jaded: "it's only eight-bit."
Who cares? You are buying this machine for a specific reason. If it satisfies that need, it is never obsolete. Only your requirements become obsolete. If you no longer require it, then you obviously no longer need hardware to satisfy the need. The need ceases to exist – not the hardware. If the need continues to exist until the year 2000, then that machine is still satisfactory.
There is no such thing as hardware obsolescence. That is a phrase that was coined by the naysayers in this industry. That's baloney.
---
In a way, he's not entirely wrong, but I think it's also quite myopic. Sure, a C64 can be "satisfactory" even today for many tasks in the sense "it'll work", but that's not the same as "good". The C64 is 320×200; even just a doubling of that to 640×400 (never mind fancy things like 800×600 or 1024×768) can be a huge productivity boost in many scenarios simply because you can display more text, never mind all the "RAM, bytes, bits" that do actually matter since you can do more faster.
There are markets where I think he'd be broadly correct--the TI-83 (well, TI-84 now, I guess) is the one that comes to mind. Its target market gives it a certain cap of feature development (you don't want it too be powerful a computer, lest teachers feel uncomfortable letting students use it on tests), and this means that updating models is largely about minor hardware stuff (e.g., switching out a bank of AAA batteries for a rechargeable battery).
However, given the context of the interview in general ("you're losing market share, sales are slowing, you're losing customers, etc., what are you going to do about it?"), arguing basically "nothing's wrong" bespeaks a certain arrogance that would, even without the benefit of hindsight, give me tremendous pause.
Many people here seems to miss what 95% of users did with the C64 (and other 8-bit machines): Videogames.
The trick was to sell at parents the idea that they were buying a powerful but still cheap machine for their kids so they could learn how to use computers[0].
But once in their bedrooms, 95% of kids spent all of their time just playing games and ignoring what "RAM, bytes, bits" or even Basic was.
Once more sophisticated consoles (Nintendo, Sega) and 16/32bit PC games hit the market, the C64 lost all of its appeal.
The article is right if you look at computers as standalone devices. The problem is, they aren't standalone devices for most people. Even prior to the popularity of the Internet, people exchanged documents. With the rise of the Internet, people started consuming more and from more sources. Interoperability became more important, and other people started defining standards that were well beyond the capabilities of 8-bit micros. The C64, among others, were doomed.
(For what it's worth, I was encountering people who used 8-bit micros well into the late 90's. They were happy with what they had, but they were also very much using their computers in isolation.)
However, old computing does have a place, as has been mentioned before here on HN, Ben Eater has done an awesome job at getting folks involved with hardware and low-level software design:
Note that I don't know the guy (though I really wish I did, would love to have a couple beers with him.) I stumbled across his stuff just this year and loved it thanks to my background along with my family background.
In the absence of elementary silicon design classes, we find YouTubers stepping up. I am rather critical of such folks, as most tend to be super pushy wish sponsorships, ads, and what not, however, his videos (thus far) do not do that. Note that I did (eventually) buy a couple of his kits.
> There is no such thing as hardware obsolescence.
That's a bit naive. Older hardware devices, such as Android 4 devices, cannot be upgraded to TLS 1.3. Hardware also becomes obsolete when wear items are no longer available - and that could include 3.5" floppies, or compatible batteries, or even RAM modules in some applications. Not to mention wifi devices that support only 802.11a, or G1 cellular phones, or even analogue television in the United States.
Often devices are still needed, but the ecosystem surrounding their usage has changed. That is what often brings around obsolescence in this industry.
"If [this machine] satisfies the need, it is never obsolete. (...) If the need continues to exist until the year 2000, then this machine is still satisfactory. There is no such thing as hardware obsolescence. That is a phrase that was coined by the naysayers in this industry. That's baloney."
It is so far from reality for the general public, but so true for my 1980's machines that are still very funny, to play with and program!
As a counterpoint. I don't find using my old phones and tablets fun. They are all worse than I remember and extremely laggy. Imagine trying to play Apex on a computer from 1980. There's no way you are going to have fun playing that.
That article came out just around the time 286 clones became affordable which was the beginning of the end of the 8-bit age. A 285 machine was still a little over $1000 but that’s inclusive of the monitor and a hard drive. The performance though was ‘on another level’ compared to 8-bitters and people like John Carmack and Mike Abrash were just about to figure out how to make games like Commander Keen.
286 was a very shitty system: it had only a beeper, practically no DMA, no graphics hardware acceleration of any kind (not even hardware scrolling; sprites were science fiction). Almost everything in that shitty PC bucket was driven by the processor, and that processor was dog slow.
That shows how bad PC clones were at some types of video games. It took a massive programming effort to do a side scroller, requiring machines that arrived six years after the Commodore 64 or Famicom.
Oh wow, that made me feel old. I actually read that article just before making the switch from the Radio Shack TRS-80 ecosystem to Commodore. See the Tussey ad for a 64C with an FSD-2 floppy? I bought that package from them with a repackaged c.Itoh thermal printer. It actually had the mail-in redemption offer the Commodore guy referred to in the article.
It felt like I had that C64 forever. I learned CBM BASIC, 6502 assembler, and even K
The gorgeous thing about these was that, the thing that sold a game was clever playability. Even the games that tried to create an immersive world had to span 4-6 floppy disks, and mostly had to lean on imagination, and your ability to see a fantastic world in colored pixels. I learned to code on a vic-20, then a C128, though the C64 had basically all of the fun software. When the Amiga came out, you had to pay $500 to get a C compiler to build software on it, but it was a dream. When Commodore finally tanked, I looked at the 80286 and Win 3.11, and nearly gave up on computing. Then: Linux. But for all of them, the beauty came because you could "touch the bottom" of the virtual world they offered.
1988 was definitely the pinnacle of the C64 in Britain as the games developers there had stretched the HW to the absolute limits then. I don’t think USA C64 owners had as rich an experience with those machines as we did in Britain. From 1989 onwards was the rise of the Amiga but the C64 still kept Commodore afloat until 1993.
The pages of code beginning on page 82 bring back some memories. There was a certain childhood thrill that came from changing some of those values despite not really knowing what was going on.
Instead of focusing on the obvious progress story lets try to see what the C64 still has going for it and some unmentioned drawbacks:
- It was a somewhat open system that you could build something for without asking for permission!
- You could copy all software.
- The longevity of the hardware surpasses most devices built after for a fraction of the cost.
- It was repairable.
- Most sold compatible architecture with ability to be productive ever in the history of mankind, probably forever.
- 15W is pretty low power considered the lithography at the time (micrometers).
- Keyboard built in that still works 40 years later!
- S-video built-in before it was even called that, not even the amiga had that!
- The SID is unparalleled still in some regards.
- The VIC was incredible, sharing the 2MHz RAM with 1Mhz CPU interleaved and hardware sprites!
The only flaws I can find are:
- The IEC bug that slowed IO from the VIC until the 128.
- One button joysticks?! the device has POTx and POTy why not use those?!?!
- No built in assembly monitor in the default setting.
To me human progess has ONLY been the C64 -> Raspberry 4 arc: from micrometer to nanometer. And the raspberry uses linux and has all compilers built in, and still very few kids use it for it's initial intended purpose: to build software!
With electricty prices rising I think the Raspberry 4 will have a resurgence so I always develop my software to work on that device too.
But most important; with computers as with protocols there is no competition, you need to follow the standard and in the 80s that standard was the C64. I think ARM+linux will be that final standard unless risc-v can pull the open hardware trick off!
Seen that kids, that weren't even born when the C64 died, buy one and try to develop for it is a pretty good sign that the C64 will survive!
The video chip was called the VIC-II. The VIC was in the VIC-20. They are similar in operation but not compatible.
Digital joysticks were best for most games. It's too bad no one brought out a game pad as they are better for some games. A second fire button could have been added. There were analog joysticks available but typically for special software such as flight simulators. It is too bad Commodore didn't put out an analog stick to encourage their use in driving and flight games. The joystick industry would have ran with it and would have brought out lots of them.
The VIC-20 didn't have a built-in assembler either. Jack wasn't a computer user so he didn't see them as being important and was more focused on keeping the prices low, especially for the home models. Commodore did release ML monitors and even the Super Expanders on cartridge for the VIC-20 and C64.
The IBM PC only won because it was built from off-the-shelf parts, meaning the hardware was effectively open source. Once the BIOS was cloned, there was no way anyone else could keep up. The only survivor is Apple and they almost didn't make it. If the CP/M machines had time to become mainstream on 16-bit processors before the IBM PC was cloned, it might have won and Digital Research might have become what Microsoft has.
I wonder what the equivalent headline would be today. Which hardware or software technologies are currently blasting away, but if you look at the cutting edge, are actually obsolescent?
I'm going to get buried for this but, even though I have been a JavaScript programmer for many years, in the software side I think it's JavaScript, and even further a few years after that, web browsers.
I see web assembly as a compelling target for many types of applications on the backend, or blockchain etc.
I think that _could_ also eventually translate into momentum for standards for i/o devices for web assembly.
After years of trying many languages, I started using JS many years ago, and it's so fast and familiar (w/ a big routines library built) and versatile (with HTML & CSS) that I won't be leaving it behind for personal projects.
'Sometimes I listen to software developers, and I get a little bit angry. I want to ask them "Why are you trying to kill this product? Is there not enough installed base to support your efforts?"
Oh boy Rich... posterity has given you an answer!
Edit: There is a review later on, however, that was fun. A game called Trap! could be sped up by poking memory.
And then I got to the file listings... some of the programs you literally entered in machine code! How did people ever find the time?!
Typing in those hex bytes didn't actually take that long. The worst part was making a data entry mistake. Later "versions" of the listings had a checksum for every line, but the earlier ones didn't. Finding where you typed "6E" instead of "6F" could be a real chore.
8-11 year old me made the time because after the sticker shock of two video games, my parents learned about and got me a subscription to Compute! I also spent the summer gathering up bottles and doing odd tasks so that I could afford to get a disk drive because my siblings kept stealing my tapes to record songs off the radio.
> some of the programs you literally entered in machine code! How did people ever find the time?!
IIRC there was a program in BASIC you had to run to start the machine language prompt. Entering lines basically involved typing a series of numbers and their checksum. If you entered the line without errors, a bell sound would play. Otherwise you would hear a buzzing sound.
I knew how to touch type and would just treat the number row on the keyboard as the home row. It was actually faster to transcribe the machine code due to the checksum feature compared to BASIC code.
Looking back on it, I wish they would have just had the program listings in assembly language instead of machine code. That would have made it easier to follow along.
It's worth thinking that the just-in-time supply chain so prevalent now is probably what allows the fast pace of change. In the 80s, you made a product, and needed to lock in supply contracts, warehousing, sales etc. There was so much inertia in those, you ended up wanting to keep your working product going without changing it.
Comparing the millions of man-years that have been poured into 16-bit CPUs and architecturers, gotta suspect that the ROI by sticking with 8-bits would have been a better choice.
More generally, I'd suggest that ten-thousand ants could convey a ton of leaves with much better overal energy-efficiency than using a single pickup truck. ('Relative greenness') Might even be a law of nature!
[+] [-] Beltalowda|3 years ago|reply
Gazette: Despite this, the machines sometimes get rapped as being old-fashioned, obsolete.
McIntyre: Today, if we positioned the 64 in the marketplace – forget RAM, bytes, bits – if we went out and functionally described the Commodore 64, it would be heralded as a fantastic advancement in personal micro-computing.
The problem is that if you start to talk to people who have been in the business since its inception, they start to get jaded: "it's only eight-bit."
Who cares? You are buying this machine for a specific reason. If it satisfies that need, it is never obsolete. Only your requirements become obsolete. If you no longer require it, then you obviously no longer need hardware to satisfy the need. The need ceases to exist – not the hardware. If the need continues to exist until the year 2000, then that machine is still satisfactory.
There is no such thing as hardware obsolescence. That is a phrase that was coined by the naysayers in this industry. That's baloney.
---
In a way, he's not entirely wrong, but I think it's also quite myopic. Sure, a C64 can be "satisfactory" even today for many tasks in the sense "it'll work", but that's not the same as "good". The C64 is 320×200; even just a doubling of that to 640×400 (never mind fancy things like 800×600 or 1024×768) can be a huge productivity boost in many scenarios simply because you can display more text, never mind all the "RAM, bytes, bits" that do actually matter since you can do more faster.
[+] [-] jcranmer|3 years ago|reply
However, given the context of the interview in general ("you're losing market share, sales are slowing, you're losing customers, etc., what are you going to do about it?"), arguing basically "nothing's wrong" bespeaks a certain arrogance that would, even without the benefit of hindsight, give me tremendous pause.
[+] [-] achairapart|3 years ago|reply
The trick was to sell at parents the idea that they were buying a powerful but still cheap machine for their kids so they could learn how to use computers[0].
But once in their bedrooms, 95% of kids spent all of their time just playing games and ignoring what "RAM, bytes, bits" or even Basic was.
Once more sophisticated consoles (Nintendo, Sega) and 16/32bit PC games hit the market, the C64 lost all of its appeal.
[0]: https://substackcdn.com/image/fetch/f_auto,q_auto:good,fl_pr...
[+] [-] II2II|3 years ago|reply
(For what it's worth, I was encountering people who used 8-bit micros well into the late 90's. They were happy with what they had, but they were also very much using their computers in isolation.)
[+] [-] eek2121|3 years ago|reply
However, old computing does have a place, as has been mentioned before here on HN, Ben Eater has done an awesome job at getting folks involved with hardware and low-level software design:
YouTube: https://www.youtube.com/c/beneater Reddit: https://www.reddit.com/r/beneater/
Note that I don't know the guy (though I really wish I did, would love to have a couple beers with him.) I stumbled across his stuff just this year and loved it thanks to my background along with my family background.
In the absence of elementary silicon design classes, we find YouTubers stepping up. I am rather critical of such folks, as most tend to be super pushy wish sponsorships, ads, and what not, however, his videos (thus far) do not do that. Note that I did (eventually) buy a couple of his kits.
[+] [-] brudgers|3 years ago|reply
Double would be 448x280.
[+] [-] jahnu|3 years ago|reply
With that machine my nostalgia is absolutely maxed out :)
[+] [-] unknown|3 years ago|reply
[deleted]
[+] [-] NL807|3 years ago|reply
Funnily enough Melbourne Metro used C64 systems well into the 2000s for displaying timetables.
[+] [-] raverbashing|3 years ago|reply
The C64 wasn't even a good 8-bit machine for a start
And even if it was, it is so removed from today's computers to be useless. Really
I'd say a 386 is not obsolete from that point of view (of the discussion). Or even the Amiga as others mentioned. But the C64 is.
[+] [-] dotancohen|3 years ago|reply
Often devices are still needed, but the ecosystem surrounding their usage has changed. That is what often brings around obsolescence in this industry.
[+] [-] ilaksh|3 years ago|reply
[+] [-] acchow|3 years ago|reply
I guess if you assume that computation requires no energy…
[+] [-] illys|3 years ago|reply
"If [this machine] satisfies the need, it is never obsolete. (...) If the need continues to exist until the year 2000, then this machine is still satisfactory. There is no such thing as hardware obsolescence. That is a phrase that was coined by the naysayers in this industry. That's baloney."
It is so far from reality for the general public, but so true for my 1980's machines that are still very funny, to play with and program!
[+] [-] _the_inflator|3 years ago|reply
This is the result of using Cross Plattform development to the extreme: https://www.c64demo.com/big-parallax-logo-mover/
[+] [-] charcircuit|3 years ago|reply
[+] [-] PaulHoule|3 years ago|reply
[+] [-] Annatar|3 years ago|reply
[+] [-] layer8|3 years ago|reply
[+] [-] DerekL|3 years ago|reply
[+] [-] JoyrexJ9|3 years ago|reply
Maybe once the 386 came along then PC stepped into the picture
[+] [-] kochbeck|3 years ago|reply
It felt like I had that C64 forever. I learned CBM BASIC, 6502 assembler, and even K
[+] [-] indigodaddy|3 years ago|reply
There were SO many Commodore ads and articles in that magazine. I’m guessing this was a version of Compute! just geared toward Commodore correct?
[+] [-] sakoht|3 years ago|reply
[+] [-] sys_64738|3 years ago|reply
[+] [-] timbit42|3 years ago|reply
[+] [-] vardump|3 years ago|reply
For recent C64 SW, see: https://csdb.dk/latestreleases.php
[+] [-] 300bps|3 years ago|reply
https://www.the8bitguy.com/wp-content/uploads/2021/01/Petsci...
[+] [-] harrylepotter|3 years ago|reply
[+] [-] bullen|3 years ago|reply
- It was a somewhat open system that you could build something for without asking for permission!
- You could copy all software.
- The longevity of the hardware surpasses most devices built after for a fraction of the cost.
- It was repairable.
- Most sold compatible architecture with ability to be productive ever in the history of mankind, probably forever.
- 15W is pretty low power considered the lithography at the time (micrometers).
- Keyboard built in that still works 40 years later!
- S-video built-in before it was even called that, not even the amiga had that!
- The SID is unparalleled still in some regards.
- The VIC was incredible, sharing the 2MHz RAM with 1Mhz CPU interleaved and hardware sprites!
The only flaws I can find are:
- The IEC bug that slowed IO from the VIC until the 128.
- One button joysticks?! the device has POTx and POTy why not use those?!?!
- No built in assembly monitor in the default setting.
To me human progess has ONLY been the C64 -> Raspberry 4 arc: from micrometer to nanometer. And the raspberry uses linux and has all compilers built in, and still very few kids use it for it's initial intended purpose: to build software!
With electricty prices rising I think the Raspberry 4 will have a resurgence so I always develop my software to work on that device too.
But most important; with computers as with protocols there is no competition, you need to follow the standard and in the 80s that standard was the C64. I think ARM+linux will be that final standard unless risc-v can pull the open hardware trick off!
Seen that kids, that weren't even born when the C64 died, buy one and try to develop for it is a pretty good sign that the C64 will survive!
[+] [-] timbit42|3 years ago|reply
Digital joysticks were best for most games. It's too bad no one brought out a game pad as they are better for some games. A second fire button could have been added. There were analog joysticks available but typically for special software such as flight simulators. It is too bad Commodore didn't put out an analog stick to encourage their use in driving and flight games. The joystick industry would have ran with it and would have brought out lots of them.
The VIC-20 didn't have a built-in assembler either. Jack wasn't a computer user so he didn't see them as being important and was more focused on keeping the prices low, especially for the home models. Commodore did release ML monitors and even the Super Expanders on cartridge for the VIC-20 and C64.
The IBM PC only won because it was built from off-the-shelf parts, meaning the hardware was effectively open source. Once the BIOS was cloned, there was no way anyone else could keep up. The only survivor is Apple and they almost didn't make it. If the CP/M machines had time to become mainstream on 16-bit processors before the IBM PC was cloned, it might have won and Digital Research might have become what Microsoft has.
[+] [-] ilaksh|3 years ago|reply
I wonder what the equivalent headline would be today. Which hardware or software technologies are currently blasting away, but if you look at the cutting edge, are actually obsolescent?
I'm going to get buried for this but, even though I have been a JavaScript programmer for many years, in the software side I think it's JavaScript, and even further a few years after that, web browsers.
I see web assembly as a compelling target for many types of applications on the backend, or blockchain etc.
I think that _could_ also eventually translate into momentum for standards for i/o devices for web assembly.
[+] [-] 8bitsrule|3 years ago|reply
[+] [-] brudgers|3 years ago|reply
[+] [-] chris_wot|3 years ago|reply
Oh boy Rich... posterity has given you an answer!
Edit: There is a review later on, however, that was fun. A game called Trap! could be sped up by poking memory.
And then I got to the file listings... some of the programs you literally entered in machine code! How did people ever find the time?!
[+] [-] phs318u|3 years ago|reply
[+] [-] twh270|3 years ago|reply
[+] [-] mkozlows|3 years ago|reply
[+] [-] ygjb|3 years ago|reply
[+] [-] u801e|3 years ago|reply
IIRC there was a program in BASIC you had to run to start the machine language prompt. Entering lines basically involved typing a series of numbers and their checksum. If you entered the line without errors, a bell sound would play. Otherwise you would hear a buzzing sound.
I knew how to touch type and would just treat the number row on the keyboard as the home row. It was actually faster to transcribe the machine code due to the checksum feature compared to BASIC code.
Looking back on it, I wish they would have just had the program listings in assembly language instead of machine code. That would have made it easier to follow along.
[+] [-] mwcremer|3 years ago|reply
[+] [-] lttlrck|3 years ago|reply
[+] [-] askvictor|3 years ago|reply
[+] [-] 8bitsrule|3 years ago|reply
More generally, I'd suggest that ten-thousand ants could convey a ton of leaves with much better overal energy-efficiency than using a single pickup truck. ('Relative greenness') Might even be a law of nature!
[+] [-] zsz|3 years ago|reply
https://www.youtube.com/watch?v=ZZfM1lkLuMI
[+] [-] indymike|3 years ago|reply
[+] [-] az_reth|3 years ago|reply
[+] [-] kenjackson|3 years ago|reply
[+] [-] mouzogu|3 years ago|reply
looking back now, it seems there was way too many options. it would have been confusing as a consumer to choose.