top | item 37796292

Why did the Motorola 68000 processor family fall out of use in PCs?

204 points| SeenNotHeard | 2 years ago |retrocomputing.stackexchange.com | reply

248 comments

order
[+] smackeyacky|2 years ago|reply
I started working professionally in the late 1980s and it seemed like the big competition at the time was between x86 and 680x0 variants.

From the Mac right up to multiprocessing minis (Bull DPX/2 for example could be had with 4 68030), plus a variety of high end workstation vendors (Sun, Apollo) the 680x0 seemed like a much higher end processor than anything Intel based.

Then Sun dropped it in favour of Sparc.

It took a while but the 386 and it's replacements gradually ate everything in that space thanks to Compaq leading the way. Even Sun had intel based machines.

[+] jameshart|2 years ago|reply
Sure, the x86 family took a commanding lead over that time... But over in the corner where nobody was looking, in 1987 Acorn shipped the Archimedes with this RISC based processor they called the 'Acorn RISC Machine', or ARM... and it's looking increasingly like that whole x86/68x00 fight was a sideshow.
[+] jjav|2 years ago|reply
I always enjoyed assembly programming the best on Motorola CPUs. Spent a lot of my youth on the 6809 and later 68020. Still have assembly programming books for both on the bookshelf behind me. Never really did enjoy assembly on the 8086 or 286, very ugly. Did some SPARC assembly programming later and it was nicer than x86 but by then assembly was fading from my life.
[+] tedunangst|2 years ago|reply
HP also had a 68k workstation line before PA-RISC.
[+] panick21_|2 years ago|reply
I mean this how it might have appeared but internally everybody know it was up. Sun by like 1985 already knew that Motorola was pushing ahead fast enough, that's why they developed SPARC. They did try to get Motorola to be more aggressive and so on, but Motorola wasn't interested.

Even Acron knew it in 1884 when they evaluated processors.

Motorola was going nowhere quickly in the mid to late 80s.

Apple as usual was late to move and their internal RISC project went nowhere fast.

The surprising part of the story is Intel managed to keep up. They were able to do so because they had absurd volume and could pay far larger teams.

[+] ThomasBHickey|2 years ago|reply
I remember the Apollo reps moaning that the speed of Motorola's 68k chips weren't keeping up the the x86. I still have an Apollo motheroard with two 68k chips on it -- a work around to enable virtual memory.
[+] tivert|2 years ago|reply
This is the most interesting answer: https://retrocomputing.stackexchange.com/a/27727/21496. It talks in detail about design choices that made 68k hard to scale.
[+] ajross|2 years ago|reply
I think that's mostly wrong though, because as the P6 demonstrated complicated CISC addressing modes can be trivially decomposed and issued to a superscalar RISC core.

What really killed 68k was the thing no one here is qualified to talk about: Motorola simply fell off the cutting edge as a semiconductor manufacturer. The 68k was groundbreaking and way ahead of its time (shipped in 1978!), the 68020 was market leading, the '030 was still very competitive but starting to fall behind the newer RISC designs, leading its target market to switch. The 68040 was late and slow. The 68060 pretty much never shipped at all (it eventually had some success as an embedded device).

It's just that posters here are software people and so we want to talk about ISA all the time as if that's the most important thing. But it's not and never has been. Apple is winning now not because of "ARMness" but because TSMC pulled ahead of Intel on density and power/performance.

[+] npunt|2 years ago|reply
Though I don't think it was the root cause of 68k's demise, offering poor backwards compatibility between generations was an unforced error by Motorola. Everything was changing so fast in computing in the 80s-90s, and Wintel's soft guarantee that you could just write/buy software and it'd work in the future was a big selling point.
[+] cmrdporcupine|2 years ago|reply
ColdFire made the compromises necessary to make it scale, and the author points that out. But it was too late.

ColdFire can be made entirely 68k compatible with simple low-overhead emulation software. It could be have been a viable path forward for 68k, but they rolled it out almost 10 years too late, after they'd totally given up on 68k.

Motorola abandoned 68k, is what happened.

[+] bell-cot|2 years ago|reply
From both vague memory, and a skim of Wikipedia -

https://en.wikipedia.org/wiki/Motorola_68020#Launch,_fabrica...

- I'd say that Motorola management's motto was "Meh, whatever", while Intel management's motto was "Only the paranoid survive".

[+] pfdietz|2 years ago|reply
Motorola was an example of the adage "never invest in a company that has a museum to itself."

(The Galvin Center has since been demolished, replaced by a Top Golf. There's drone footage on Youtube of various stages of the demolition.)

[+] simne|2 years ago|reply
68k was good, but with extreme flaws, and late. This was enough so Intel become first.

Most annoying flaws: sophisticated memory interface and very limited old companion chips (from 6800 family), and for original 68k need second CPU to make virtual memory.

So what happen? - When appear first 8086/88 and 68k, RAM was small in all machines, but 8086 was simpler and 8086 system was cheaper, and near same speed.

Then appear 286, which was improved 8086 with much better speed and bigger possible RAM, but Motorola answer with 68010, just fixes flaws, this was good, but not enough in race.

Fortunately for Motorola, 286 was not fully compatible with 8086, some programs need to be modified, but many old programs from 68k family also does not work on 68020, so this was good time to think, why bother too much about compatibility, if it is not achievable.

To be honest, on PC I first seen real compatibility, most programs run without any modifications on 8086/8088/80286/80386/80486, just on each iteration faster.

First really significant issues with compatibility on x86 appear on Pentium, as I remember, Pentium-Pro was first Intels microarchitecture on which 16-bit programs was slower then on previous Intel CPUs.

But all these facts was future when was first competition of 8086 and 68k, and nobody could be sure about Intel future before 80386, even many people think 80286 is unfortunate CPU.

So I think, if Motorola made simplified 68k with fixed flaws (to be cheaper and may be to integrate more on CPU chip), it was really possible to gain momentum and overcome Intel on PC market.

Unfortunately, Motorola decide to make 68020/68030, semi-compatible with 68k, but not cheap enough, and stay second in race. May be this because Motorola that time have good sales in military, so they don't bother about future.

And after 80386 for Motorola time was lost forever. PowerPC was totally another history.

[+] api|2 years ago|reply
It was only used by closed vertically integrated brands like Apple and Commodore (Amiga) at a time when open component based PCs were all the rage. I don't even think M68K motherboards you could drop in a case and build with were even available. If they were it was never a big thing.

The entire market dominance of the x86/x64 architecture came out of that era and came about because you could build PCs with it and run a variety of software on it including DOS, Windows, FreeBSD, commercial Unix, and later Linux.

[+] yetanotherloss|2 years ago|reply
Also a factor was that while clone machines existed of the IBM/Intel systems and earlier Apple and other 6502 computers, often out of Taiwan, the other vertical vendors managed to keep tighter control. 68k equipment from 3rd parties for Macs and such was much harder to come by. While that temporarily increased their margins, long term it meant that PC component costs kept dropping until you could get a 586 desktop that did 75% of a SPARCstation or NeXT for 30% of the cost.
[+] pjmlp|2 years ago|reply
Only because IBM failed to prevent Compaq to start the component based PC market in first place.
[+] kristopolous|2 years ago|reply
It's odd how arguably the most relevant of the lot was fairly obscure at the time: Acorn computers, creator of the ARM processor.

https://en.m.wikipedia.org/wiki/Acorn_Computers

You can run RISC OS on a modern ARM like the Raspberry PI - it gets weirder the longer you work with it - menu items with input boxes and other GUI widgets, a strange DOS VMS UNIX hybrid CLI that appears at the bottom of the framebuffer, scrolling your graphic screen up as you type and full of terminology that's incredibly excessively British.

Go to 10 minutes to see the shell https://youtu.be/oL4w3AK6Qpw?si=Vdu2ur1fM0N9Tl2X&t=10m and http://www.riscos.com/support/users/userguide3/book3b/book3_... for the documentation

Also see 8:50 for the British terminology and 12:34 for an example of the weird menus

The thing I like about it is the creators clearly knew what the dominant paradigm was and made a decision to be different. It's nice.

[+] PeterStuer|2 years ago|reply
IBM 'legitimized' the personal computer for business use. Their PC was build around Intel's processor. They accidentally made the design clonable, and the OS was not exclusivly licenced.

Compaq was first to jump into this, but soon this opportunity created a massive ecosystem of competing clones all able to run the same binary distributed software packages which fueled both a software industry and a commoditization of the platform which made it much more accesible and affordable.

This enormous ecosystem success forced Intel to remain backward compatible with old instruction sets to run older binaries, with Microsoft forced the same way on OS api's and services. Even IBM itself tried but failed to counter this s runaway train on both the hardware and the software front.

At the processor level I loved the much more sane instruction set of the 68000 family (much like I preferred the Z80 over the 6502 before), but the dynamics of the whole PC ecosytem just steamrolled over the alternative 68000 platforms, even though Atari, Commodore and Apple produced compelling designs.

[+] usrusr|2 years ago|reply
Yeah, it was the vertically unbundled platform that made the decision. Relative technical qualities of x86 vs 68k had very little to do with the outcome I think.
[+] karmakaze|2 years ago|reply
There are many contributing factors as are being mentioned. I would say the most dominant one is the success of the x86 PC. That success depended on the continued cost/performance value. M68k systems on the whole were considered workstation machines that looked down upon less capable hardware. Exceptions to this were the Amiga and Atari ST, which unfortunately competed themselves out of existence rather than against the PC market which was squeezing them out. The Macintosh was more capable due to its software rather than CPU.

Once you have the success of DOS PCs and add growing exposure of Windows 2.x (e.g. Windows/386) filling in capabilities it's hard to compete with technically better but also much more expensive systems except in smaller or niche markets over time. Even the server market switched from the likes of Sun SPARC to x86 systems.

The theme is that cheaper and minimally viable has a larger market potential that wins if able to find a way to survive. An earlier/smaller example of this is how the 6502 ate the lunches of other/better 6800 or Z80 based systems. That success later failed due to stagnation of the hardware and operating systems, and again in-competition. The x86 PC-compatible market allowed competition between vendors while still using the same ISA and ecosystems DOS & Windows.

I grew up in this era having multiple Atari 8-bit & ST systems, using Apple ][, Macintosh, and rare access to Amigas. I was extremely disappointed with DOS+Windows prevailing over the more exciting systems from a graphics/sound gaming perspective. Market size won.

[+] Aloha|2 years ago|reply
In the end I think specialized hardware always looses over software - that's why those things failed.
[+] leptons|2 years ago|reply
> An earlier/smaller example of this is how the 6502 ate the lunches of other/better 6800 or Z80 based systems. That success later failed due to stagnation of the hardware and operating systems, and again in-competition.

6800 and Z80 weren't much if at all better than a 6502 in terms of performance. They're all 8-bit CPUs and limited in various ways. They all also had 16-bit iterations in the 68000, 65816, and Z800.

Intel x86 and the PC and the software being written for the platform is what killed off demand for those other chips. If there were a 68090 (or whatever 68k variant it would evolve into) today inside a modern Amiga platform with a modern GPU, that people still wrote software for, I'd be using it as my daily driver.

[+] SanjayMehta|2 years ago|reply
Speaking from a very narrow perspective: microprocessor based instrumentation in India in the early 80s, it was documentation and support.

We’d write to Intel for information and they would send back a stack of manuals gratis.

Motorola wouldn’t even reply.

[+] jagrsw|2 years ago|reply
I vaguely remember those days, late '80s and early '90s, and a lot of it probably had to do with IBM's reputation. Amiga had its pro uses - like smaller TV stations using it with VideoToaster or Scala for broadcast management. But PCs had this vibe of being "serious" or "professional" that Amiga, Atari, and Mac just didn't have.

On the technical side, Amiga had some downsides too. Its OS was in ROM, and while it was way ahead of DOS and early Windows back in '87, it got outdated by the early '90s. Smaller Amiga models didn't support hard drives without buying a pricey add-on, making them more like game machines where you were stuck swapping floppies.

But if you look at OS and CPU architecture, it was almost like comparing a well-designed system to a mess. DOS was clunky, and x86 had its weird quirks: limited registers, awkward 8-bit compatibility, segmentation over paging, unnecessary IO mode/addressing instead of MMIO, messy assembler (prefixes, segments, adhoc instructions) you name it.

[+] roywashere|2 years ago|reply
Motorola 68xxx is not just Amiga and fun and games. I used to own a Sun 3/60 which is 68020 and most definitely very much business, much more than IBM PC
[+] MarcusE1W|2 years ago|reply
For professional PC users one additional thing was important. The display refresh rate. In those days you put up with a lot of crap if you had to sit in front of the computer for the whole day. That’s why black screens with green or amber text was popular. The monitors were cheaper, but also you did not notice the screen flicker so much. The popular Hercules card had a higher Hz rate than the usual Color graphics and was considered for professionals. Despite the simple look. A while it was accepted to have lower rates, but in the beginning of the 90th the display refresh rate was a thing.

The Amiga had great graphics but especially in Europe with the 50 Hz PAL system they flickered too much. Even at that time 50 Hz were seen as unergonomic. The US and the Atari ST with 60 Hz were a bit better but in general 70 Hz was seen as necessary.

The Atari ST mono had 70 Hz but a small monitor. Still quite successful in professional places. If you just got away with it at the end of the 80th, in the early 90th the display refresh rate had to be 70 Hz.

At that point the graphic of the Atari ST and Amiga was not that impressive anymore compared to the PC (the ET4000 graphics chip was out) and if the ergonomics don’t fit either then it was hard to argue that they were a good fit for professional use where you use the computer longer during the day, like text processing etc.

[+] snvzz|2 years ago|reply
>Its OS was in ROM

This does not mean it wasn't upgradable by using RAM, as both SetPatch and soft kickers (mkick, skick and so on) demonstrate.

Replacing the ROMs is also possible, although not ideal, and kits with Workbench disks and ROM chips were sold cheaply.

Later, it could have been replaced by an EEPROM, too, but Commodore died first.

>Amiga had some downsides too

The main issue was the way pointers were passed around as if handing out candy, not providing IPC mechanisms not reliant in memory sharing. This hinders attempts to properly leverage memory protection in later models, as well as the ability to reasonably implement SMP later on.

In no small part, Commodore was to blame. The first Amiga released in 1985; MMU-capable m68k CPUs had been available for years. Yet they chose the plain 68000, which was not MMU-capable. A corner-cutting measure that would later bite them. E.g. move from SR/CCP discrepancy, vector base forced into low chipram and so on.

[+] cmrdporcupine|2 years ago|reply
Motorola jerked around their customers by announcing the deprecation and death of the 68k not once but twice; first the 88k (total flop), then the PowerPC.

Intel played around with making fancy new alternative architectures, too (i960, i860). But it never gave any of its PC customers the impression that x86 was going to be murdered. (Well, until Itanium, but that's much later and was almost serious trouble for them.)

The switch to PowerPC almost killed Apple, in my opinion. They couldn't afford the chaos and instability. System 7.6 on PowerPC was terribly unstable, and offered little to no advantages.

Meanwhile Intel iterated on the (basically inferior) x86 architecture and we got Pentium & Pentium I and proved you didn't have to go RISC.

A few years later Motorola/Freescale rolled out ColdFire, which did for 68k what Intel did for x86. But probably about 3-4 years too late, and targeted really only for embedded devices.

68k was great. But these days I don't think I'd want a big-endian machine.

[+] anthk|2 years ago|reply
OSX on PPC got a good chunk of the PC sales on multimedia and press.
[+] orochimaaru|2 years ago|reply
Oh man. I had to study this instruction set for a systems programming course for my undergrad. The project was to make a linker/loader that could run assembly code written in this.

Boy does it have a complicated instruction set. Anyway, we early on negotiated with the instructor on what instructions we would support. Early education in defining scope for success I guess.

[+] readthenotes1|2 years ago|reply
I am surprised that no one has commented on the difficulty of learning/using the 68000 Assembly language versus Intel.

Iirc, there were no books available to me for the Motorola Assembly language programming nor do I remember having easy access to any environments for it.

[+] hshxushx|2 years ago|reply
Intel won because of windows. None of those answers adequately give credit to gates.
[+] vondur|2 years ago|reply
Intel won because of DOS. Once Compaq opened the market to clones, you could purchase an Intel PC from any number of manufacturers that could run DOS, and then later Windows. IBM tried to stop this with OS2, but thanks to its DOS compatibility layer, Microsoft convinced everyone to develop for DOS, since it wasn't going away in OS2.
[+] JdeBP|2 years ago|reply
I think that seven different answers there and at least that many, also all different, posted here indicates that there isn't in fact a demonstrably correct answer explaining why the global market did what it did.

Of course, any economist would say "Welcome to economics!" at this point. (-:

[+] tedunangst|2 years ago|reply
Everyone knew CISC was dead and 88k was the future.
[+] KerrAvon|2 years ago|reply
Yes. That's literally the explanation. Motorola wanted to move people to their 88k RISC CPU. Apple was planning to move the Mac to 88k; the 68k emulator on the first PowerMacs was originally for the 88k.
[+] dsm9000|2 years ago|reply
I liked programming the m68k cpus. They were also the CPU used in my computer science department curricula for assembly language programming classes.

At school we had lots of Sun{2,3,4}, Apollo, HP, Mac, and NeXT computers which we could practice on. Kinda saw the writing on the wall when we got a 6 CPU i386 sequent symmetry system and then SPARC, MIPS RISC, and PowerPC while nothing really from Motorola. I never enjoyed programming x86 cpus after being self taught on 6502 and then m68k systems :-)

I still have an ATARI Mega ST and a Sun 2 at home for sentimental reasons only.

[+] TacticalCoder|2 years ago|reply
Something has to be said about clock race too... Even the Sun workstation with 68020 Motorola CPU would only clock at 16 to 20 Mhz I think. Meanwhile the 386 came out in 1985 (announced in 1984 according to Wikipedia) and could go to 40 Mhz.

One answer on SO (with only six upvotes) says that Motorola couldn't keep up the clock race.

When technology is advancing at an insane pace and you see a CPU line that is already not keeping up with the clock race, it probably doesn't make much sense to bet on that CPU line.

I remember going from my beloved Commodore Amiga (68000) to a PC felt like going back in time but the PC's 386 would clock at 40 Mhz while the Amiga would clock a... 7 Mhz.

So there's that.

[+] mixmastamyk|2 years ago|reply
Because of the vast amounts of money being poured into wintel (dostel?) at the time, no one else could compete after a decade+ of being outspent 10x on R&D by Intel. Not Sparc, not Mips, no not Motorola either. Was just reading about DEC/Alpha in another thread.

Software lock-in, first from IBM then MS contributed to massive consolidation in the industry and until the old players tapped-out.

Presumably, with Intel's budget Motorola could have paved over the 68k's flaws just like was done with x86.

[+] wk_end|2 years ago|reply
I too suspect they could have - but even if they could have, it wasn’t at all clear that it would’ve been the right thing to do. Even for Intel I don’t think it was clear for a while that their gambit was going to pay off and bury RISC (or Itanium), and there was much less incentive for Motorola to maintain backwards compatibility. A clean design that was faster - and easier to make faster - must’ve seemed like a sure bet at the time.
[+] vondur|2 years ago|reply
I assume that's one of the reasons why they teamed up with IBM and Apple for the PowerPC. More resources that can be used for CPU development compared to them going it alone.
[+] agumonkey|2 years ago|reply
It seems the software was a massive influence because motorola cpus were successful in consoles.
[+] cmrdporcupine|2 years ago|reply
Coming back to this thread a day later. I think the real story is that: 68k could not keep up because Motorola just stopped investing in it. Dunno if this is a chicken vs egg thing due to its already declining popularity, or just bad leadership at Motorola, or both. But Intel shoveled money into the R&D furnace for x86 while Motorola spent their energies elsewhere. And so they stopped competing, and then stopped existing.
[+] creer|2 years ago|reply
There were many microprocessor designs out or coming out from everybody. Plenty of them got design-ins (chosen to have a product built around them). For PC, nobody could compete with the wave of the PC-compatibles running Windows on x86.

Many of the new microprocessors did come out in non-PC products like workstations - where it didn't matter as much.