top | item 21456855

1984, the Year of the 32-bit Microprocessor (1984)

78 points| indigodaddy | 6 years ago |archive.org | reply

48 comments

order
[+] elwell|6 years ago|reply
Paged through some and found an article on Speech Recognition [0]:

Most interesting sentence: "Currently, it is not likely that such techniques would be used for domestic surveillance."

    Philosophical Issues
    The specter of Big Brother may not be of concern in
    Western society today, but the evolution of distributed 
    intelligence among machines with speech-recognition
    capability certainly provides the technical base for
    monitoring our activities. In fact, the U.S. National
    Security Agency has developed what may be the world's most
    advanced speech-recognition algorithms. This system spots
    keywords in intercepted verbal transmissions from
    "unfriendly" nations. Currently, it is not likely that
    such techniques would be used for domestic surveillance.
    But speech technologists as well as the public must be
    aware of the potential loss of privacy.
[0] - https://archive.org/details/byte-magazine-1984-01/page/n213
[+] stevenwoo|6 years ago|reply
This was five years before a 10000 dollar dedicated speech recognition card would be available for home PC's, I remember when we got one for testing when I was working during college at IBM and it was huge like those double slot graphics cards one can buy today, now we can do the equivalent on our phones - it's hard to predict sometimes how fast or where technology is going to advance.
[+] wscott|6 years ago|reply
Well, they understood the significance of the year.
[+] romwell|6 years ago|reply
Emphasis on "currently".
[+] snvzz|6 years ago|reply
By the requisites they list, which focus on the ISA, the original 68000 was already a 32bit architecture in 1979, but it would indeed take until 68020 in 1984 for the CPU to have a 32bit bus and for 32bit operations to not be internally split into 16bit ones.

I admire the foresight of Motorola in making the ISA itself 32bit from the get go.

[+] hermitdev|6 years ago|reply
When I was in college, studying electrical engineering, we used a 68K (I forget the exact version) in our microcomputers course. In lab, the "computers" we worked on were hand assembled, wire-wrapped from individual chips. Quite the site to look at, especially considering that it fit in the same size as a then standard desktop form.

What I do remember is that while 32-bit ISA, the data bus was 16-bit. We had both hardware and software debuggers. The hardware debugger was basically flipping a switch from going from a regular clock to a single step with set of hex displays on the front of the computer that allowed you to see what was on the databus, no internal CPU state. The software debugger allowed you to see CPU registers, display memory. No stack trace.

One of our projects was to write a resident monitor, maybe more analogous to a super micro kernel. Because the monitor ran in supervisor mode, while normal code ran in user mode, and our software debugger also ran in supervisor mode, we couldn't use the software debugger for our monitor. We could only use the hardware debugger. I got really good at decoding 68K instructions from reading the hex display while single-stepping.

A silly aside from that last comment; We built our monitor on then modern PCs, specifically, I had a laptop running FreeBSD, because it had a 68K emulator available. To get our assembled code to the micro computer, we had to transfer it over a 1200 baud serial connection, and it took around 30 minutes to transfer our monitor before we could even test. So, we heavily utilized the emulator which was nearly instantaneous. We had run all of our tests against the emulator and everything looked good. We were still in lab at around 2AM, and our professor walks in and asks how we were doing. I responded we were doing great, and had just uploaded our latest code to test. Our test on the actual hardware failed spectacularly. Started using the hardware debugger to find out why, and it turned out to be an addressing issue (the instruction was using the default 16-bit addressing instead of the intended 32-bit addressing). It was literally a 1-bit bug. I manually edited the memory and reran and it was fine. There might have also been a jokingly light back-hand across the face of the team member that wrote the routine that failed.

[+] djmips|6 years ago|reply
According to Sophie Wilson, one of the reasons Acorn decided to go ahead with their own CPU design was the fact that in 1983 the 68000 and other contemporary CPUs were bottle-necked on memory access.
[+] mschaef|6 years ago|reply
> By the requisites they list, which focus on the ISA, the original 68000 was already a 32bit architecture in 1979,

Huh? The second paragraph states: "First, let's define our terms. A 32-bit microprocessor has a full 32-bit architcture, a full 32-bit implementation, and a 32-bit data path (bus) to memory." (And then the author spells out what each of those three things means individually.)

IIRC, the 68000 missed both the 32-bit implementation and the data path.

Interesting to note though, that Intel's later 80386SX product would NOT quality for the article, even though it's a compatible reduction of a chip that does. (The SX version of the product had a 16-bit bus and was physically closer to a 80286 in terms of external hardware interface.)

[+] klodolph|6 years ago|reply
For some history—the early Macs in 1984 used 24-bit addressing. The 68000 itself had a 24-bit address bus, but the address registers were 32 bits. So you could use the top 8 bits for some extra data, and programs actually did this (including ROMs/system software for early Macs). This meant that your memory would max out at 8 MiB, with the other half of the address space used for something else (ROM?).

When System 7 came out, 32-bit addressing became available but not all software was “32-bit clean”. So you could enable or disable 32-bit addressing in the memory control panel. This is similar to the A20 gate on IBM PCs.

By 1993 or so the hardware was no longer built to support 24-bit addressing. PowerMacs followed soon after.

[+] zackmorris|6 years ago|reply
Ya I miss the 68000. It was the first assembly language that I learned. It was much closer to RISC than x86 is, so there were plenty of registers and the instructions were fairly generic so it was straightforward to map C code almost directly to assembly. The "32 bit clean" stuff caused a lot of bugs for years after 32 bit memory arrived though.

What really killed it though was perceived slowness. Apple insisted on shipping generations of Macs with crippled busses running at half the bit width of the CPU and/or at reduced frequencies.

I remember the very fastest Macs in the early 90s not being able to even run a scrolling 2D game at 640x480 in 8 bit color because it just wasn't a priority. Meanwhile a 486 could run DOOM in 320x240. We had to wait until the 60 MHz PPC 601 arrived, and even it struggled with 640x480 graphics (the lowest resolution available). On top of that, most software was still emulated 68000, so the perceived slowness of Macs continued until Steve Jobs returned and introduced the colored iMacs.

I eventually came to love x86 assembly for the brief time I used it though. It has so many easter egg instructions for moving bytes around wider registers and getting free side effects with memory access that it felt like there was always another way to gain a bit more performance out of hand-rolled loops.

Of course that's all gone today, because instruction sets are so.. byzantine that a good compiler will usually beat hand-rolled assembly. That's because the real processing happens as RISC beneath microcode so a human can't really know the optimal way to string instructions together to keep the pipeline full or avoid cache misses. Also the SIMD stuff expanded the solution space to such a degree that it's almost pointless to do anything directly. Better to use a vector language like MATLAB and compile to SIMD accelerated C or go to GPU processing instead IMHO. Otherwise you're perpetually struggling with premature optimization and can't work at a productive level of abstraction.

[+] saagarjha|6 years ago|reply
I wonder if we’ll ever have to make today’s software “64-bit clean”.
[+] aidenn0|6 years ago|reply
This was an interesting year for S-100, and it's reflected in the ads. The writing was on the wall for the S-100 bus on microcomputers (the PC platform was displacing it), so they were pivoting to the high-end (look at the S-100 SBCs designed to be used in larger installations, the way that VME would eventually be used). There's a good mix of ads for both applications in this issue. The standard was eventually retired 10 years later.
[+] Davesjoshin|6 years ago|reply
Page 147 shows an advertisement for computers and parts. One of the companies present is a Pied Piper.
[+] ggambetta|6 years ago|reply
You mean... you mean they figured out time travel tech???
[+] acqq|6 years ago|reply
From the same BYTE, something even today can be recognized as an absolutely amazing achievement:

https://ia600609.us.archive.org/BookReader/BookReaderImages....

Turbo Pascal - IBM Pascal - Pascal MT+

Price: 49.95 | 300.00 | 595.00

Compile and Link Speed: 1 sec | 97 sec | 90 sec

Execution speed: 2.2 sec | 9 sec | 3 sec

Disk Space 16 bit: 33K with editor! | 300K + editor | 225K + editor

Disk space 8 bit: 28K with editor! | Not Available | 158K + editor

...

Locates Run Time errors directly in source code: YES | NO | NO

----

Extended Pascal for your IBM PC, APPLE CP/M, MSDOS, CP/M 86, CCP/M 86 or CP/M 80 computer features:

- Full screen interactive editor providing a complete menu driven program development environment.

- 11 significant digits in floating point arithmetic.

- Built-in transcendental functions.

- Dynamic strings with full set of string handling features

- Program chaining with common variables.

- Random access data files.

- Full support of operating system facilities.

- And much more.

[+] michaelhoffman|6 years ago|reply
The ads in this issue for long-forgotten products are amazing.
[+] indigodaddy|6 years ago|reply
Yes quite a treasure trove I noticed as well. A lot of disk copying services it seemed?
[+] AnimalMuppet|6 years ago|reply
Ah, what days those were! Expecting the 68020 to be twice as fast as the 68000 at the same clock speed? Wow. Today we get table scraps of improvement. (On the other hand, the baseline is much higher now...)
[+] snvzz|6 years ago|reply
>(On the other hand, the baseline is much higher now...)

Which unfortunately hasn't resulted in a more responsive user experience.

Faster CPUs just meant that optimising code got less important. Now, we're stuck with a bloated stack, top to bottom.

[+] mhd|6 years ago|reply
Interesting to read about the NS32k. The only thing I ever heard it used was the Ceres workstation of the ETH Zurich, the native platform of Wirth's Oberon OS.
[+] cmrdporcupine|6 years ago|reply
Jack Tramiel's Atari Corp. evaluated the NS32k series for use in their Atari ST. They even built a prototype. But the processor had bugs and quality issues and they went with the 68k instead. I understand NS did fix their issues but by then it was too late and it never caught on outside of anything but embedded applications (apparently used in laser printers...)
[+] wbhart|6 years ago|reply
The Zilog z80000 wasn't introduced until 1986, but was pipelined and was in some respects about 6 years ahead of Intel. Apparently they are still used today in embedded application, though there was one moderately successful machine based on them back in the day.
[+] dfawcus|6 years ago|reply
I recall learning of the z80000 (via a datasheet) in '88, but never managed to find a machine using it.

I know of machines designed and built with the z8000, but I've only heard of the z80000 being used for embedded devices (possibly one main customer?).

Are you able to provide a reference to an actual general purpose computer built around the z80000?

[+] ThomasBHickey|6 years ago|reply
I still have one of the original Apollo/Domain (DN300?) motherboards. It has two Motorola MC68000L8 chips on it. It was explained to me that Apollo found they needed both of them to make virtual memory work. 32 bit data width, 16 bit data bus.
[+] that_jojo|6 years ago|reply
It's worth going into why the two CPUs.

Basically, the original 68000 isn't capable of atomically restarting an instruction interrupted during a memory access cycle, and so there's no way to implement a standard MMU.

So Apollo just chucked in a slave CPU that would detect the interruption of the master CPU, halt it, deal with any remapping or what have you, and then just completely reset the master CPU.

[+] erwan577|6 years ago|reply
This byte article does not seems concerned about the choice of endianness (big-endian or little-endian).
[+] magoghm|6 years ago|reply
Endianness matters when you share binary data between computers with different architectures. That didn't happen that often in 1984.
[+] kabdib|6 years ago|reply
Honestly, byte order didn't matter much. There were so many more dimensions of incompatibility available that byte-order didn't enter the picture when you were choosing a processor for a product. Most data was exchanged via files written to floppy disks, and you just wrote conversion utilities (or programs dealt with save formats on the fly).