top | item 47016658

Instruction decoding in the Intel 8087 floating-point chip

72 points| pwg | 15 days ago |righto.com

16 comments

order

Dr_Jefyll|11 days ago

> when an ESCAPE instruction is encountered, the 8086 processor starts executing the instruction, even though it is intended for the 8087. The 8086 computes the memory address that the instruction references and reads that memory address, but ignores the result.

Intriguingly, the 65C02 also shows this behaviour. Several of the C02's undefined opcodes cause an address to be computed and a fetch to occur (whose data is then discarded). The spurious address computation wasn't intended as a feature, but an opportunity nevertheless exists. My 1988 KK Computer uses it as part of a co-processor scheme in which microcoded external logic gives the 65C02 six new registers and 44 new instructions, including ITC NEXT (as used by Forth).

Repeatedly posted to HN. :-) If anyone's interested and hasn't yet seen it... https://news.ycombinator.com/item?id=26432154

burnt-resistor|11 days ago

An FPU coprocessor doesn't make an 8088 run any faster, it only accelerates computation of floating-point calculations important for a subset of software.

The original DOOM used fixed point math so it could run on machines lacking an FPU like the 386 and 486sx. MSFS 1.x to 5.x didn't require a coprocessor either. Falcon 3.0 and related sims (MiG-29, F/A-18) likewise require only a 286 but could use it optionally.

Here's a approximate list of x87-usable (required or optional) software: https://ctrl-alt-rees.com/2019-06-06-list-of-software-that-u...

R:Base System V port to OS/2 (FPU optional) was basically a rewrite from FORTRAN to C, and experienced issues due to Microsoft's substandard floating-point emulation library compared to the one(s) Microrim used previously.

kens|15 days ago

Author here for all your 8087 questions...

pwg|15 days ago

Ken,

Way back (circa 1988ish timeframe) I remember a digital logic professor giving a little aside on the 8087 and remarking at the time that it (the 8087) used some three value logic circuits (or maybe four value logic). That instead of it being all binary, some parts used base 3 (or 4) to squeeze more onto the chip.

From your microscopic investigations, have you seen any evidence that any part of the chip uses anything other than base 2 logic?

rogerbinns|15 days ago

Do you know what other prior systems did for co-processor instructions? The 8086 and 8087 must have been designed together for this approach to work, so presumably there is a reason they didn't choose what other systems did.

It is notable that ARM designed explicit co-processor instructions, allowing for 16 co-processors. They must have taken the 8086/8087 approach into account when doing that.

iberator|11 days ago

What's the point of this all? This is late for like 30,40 years to the game.

I mean: ZILOG probably already did it all in like 1982

drob518|11 days ago

Fascinating. The old designers were amazingly creative with their limited resources. The decoding logic for the 8087 is… ahem… interesting with how it interacts with the 8086. Amazing that it works at all. Would be interesting to see the actual microcode source and how some of the algorithms are implemented.

I always enjoy your write ups, @kens. I was doing hardware design in the late 1980s/early 1990s, and so this takes me back.