(no title)
Teongot | 2 years ago
> When it started testing simulations of early Pentium prototypes, Intel discovered that a lot of game designers had found that they could shave one instruction off a hot loop by relying on a bug in the flag-setting behavior of Intel's 486 microprocessor. This bug had to be made part of the architecture: If the Pentium didn't run popular 486 games, customers would blame Intel, not the game authors.
Does anybody know the details of this?
loup-vaillant|2 years ago
> Sadly, my source for this was a former Intel chief architect, and I don’t think he ever said it anywhere that was recorded. […]
EdwardCoffin|2 years ago
If you're more interested in just concrete examples of this kind of thing, apparently IBM's System/360 team ran into heaps of this kind of issue when emulating the IBM 1401. It's mentioned in Frederick P. Brooks, Jr.'s book The Mythical Man-Month in the Formal Definitions section of Chapter 6, and I think he probably discussed it in more detail in his (and Blaauw's) book Computer Architecture.
Edit: pasting in a cleaned up version of the relevant part of the transcript:
1:04:56 It's required to run every one of those well, how do I know it does? I can't test them all, so you say well you as long as you design to the architecture spec that should be enough. Right? Ha no for example, inside the architecture spec there are places where it says this condition flag is undefined as a result of this operation so you'll do – I don't know what it was anymore, add operation or something, no, it can't be add pick a different [thing] – there was some instruction that would say I do not guarantee what the carry flag will look like when I'm finished and you go as an architect hey that's cool it means I can do it either way whatever way is easiest. No it doesn't.
Yeah if you think that you're going to get in big trouble. Because what what will happen is – and this literally happened which is why I know about this – you put the chip out and then you discover oh it was easiest for my team to set the bit to a 1 didn't matter because it was undefined right I get to pick, but all the previous chips were setting it to a zero although they were calling it undefined. Now you're in trouble, because what you're going to discover some goofy app out there required that the bit be a zero after that operation even though the book said it was undefined. And they didn't notice because up until now it always was a zero. But your chip comes out, the software doesn't work anymore. Guess who's at fault? You are. Can you go "hey look at what the book says, can't you read?" and they'll say "I don't care what you say your chip doesn't work my software, you're a loser, your chips busted."
[1] https://youtu.be/jwzpk__O7uI?si=iy23ZM5tQX-hI87C&t=3903
[2] https://www.sigmicro.org/media/oralhistories/colwell.pdf
formerly_proven|2 years ago
tambourine_man|2 years ago