top | item 40595920

Don Estridge: A misfit who built the IBM PC

259 points| dshipper | 1 year ago |every.to

126 comments

order

tromp|1 year ago

It's rare that a tech story brings me to tears, but I couldn't help feel one swelling up when reading the final paragraphs.

> Eventually, it was Wilkie who made the first move. Overwhelmed with emotion, his eyes red and swollen with grief, he stepped forward and detached the red rosette from the lapel of his suit jacket. It was the same one Don had given him years before. Leaning down, he gently placed the rosette on the casket.

It feels like there should be a movie made about this story...

tracker1|1 year ago

I feel the same way. The end of the story is just sad. I wish that more companies could break their own structures to offer rewards, bonuses and more freedom to teams like this. The kinds of people that thrive with these kinds of opportunities tend not to do as well with general corporate culture.

So many times a relatively small upstart team with enough freedom will accomplish greatness, only for corporate culture to completely destroy what was.

illys|1 year ago

I love the story... But don't forget this story is the proper selection of events with textual glue and interpretation to make it feel like a novel.

Some statements belong more to the glue than to History, and they should remind us this is a real-life-based * novel *. I especially noted this one: "nobody at IBM had any real experience with [microcomputers]".

IBM senior management was certainly reluctant, but "nobody"... They even had microcomputer products that hit the market:

- IBM 5100 1975, first IBM personal computer

- IBM 5110 1978, 5100 updated for a larger market target

- IBM System/23, under parallel development with the IBM PC and released 1 month before in July 1981: many of the IBM PC features are shared with or taken from it (8-bit Intel processor family 8080 vs. 8088, very same expansion connector, reuse of the electronic expansion cards such as serial, exact same keyboard - just in a different box and with different function keycaps...)

phkahler|1 year ago

>> His divisional heads always had the same answer. Microcomputers—home computing—were a fad. They were low-cost and low-profit. Let others scrabble around in the metaphorical dirt of home computing. The real money was in the markets that IBM’s divisions already dominated—selling vast mainframes and minicomputer systems to large businesses. Cary was even told to buy Atari, which by then had established itself as America’s home video game system of choice. That’s all home computers were good for: gaming.

This attitude was so short sighted. A friend of mines dad was using their Apple II for work-related spreadsheets and thought it was the greatest this ever. Not sure how IBM folks could not see this opportunity just because it was smaller scale than "what they did". 20 years later Intel seemed to have missed the mobile market due to a similar attitude.

dkarl|1 year ago

None of those division heads were trying to honestly assess the microcomputer market. They were trying to stay in harmony with opinion at their level and higher in IBM.

That's what you get at that level in a company that big. Anyone who is two or more levels from the top of the org chart and also two or more levels from the bottom lives in a reality that consists entirely of the attitudes and opinions of other people, weighted by each person's ability to impact their career. If they saw that the building they were in was on fire, their thought process would go something like: "Bob isn't here today because he's at that sales meeting. When he hears about the fire he'll downplay it as something minor, so I shouldn't evacuate or he'll think less of me. But Bob's boss Don is here. If Don evacuates and I don't, that might Don feel embarrassed and emasculated, and he'll take it out on Bob. So I need to evacuate if and only if Don evacuates. Bob won't mind me evacuating if Don does it. But Don's office is on the other side of that wall of approaching flames. Shit. My only chance is if he's in a meeting on this side of the building, so I can track him down and see what he's doing. Let me check his calendar real quick...."

LaundroMat|1 year ago

It's close to what Clayton Christensen describes as disruptive innovation (his examples were the steel industry and radio's): incumbents are forced higher up the chain by low quality competitors ("home computers are only good for gaming") that answer an unanswered need well enough. Once these competitors gain a foothold, quality improves and incumbents have less and less of a market.

akira2501|1 year ago

> Not sure how IBM folks could not see this opportunity just because it was smaller scale than "what they did".

I encourage everyone to get a copy of the Hercules emulator and a copy of the "Turn Key 5" MVS distribution and spend a little time using it. The mainframe idea of "computing" and "running jobs" is so comprehensively different it's really hard to map any previous consumer computing experience into it. It's also just a lot of fun because of that.

The whole experience is centered around efficient use of machine resources while providing a comprehensive batch execution and scheduling system for centralized job execution in this environment. The level of accounting, reporting, repeatability, and job language features is actually something worthwhile to dive into.

In any case, I'm willing to bet that IBM's internal ideology is that end users wouldn't want to do the computing themselves, but would instead go to middle men who would would purchase computing either directly from IBM or as some form of "remote job entry" through a third party provider. To that end they were rapidly building out the infrastructure to do just that.

> Intel seemed to have missed the mobile market due to a similar attitude.

In both cases, they're still here, although Intel did a much better job of catching up to their past mistakes.

calrain|1 year ago

This IBM viewpoint of PC's is well entrenched in enterprise IBM Mainframe support teams

I remember in 2010, having to present to a team of Mainframe techs at a bank about how we would be integrating an Identity solution (that ran on Windows servers) into their Mainframes.

They couldn't stop making comments about how useless Windows is and it's just a gaming platform. One guy ranted and raged so hard that he stood up and stormed out of the room.

I remember my Project Manager who was in the meeting looked at them and said 'Guys, we talked about this earlier'...

I can see where that mindset comes from, these guys have been drinking the IBM Kool-Aid for a long time

ordu|1 year ago

> Not sure how IBM folks could not see this opportunity just because it was smaller scale than "what they did".

Bureaucracy can be like that. Big bosses who might be really interested in increased profits rely on their subordinates to see the market, but subordinates are risk averse and don't want to change anything. Add corporate politics, people fighting not for innovations or for a market share but for promotions, and you'll get the picture.

It seems to be that they besides all that they were ideological, believed that size does matter and scorned on those who made computers smaller than theirs. Ideology means that people would have troubles to see anything that contradicts their ideology. Peer pressure, social desirability and all these things set up individual biases.

II2II|1 year ago

>> His divisional heads always had the same answer. Microcomputers—home computing—were a fad. They were low-cost and low-profit. Let others scrabble around in the metaphorical dirt of home computing.

Those views seemed to be relatively common. Just look at those home computer upstarts, many of which were scrambling to make business machines. Apple's follow ups to the Apple II were the Apple III and Lisa, both intended for business. Commodore seemed to have business computers on their plate most of the time. Tandy also pursued the business market. TI was a bit of an outlier in that they were into minicomputers before personal computers, and quickly jumped ship when they turned out to be low profit. Maybe it was different in Europe, but certainly not in North America.

I'm not entirely sure they were wrong either. A lot of companies rose then fell in the home computer market. IBM themselves haven't pursued the home market in decades at this point. Many, if not most, segments consolidated to the point that there was just one company with any meaningful market share. It could even be argued that the real money these days isn't in the hardware or software for the home market, but in the services they enable access to.

tracker1|1 year ago

They hamstrung and killed OS/2 similarly.

ghaff|1 year ago

In the case of Intel, based on what I saw, they were just desperate/convinced to turn the x86 into a beachhead for mobile (but Flash will be the same!) but that ended up not making sense.

insane_dreamer|1 year ago

> This attitude was so short sighted.

Reminds me of Kodak.

contingencies|1 year ago

A friend of mine's father was the head of Digital[0] in Australia and later sent to Boston after being promoted. I distinctly recall speaking to him in around 1995 regarding Linux. He, along with I believe a large number of commercial Unix vendors, snubbed his nose at Linux suggesting it was a passing fad and would never challenge their "serious" Unix. This is interesting because Jon 'Maddog' Hall[1], then CTO of Digital (before it was acquired by Compaq in 1998, acquired in turn by HP in 2002) certainly did get it... I interviewed him once in Sydney circa '99 and had a good long chat once in Taiwan circa '01 after crossing paths by chance. He was traveling the world proselytizing Linux in shorts and flip-flops, had a firm belief in embedded Linux changing the world (Android[2] wasn't released until nearly a decade later in 2008), but was yet to announce he was gay (took another decade). Fast forward 30 years: nobody younger than 40 has practically even heard of the company, Linux is in every household, and the very idea of a commercial Unix a joke.

Furthermore, in perfectly delicious irony, IBM's own modifications to Linux[3] to support the allocation of workloads to its giant server hardware have enabled the popularization of containers, further reducing demands for server equipment, increasing portability between desktop and server environments, and substantially drawing down the cost of provisioning for cloud services - the arch rival to traditional mainframe mentality. Today, in a world awash with dirt cheap and ever-present processing power and storage, as well as recently unimaginable levels of connectivity, we stand almost at the point where the term "server" itself has become an anachronism and consumption-oriented devices draw consumers toward "services" (often as paid for subscriptions).

IMHO some industries which will look nothing like today's version in 30 years' time: food, oil, transport, construction, clothing, health, and education. Carpe diem.

[0] https://en.wikipedia.org/wiki/Digital_Equipment_Corporation [1] https://en.wikipedia.org/wiki/Jon_Hall_(programmer) [2] https://en.wikipedia.org/wiki/Android_(operating_system) [3] https://git.kernel.org/pub/scm/linux/kernel/git/torvalds/lin...

phone8675309|1 year ago

> Not sure how IBM folks could not see this opportunity just because it was smaller scale than "what they did"

They thought it was a fad - that centralized systems (coincidentally, the machines they made) would be the computing platform that people would pay-per-minute/pay-per-hour/pay-per-month to access remotely. They wanted to be an information utility - a supplier to all - instead of selling a small, low margin box for one-time revenue.

"It is difficult to get a man to understand something, when his salary depends on his not understanding it." - Upton Sinclair

dreamcompiler|1 year ago

I was living in Texas at the time of Flight 191. I was no fan of the IBM PC but it was still gut-wrenching to hear that the father of the machine had been killed.

https://en.m.wikipedia.org/wiki/Delta_Air_Lines_Flight_191

This was the crash that brought the term "microburst" into the national consciousness.

mixmastamyk|1 year ago

Wow, the wiki page is nearly as good a read as the professionally written article. I broke the IBM-related text under the Passengers section out to its own paragraph, as it changes from statistics to specifics. Hopefully will remain that way.

nedrylandJP|1 year ago

There is a "high tech middle school" with his namesake in Boca Raton, FL, next to the former 1960s IBM R&D complex.

https://www.palmbeachschools.org/DonEstridgeMiddle

stevenwoo|1 year ago

Reading Boca Raton reminded me, they used the names of cities close to the offices for the internal names for products. I co-oped at IBM in Austin in the late 1980s and we tested the new models of PC Jr/Portable/PS2/peripherals and the internal codenames were stuff like Boca Raton and Cedar Key (which none of us got until our boss told us) but I can no longer remember which was which anymore.

yardie|1 year ago

Well, TIL. When I see a school named after someone who is not widely famous I assume they were an educator or politician.

AlbertCory|1 year ago

One consistent theme you get from business history is:

There is very little penalty for being wrong. There is often a huge penalty for being right, if the powers-that-be opposed you.

elzbardico|1 year ago

One of the biggest advantages of being young is still not knowing how real this can be.

doubloon|1 year ago

thats why you dont take credit for anything

yardie|1 year ago

I grew up in South Florida in the 80s and 90s. I was familiar with the IBM office in Boca Raton, nicknamed T-Rex, and had a few school friends who worked at IBM on the IBM PC. From what I can remember, the Boca campus was like garden leave. IBM sent you there when they didn't want you but couldn't fire you. So it was full of IBM misfits who were thrown out of HQ. I never made the connection to Flight 191, and assumed it was because of Hurricane Andrew. But once the PC market took off IBM wanted that team brought back into the veil. A lot of my friends moved to Cary, NC, more famously known as Research Triangle.

Miami, and South Florida overall, is kind of a crazy place to be. Every couple of decades people out west or from up north rediscover we actually exist. There are good engineers here but the West and Northeast have loads of money. So once CS/SWE really took off as a career the companies down here couldn't/wouldn't compete. Trust me, if you were an Asian/Indian kid in Florida in the 90s and told your parents you wanted to work in software they were going to beat some sense into you.

I've watched money flood into the area and then get carried back out when the financial tides changed. I always imagined Miami could have been kind of like a Silicon Valley but the politics, money, geography will work against it.

ghaff|1 year ago

The IBM PC in particular was probably never especially significant to IBM as a whole whatever its effect on the computer industry generally.

As someone who had IBM as a client for a number of years, we observed that there seemed to be a lot of IBM folks who basically ended up in some Siberia in one form or another.

selimthegrim|1 year ago

I always console myself one day that New Orleans will be Miami with drinking water.

nabla9|1 year ago

Classic example of Worse is Better.

All competing architectures were better than IBM PC architecture, PC BIOS was bad, chosen processor instruction set was the worst, MS-DOS operating system was bad. Only the keyboard was good.

What made it winner was open architecture, 80-column screen and IBM name.

forinti|1 year ago

I did some historical research to understand why the PC caught on (it made no sense to my 1980s teenage mind).

A PC with 80 columns card, 64KB of RAM and a floppy drive cost about the same as an Apple II Plus with the same specs (US$2,700).

A BBC Micro would set you back about US$1,500 (£900). It didn't offer slots, but did have 80 columns standard. It also had a lot of ports.

You couldn't even argue that the 8088 was much faster than the 6502. BASIC ran a lot faster on the 2MHz Beeb than on the PC.

The only thing that makes sense to me is that the people who bought it on launch were planning to use more than 64KB of RAM (which was rather expensive then).

mixmastamyk|1 year ago

Besides their name, they chose their market correctly, i.e. where the money/momentum was—small business. Third-party support was great.

Most people in the early 80s had no idea what to do with a computer at home, that's why they mostly were bought by enthusiasts, tinkerers, and gamers. I remember one of the main uses listed on the boxes was "keeping track of recipes." Haha, imagine spending thousands of dollars for a giant clunky thing to organize recipes when a box of index cards would do.

garius|1 year ago

keyboard wasn't great either, to begin with!

deater|1 year ago

I'm a bit curious about this part of the article:

> Unlike all of its major rivals—including the Apple II—the IBM PC was built > with an open architecture.

The Apple II, designed by Woz, is famously open, to the point the original model came with full schematics and ROM listings which made it trivially cloneable. I'm curious why this isn't considered an open architecture.

nabla9|1 year ago

Apple II hand proprietary design. You could not clone it legally.

Just like posting source code does not make the code open, publishing schematics does not make the design open.

KerrAvon|1 year ago

This part is factually incorrect:

> The easiest way to set that standard wasn’t just to sell machines; it was to let other companies sell parts, software, and even whole computers that would be compatible with your machine. Unlike all of its major rivals—including the Apple II—the IBM PC was built with an open architecture.

The Apple II was effectively as open as the PC. And IBM didn't want clones any more than Apple did. Both the Apple II and the PC were eventually legally cloned, and neither company could do anything about it.

ghaff|1 year ago

It might make an interesting business book--maybe I'll write it--what realistic business strategies companies that are widely viewed as failures could have followed though industry changes that boards/shareholders wouldn't have revolted about.

I'm not even sure IBM is a great example. It had a really rough stretch but is still there as a very profitable dividend-paying large corporation even if it's not considered cool.

kennethrc|1 year ago

> "The system would do two things. It would draw an absolutely beautiful picture of a nude lady, ..."

Lena? (https://en.wikipedia.org/wiki/Lenna)

garius|1 year ago

I did try and find out if it was. Sadly couldn't find anything to confirm it!

mixmastamyk|1 year ago

Probably monochrome for the very first prototype.

mdavid626|1 year ago

But she is not nude on this pic, right?

jes5199|1 year ago

there’s a sense in which IBM was right to fear the PC - it, in fact, killed their main industry, and they were not able to compete well in the new space, despite defining the standard. maybe they could have pursued it more enthusiastically and done better in the 1990s, but, it still would have been fighting against the tide

fortran77|1 year ago

It also killed their very successful Selectric business!

drumdance|1 year ago

In 1981 my sister was just out of college and worked on the PC as her first job. She said the loss of Estridge was devastating, and IBM changed some of their policies around executives traveling together because of it.

doubloon|1 year ago

well in the long run the naysayers were right. the personal PC business is strewn with dead companies scrounging for pennies. it is basically a loss leader. FAANG - which one of those make PCs? oh right, none of them except Apple which has like a 1 percent PC market share which is used to make apps and videos for phones.

my favorite screwdriver shop (PC parts, cases, cpus, fans, etc) just closed. one of the last in the city. decline in business.

zabzonk|1 year ago

i believe that estridge was being head-hunted at apple as ceo before they eventually hired scully. sad thing is that if they had hired him, he might even be alive today, but he preferred to stay at big blue.

rvense|1 year ago

I'd never heard of this, but it's in the article, too. I think the tech world would have looked very different if Estridge had taken over instead of Sculley.

djmips|1 year ago

A lot of the time, I really am turned off by these articles in short story form but this one flows well. Good job author!

labrador|1 year ago

The end of this article is so beautifully written it made me tear up

mellosouls|1 year ago

For people who are interested in this era, Halt and Catch Fire is a terrific portrayal of the sorts of characters and battles that defined it, albeit from more of a startup perspective.

https://www.youtube.com/watch?v=pWrioRji60A

pan69|1 year ago

I really enjoyed the first season (especially the first couple of episodes) as the focus of the story is the release of the product and the struggles associated with it.

The second season seems to become the typical personal drama / relationship / betrayal / writers kung-fu story arch / etc. that every series comprising more than one season seems to spiral into these days.

So, highly recommend the first season!

mistyvales|1 year ago

Came to say the same thing :D

The character of Cameron was highly inspired by Romero. In fact, the book Masters of Doom is kind of a blueprint for the show in some ways

endofreach|1 year ago

One of my favourite shows of all time. Wish there was more of it...

Thanks for the reminder to rewatch it. I really need that show now.

st3ve445678|1 year ago

Fantastic show, highly recommend.

aspenmayer|1 year ago

This is the first time I've heard someone refer to DEC as "Digital." Is that an Australian quirk? Not that it's wrong, as it is part of the name and could likely be a regional expression and/or historically accurate, and in any case it's before my time in the industry.

> but was yet to announce he was gay (took another decade)

I don't know why this detail was included; not that it's anything to be ashamed of. It just doesn't seem relevant at all to the other points you have raised, and seems a bit insensitive or judgemental imo.

p_l|1 year ago

It's a Digital thing.

IIRC, the company itself promoted the use of "Digital" rather than "DEC", with the latter being accident of funding availability when it was founded.

spacechild1|1 year ago

Ha! I had no idea what parent meant with 'Digital', thanks for clearing things up :)

contingencies|1 year ago

PC shutdowns have no place on HN, we deal in facts. It was included because I personally felt that it was an interesting historic tidbit... especially since he'd flipped from suit to flip-flops and commercial unix to Linux, but still kept this secret. No doubt in those days executives with career plans probably had to keep such things unknown. Can't comment on Digital analogues.

Re. classic PC shutdown emotive, character mud-slinging focused response below, check again: I did not state an opinion, I did exactly the opposite. Unsure what you think you are "calling me out" on - all I expressed was a (very public) fact[0] (furthermore, about which discussion has been intentionally invited), and when questioned, expressed the purpose for doing so was an interest in sharing that fact for others, since it is historically notable with respect to distinction from the current era. If my interest in sharing is now a matter subject to your offense, feel free to be offended, but don't post about it.

[0] https://www.linux-magazine.com/content/view/full/55727