"Later in the 1970s, Engelbart lost his key engineers to Xerox PARC lab, a lavish and well-funded research center a few miles away. At the head was Alan Kay, 15 years Engelbart’s junior—an upbeat, brilliant guy who knew how to inspire people. The laboratory chief was Engelbart’s former funder from ARPA, Robert Taylor. For Engelbart, networks had always been an inextricable part of his vision. But under Kay’s direction, the engineers created a personal computer, geared toward individual productivity rather than collaboration. Their software included more user-friendly versions of a few of Engelbart’s original ideas, including multiple windows, text with integrated graphics, and the mouse. A cruel joke of the time was that Engelbart’s Augmentation Research Center had been a training program for PARC." [1]
SRI did a boatload of work: USPS zip code automation, BART cards, ATM's, distributed enterprise computing in the 1980's, etc. Unlike its peers, it kept defense contracts below roughly 40% of its overall business. Even PARC was mostly awash in DARPA funds.
The final stage of the creative process is dissemination, and this is what failed Xerox Parc. This is also what failed Kodak, when they failed to capitalise on the digital camera, which they had invented.
The core issue is twofold:
1. Entrenched interests within the company. When Microsoft tried to develop their mobile strategy, the head of the Excel division deliberately made the mobile Excel experience a crappy one as he had no faith in Mobile to begin with.
2. Framing the new product within old paradigms. Kodak had it all: a head start on digital image capture, a partnership with Apply (with the QuickTake 100) and shot tons of money. But despite developing a web-based dissemination platform for digital photos, they could not foresee the rise of social sharing. This was a tragedy, as their Box Brownie (released 1900) had the same effect on the photo industry of the time as social sharing would have on the photo industry. But Kodak was fixated on the mistaken notion that consumers would want to print their 'Kodak moment'.
If Parc had failed at dissemination we'd all still be using the command line.
The windows/icons/mouse/menus paradigm disseminated just fine. It wasn't necessarily completely original - see also, Mother of All Demos - but they showed it could be implemented in real systems with real usability benefits.
Laser printers and networking also disseminated just fine.
What didn't disseminate was the development environment. That's not necessarily a surprise, because an environment that delights creative people with PhDs isn't going to translate well to the general public.
Ultimately Smalltalk, MESA, etc were like concept cars. You could admire them and learn from them, but they were never going to be viable as mainstream products.
Windows and Mac filled the gap by producing dumbed-down and overcomplicated versions of the original vision, which - most importantly - happened to be much cheaper.
Xerox get a lot of criticism for missing the potential, but it's easy to forget that in the 70s word processors typically cost five figures, and minicomputers equivalent to the Dorado cost six figures.
Maybe the future would have been different if someone had said "We need to take these ideas and work out how to build them as cheaply as possible."
But realistically commodified computing in the 70s ran on Z80s and still cost five figures for a business system with a hard drive.
The technology to make a cheaper version didn't exist until the 80s.
The problem since Parc is more that there has been no equivalent concentration of smart, creative, playful people. Technology culture changed from playful exploration to hustle and "engagement", there's less greenfield space to explore, and - IMO - there are very few people in the industry who have the raw IQ and the creativity to create a modern equivalent.
It's pretty much the opposite problem. Instead of exploring tech without obvious business goals, business goals are so tightly enforced it's very hard to imagine what would happen if completely open exploration was allowed again.
>But Kodak was fixated on the mistaken notion that consumers would want to print their 'Kodak moment'.
Yes and no. They also invented the PhotoCD.
Kodak doubtless did a lot of things wrong. But transitioning from largely a chemical and film consumables business--they actually owned a chemical company at one point--to digital was always going to be tough. Very little of their expertise and distribution channels was transferrable to digital.
Fujifilm did better by applying its emulsion knowhow to some healthcare products. And also came out with a pretty nice line of mirrorless cameras. But it had tough financial times too and was a much smaller company to pivot.
At the time (~1999) digital photography was starting to take shape on the consumer side, the only option for viewing digital photos was on a CRT computer screen. Also at the time, the idea of anyone other than “nerds” having a computer and wanting to view photos on them was completely foreign to executives in the company (this was the overwhelmingly popular view of almost everyone at the time, not just Kodak executives). IMO, this was by far the primary issue that caused Kodak to miss out.
Demo of Mesa/Cedar system, the lesser known system from Xerox ones, written in memory safe systems programming language of the same name, with reference counting and a cycle collector, and those unsafe code blocks.
Replicates the Lisp and Smalltalk development experience, introduces ideas that were inspiration for Oberon Gadgets, OLE and OpenDoc, had networking, file servers, only downside was being single user workstation.
We as industry are still catching up with that experience and overall system safety.
Yes, Cedar was impressive, like the adult version of the Oberon system. I wonder if the source code of the Cedar environment is available somewhere. There is an interesting report by Atkinson (and Jacobi, who is on the panel in the posted video) where they migrated the environment to Unix using C as an intermediate layer. This is apparently the version used for the demonstration. But I couldn't yet find this source code anywhere. Maybe someone has a link.
I'd like to think these ideas have been around and well-known for decades, and we're starting to see the hardware rise up to support them. If you told someone in 2001 that Linux would be rewritten in say, Java, with a garbage collector it would have been dismissed as starkly inefficient. Nowadays we have hardware that makes certain safeties negligible on performance (memory tagging, bounds checking, etc). Nowadays we can make an OS out of Java and it will stand just fine. Or Rust. Faster hardware means we run slower (and safer) software. Not everything gets isolated into an ASIC, of course.
I worked for Xerox in 1979-1980. I saw first-hand how the general Xerox organization was too copier-focused to appreciate the treasure they had at PARC. It's ironic that every new hire was given a copy of "The Billions Nobody Wanted" (by John H. Dessauer), a bio of Chester Carlson, the inventor of xerography.
At the top, Xerox saw the "paperless office" as a threat to their copier business. They denied my company access to the technical information we needed to make full use of the $500K ($1.7M today) laser printer we bought from them. A pox on Xerox!
To their credit, every branch had a terminal linked to an SDS Sigma running the Control Program V (CP-V) timesharing system. I used it to write a BASIC program to calculate optimal copier proposals for salesmen.
Fumbling the Future: not all that good; reads like a B-school case study
Dealers of Lightning: excellent history
Inventing the Future (by me!): what it was actually like to be there in the trenches. Fictionalized, but all facts are accurate. Foreword by David C. Smith, the guy who invented icons. That's his son on the cover, playing MazeWar on an Alto.
I commented Dealers in parallel, because of the narrative I'd argue it's one of the best tech histories every written. While Soul of the Machine was better reviewed, Dealers was more compelling.
I did enjoy your book by the way - thank you for writing it.
I'll add the Innovators by Walter Isaacson to the list as a general take on persons involved in the history of computing. It was well written and the stories within were exciting to read.
What a remarkable list of achievements in the PARC alumni. It's hard to think that amount of computer related talent will ever be concentrated to that level again, especially given the rapid ascent of our field driven in no small part by the innovation that went on at PARC.
This article seems to have been digitized with OCR. I've found several glitches: Smalltalk-8O, 'dis play', and other mistakes caused by OCR that should have been caught by a spelling check. Issues with quotes are apparent throughout the article: quotes ending with `“` instead of `”` (see the first paragraph) spaces between single and double quotes (`“ ‘` and `’ ”`) make lines starting with quotes look odd and leave a lines with just a double quote on it at the end of a paragraph (the second paragraph ends with a `”` on it's own line)
I also found this quite interesting:
> One innovative use for the network had nothing to do with people sending messages to one another; it involved communication solely between machines. Because the dynamic memory chips were so unreliable in those days, the Alto also ran a memory check when it wasn’t doing anything else. Its response to finding a bad chip was remarkable: “It would send a message telling which Alto was bad, which slot had the bad board, and which row and column had the bad chips,” Thornburg said. “The reason I found out about this was that one day the repairman showed up and said, ‘Any time you’re ready to power down, I need to fix your Alto,’ and I didn’t even know anything was wrong.”
Alan Kay, one of PARC's most prominent researchers, recommends Waldrop's book The Dream Machine. Actually I think he has an account here on Hacker News.
I first came across this book in a library (a while ago) and basically read the whole thing on the spot. From my entirely biased perspective as someone with a "vintage computer" collection, it's the best book ever. The staggering magnitude of the accomplishments and inventions of PARC are almost as fascinating as fiction. The ineptitude of Xerox management in figuring out what to do with the output of PARC was similarly unreal.
Bell Labs was able to commercialize (or at least license IP) from some of their work with reasonable return, but eventually lost momentum from the series of spinoff-layoff-mergers. How does an organization successfully manage or balance that kind of research productivity with commercialization in the long term? Is there any management literature on that in particular?
I know some places like Microsoft and Google have had research programs for quite some time, although of course Google mostly fails to adequately support products that come out of Google X before killing them off or scaling them back (such as Glass), and I'm not sure how much of what Microsoft Research does is actually commercialized. Notably, Butler Lampson, who was one of the key folks at PARC, has been with Microsoft Research since 1995.
You've got to appreciate the irony of how an article about historical revolutionary technologies is buried under what has to be one of the most obnoxious web interface I've ever seen. And on top of all that, you realize the website is affiliated with the IEEE.
This thing is not even properly printable, even with Print Friendly. I like to imagine Alan Kay stumbling upon this and what his reaction would be...
The politics in academia and research are horrid. It's mainly narcissism that drives people to publish rather than hide their stash. Every such group I've seen have the publishers taking everyone's work and putting their name on it. Smart people go find other smart people, and the smartest make a name for themselves by distilling the collective wisdom.
But the incentives are wrong. They'd rather be known for the idea than do the work to deliver it (and love to blame the bean counters for their own lack of diligence).
Apple today is right to sequester information -- not because it divulges secrets but because it's better culturally to say it's not real until it's a product people can use. They've tamped down the glory-mongering of their pre-Jobs 90's with its bloated Copland and Newton egos. To me that discipline drove Apple to deliver the innovations they have in the 2000's - not Jony's esthetics or Cook's efficiency.
My recent study of history leads me to conclude that had Douglas Engelbart[1] been allowed to keep working on NLS[2], we'd have gotten a far better result. Instead his staff was poached, and PARC was the far less than optimal result. History shows time and time again that the best outcome rarely happens. We still don't have a Memex[3] as proposed by Bush in 1945.
I've studied HCI and Engelbart's work for quite a long time, and I'm not sure I would fully agree that things would have been better had Engelbart continued working on NLS, or if NLS had won out.
One simple example is with copy and paste. NLS had a highly complex and error prone copy-and-paste system. A lot of the commands in NLS were modal (vs modeless), and thus error prone. See this article for more info:
http://worrydream.com/refs/Tesler%20-%20A%20Personal%20Histo...
Another issue was basic usability of NLS. Engelbart's vision was on continuous bootstrapping, continually aiming to create more powerful and expressive systems. This vision was pretty good for technical people, but also comes at the cost of basic ease of use and entry for newcomers. For instance, think about how many problems non-technical people still have with today's PCs and smartphones, many of which strive for "walk up and use" kind of usage.
Having been in research for over two decades now, my position is that we researchers will often identify the right problem and get the right general direction, but not always the right form. For instance, in the late 90s at Berkeley, folks correctly identified cluster computing as the future, and that we should aim to create abstractions to host Internet services to make them more scalable and robust. The specifics went in a very different direction (e.g. cloud computing, Kubernetes, etc), but the general thrust was right. As another example, my colleagues and I correctly identified rapid prototyping and testing tools for user interfaces as a major need, and while we inspired a lot of ideas with our work (pen-based sketching), industry went in a very different direction (see Figma and Axure and InVision).
I think the same is true with Engelbart's work. He was definitely a pioneer and helped open up people's imaginations as to what was possible with computing, but was off on a lot of the details and specifics that really matter for adoption. And that's ok because to a large extent, that's one of the many reasons we as a community and society do research.
"optimal" results are a fallacious indicator: optimal for now can mean a disaster in the near future, think about the actual just-in-time industry crisis for instance.
I do not say that's was happen since I do not really know nor NLS (of course I've seen the Mother of all the demos, but not much more especially in technical terms) nor Smalltalk workstations (I've dig a bit more there, emulation included, but lacking comparison with NLS...) however is a generally valid principles: in nature it does not win the better but the more adaptive. In humans terms some very good stuff might simply fade into oblivion because they are too ahead of their time for most people or are against someone's interests.
To spot a more modern example maybe a bit more easy to know by most HN-ers just see Java history, it's original idea and the subsequent practical fail due to the fact nobody really care the original idea adopting instead just the boilerplate. Plan 9 was another example.
IMVHO the real points was:
- pioneers have seen desktop computers, not services, with the human at the center, not a vendor or a service, more specifically connected people with powerful tool some in politics is named social tissue needed for democracy, something essentially all élites from all countries fear most;
- pioneers have cared about their idea, vision, not much their present time short terms needs, as a result most people have no idea about the future they envisage and have tried to meld what they are told about the future to their present created ridiculous stuff and encouraging some to go that way (see General Magik virtual office for instance);
- pioneers have not cared at all about scaling their gear, witch means how to make them en masse, creating a supply chain, ... as a result profiting from them was essentially an artisanal game no one can really sustain and at government level no one is much interested either.
Another today examples: we have seen in the recent 50+ years past various efforts to make small, cheap VTOL aircrafts, i.e. "flying cars". Now we see even financial institutions embracing them in their sauce, well so far such experiments have proven feasible and to a certain extent VERY interested for all. So far most think they are impossible. Useless. Unsafe. .... As a result the general reactionary behaviors of essentially any large community from élites (who feel well in general and fear loosing their status if something change) to generic human beings (who generally think the change is bad, not needed, they simply fear it if is a real change) annihilate or at least postpone in a very far future any real revolution.
Culture is typically the mean to make things happen but so far we seems to be unable (beside the fact that most do NOT want that happen) to really spread a substantial culture though the masses. We know how to do that in small cohort but not in larger ones. We still miss a way to connect those small cohort to the humanity enough to make the magic happen...
An absolute storm of Creativity where each Genius exponentially builds off of another. The Energy, Enthusiasm and Drive of these Geniuses leaps out at you.
Bur as usual A bunch of horse’s asses who don’t know anything about technology were making the decision (direct quote) and squandered it all.
Would it be possible to recreate a similar environment in any company today? I highly doubt it. Nobody looks at Research as it was done in Xerox PARC, Bell Labs and HP Labs.
Some relevant quotes;
Some researchers say PARC was a product of the 1960s and that decade’s philosophy of power to the people, of improving the quality of life. When the center opened in 1970, it was unlike other major industrial research laboratories; its work wasn’t tied, even loosely, to its corporate parent’s current product lines. And unlike university research laboratories, PARC had one unifying vision: it would develop “the architecture of information.”
Since projects were not assigned from above, the researchers formed their own groups; support for a project depended on how many people its instigator could get to work on it.
“Systems research requires building systems,” he said. “Otherwise you don’t know whether the ideas you have are any good, or how difficult they are to implement.
Since MAXC, the center has built prototypes of dozens of hardware and software systems—prototypes that sometimes numbered in the thousands of units.
“There was a lab where the Altos were getting built, with circuit boards lying around, and anyone could go in and work on them,” recalled Daniel H.H. Ingalls,
If you’re dealing with marketing or planning people, make them kick the tires. All the charts and all the slides aren’t worth a damn
One reason that Xerox had such trouble bringing PARC’s advances to market was that, until 1976, there was no development organization to take research prototypes from PARC and turn them into products.
“The amazing thing about the PARC environment in 1976-77 was the feeling of power; all of a sudden you could create things and make lots of them. Not just one sheet, but whole books,” said Conway.
But some of those who left PARC recalled that a disillusionment had set in. They hadn’t been frustrated with the progression of their careers; rather, they had been frustrated with the rate of progression of their products into the real world.
I think people forget that it takes 2 types of people to move tech or for that matter a company to greatness. The product is not the star. There's the dreamer that comes up with a product and then there's the one that sells the dream to the rest of us. They are usually BSer's to the nth degree. They will lie to your face and you will love them for that. They are soooooo loved by some that some men and women dream of having their baby. Think IBM,Apple,Ford,TESLA and many more.
Xerox PARC, just didn't have a great BSer in-house.
Dealers of Lightening is a book I recommend to anyone who has worked in technology. It's an important story to understand, how you can have all the right people, and all the right technology, and literally be a generation ahead of everyone else, and still fail.
Having technology that can easily be commercialized at a reasonable price is more important that having the best, newest or fastest technology - you can always iterate later.
The Mac was successful not because it was best on a technical level - it wasn't, but it hit the right price point, and the right level of functionality - and at the right time.
My guess at what in the early days of personal computing, i.e., ~1985, really caused the rapid growth: Replacing the typewriters.
The labor of using typewriters was enormous. So, replacing a typewriter with just a PC, maybe with only a floppy disk for file storage, and just an early, crude dot matrix printer saved a lot of money.
Next, electronic spreadsheets saved a lot of money.
Next, with just dial-up connections, we got email which in many cases was a big improvement, speed, cost on what the USPS, FedEx, etc. provided.
...
Now for shoppers, on-line shopping can often yield better shopping results with huge savings in time and effort and also often money.
Apparently now some investors are looking at the Internet and remote work as a way for workers and families to get huge savings on costs of housing and transportation.
Then, nearly everything in the usual parts and pieces of the economy is open to huge cost savings via automation based on computing and the Internet, in simple terms, having some robot do the work instead of a human. E.g., it's spring and time to plow the field and put in a crop of corn. So, click on an icon, and the tractor fires itself up, plows the field, and then puts in the seed. And the tractor was made nearly entirely by robots, maybe including some 3D printing.
Then in the supply chain, the corn goes to feed hogs, chickens, and to fatten up cattle, and the work there also gets heavily automated. Sure, cornmeal can also be used for tasty breading for fried fish and chicken!
Uh, maybe surprisingly soon and quite broadly we don't really need more farm workers but fewer and, then, more developers of the automation ...!
Developing all the needed automation is not so easy now, in particular, while can save big on Opex can have a big initial Capex. So, we need better tools for developing the automation -- robots, artificial intelligence, whatever call it.
Hmm .... With the automation, will (finally) have good numerical data on nearly all parts of the supply chain from a mine, farm, forest, etc. all the way to the dinner table, etc. Then we will notice that with the good, new numerical data, can get some cost savings by the now old subject optimization, mathematical programming, from linear programming, nonlinear programming, ... stochastic optimal control, etc.
And with all the good data on the supply chain, should be able to do some analysis and do well investing the commodities markets.
It used to be, "Machines should work. Humans should think." Maybe it will be "Machines do the work, and humans enjoy life", e.g., pursue family, art, science, understand the universe, etc.
Uh, for that future, we have a lot of software to write!
In the mid-term, the replacement of typewriters with computers led to the secretary's job description going away (phone answering machines helped too). Somewhere I read (warning: this may be apocryphal) that secretaries were exactly one of the reasons Xerox never cashed in on the personal computer (or the Xerox Star): Xerox managers had secretaries to do their typing, and couldn't imagine doing it themselves; nor could they imagine giving their secretary a piece of hardware that cost as much as the secretary's yearly salary. (The Xerox Star cost somewhere in the range of $15k, I'm just guessing what a secretary's salary was back then. And never mind that factory laborers used machinery that cost at least that much.)
>My guess at what in the early days of personal computing, i.e., ~1985, really caused the rapid growth: Replacing the typewriters.
In business, word processing really came in with the minicomputer-based word processors from the likes of Wang and the other minicomputer makers. Though they were by no means universal.
When the IBM PC came out in 1981, yes people used it for word processing but spreadsheets, especially Lotus 1-2-3, were really the killer app.
[+] [-] matesz|3 years ago|reply
"Later in the 1970s, Engelbart lost his key engineers to Xerox PARC lab, a lavish and well-funded research center a few miles away. At the head was Alan Kay, 15 years Engelbart’s junior—an upbeat, brilliant guy who knew how to inspire people. The laboratory chief was Engelbart’s former funder from ARPA, Robert Taylor. For Engelbart, networks had always been an inextricable part of his vision. But under Kay’s direction, the engineers created a personal computer, geared toward individual productivity rather than collaboration. Their software included more user-friendly versions of a few of Engelbart’s original ideas, including multiple windows, text with integrated graphics, and the mouse. A cruel joke of the time was that Engelbart’s Augmentation Research Center had been a training program for PARC." [1]
[1] https://www.smithsonianmag.com/innovation/douglas-engelbart-....
[+] [-] w10-1|3 years ago|reply
[+] [-] mistrial9|3 years ago|reply
http://liveblog.co/users/davewiner/2015/05/06/iWouldHaveHire...
[+] [-] Daub|3 years ago|reply
The core issue is twofold:
1. Entrenched interests within the company. When Microsoft tried to develop their mobile strategy, the head of the Excel division deliberately made the mobile Excel experience a crappy one as he had no faith in Mobile to begin with.
2. Framing the new product within old paradigms. Kodak had it all: a head start on digital image capture, a partnership with Apply (with the QuickTake 100) and shot tons of money. But despite developing a web-based dissemination platform for digital photos, they could not foresee the rise of social sharing. This was a tragedy, as their Box Brownie (released 1900) had the same effect on the photo industry of the time as social sharing would have on the photo industry. But Kodak was fixated on the mistaken notion that consumers would want to print their 'Kodak moment'.
[+] [-] TheOtherHobbes|3 years ago|reply
The windows/icons/mouse/menus paradigm disseminated just fine. It wasn't necessarily completely original - see also, Mother of All Demos - but they showed it could be implemented in real systems with real usability benefits.
Laser printers and networking also disseminated just fine.
What didn't disseminate was the development environment. That's not necessarily a surprise, because an environment that delights creative people with PhDs isn't going to translate well to the general public.
Ultimately Smalltalk, MESA, etc were like concept cars. You could admire them and learn from them, but they were never going to be viable as mainstream products.
Windows and Mac filled the gap by producing dumbed-down and overcomplicated versions of the original vision, which - most importantly - happened to be much cheaper.
Xerox get a lot of criticism for missing the potential, but it's easy to forget that in the 70s word processors typically cost five figures, and minicomputers equivalent to the Dorado cost six figures.
Maybe the future would have been different if someone had said "We need to take these ideas and work out how to build them as cheaply as possible."
But realistically commodified computing in the 70s ran on Z80s and still cost five figures for a business system with a hard drive.
The technology to make a cheaper version didn't exist until the 80s.
The problem since Parc is more that there has been no equivalent concentration of smart, creative, playful people. Technology culture changed from playful exploration to hustle and "engagement", there's less greenfield space to explore, and - IMO - there are very few people in the industry who have the raw IQ and the creativity to create a modern equivalent.
It's pretty much the opposite problem. Instead of exploring tech without obvious business goals, business goals are so tightly enforced it's very hard to imagine what would happen if completely open exploration was allowed again.
[+] [-] ghaff|3 years ago|reply
Yes and no. They also invented the PhotoCD.
Kodak doubtless did a lot of things wrong. But transitioning from largely a chemical and film consumables business--they actually owned a chemical company at one point--to digital was always going to be tough. Very little of their expertise and distribution channels was transferrable to digital.
Fujifilm did better by applying its emulsion knowhow to some healthcare products. And also came out with a pretty nice line of mirrorless cameras. But it had tough financial times too and was a much smaller company to pivot.
[+] [-] orev|3 years ago|reply
Source: I worked in the Kodak HQ at this time.
[+] [-] unknown|3 years ago|reply
[deleted]
[+] [-] pjmlp|3 years ago|reply
https://youtu.be/z_dt7NG38V4
Replicates the Lisp and Smalltalk development experience, introduces ideas that were inspiration for Oberon Gadgets, OLE and OpenDoc, had networking, file servers, only downside was being single user workstation.
We as industry are still catching up with that experience and overall system safety.
[+] [-] Rochus|3 years ago|reply
[+] [-] agumonkey|3 years ago|reply
[+] [-] thrwawy283|3 years ago|reply
[+] [-] GnarfGnarf|3 years ago|reply
At the top, Xerox saw the "paperless office" as a threat to their copier business. They denied my company access to the technical information we needed to make full use of the $500K ($1.7M today) laser printer we bought from them. A pox on Xerox!
To their credit, every branch had a terminal linked to an SDS Sigma running the Control Program V (CP-V) timesharing system. I used it to write a BASIC program to calculate optimal copier proposals for salesmen.
[+] [-] mcswell|3 years ago|reply
[+] [-] AlbertCory|3 years ago|reply
For more reading:
Fumbling the Future: not all that good; reads like a B-school case study
Dealers of Lightning: excellent history
Inventing the Future (by me!): what it was actually like to be there in the trenches. Fictionalized, but all facts are accurate. Foreword by David C. Smith, the guy who invented icons. That's his son on the cover, playing MazeWar on an Alto.
[+] [-] Aloha|3 years ago|reply
I did enjoy your book by the way - thank you for writing it.
[+] [-] mistrial9|3 years ago|reply
[+] [-] chillpenguin|3 years ago|reply
[+] [-] AlbertCory|3 years ago|reply
[+] [-] skimdesk|3 years ago|reply
[+] [-] sien|3 years ago|reply
A note for folks, the eBook of "Inventing the Future" is $US 3.95 or it can be read on Kindle Unlimited
It's great when people price eBooks cheaply like this.
[+] [-] rileyphone|3 years ago|reply
- Sent from my (not quite) Dynabook
[+] [-] efreak|3 years ago|reply
I also found this quite interesting:
> One innovative use for the network had nothing to do with people sending messages to one another; it involved communication solely between machines. Because the dynamic memory chips were so unreliable in those days, the Alto also ran a memory check when it wasn’t doing anything else. Its response to finding a bad chip was remarkable: “It would send a message telling which Alto was bad, which slot had the bad board, and which row and column had the bad chips,” Thornburg said. “The reason I found out about this was that one day the repairman showed up and said, ‘Any time you’re ready to power down, I need to fix your Alto,’ and I didn’t even know anything was wrong.”
[+] [-] unknown|3 years ago|reply
[deleted]
[+] [-] mepian|3 years ago|reply
[+] [-] wanderingmind|3 years ago|reply
[+] [-] voldacar|3 years ago|reply
[+] [-] nikolayasdf123|3 years ago|reply
[+] [-] mfarstad|3 years ago|reply
[+] [-] jinto36|3 years ago|reply
Bell Labs was able to commercialize (or at least license IP) from some of their work with reasonable return, but eventually lost momentum from the series of spinoff-layoff-mergers. How does an organization successfully manage or balance that kind of research productivity with commercialization in the long term? Is there any management literature on that in particular?
I know some places like Microsoft and Google have had research programs for quite some time, although of course Google mostly fails to adequately support products that come out of Google X before killing them off or scaling them back (such as Glass), and I'm not sure how much of what Microsoft Research does is actually commercialized. Notably, Butler Lampson, who was one of the key folks at PARC, has been with Microsoft Research since 1995.
[+] [-] nikolayasdf123|3 years ago|reply
[+] [-] syl_sau|3 years ago|reply
[+] [-] chasil|3 years ago|reply
Privacy Browser doesn't enable JavaScript by default, and reduces frustration with many sites.
[+] [-] pratyushag2|3 years ago|reply
[+] [-] w10-1|3 years ago|reply
The politics in academia and research are horrid. It's mainly narcissism that drives people to publish rather than hide their stash. Every such group I've seen have the publishers taking everyone's work and putting their name on it. Smart people go find other smart people, and the smartest make a name for themselves by distilling the collective wisdom.
But the incentives are wrong. They'd rather be known for the idea than do the work to deliver it (and love to blame the bean counters for their own lack of diligence).
Apple today is right to sequester information -- not because it divulges secrets but because it's better culturally to say it's not real until it's a product people can use. They've tamped down the glory-mongering of their pre-Jobs 90's with its bloated Copland and Newton egos. To me that discipline drove Apple to deliver the innovations they have in the 2000's - not Jony's esthetics or Cook's efficiency.
[+] [-] mikewarot|3 years ago|reply
1 - https://en.wikipedia.org/wiki/Douglas_Engelbart
2 - https://www.darpa.mil/about-us/timeline/nls
3 - https://www.darpa.mil/program/memex
[+] [-] jasonhong|3 years ago|reply
One simple example is with copy and paste. NLS had a highly complex and error prone copy-and-paste system. A lot of the commands in NLS were modal (vs modeless), and thus error prone. See this article for more info: http://worrydream.com/refs/Tesler%20-%20A%20Personal%20Histo...
Another issue was basic usability of NLS. Engelbart's vision was on continuous bootstrapping, continually aiming to create more powerful and expressive systems. This vision was pretty good for technical people, but also comes at the cost of basic ease of use and entry for newcomers. For instance, think about how many problems non-technical people still have with today's PCs and smartphones, many of which strive for "walk up and use" kind of usage.
Having been in research for over two decades now, my position is that we researchers will often identify the right problem and get the right general direction, but not always the right form. For instance, in the late 90s at Berkeley, folks correctly identified cluster computing as the future, and that we should aim to create abstractions to host Internet services to make them more scalable and robust. The specifics went in a very different direction (e.g. cloud computing, Kubernetes, etc), but the general thrust was right. As another example, my colleagues and I correctly identified rapid prototyping and testing tools for user interfaces as a major need, and while we inspired a lot of ideas with our work (pen-based sketching), industry went in a very different direction (see Figma and Axure and InVision).
I think the same is true with Engelbart's work. He was definitely a pioneer and helped open up people's imaginations as to what was possible with computing, but was off on a lot of the details and specifics that really matter for adoption. And that's ok because to a large extent, that's one of the many reasons we as a community and society do research.
[+] [-] kkfx|3 years ago|reply
I do not say that's was happen since I do not really know nor NLS (of course I've seen the Mother of all the demos, but not much more especially in technical terms) nor Smalltalk workstations (I've dig a bit more there, emulation included, but lacking comparison with NLS...) however is a generally valid principles: in nature it does not win the better but the more adaptive. In humans terms some very good stuff might simply fade into oblivion because they are too ahead of their time for most people or are against someone's interests.
To spot a more modern example maybe a bit more easy to know by most HN-ers just see Java history, it's original idea and the subsequent practical fail due to the fact nobody really care the original idea adopting instead just the boilerplate. Plan 9 was another example.
IMVHO the real points was:
- pioneers have seen desktop computers, not services, with the human at the center, not a vendor or a service, more specifically connected people with powerful tool some in politics is named social tissue needed for democracy, something essentially all élites from all countries fear most;
- pioneers have cared about their idea, vision, not much their present time short terms needs, as a result most people have no idea about the future they envisage and have tried to meld what they are told about the future to their present created ridiculous stuff and encouraging some to go that way (see General Magik virtual office for instance);
- pioneers have not cared at all about scaling their gear, witch means how to make them en masse, creating a supply chain, ... as a result profiting from them was essentially an artisanal game no one can really sustain and at government level no one is much interested either.
Another today examples: we have seen in the recent 50+ years past various efforts to make small, cheap VTOL aircrafts, i.e. "flying cars". Now we see even financial institutions embracing them in their sauce, well so far such experiments have proven feasible and to a certain extent VERY interested for all. So far most think they are impossible. Useless. Unsafe. .... As a result the general reactionary behaviors of essentially any large community from élites (who feel well in general and fear loosing their status if something change) to generic human beings (who generally think the change is bad, not needed, they simply fear it if is a real change) annihilate or at least postpone in a very far future any real revolution.
Culture is typically the mean to make things happen but so far we seems to be unable (beside the fact that most do NOT want that happen) to really spread a substantial culture though the masses. We know how to do that in small cohort but not in larger ones. We still miss a way to connect those small cohort to the humanity enough to make the magic happen...
[+] [-] unknown|3 years ago|reply
[deleted]
[+] [-] jhoechtl|3 years ago|reply
it was a loss for xerox but a win for mankind. Many outcomes of Xerox became patent unencumbered
[+] [-] rramadass|3 years ago|reply
An absolute storm of Creativity where each Genius exponentially builds off of another. The Energy, Enthusiasm and Drive of these Geniuses leaps out at you.
Bur as usual A bunch of horse’s asses who don’t know anything about technology were making the decision (direct quote) and squandered it all.
Would it be possible to recreate a similar environment in any company today? I highly doubt it. Nobody looks at Research as it was done in Xerox PARC, Bell Labs and HP Labs.
Some relevant quotes;
Some researchers say PARC was a product of the 1960s and that decade’s philosophy of power to the people, of improving the quality of life. When the center opened in 1970, it was unlike other major industrial research laboratories; its work wasn’t tied, even loosely, to its corporate parent’s current product lines. And unlike university research laboratories, PARC had one unifying vision: it would develop “the architecture of information.”
Since projects were not assigned from above, the researchers formed their own groups; support for a project depended on how many people its instigator could get to work on it.
“Systems research requires building systems,” he said. “Otherwise you don’t know whether the ideas you have are any good, or how difficult they are to implement.
Since MAXC, the center has built prototypes of dozens of hardware and software systems—prototypes that sometimes numbered in the thousands of units.
“There was a lab where the Altos were getting built, with circuit boards lying around, and anyone could go in and work on them,” recalled Daniel H.H. Ingalls,
If you’re dealing with marketing or planning people, make them kick the tires. All the charts and all the slides aren’t worth a damn
One reason that Xerox had such trouble bringing PARC’s advances to market was that, until 1976, there was no development organization to take research prototypes from PARC and turn them into products.
“The amazing thing about the PARC environment in 1976-77 was the feeling of power; all of a sudden you could create things and make lots of them. Not just one sheet, but whole books,” said Conway.
But some of those who left PARC recalled that a disillusionment had set in. They hadn’t been frustrated with the progression of their careers; rather, they had been frustrated with the rate of progression of their products into the real world.
[+] [-] WheelsAtLarge|3 years ago|reply
Xerox PARC, just didn't have a great BSer in-house.
[+] [-] tester756|3 years ago|reply
but what actually makes those places so special? why them, not somebody else?
What made the difference between them and the rest?
[+] [-] sytelus|3 years ago|reply
[+] [-] Aloha|3 years ago|reply
Having technology that can easily be commercialized at a reasonable price is more important that having the best, newest or fastest technology - you can always iterate later.
The Mac was successful not because it was best on a technical level - it wasn't, but it hit the right price point, and the right level of functionality - and at the right time.
[+] [-] graycat|3 years ago|reply
The labor of using typewriters was enormous. So, replacing a typewriter with just a PC, maybe with only a floppy disk for file storage, and just an early, crude dot matrix printer saved a lot of money.
Next, electronic spreadsheets saved a lot of money.
Next, with just dial-up connections, we got email which in many cases was a big improvement, speed, cost on what the USPS, FedEx, etc. provided.
...
Now for shoppers, on-line shopping can often yield better shopping results with huge savings in time and effort and also often money.
Apparently now some investors are looking at the Internet and remote work as a way for workers and families to get huge savings on costs of housing and transportation.
Then, nearly everything in the usual parts and pieces of the economy is open to huge cost savings via automation based on computing and the Internet, in simple terms, having some robot do the work instead of a human. E.g., it's spring and time to plow the field and put in a crop of corn. So, click on an icon, and the tractor fires itself up, plows the field, and then puts in the seed. And the tractor was made nearly entirely by robots, maybe including some 3D printing.
Then in the supply chain, the corn goes to feed hogs, chickens, and to fatten up cattle, and the work there also gets heavily automated. Sure, cornmeal can also be used for tasty breading for fried fish and chicken!
Uh, maybe surprisingly soon and quite broadly we don't really need more farm workers but fewer and, then, more developers of the automation ...!
Developing all the needed automation is not so easy now, in particular, while can save big on Opex can have a big initial Capex. So, we need better tools for developing the automation -- robots, artificial intelligence, whatever call it.
Hmm .... With the automation, will (finally) have good numerical data on nearly all parts of the supply chain from a mine, farm, forest, etc. all the way to the dinner table, etc. Then we will notice that with the good, new numerical data, can get some cost savings by the now old subject optimization, mathematical programming, from linear programming, nonlinear programming, ... stochastic optimal control, etc.
And with all the good data on the supply chain, should be able to do some analysis and do well investing the commodities markets.
It used to be, "Machines should work. Humans should think." Maybe it will be "Machines do the work, and humans enjoy life", e.g., pursue family, art, science, understand the universe, etc.
Uh, for that future, we have a lot of software to write!
[+] [-] mcswell|3 years ago|reply
[+] [-] ghaff|3 years ago|reply
In business, word processing really came in with the minicomputer-based word processors from the likes of Wang and the other minicomputer makers. Though they were by no means universal.
When the IBM PC came out in 1981, yes people used it for word processing but spreadsheets, especially Lotus 1-2-3, were really the killer app.
[+] [-] unknown|3 years ago|reply
[deleted]