One of the things I love about Engelbart's work is that it does not obfuscate the underlying data. You have the data structure at your finger tips and it's up to you how you display it or interact with it. If you've got a graph, you interact with that graph as a graph. If you create a list you can interact with that as a list.
I am trying to implement something in a similar vein[1].
December 9, 1968 Douglas Engelbart did something what I believe was a huge leap into the future. He didn't just invent the mouse, but the whole experience and interactivity with a computer which was unheard.
Do that sort of things still happen today? Of course new things get invented and new ways to interact with media and a computer such as Leap Motion, Myo - Gesture control armband, Siri, and I'm sure there are tons of other examples out there. But they seem to be what in Engelbart's presentation was the mouse, just a part of the larger picture.
This presentation was on par with the likes of Gutenberg's press, whoever first smelted metal, Einstein's relativity, and a handful of other simple concepts with singularity / black-swan level impacts. Great improvements & refinements occurred to be sure, but the base concepts were utterly new and staggeringly impactful.
Presumably he was picking the low hanging fruit. By today all the obvious or easy ideas have already been done a hundred times over. It's always harder to make progress in an older field than a newer one.
He loses me a bit when he starts talking about his intent. Englebart indeed demonstrated precursors to all these modern technologies, despite the author claiming otherwise. So he didn't demonstrate a precursor Skype and screen sharing, just because there's only one mouse pointer to control in Skype? Ok, fine. Then he demonstrated the precursor to Screenhero, which has multiple mouse pointers and embodies Englebart's intent exactly.
Would there be a better quality video available anywhere? I've always thought that of the one single copy we do have floating around the net, the quality is garbage. I wanna read the text!
That's an awesome chord keyboard - http://www.youtube.com/watch?v=yJDv-zdhzMY#t=2039 I've always wondered why more people don't use them. It seems significantly less prone to errors - instead of fat fingers or incorrect placement, you have to coordinate the timing between the fingers.
It's pretty amazing how much that video demonstrates. I wonder what the next version of that video will be. Hopefully it's not computer related.
I'd like to try one, but there appears to be basically no such thing as a bluetooth-enabled one-hand chord keyboard that you are intended to hold in your hand (as opposed to having on a table), and that's the use case I'm interested it... an input device for my augmented reality glasses while I'm walking around. On that note, if they're ever going to make a... well... can't really make a comeback if you never made an appearance at all... an appearance, that's probably the scenario that will drive them. Voice may cover casual usage, but when you really need to go to town you're going to need something more, and no current input device can meet that need.
It wouldn't be very useful without NLS. It wasn't intended for text entry but to select commands to execute while the mouse selected the target for the commands, ie you would enter DW for "delete word" with the chorded keyboard and use the mouse to select the word to delete.
It disappeared because the people a Xerox Labs decided it was too hard for normal people to learn to interact with a computer in this way, so they replaced the chorded keyboard with on-screen buttons plus a set of keys on the left of the keyboard for the most used operations (Undo, Open, Copy, Paste, etc).
Then Apple, to make things even friendlier, removed the extra buttons from the keyboard entirely and relegated the most frequently used operations to key combinations (Cmd-C, Cmd-X, Cmd-V, etc). You will notice that many shortcuts are relegated to the left side of the keyboard.
Small case only was pretty popular in the interwar period, too. You'd also be hard pressed to find a single capital in 1957's 12 Angry Men. Even the names of the actors in the credits are all lower case.
If you are interested in how NLS was actually used I also suggest watching the 1969 demo which is less flashy but contains a better explanation of the interface.
It looks like a interactive document/hypertext backed-up by a schema-less store ( much like JSON, with hierarchies ). Also I liked the idea to build "views" from the document(s).
Views are a very powerful idea, one that is currently under represented.
Today, if you want a different view of a document/data, you typically have to convert the document/data into a new one (in a different format). Then, as you change the original, the other remains outdated. Views are awesome because they remain up to date.
It is shocking to me that it took so long for this technology to be realized by the public at large. Decades passed before this technology reached people's homes.
The Q&A session at the end is very interesting. People asking about gestures and using a pencil (stylus) instead of a keyboard (note this is back in 86).
Is everyone seeing a bunch of screwed up inline links, or is it just me? Stuff like this:
<a href="http://A young Stewart Brand — who would shortly launch The Whole Earth Catalog — operated one of the cameras in Menlo Park. Brand, along with others,">was fairly mind-blowing</a>
Incidentally, the way the kids were typing in the recent Ender's Game movie is interesting. Does anyone know how deeply they designed that system/interface? (i.e., was it just random keystrokes or did the chording actually matter?)
[+] [-] samsquire|12 years ago|reply
I am trying to implement something in a similar vein[1].
[1] 98, 4, & 44: https://github.com/samsquire/ideas
[+] [-] larrybolt|12 years ago|reply
Do that sort of things still happen today? Of course new things get invented and new ways to interact with media and a computer such as Leap Motion, Myo - Gesture control armband, Siri, and I'm sure there are tons of other examples out there. But they seem to be what in Engelbart's presentation was the mouse, just a part of the larger picture.
[+] [-] davepeck|12 years ago|reply
[1] Jan 9, 2007 video in this podcast series: https://itunes.apple.com/us/podcast/apple-keynotes/id2758346...
[+] [-] ctdonath|12 years ago|reply
[+] [-] Houshalter|12 years ago|reply
[+] [-] jgrahamc|12 years ago|reply
[+] [-] chas|12 years ago|reply
[+] [-] adamio|12 years ago|reply
What about today's tools isn't alighted with Engelbart's vision? Everything? That seems a bit too broad
[+] [-] nsxwolf|12 years ago|reply
[+] [-] melloclello|12 years ago|reply
[+] [-] km3k|12 years ago|reply
Part 1 - https://archive.org/details/XD300-23_68HighlightsAResearchCn...
Part 2 - https://archive.org/details/XD300-24_68HighlightsAResearchCn...
Part 3 - https://archive.org/details/XD300-25_68HighlightsAResearchCn...
[Edit: formatting]
[+] [-] mercuryrising|12 years ago|reply
It's pretty amazing how much that video demonstrates. I wonder what the next version of that video will be. Hopefully it's not computer related.
[+] [-] jerf|12 years ago|reply
[+] [-] EdiX|12 years ago|reply
It disappeared because the people a Xerox Labs decided it was too hard for normal people to learn to interact with a computer in this way, so they replaced the chorded keyboard with on-screen buttons plus a set of keys on the left of the keyboard for the most used operations (Undo, Open, Copy, Paste, etc). Then Apple, to make things even friendlier, removed the extra buttons from the keyboard entirely and relegated the most frequently used operations to key combinations (Cmd-C, Cmd-X, Cmd-V, etc). You will notice that many shortcuts are relegated to the left side of the keyboard.
[+] [-] sp332|12 years ago|reply
Edit: I mean on the announcement at the bottom of the article.
[+] [-] ianbicking|12 years ago|reply
[+] [-] eru|12 years ago|reply
[+] [-] EdiX|12 years ago|reply
https://archive.org/details/XD300-23_68HighlightsAResearchCn... (first reel, follow the links for the other two)
If you are interested in how NLS was actually used I also suggest watching the 1969 demo which is less flashy but contains a better explanation of the interface.
https://archive.org/details/XD301_69ASISconfPres_Reel1
[+] [-] unknown|12 years ago|reply
[deleted]
[+] [-] ccozan|12 years ago|reply
It looks like a interactive document/hypertext backed-up by a schema-less store ( much like JSON, with hierarchies ). Also I liked the idea to build "views" from the document(s).
[+] [-] shurcooL|12 years ago|reply
Today, if you want a different view of a document/data, you typically have to convert the document/data into a new one (in a different format). Then, as you change the original, the other remains outdated. Views are awesome because they remain up to date.
[+] [-] noselasd|12 years ago|reply
[+] [-] jostmey|12 years ago|reply
[+] [-] columbo|12 years ago|reply
https://archive.org/details/XD302_86ACM_Prese_AugKnowledgeWo...
The Q&A session at the end is very interesting. People asking about gestures and using a pencil (stylus) instead of a keyboard (note this is back in 86).
[+] [-] tempestn|12 years ago|reply
<a href="http://A young Stewart Brand — who would shortly launch The Whole Earth Catalog — operated one of the cameras in Menlo Park. Brand, along with others,">was fairly mind-blowing</a>
[+] [-] saraid216|12 years ago|reply
[+] [-] unknown|12 years ago|reply
[deleted]
[+] [-] andrelaszlo|12 years ago|reply
[+] [-] enupten|12 years ago|reply