It's an interesting pattern with academic HCI work like this. For instance, Shneiderman (who did legitimately innovative design work!) often claims not only Hyperties's hyperlink styling as an academia -> industry success story but also the iPhone keyboard: his group had done a series of usability studies on various tiny keyboard approaches. And yet Ken Kocienda, who actually created the iPhone keyboard, has told me he was unaware of this work (and in fact never surveyed prior work). I've heard the same story for CSCW interface elements. Not encouraging!
gwern|4 years ago
I would merely roll my eyes at this (does it really hurt anyone to believe that 'Shneiderman designed the iPhone keyboard'? HCI people know almost all their work is useless anyway), but it leads people to greatly overvalue the role of theory and understanding in ML progress, and undervalue the role of compute. If you believe that resnets were invented by thinking hard to come up with a beautiful theory about identity transformations & gradient propagation, then you are going to make different forecasts and emphasize different things than if you knew the reality was that resnets had been repeatedly invented but failed because there wasn't enough compute to show that they worked on anything but toy problems like the swiss roll and the final invention of resnets was grad students throwing random archs at the wall to see what stuck who happened to have enough GPUs that they could show it worked amazingly well on ImageNet and then had to explain why mashing together multiple layers with additional connections did anything.
DonHopkins|4 years ago
https://en.wikipedia.org/wiki/Extreme_learning_machine#Contr...
You gotta admit, it's an awesome new name for some good old concepts (kinda like "AJAX"). Many Millennial Brogrammers eat that edgy macho shit up. They should throw Mega Monster Extreme Learning Machine Hack-O-Thon Training Rallies every Sunday, Sunday, Sunday !!!!!!
https://www.youtube.com/watch?v=ohp_nmI_TFA
The same kind of macho dick oriented subversion happened when Mark Weiser's sissy Calm Technology / Ubiquitous Computing got rebranded by IBM's macho Pervasive Computing Division. (Because it's all about dividing and penetrating and grabbing and embedding trophies of desire and charging attacks and swearing about war metaphors in the trenches!)
https://www.cnet.com/tech/services-and-software/ibm-vows-to-...
>IBM vows to make computing pervasive: Big Blue voices its intention to grab a piece of the pervasive computing market, where computing power and Net access are embedded in everything from handhelds to cars.
https://www.eweek.com/mobile/ibm-takes-on-pervasive-computin...
>IBM Takes on Pervasive Computing: While everyones still talking about the potential of wireless technologies, Rod Adkins, general manager of IBMs pervasive computing division, is in the trenches, helping the Armonk, N.Y., company develop wireless solutions for its customers. A major initiative at IBM, pervasive computing extends e-business to new devices. Adkins is charged with integrating IBMs technology, software, hardware and services into wireless and mobile solutions.
https://news.ycombinator.com/item?id=21765409
DonHopkins on Dec 11, 2019 | parent | context | favorite | on: Flutter: UI platform designed for ambient computin...
31 years late, Google attempts to re-brand "Ubiquitous Computing" (aka "Calm Technology") as "Ambient Computing". At least it sounds more mellow, less intrusive, unwelcome, penetrative, and phallic than the other attempt at rebranding UbiComp as "Pervasive Computing" in order to sell it to the military.
https://en.wikipedia.org/wiki/Ubiquitous_computing
https://en.wikipedia.org/wiki/Calm_technology
https://www.researchgate.net/post/What_is_differents_between...
https://internetofthingsagenda.techtarget.com/definition/per...
>The term pervasive computing followed in the late 1990s, largely popularized by the creation of IBM's pervasive computing division. Though synonymous today, Professor Friedemann Mattern of the Swiss Federal Institute of Technology in Zurich noted in a 2004 paper that:
>Weiser saw the term 'ubiquitous computing' in a more academic and idealistic sense as an unobtrusive, human-centric technology vision that will not be realized for many years, yet [the] industry has coined the term 'pervasive computing' with a slightly different slant. Though this also relates to pervasive and omnipresent information processing, its primary goal is to use this information processing in the near future in the fields of electronic commerce and web-based business processes. In this pragmatic variation -- where wireless communication plays an important role alongside various mobile devices such as smartphones and PDAs -- ubiquitous computing is already gaining a foothold in practice.
You say pervasive, I say perversive. Let's call the whole thing off.