If Maciej hadn't written this, I would still feel alone in how I see the technological world. I really can't express how grateful I am that this exists.
There is a vast, vast gulf between what the majority of software developers seem to think users want, and what users actually want. And this isn't a Henry Ford "they wanted faster horses" sort of thing, this is a, "users don't just hate change, they resent it" sort of thing.
I work directly with end users. It's mostly over email now, my tech still works with them in person, face-to-face, in their business or home. We get so many complaints. So many questions: "do I really have to upgrade this?" "I liked this the way it was." "It worked just fine, why are they changing it again?"
Every time I try to argue on behalf of my customers, here or elsewhere, it gets ignored, or downvoted, or rebutted with, "but my users say they always want the latest and greatest..."
There are 100 million people in the United States alone over the age of 50. How much new software is designed for them? How much new software exists merely as a tool, comfortable with being put away days or weeks at a time, and doesn't try to suck you in to having to sign in to it on a regular basis to see what other people are doing with it? How much of our technology -- not just software, but hardware here too -- is designed to work with trembling hands, poor eyesight, or users who are easily confused?
There are over a hundred million people that don't understand why your site needs a "cookie" to render, that can't tell the difference between an actual operating system warning and an ad posing as one, that aren't sure what to do when the IRS sends them an email about last year's tax return with a .doc attached. (That one happened today.)
For these people, the technology most of us build really really sucks.
And that is a growing demographic, not a shrinking one...
I program for a living and I know why sites need "cookies", and still, for me most software really really sucks. Just because I can figure out how each and every new cell phone works doesn't mean I want to, similarly for other kinds of gratuitous changes and incompatibilities. And BTW systems built "by hackers for hackers" are among the worst offenders (*nix clones in general and Linux distributions in particular are good examples.)
Thank you for a very insightful, thoughtful, and considerate reply. I'm newly in the 50+ demographic, but I've also been a software engineer for my entire career (starting with a Commodore 64 in the 1980s) -- from that perspective I especially appreciate your asking "How much new software exists merely as a tool, comfortable with being put away days or weeks at a time, and doesn't try to suck you in to having to sign in to it on a regular basis to see what other people are doing with it?". Thank you!
You're not alone, but you're part of a minority. I don't know if others don't care or are really that oblivious of how the world is different outside of the computer screen. What I can see is that most software people are happy with the status quo; they get paid to do what they think it's cool, instead of doing what the user truly needs. "Refactor to new framework XYZ? Great, no problem. User-research? No need, we obviously know what they want".
Anyway, a few days ago there was a post here on HN about an anthropologist at Adobe, studying how designers use Photoshop. It's an exception (although in academia there are quite a few notable anthropologists studying human usage of IT), but at least it's a glimpse of hope...
"There is a vast, vast gulf between what the majority of software developers seem to think users want, and what users actually want."
s/want/need/2
Another truism: Users are generally resilient and will adapt to whatever they are forced to use.
There are some very strong ideas shared by developers about what users allegedly "want". But most of the time I think developers are just invoking the mythical "user" in defending their own choices in software design, not users'. Users have little if any choice.
Is it possible that those who purport to "know" what users want are simply observing how users have adapted to using what they were given? (the users having had no real choice of real alternatives)
As a user, in the cases where I cannot write the software myself, I try to find programmers who share my sensibilities in software design.
This is the best hope I have for finding software that I would "want".
Despite what any programmer proclaims about her users, what I the user end up choosing is not the the software I want (I do not ask other programmers to write programs or add features, etc.).
I end up with the software the developer would herself want and is generous enough to share with others.
Fortunately we have some interest in the same things: no gui, portable, small, fast, simple, etc.
One common counter argument that people make is that users are lazy when you say they don't want to learn something new(lets ignore the physically challenged for now). But that argument is so fallacious I think.
What users actually mean is- for the delta value this new feature provides, it is not worth their time to learn to do this the new way. Everybody has to make judicious choice on where they choose to spend their time. And in that vein I dont think the users are being lazy in saying that it is not really efficient if they have to spend significant amount of time learning something new in what they perceive to be of little added value.
EDIT: BTW I do think that people are happy to relearn something if they perceive it to provide a "significant" added value. Hypothetically speaking if I had to relearn driving and this "new way of driving" took me from NY to DC in an HR for the same cost, I would try to learn that in a heartbeat. But I dont want to learn a new way of driving if it reduces my drive time by only 15 min.
I also think software/CS needs a school of thought that discourages significant change/variation in UX design. But unfortunately most people are incentivsed for change.
These days I build things that people have been anxiously waiting for. Like, every day it doesn't exist is kind of a problem, that sort of thing.
But I have been on the other side, too, building things users mostly hated. Even when the thing we made seemed clearly better than what it replaced, our users hated it. Or at least some portion would - probably good to keep in mind that people who are happy tend not to speak up as much.
It's quite a predicament for a company to be in, and I always felt bad for the designers. For me it was OK, it ended up allowing me to try lots of different ways of building things.
The form of the blog post reminded me very much of Jon Berger's Ways of Seeing, a life-changing political book on art history that changes the way you look at the world.
I'd be surprised if the author hadn't read it, or one of its descendants.
I often avoid software upgrades not because I know something will break but because I speculate something will break and dont want to find out.
Out of all the "features" that anger me the most are automatic, silent upgrades. One can generally disable them but I dont want to have to figure out how; if I want an upgrade I will download and install it manually and do so when I am good and ready.
That web applications - as opposed to those I install locally - are increasingly common I regard as the problem not the solution. Quite commonly a website breaks for no apparent reason, eventually I clue into that they revised their Javascipt but did not test on the browser I actually use.
When considering whether to install software, first look for reviews. Many eCommerce sites make it easy to sort by most-critical first. Do take them with a grain of salt as bad reviews are sometimes posted by unethical competitors as well as cyberstalkers like Kuro5hin's modus.
If those bad reviews would affect you and are in a recent release then maybe you want to give it a pass.
I know all about security patches, I once got to play on that same Sun workstation that Kevin Mitnick ransacked but at least I had the sense to ask Tsutomu's permission first.
While actual code serves as a good example the problem is far more general. Consider the Apollo I fire: the Astronauts were unable to convince NASA to redesign their capsule's inward-opening door so they had themselves photographed praying over a model of the capsule.
Everyone knows astronauts are brave; those men died for a door hinge.
The first half of this talk is excellent. The story of air travel--how the engineers of the 60s thought that we'd have supersonic jets and flying cars within a few years--that's a great cautionary tale. Evolutionary biologists call it Punctuated Equilibrium: the idea that progress comes in spurts. The warning--that the evolution of computer hardware and software might be radically slower over the next 50 years than it was over the past 50--seems timely and reasonable.
The second half is not excellent. Bad mental habits and weak arguments are on display.
* Imprecision. Specifically: choosing sentences for their "punch", like a bumper sticker, without caring whether they make sense:
"There is something quite colonial, too, about collecting data from users [...] I think of it as the White Nerd's Burden."
"It's a kind of software Manifest Destiny."
...cool story, bro
* Excessive negativity. If you haven't already, I highly recommend reading Everything is Problematic -- it really helped me understand where people like Maciej are coming from. http://www.mcgilldaily.com/2014/11/everything-problematic/
* Disrespect. "Google works on some loopy stuff in between plastering the Internet with ads." I guess 40 years ago he would have said "AT&T works on some loopy stuff in between sending you phone bills".
Organizations like Bell Labs--or Google X--are treasures, and as engineers we should be glad they exist.
I was just about to write a similar sentiment. The first part is really great, also the parallels and the lessons to learn.
But the second half was not so great. While I agree with the overall sentiment, the concrete examples always slightly miss the point.
-----
The first such passage that caught my eye was the example about Windows XP:
> Rather than offer users persuasive reasons to upgrade software, vendors insist we look on upgrading as our moral duty. The idea that something might work fine the way it is has no place in tech culture.
The problem is that Windows XP is not working just fine, especially when connected to the Internet. It is full of malware and helps establishing botnets, DDOS attacks and thus feeds mafia structures. So we, as a soeciety, do have a moral obligation to retire Windows XP.
Also, nobody is forced by law to use Microsoft. There are plenty of alternatives. Most of those will run smoothly on your old hardware, such as MINT, Ubuntu or whatever. And for normal Office and Web stuff, this "jump" is surely less painful than switching to the latest Windows version.
Of course Microsoft is still to blame here, but for something entirely different: For closing Windows XP support. For not applying serious security fixing to it. I bet there are plenty of companies and people that are more than willing to pay for ongoing maintenance of Windows XP, but the only company that could offer that service denys it.
-----
So the author missed the real point, even though this point is totally in line with the overall sentiment of the article. And it goes on and on like that, in the second half. That makes it a really annoying read.
I had a similar feeling, and thought it was probably because I know nothing about aviation and therefore read that stuff uncritically and feel entertained.
I wonder whether that's a general principle in writing -- tell outside-your-expertise stories to entertain the reader, and then with luck (depending on your perspective), the reader reads the rest and feels entertained and informed and good emotions out of a sort of inertia the rest of the way through.
The central mistake of the second half is this:
“Vision 1: CONNECT KNOWLEDGE, PEOPLE, AND CATS. This is the correct vision.”
He conflates computing with the Web, and argues that the Web is fine just the way it is; it just needs to be spread more widely. As if there aren’t plenty of people already trying to do exactly that.
What he ignores is that computing is an expression of human creativity. Like literature and music, you take it into a context that it wasn’t written for, and it rarely produces a desirable response. As long as computing is relevant, we will need new software.
The last 100 years of computing machines have proven that this still applies. A web based on ad revenue is no good in an area with poor electricity infrastructure and low bandwidth, so the famous M-Pesa has a completely different interface and different software than Wells Fargo Online® Banking.
The article/talk has some good points, but they aren’t connected together in a good way.
Around 2012 I worked with a team migrating some content from a very large static HTML site dating back to 1992. We scoffed at the awful ad-hoc nature of it all, just a pile of static hand-coded HTML pages.
But the 2002-2005 stuff had aged much worse. At some point there was fancy site generator that had used javascript for everything, and the javascript apparently only worked properly in IE6. So most of the navigation was busted in a modern browser, and needed special scrapers to parse out what should have been plain old <a href...> tags.
Now, I regularly think back to that crusty old HTML3 static site that had sat there for 15-20 years and think: I wonder if my AngularJS/D3.js/jqGrid/etc. single-page app will even load in a browser 20 years from now, let alone perform as originally intended...
It's not just HTML. I think any proprietary standard runs the risk.
We've been burned at my work by old word and word perfect documents from early 90's refusing to render properly in new versions of office. I think they tried about three recent versions of office before they gave up and resorted to scanning in hard copies as PDF's to recover some documents.
Access databases are also notorious for this problem. A lot of business apps were written with Access in the mid 90's and it has now become too expensive/time consuming to migrate them off of a dead technology. In business world stuff tends to run far longer than softawre developers consider. Hell COBOL and PL/1 stuff runs in some places still.
> I wonder if my AngularJS/D3.js/jqGrid/etc. single-page app will even load in a browser 20 years from now, let alone perform as originally intended...
Random question for you: In your single-page app, do you support bookmarking "locations" that take multiple clicks to get to, so rather than redo those clicks at a later date, the user can get there via the bookmark? (Assuming your site has locations that require multiple clicks to get to of course.)
I just wonder, does it really matter that it does not run in 20 years?
Most of the stuff we do on the web are momental stuff that is only relevant right now. Stuff like Wikipedia will probably not change much since it's only real purpose it to display text. There is no need for an angular app there and in fact it would be a horrible idea to even make it that way.
The stuff that are meant to last long will if we want to.
The little bit about people believing the "AIs will take over the world" non-sense is gold.
I am still shocked that Elon Musk seriously believes in the pseudosciency "well Google has gotten better so obviously we will build a self-learning self-replicating AI that will also control our nukes and also be connected in a way and have the capabilities to actually really kill all humans."
Meanwhile researchers can't get an AI to tell the difference between birds and houses.
EDIT: I looked a bit more into the research that these people are funding. A huge amount of it does seem very silly, but there is an angle that is valid: dealing with things like HFT algorithms or routing algorithms causing chaos in finance or logistics.
The threat of a superintelligent AI taking over the world is certainly real -- assuming you have a superintelligent AI. If you accept that it is possible to build a such an AI, then you should, at the very least, educate yourself on the existential risks it would pose to humanity. I recommend "Superintelligence: Paths, Danger, Strategies," by Nick Bostrom (it's almost certainly where Musk got his ideas; he's quoted on the back cover).
The reason we ought to be cautious is that in a hard-takeoff scenario, we could be wiped from the earth with very little warning. A superintelligent AI is unlikely to respect human notions of morality, and will execute its goals in ways that we are unlikely to foresee. Furthermore, most of the obvious ways of containing such an AI are easily thwarted. For an eerie example, see http://www.yudkowsky.net/singularity/aibox
Essentially, AI poses a direct existential threat to humanity if it is not implemented with extreme care.
The more relevant question today is whether or not a true general AI with superintelligence potential is achievable in the near future. My guess is no, but it is difficult to predict how far off it is. In the worst-case scenario, it will be invented by a lone hacker and loosed on an unsuspecting world with no warning.
This comes across a little like "get off my lawn", but you got to admit, the web has gotten pretty awful in the last few years.
I enjoy my time online less and less as more content is stuffed into single-page app walled gardens that load massive quantities of cruft, ads, and tracking code. I almost preferred the flash-era.
It seems that the goal of the web as being user-centric has taken the back seat to trying to convince the user that they're just a passive recipient of crafted experiences whose only purpose is to click ads or open their wallet.
>I enjoy my time online less and less as more content is stuffed into single-page app walled gardens that load massive quantities of cruft, ads, and tracking code. I almost preferred the flash-era.
Agreed. I have noticed that I avoid heavy sites and prefer light static HTML sites with server side rendered content. For example, when I go to Reddit i always check the url of a link before i decide to click it, if it is a "magazine" site or "engadget" site or the like, i won't click it.
I completely disagree. I think the web has improved in almost every way. Sure, there are shitty content sites whose main goal is to get ad clicks, but those have been around for a long time and the ads have actually become less obnoxious than before. There is so much better content nowadays presented in much cleaner and focused layouts. And the UX of most websites has improved dramatically. Look at sites like YouTube, Netflix, Facebook, Vine, reddit, Spotify, Soundcloud and compare them to the equivalent sites 5-10 years ago. It's not even a contest, the new sites are better in every way.
There's a slight difference in that we don't have supersonic animals on Earth, we don't see any interstellar spaceships, and we don't see anything even remotely like that - no animal metabolism that compares to the scale of the controlled energy release of a rocket engine or a nuclear power plant -
but we do see much much better information processing systems than our current computers, and they are very energy efficient.
They are built of meat and not exotic superconducting Buckywhatever, they are limited by constraints on heat dissipation and oxygen/glucose supply - problems which are easy to work around with electric pumps, heat pumps and industrial food supply - and by a historic need for sleep.
Maybe exponential progress has to slow, but can we dismiss intelligent non-humans in the same way we can dismiss space stations or interstellar travel?
Space stations are a cost. Free workers is a saving.
Shouldn't we expect human technology will approach animal intelligence in one substrate and design or another? And then surpass it a little by removing some of the obvious constraints?
This is a good question, and I think will have to wait on a better understanding of what we mean by intelligence, which right now is a term that carries a lot of luggage.
It may be that the situation is similar to biochemistry. We observe exquisitely complex synthetic pathways in the natural world, and can to some extent harness and retool things to our benefit, but our capacity to design those reactions from scratch is almost nil compared to what goes on in the simplest living cell.
It may be that creating those fast rockets that don't exist in nature is orders of magnitude easier than writing a catbot. That's an open question and so far the evidence is on the side of it being far out of our grasp.
"The White Nerd's Burden" - must use that one. Kipling's words almost fit:
Take up the White Man’s burden—
Send forth the best ye breed—
Go send your sons to exile
To serve your captives' need
To wait in heavy harness
On fluttered folk and wild—
Your new-caught, sullen peoples,
Half devil and half child
Take up the White Man’s burden
In patience to abide
To veil the threat of terror
And check the show of pride;
By open speech and simple
An hundred times made plain
To seek another’s profit
And work another’s gain
Take up the White Man’s burden—
And reap his old reward:
The blame of those ye better
The hate of those ye guard—
The cry of hosts ye humour
(Ah slowly) to the light:
"Why brought ye us from bondage,
“Our loved Egyptian night?”
Take up the White Man’s burden-
Have done with childish days-
The lightly proffered laurel,
The easy, ungrudged praise.
Comes now, to search your manhood
Through all the thankless years,
Cold-edged with dear-bought wisdom,
The judgment of your peers!
Almost but not quite. I note you've skipped perhaps the most memorable line "Take up the White Man's burden, The savage wars of peace". I'm hoping that the White Nerds may be able to bring peace through the likes of Zuckerberg connecting everyone rather then Bush / Cheney and similar's savage wars. At least cellphones cause less collateral damage than cluster bombs.
It is hard to figure out when exponential growth will end while you are in the middle of it.
Linus said something relevant to this in a recent interview on Slashdot[1], answering a question on dangerous AI. To paraphrase him: people are crazy to think that exponential growth lasts forever. As impressive as it may be (at the time), it's only the beginning of an S-curve.
I've extracted a small part of his answer below. I think the whole interview is worth reading (it's a general interview, not AI-specific)
... I'd expect just more of (and much fancier) rather targeted AI, rather than anything human-like at all. Language recognition, pattern recognition, things like that. I just don't see the situation where you suddenly have some existential crisis because your dishwasher is starting to discuss Sartre with you.
The whole "Singularity" kind of event? Yeah, it's science fiction, and not very good SciFi at that, in my opinion. Unending exponential growth? What drugs are those people on? I mean, really..
It's like Moore's law - yeah, it's very impressive when something can (almost) be plotted on an exponential curve for a long time. Very impressive indeed when it's over many decades. But it's _still_ just the beginning of the "S curve". Anybody who thinks any different is just deluding themselves. There are no unending exponentials
Well, the biggest current problem is that everything had to be redesigned for fat fingers and small screens. The worst expression of this was Windows 8, which made desktops look like tablets, and was rejected by the market.
Then there's cutesy web design; Flash ten years ago, "material design" now. Just because you can animate everything doesn't mean you should. (Annoyingly, the one browser thing that ought to be animated isn't. When you click on a link that exits the page, nothing happens until the new page loads, often leading to unnecessary double clicking. The browser should blank the old page, or grey it out, or dissolve to the new page, or do some kind of transition.)
As for the big stuff, the AI-driven future is going to be interesting. The big threat, as I point out occasionally, is not Terminator robots. It's Goldman Sachs run by a a machine learning system optimizing for maximum stockholder value.
Soviet engineers lacked the computers to calculate all the bending and wiggling the wings would do if you hung the engines under them, so they just strapped engines on the back.
This sounds like a myth to me, is there any evidence that missing computer power was the reason for tail-engine-designs? There have been a number of western civilian jet planes from that era with tail engines too (for instance the Boeing 727). AFAIK the wing-mounted engines won in the end because they are easier to maintain, which might not have been such a strong factor in the 50's and 60's.
I find it more likely that Soviet design bureaus didn't pay much attention to operating efficiency and put more effort into building planes that can operate from 'rougher' air strips, etc. But I'm not an aeroplane expert.
Another nice parallel with the 747/2707 is that the Core line was derived from the Pentium M which was done by Intel's B-team in Israel while the big boys got to work on the future with Itanium and P4.
This was a great read. In my mind, it reminded me of the true underlying simplicity of everything on the web. He's right--we are riding the shockwaves of the computer revolution and its fizzling down. The transistor is getting smaller (I think Intel's newest one is 7 nanometers) but soon we won't be able to physically make it any smaller. Does this mean the computing revolution is over? No. I think it just means the future is going to be about combining computing with other fields like medicine, the arts, and so on.
This guys dig at AI is a little conflicted with the exponential hangover idea. Sure we can only simulate a 300 neuron worm right now but, if we every achieve a computationally bound solution and experience exponential growth at a rate similar to Moores Law, in 50 years we could be watching real AI cat videos on Youtube, complete with awful UI controls that customise the feline personality in realtime. Oh, in Javascript of course.
Ok this article is good but the tangent into the singularity is just stupid:
>In fact, forget about worms—we barely have computers powerful enough to emulate the hardware of a Super Nintendo.
He just disproved his own point: Emulation is hard, but it's possible to run much faster software directly optimized for your system or build it in hardware.
>If you talk to anyone who does serious work in artificial intelligence (and it's significant that the people most afraid of AI and nanotech have the least experience with it) they will tell you that progress is slow and linear, just like in other scientific fields.
What?! Has he actually talked to anyone doing "serious work in artificial intelligence"? I am certain they would not say it is "slow and linear".
The presentation is available for viewing here: https://youtu.be/nwhZ3KEqUlw
I must say I think the written one is better. It has additional information that I found interesting.
He criticizes his second vision, the Silicon Valley vision of software eating the world. But where I live (Wichita, Kansas), Uber is actually better than the conventional taxi services. As in, when uber was temporarily shut down in Kansas and I tried to call a conventional taxi to my home, I waited over half an hour, it never showed up, and I had to cancel. So the second vision resonates with me, and I don't even live in Silicon Valley. What am I missing?
Edit: It's not just one anecdote. TO quote the OP:
> We started with music and publishing. Then retailing. Now we're apparently doing taxis.
Yes, and software has made each of these better. The vision of software improving the world seems to actually be working.
[+] [-] thaumaturgy|10 years ago|reply
There is a vast, vast gulf between what the majority of software developers seem to think users want, and what users actually want. And this isn't a Henry Ford "they wanted faster horses" sort of thing, this is a, "users don't just hate change, they resent it" sort of thing.
I work directly with end users. It's mostly over email now, my tech still works with them in person, face-to-face, in their business or home. We get so many complaints. So many questions: "do I really have to upgrade this?" "I liked this the way it was." "It worked just fine, why are they changing it again?"
Every time I try to argue on behalf of my customers, here or elsewhere, it gets ignored, or downvoted, or rebutted with, "but my users say they always want the latest and greatest..."
There are 100 million people in the United States alone over the age of 50. How much new software is designed for them? How much new software exists merely as a tool, comfortable with being put away days or weeks at a time, and doesn't try to suck you in to having to sign in to it on a regular basis to see what other people are doing with it? How much of our technology -- not just software, but hardware here too -- is designed to work with trembling hands, poor eyesight, or users who are easily confused?
There are over a hundred million people that don't understand why your site needs a "cookie" to render, that can't tell the difference between an actual operating system warning and an ad posing as one, that aren't sure what to do when the IRS sends them an email about last year's tax return with a .doc attached. (That one happened today.)
For these people, the technology most of us build really really sucks.
And that is a growing demographic, not a shrinking one...
[+] [-] yosefk|10 years ago|reply
[+] [-] eevilspock|10 years ago|reply
This is one of the many costs of having a web funded by advertising: https://news.ycombinator.com/item?id=8585237
Here's another Maciej post that is part of the solution, Don't Be A Free User: https://blog.pinboard.in/2011/12/don_t_be_a_free_user/
[+] [-] eplanit|10 years ago|reply
[+] [-] paulojreis|10 years ago|reply
Anyway, a few days ago there was a post here on HN about an anthropologist at Adobe, studying how designers use Photoshop. It's an exception (although in academia there are quite a few notable anthropologists studying human usage of IT), but at least it's a glimpse of hope...
[+] [-] gwu78|10 years ago|reply
s/want/need/2
Another truism: Users are generally resilient and will adapt to whatever they are forced to use.
There are some very strong ideas shared by developers about what users allegedly "want". But most of the time I think developers are just invoking the mythical "user" in defending their own choices in software design, not users'. Users have little if any choice.
Is it possible that those who purport to "know" what users want are simply observing how users have adapted to using what they were given? (the users having had no real choice of real alternatives)
As a user, in the cases where I cannot write the software myself, I try to find programmers who share my sensibilities in software design.
This is the best hope I have for finding software that I would "want".
Despite what any programmer proclaims about her users, what I the user end up choosing is not the the software I want (I do not ask other programmers to write programs or add features, etc.).
I end up with the software the developer would herself want and is generous enough to share with others.
Fortunately we have some interest in the same things: no gui, portable, small, fast, simple, etc.
[+] [-] thetruthseeker1|10 years ago|reply
What users actually mean is- for the delta value this new feature provides, it is not worth their time to learn to do this the new way. Everybody has to make judicious choice on where they choose to spend their time. And in that vein I dont think the users are being lazy in saying that it is not really efficient if they have to spend significant amount of time learning something new in what they perceive to be of little added value.
EDIT: BTW I do think that people are happy to relearn something if they perceive it to provide a "significant" added value. Hypothetically speaking if I had to relearn driving and this "new way of driving" took me from NY to DC in an HR for the same cost, I would try to learn that in a heartbeat. But I dont want to learn a new way of driving if it reduces my drive time by only 15 min.
I also think software/CS needs a school of thought that discourages significant change/variation in UX design. But unfortunately most people are incentivsed for change.
[+] [-] serve_yay|10 years ago|reply
But I have been on the other side, too, building things users mostly hated. Even when the thing we made seemed clearly better than what it replaced, our users hated it. Or at least some portion would - probably good to keep in mind that people who are happy tend not to speak up as much.
It's quite a predicament for a company to be in, and I always felt bad for the designers. For me it was OK, it ended up allowing me to try lots of different ways of building things.
[+] [-] zwischenzug|10 years ago|reply
I'd be surprised if the author hadn't read it, or one of its descendants.
http://www.amazon.co.uk/Ways-Seeing-Penguin-Modern-Classics/...
[+] [-] jwmoz|10 years ago|reply
[+] [-] monk_e_boy|10 years ago|reply
[+] [-] MichaelCrawford|10 years ago|reply
Out of all the "features" that anger me the most are automatic, silent upgrades. One can generally disable them but I dont want to have to figure out how; if I want an upgrade I will download and install it manually and do so when I am good and ready.
That web applications - as opposed to those I install locally - are increasingly common I regard as the problem not the solution. Quite commonly a website breaks for no apparent reason, eventually I clue into that they revised their Javascipt but did not test on the browser I actually use.
When considering whether to install software, first look for reviews. Many eCommerce sites make it easy to sort by most-critical first. Do take them with a grain of salt as bad reviews are sometimes posted by unethical competitors as well as cyberstalkers like Kuro5hin's modus.
If those bad reviews would affect you and are in a recent release then maybe you want to give it a pass.
I know all about security patches, I once got to play on that same Sun workstation that Kevin Mitnick ransacked but at least I had the sense to ask Tsutomu's permission first.
[+] [-] MichaelCrawford|10 years ago|reply
http://www.warplife.com/jonathan-swift/books/software-proble...
While actual code serves as a good example the problem is far more general. Consider the Apollo I fire: the Astronauts were unable to convince NASA to redesign their capsule's inward-opening door so they had themselves photographed praying over a model of the capsule.
Everyone knows astronauts are brave; those men died for a door hinge.
[+] [-] MichaelCrawford|10 years ago|reply
[deleted]
[+] [-] dcposch|10 years ago|reply
The second half is not excellent. Bad mental habits and weak arguments are on display.
* Imprecision. Specifically: choosing sentences for their "punch", like a bumper sticker, without caring whether they make sense: "There is something quite colonial, too, about collecting data from users [...] I think of it as the White Nerd's Burden."
"It's a kind of software Manifest Destiny."
...cool story, bro
* Excessive negativity. If you haven't already, I highly recommend reading Everything is Problematic -- it really helped me understand where people like Maciej are coming from. http://www.mcgilldaily.com/2014/11/everything-problematic/
* Disrespect. "Google works on some loopy stuff in between plastering the Internet with ads." I guess 40 years ago he would have said "AT&T works on some loopy stuff in between sending you phone bills".
Organizations like Bell Labs--or Google X--are treasures, and as engineers we should be glad they exist.
In short: haters gonna hate
[+] [-] vog|10 years ago|reply
But the second half was not so great. While I agree with the overall sentiment, the concrete examples always slightly miss the point.
-----
The first such passage that caught my eye was the example about Windows XP:
> Rather than offer users persuasive reasons to upgrade software, vendors insist we look on upgrading as our moral duty. The idea that something might work fine the way it is has no place in tech culture.
The problem is that Windows XP is not working just fine, especially when connected to the Internet. It is full of malware and helps establishing botnets, DDOS attacks and thus feeds mafia structures. So we, as a soeciety, do have a moral obligation to retire Windows XP.
Also, nobody is forced by law to use Microsoft. There are plenty of alternatives. Most of those will run smoothly on your old hardware, such as MINT, Ubuntu or whatever. And for normal Office and Web stuff, this "jump" is surely less painful than switching to the latest Windows version.
Of course Microsoft is still to blame here, but for something entirely different: For closing Windows XP support. For not applying serious security fixing to it. I bet there are plenty of companies and people that are more than willing to pay for ongoing maintenance of Windows XP, but the only company that could offer that service denys it.
-----
So the author missed the real point, even though this point is totally in line with the overall sentiment of the article. And it goes on and on like that, in the second half. That makes it a really annoying read.
[+] [-] eevilspock|10 years ago|reply
> In short: haters gonna hate
and
> * Disrespect.
> ...cool story, bro
[+] [-] forscha|10 years ago|reply
I wonder whether that's a general principle in writing -- tell outside-your-expertise stories to entertain the reader, and then with luck (depending on your perspective), the reader reads the rest and feels entertained and informed and good emotions out of a sort of inertia the rest of the way through.
[+] [-] Decade|10 years ago|reply
He conflates computing with the Web, and argues that the Web is fine just the way it is; it just needs to be spread more widely. As if there aren’t plenty of people already trying to do exactly that.
What he ignores is that computing is an expression of human creativity. Like literature and music, you take it into a context that it wasn’t written for, and it rarely produces a desirable response. As long as computing is relevant, we will need new software.
The last 100 years of computing machines have proven that this still applies. A web based on ad revenue is no good in an area with poor electricity infrastructure and low bandwidth, so the famous M-Pesa has a completely different interface and different software than Wells Fargo Online® Banking.
The article/talk has some good points, but they aren’t connected together in a good way.
[+] [-] vul6|10 years ago|reply
[+] [-] csirac2|10 years ago|reply
But the 2002-2005 stuff had aged much worse. At some point there was fancy site generator that had used javascript for everything, and the javascript apparently only worked properly in IE6. So most of the navigation was busted in a modern browser, and needed special scrapers to parse out what should have been plain old <a href...> tags.
Now, I regularly think back to that crusty old HTML3 static site that had sat there for 15-20 years and think: I wonder if my AngularJS/D3.js/jqGrid/etc. single-page app will even load in a browser 20 years from now, let alone perform as originally intended...
[+] [-] bigger_cheese|10 years ago|reply
We've been burned at my work by old word and word perfect documents from early 90's refusing to render properly in new versions of office. I think they tried about three recent versions of office before they gave up and resorted to scanning in hard copies as PDF's to recover some documents.
Access databases are also notorious for this problem. A lot of business apps were written with Access in the mid 90's and it has now become too expensive/time consuming to migrate them off of a dead technology. In business world stuff tends to run far longer than softawre developers consider. Hell COBOL and PL/1 stuff runs in some places still.
[+] [-] mistermann|10 years ago|reply
Random question for you: In your single-page app, do you support bookmarking "locations" that take multiple clicks to get to, so rather than redo those clicks at a later date, the user can get there via the bookmark? (Assuming your site has locations that require multiple clicks to get to of course.)
[+] [-] staticelf|10 years ago|reply
Most of the stuff we do on the web are momental stuff that is only relevant right now. Stuff like Wikipedia will probably not change much since it's only real purpose it to display text. There is no need for an angular app there and in fact it would be a horrible idea to even make it that way.
The stuff that are meant to last long will if we want to.
[+] [-] rtpg|10 years ago|reply
I am still shocked that Elon Musk seriously believes in the pseudosciency "well Google has gotten better so obviously we will build a self-learning self-replicating AI that will also control our nukes and also be connected in a way and have the capabilities to actually really kill all humans."
Meanwhile researchers can't get an AI to tell the difference between birds and houses.
EDIT: I looked a bit more into the research that these people are funding. A huge amount of it does seem very silly, but there is an angle that is valid: dealing with things like HFT algorithms or routing algorithms causing chaos in finance or logistics.
[+] [-] nemo1618|10 years ago|reply
The reason we ought to be cautious is that in a hard-takeoff scenario, we could be wiped from the earth with very little warning. A superintelligent AI is unlikely to respect human notions of morality, and will execute its goals in ways that we are unlikely to foresee. Furthermore, most of the obvious ways of containing such an AI are easily thwarted. For an eerie example, see http://www.yudkowsky.net/singularity/aibox Essentially, AI poses a direct existential threat to humanity if it is not implemented with extreme care.
The more relevant question today is whether or not a true general AI with superintelligence potential is achievable in the near future. My guess is no, but it is difficult to predict how far off it is. In the worst-case scenario, it will be invented by a lone hacker and loosed on an unsuspecting world with no warning.
[+] [-] manachar|10 years ago|reply
I enjoy my time online less and less as more content is stuffed into single-page app walled gardens that load massive quantities of cruft, ads, and tracking code. I almost preferred the flash-era.
It seems that the goal of the web as being user-centric has taken the back seat to trying to convince the user that they're just a passive recipient of crafted experiences whose only purpose is to click ads or open their wallet.
[+] [-] thecopy|10 years ago|reply
Agreed. I have noticed that I avoid heavy sites and prefer light static HTML sites with server side rendered content. For example, when I go to Reddit i always check the url of a link before i decide to click it, if it is a "magazine" site or "engadget" site or the like, i won't click it.
[+] [-] dntrkv|10 years ago|reply
[+] [-] TheGunner|10 years ago|reply
[+] [-] sealthedeal|10 years ago|reply
[deleted]
[+] [-] jodrellblank|10 years ago|reply
They are built of meat and not exotic superconducting Buckywhatever, they are limited by constraints on heat dissipation and oxygen/glucose supply - problems which are easy to work around with electric pumps, heat pumps and industrial food supply - and by a historic need for sleep.
Maybe exponential progress has to slow, but can we dismiss intelligent non-humans in the same way we can dismiss space stations or interstellar travel?
Space stations are a cost. Free workers is a saving.
Shouldn't we expect human technology will approach animal intelligence in one substrate and design or another? And then surpass it a little by removing some of the obvious constraints?
[+] [-] idlewords|10 years ago|reply
It may be that the situation is similar to biochemistry. We observe exquisitely complex synthetic pathways in the natural world, and can to some extent harness and retool things to our benefit, but our capacity to design those reactions from scratch is almost nil compared to what goes on in the simplest living cell.
It may be that creating those fast rockets that don't exist in nature is orders of magnitude easier than writing a catbot. That's an open question and so far the evidence is on the side of it being far out of our grasp.
[+] [-] Animats|10 years ago|reply
[+] [-] tim333|10 years ago|reply
Almost but not quite. I note you've skipped perhaps the most memorable line "Take up the White Man's burden, The savage wars of peace". I'm hoping that the White Nerds may be able to bring peace through the likes of Zuckerberg connecting everyone rather then Bush / Cheney and similar's savage wars. At least cellphones cause less collateral damage than cluster bombs.
[+] [-] wfn|10 years ago|reply
[+] [-] unknown|10 years ago|reply
[deleted]
[+] [-] sangnoir|10 years ago|reply
Linus said something relevant to this in a recent interview on Slashdot[1], answering a question on dangerous AI. To paraphrase him: people are crazy to think that exponential growth lasts forever. As impressive as it may be (at the time), it's only the beginning of an S-curve.
I've extracted a small part of his answer below. I think the whole interview is worth reading (it's a general interview, not AI-specific)
... I'd expect just more of (and much fancier) rather targeted AI, rather than anything human-like at all. Language recognition, pattern recognition, things like that. I just don't see the situation where you suddenly have some existential crisis because your dishwasher is starting to discuss Sartre with you.
The whole "Singularity" kind of event? Yeah, it's science fiction, and not very good SciFi at that, in my opinion. Unending exponential growth? What drugs are those people on? I mean, really..
It's like Moore's law - yeah, it's very impressive when something can (almost) be plotted on an exponential curve for a long time. Very impressive indeed when it's over many decades. But it's _still_ just the beginning of the "S curve". Anybody who thinks any different is just deluding themselves. There are no unending exponentials
1. http://linux.slashdot.org/story/15/06/30/0058243/interviews-...
[+] [-] Animats|10 years ago|reply
Then there's cutesy web design; Flash ten years ago, "material design" now. Just because you can animate everything doesn't mean you should. (Annoyingly, the one browser thing that ought to be animated isn't. When you click on a link that exits the page, nothing happens until the new page loads, often leading to unnecessary double clicking. The browser should blank the old page, or grey it out, or dissolve to the new page, or do some kind of transition.)
As for the big stuff, the AI-driven future is going to be interesting. The big threat, as I point out occasionally, is not Terminator robots. It's Goldman Sachs run by a a machine learning system optimizing for maximum stockholder value.
[+] [-] flohofwoe|10 years ago|reply
Soviet engineers lacked the computers to calculate all the bending and wiggling the wings would do if you hung the engines under them, so they just strapped engines on the back.
This sounds like a myth to me, is there any evidence that missing computer power was the reason for tail-engine-designs? There have been a number of western civilian jet planes from that era with tail engines too (for instance the Boeing 727). AFAIK the wing-mounted engines won in the end because they are easier to maintain, which might not have been such a strong factor in the 50's and 60's.
I find it more likely that Soviet design bureaus didn't pay much attention to operating efficiency and put more effort into building planes that can operate from 'rougher' air strips, etc. But I'm not an aeroplane expert.
[+] [-] kevin_thibedeau|10 years ago|reply
[+] [-] jacquesm|10 years ago|reply
And it is a call to action, not just a 'nice to read' piece.
So, what are we going to do about it?
[+] [-] dghf|10 years ago|reply
That needs immortalising. Cegłowski's Law?
[+] [-] _mgr|10 years ago|reply
This line should have "...with ads" appended to it. Or "...Javascript Frameworks".
[+] [-] scottfits|10 years ago|reply
[+] [-] hrayr|10 years ago|reply
https://www.youtube.com/watch?v=nwhZ3KEqUlw
[+] [-] nly|10 years ago|reply
[+] [-] Houshalter|10 years ago|reply
>In fact, forget about worms—we barely have computers powerful enough to emulate the hardware of a Super Nintendo.
He just disproved his own point: Emulation is hard, but it's possible to run much faster software directly optimized for your system or build it in hardware.
>If you talk to anyone who does serious work in artificial intelligence (and it's significant that the people most afraid of AI and nanotech have the least experience with it) they will tell you that progress is slow and linear, just like in other scientific fields.
What?! Has he actually talked to anyone doing "serious work in artificial intelligence"? I am certain they would not say it is "slow and linear".
[+] [-] TexMitchell|10 years ago|reply
[+] [-] mwcampbell|10 years ago|reply
Edit: It's not just one anecdote. TO quote the OP:
> We started with music and publishing. Then retailing. Now we're apparently doing taxis.
Yes, and software has made each of these better. The vision of software improving the world seems to actually be working.