Computers are perceptually slower because we've replaced CPU/memory slowness with remote latency. Every click, often every keypress (for autocomplete), sometimes even every mouse hover, generates requests to who knows how many services in how many different places around the world. No wonder it's slow. Speed of light. I see this even using my development tools at work, where actions I didn't really want or ask for are being taken "on my behalf" and slowing me down. I notice because I'm on a VPN from home. The developers don't notice - and don't care - because they're sitting right on a very high-speed network in the office. It's the modern equivalent of developers using much faster machines and much bigger monitors than most users are likely to have. Just as they need to think about users on tiny phone screens and virtual keyboards, developers need to think about users with poor network connectivity (or just low patience for frills and fluff).
When dealing with web applications at least, people don't realize how many non-essential requests are being made peripheral to the action the user actually wants to accomplish. For instance, install NoScript and make a page fetch to cnn.com. There are 20+ external page requests being made to all kinds of tracking, advertising, and analytical domains which have fuck-all to do with the user's request to see content hosted by cnn.com. The page loads almost instantly when all these non-essential requests are filtered. It's a hilariously bad side effect of the web becoming as commercialized as it is.
Windows explorer is seriously slow on Windows 10. Things like the right-click menu and creating a new folder are too slow for what is done. The new item menu is very slow, perhaps due to having office 365 installed? Creating a new folder sometimes doesn't update the display of the containing folder at all.
Data point of 1, but mine is as fast as anything. On my 7 year old desktop, upgraded from 7 to 10, its near instant. But given almost anything can hook into that menu (i.e. I have winrar, vscode, treesize, etc) it probably comes down to what you have running.
I concur. I handle video files on a SSD, it can take 2-3 seconds to just view a folder with one file inside. All I want is the file name and an icon but apparently in 2019 this is an Herculean tasks.
It all depends on what hooks up into the context menu. The vanilla menu is fast. But once you accrue some unpackers, file sharing, VCS tools and garbage like Intel graphics drivers, the context menu gets unusably slow.
I use Total Commander on Windows, Midnight Commander and DoubleCmd on Linux. All nice and instant with plethora of operations. Not sure why Explorer type UI dominates.
The weird thing is that these are solved problems.
The most impressive, simple piece of software I've tried is a search tool called Everything.
I thought search was just hard and slow. Everything indexes every drive in seconds and searches instantly. I imagine it must be used by law enforcement to deal with security by obscurity.
I agree, I think there are a few things in the Windows file explorer that conspire against its good performance (file preview is a big factor, but recycle bin content seems to affect it too), and it does seem to get worse over time. I think there's a market now for a third party 'back to basics' explorer.
Apparently I use my computer differently than a lot of commenters. Because when I dust off my 1983 Apple IIe it gets REALLY slow when I try to have 50 open browser tabs, edit video, and run a few virtual machines.
I think it comes down to the fact that GUIs _sell_. GUIs have visibility and appeal, they are something users can actually see, and have opinions about (right or wrong). GUIs are the ultimate bikeshed, and for many users, the lipstick IS the pig.
----
Anecdote: I can't count the number of times I have seen a team changing a color, updating a logo, or moving an image a few pixels, resulted in happy clients/customers, and managers sending a congratulatory company wide email. While teams solving difficult engineering problems may have garnered a quiet pat on the back, if they were lucky.
HTML was designed for static documents, it boggles my mind that things like nodejs were created. It's not a secret.
HTML techs can't even run efficiently on a cheap smartphone, which is the reason apps are needed for smartphones to be usable.
Every time I'm talking to someone for job offers, I state that I want to avoid web techs. No js, no web frameworks. I prefer industrial computing, to build things that are useful. I don't want to make another interface that will get thrown away for whatever reason.
Today the computing industry has completely migrated towards making user interfaces, UX things, fancy shiny smoothy scrolly whatnots, just to employ people who can't write SQL. Companies only want to sell attention. This is exactly what the economy of attention is about.
All I dream about is some OS, desktop or mobile, that lets the user write scripts directly. It's time you encourage users to write code. It's not that hard.
And he's not even talking about software bloat. The word processor I have on my early 90's Powerbook is more responsive, generally faster to use, than my current one running on a Core 2 duo processor. Oh, and by the way, I was once complaining about this to a friend who's in IT, and he told me how the speed in which a software runs doesn't mean anything regarding its quality. What I mean is, I was telling him how bad some new software was because it was quite slower than one 10 years older which did the same thing, and he tolde me that, in software engineering, this (speed) is never a measure of a program's quality. Is this universally accepted? Speed and responsiveness are not taken into account? I always meant to ask other people in this field, but always forgot.
> Oh, and by the way, I was once complaining about this to a friend who's in IT, and he told me how the speed in which a software runs doesn't mean anything regarding its quality.
Your friend is wrong. It's an imperfect proxy, but looking at programs that do work, speed is a good proxy for quality, because speed means someone gives a damn. There are good programs that are slow, but bad programs all tend to be bloated.
Of course "speed" is something to be evaluated in context. In a group of e.g. 3D editors, a more responsive UI suggests a better editor. A more responsive UI in general suggests a better program in general.
> this (speed) is never a measure of a program's quality. Is this universally accepted?
Universally? No. It all depends on who you ask. Companies tend to say speed isn't, but the truth is, a lot of companies today don't care about quality at all - it's not what sells software. If you ask users, you'll get mixed answers, depending on whether the software they use often is slow enough to anger them regularly.
To me (on the internet since 92), speed is 100% a measure of a program’s quality. Intensive tasks get a pass (especially if they are pushed to a background queue), but IU responsiveness is definitely a measure of quality for me.
Jason Fried has also written an optimized extensively for UI speed in Basecamp (a quick google shows an article from 2008).
Speaking of: there has also been a lot written about Amazon’s discovery that every 100ms of latency cost them 1% of sales from people simply walking away from the “slow” site.
Especially when you’re doing the same task “template” on a day to day basis, even 1 second per input adds up quickly.
One reason why I'm happy to not be in IT, is because of said bullshit. Maybe that means I am part of the problem, because if every programmer who has a problem with that, leaves, you're left with just the programmers like your friend who don't see anything wrong with this.
>how many times have i typed in a search box, seen what i wanted pop up as i was typing, go to click on it, then have it disappear
Regardless of anything else, this is 100% happening to me on a regular basis. And the ironic thing is that I think it is caused by the attempt to speed up getting some results onscreen. But it’s always 500ms behind, so it “catches up” while I’m trying to move the mouse to click on something.
firefox is notoriously bad at this - type a bookmark name into the url bar, move mouse onto the bookmark, search results come in and you click something you didn't want. gets me all the time.
I think we can actually blame React and similar frameworks for the issues we see in many modern apps, including the ones mentioned in the article.
Part of the issue stems from the "strong data coupling" that's all the rage. Everything on the page should correlate at any given point in time. Add a character to a search box and the search results should be updated. The practical effect of this is that any single modification could (and often does) rewrite the contents of the entire page.
The other thing the article brings up is that developers and designers often disregard input flow. This may be partly driven by not having sufficiently dynamic tooling (Illustrator can hardly be used to design out flow patterns, for example.)
These two issues have a unifying quality: Websites must be "instagrammable", which is to say look good in single snapshots of time, and the dynamics take a serious back seat.
If you spent what people paid for a PC in 1983 (literally, without any inflation) you probably wouldn't notice anything being perceptually slower.
Like the first Mac retailed for $2500 US. Go spend $2500 on a PC today, you'll have a great time.
Granted, economies of scale make this kind of a dumb argument. But it has a bit of truth to it. People are just less willing to spend as much on their machines, as well as push much more limited platforms like mobile to their limits. We should definitely deal with that as developers, don't get me wrong - but not having to deal with the optimizations they dealt with 40 years ago doesn't make me unhappy.
I have a top of the line Intel processor that’s less than 2 years old (launched, not bought). 970 Evo Pro that’s the one of the fastest drives around. 32 GB RAM (don’t remember the speed but it was and is supposed to be super fast).
Explorer takes a second or two to launch. The ducking start menu takes a moment and sometimes causes the entire OS to lock up for a second.
The twitter rant is spot on.
There’s so much of supposed value add BS that the core usage scenarios go to shit.
And this is coming from a Product Manager. :-)
Anyway the referencing problem is painful. I feel it often. Google maps or Apple Maps. Try to plan a vacation and Mark interesting places on it to identify the best location to stay. Yup gotta use that memory. Well isn’t that one of the rules of UX design, don’t make me think?
Regarding OSes: storage has gotten so much faster and CPUs haven’t, that storage drivers and file systems are now the bottleneck. We need less layers of abstraction to compensate. The old model of IO is super slow is no longer accurate.
> ... you probably wouldn't notice anything being perceptually slower
I disagree. I have such a PC (64 GB of RAM, Quadro GPU, SSD, etc.) and I absolutely do notice things being slow, even things like Word, Excel, and VS code, let alone resource-intensive professional software.
A more expensive PC does very little to address the latency issues at play here, the problems are very much not lack of processor speed, gpu speed, or even ssd speed (most times).
I know from experience, the most godlike PC you can possibly build does virtually nothing to make common applications less laggy.
> "People are just less willing to spend as much on their machines"
And why should they? Today's smartphones are much more powerful than the most powerful supercomputer of 1983. Computers have been powerful enough for most practical purposes for years, which means most people select on price rather than power. And then a new OS or website comes along and decides you've got plenty of power to waste on unnecessary nonsense.
The first Mac was 3.5 inch disk based IIRC. I remember test driving it and was kind of shocked at that price since it felt slower than my Commodore 64 with a hard drive (the tape drive was so slow but cheap!) or my next computer, an Atari ST with a hard drive, of course the disk access/read/write speed was the dominating speed factor.
I seriously doubt there is a huge difference in how fast I can access files, scan memory, or iterate through a loop, which is what has a huge impact on perceptual latency.
> People are just less willing to spend as much on their machines,
Please stop blaming the consumers, they have very little freedom of choice.
> as well as push much more limited platforms like mobile to their limits.
I don't think anyone has really pushed any recent smartphone to their limits. I haven't checked if any demoparty maybe had a smartphone compo, but if they didn't, then yeah nobody has really tried.
The C64, Amiga and early x86 PCs have been pushed to their limits though, squeezing out every drop of performance. And there still exist C64 scene weirdos that work to make these machines perform the unimaginable.
Smartphones haven't been around long enough and have been continuously replaced by slightly better versions, that really nobody has had time to really find out what those machines are capable of.
> but not having to deal with the optimizations they dealt with 40 years ago doesn't make me unhappy.
I used to have to deal with such optimizations and I totally get that. It's freeing and I occasionally have to remind myself what it means that I don't have to worry about using a megabyte more memory because machines have gigabytes. Except that a megabyte is pretty huge if you know how to use it.
But not having to deal with the optimizations also means that new developers never learn these optimizations and they will be forgotten. And that's bad. Because there's still a place for these optimizations, like 95% of the code doesn't matter, but for that 5% performance critical stuff, ... if you just learned the framework, then you're stuck and your apps gonna suck.
It's kinda weird to optimize code nowadays though. At least if you're writing JS. It's not like optimizing C or machine code at all. If you're not measuring performance, 99% sure you'll waste time optimizing the wrong thing. Sometimes it feels like I'm blindly trying variations on my inner loop because sometimes there is little rhyme or reason to what performs better (through the JIT). Tip for anyone in this situation: disable the anti-fingerprinting setting in your browser, which fuzzes the timing functions. It makes a huge difference for the accuracy and repeatability of your performance measurements. Install Chromium and only use it for that, if you worry about the security.
There are two problems with interfaces like google maps - and one exacerbates the other.
- it's not bloody obvious how they work - randomly clicking on meaningless icons, try to uncover functionality.
- then just as you get used to it, they change it!
My biggest feature request would be a key stroke to hide all the floating crud that is obscuring my view of the map!
Selling a software release is a one-time payment. Selling a support subscription is recurring revenue. And if you make your software horrible enough to use without the support subscription, it is automatically immune to piracy.
As a practical example, I don't know anyone who uses the free open source WildFly release. Instead, everyone purchases JBoss with support. It's widely known that you just need the paid support if you want your company to be online more than half of the day. And as if they knew what pain they would be causing, their microservice deployment approach was named "thorn-tail".
The computers are faster, can do more stuff, and monitors have higher frame rates. But for many applications that aren't games latency and non-responsive UIs are a growing problem.
Last night I downloaded an app update on my handheld computing device (a phone). It took around 30 seconds to download and install the 100mb update, on a internet connection I can use pretty much anywhere in Europe for £10/mo.
15 years ago I would have been waiting 20 minutes for a single song to download on a hard wired PC.
I've been trying to explain for years that for the past 4 decades the hardware guys have been surfing Moore's law and the software guys have been pissing it away ....
Well Moore's law is falling by the wayside, if they want to start doing more with less the software guys are going to have to stop using interpreted languages, GC, passing data as json rather than as binary, all that overhead that's deriguer but that doesn't directly go to getting the job done
In 1983 virtually everything was text base. Since moving to graphical user interfaces a great deal of effort moved into more visually stimulating UI such as animations and better fonts etc. Not all of this should be counted a progress/innovation. We have waisted much of the HW performance we achieved of the years for baubles and trinkets:)
That's what you get for catering to people who don't care for your work one bit.
The same people who are telling me that their computers are slow are the same people who need a flashy animated button for every single action and the same people who refuse to understand that passwords are not just a formality.
Computers have gotten much faster in terms of raw speed and throughput, yet that hasn't translated into much of an improvement in basic UI interactions and general functioning.
That keyboard-centric design for GUIs (I clearly have never taken a design class) is what makes Reddit Enhancement Suite such an effective product, in my opinion. HN's interface is possibly just as effective in that it discourages me from taking too many actions; I can vote on basically every reddit comment I read but using a mouse to do it on HN represents such a massive barrier compared to keyboard navigation.
[+] [-] notacoward|6 years ago|reply
[+] [-] auiya|6 years ago|reply
[+] [-] musicale|6 years ago|reply
[+] [-] kristianp|6 years ago|reply
[+] [-] anilakar|6 years ago|reply
[+] [-] PetahNZ|6 years ago|reply
[+] [-] marticode|6 years ago|reply
[+] [-] TeMPOraL|6 years ago|reply
[+] [-] nikanj|6 years ago|reply
Finding that file via Explorer search takes 10 minutes. Via dir, it somehow takes 10 seconds or less.
[+] [-] FpUser|6 years ago|reply
[+] [-] abandonliberty|6 years ago|reply
The most impressive, simple piece of software I've tried is a search tool called Everything.
I thought search was just hard and slow. Everything indexes every drive in seconds and searches instantly. I imagine it must be used by law enforcement to deal with security by obscurity.
[+] [-] redcalx|6 years ago|reply
[+] [-] WalterBright|6 years ago|reply
1. A large hires screen so I can see lots of context
2. Lots of disk space
3. Online documentation available
4. Protected mode operating system
5. Github
6. Collaboration with people all over the world
The productivity destroyers:
1. social media
[+] [-] gpderetta|6 years ago|reply
>The productivity destroyers: > 1. social media
stares at HN page
[+] [-] commandlinefan|6 years ago|reply
> 1. social media
2. Project Managers
[+] [-] MaxBarraclough|6 years ago|reply
[+] [-] issa|6 years ago|reply
[+] [-] stock_toaster|6 years ago|reply
----
Anecdote: I can't count the number of times I have seen a team changing a color, updating a logo, or moving an image a few pixels, resulted in happy clients/customers, and managers sending a congratulatory company wide email. While teams solving difficult engineering problems may have garnered a quiet pat on the back, if they were lucky.
[+] [-] jokoon|6 years ago|reply
HTML was designed for static documents, it boggles my mind that things like nodejs were created. It's not a secret.
HTML techs can't even run efficiently on a cheap smartphone, which is the reason apps are needed for smartphones to be usable.
Every time I'm talking to someone for job offers, I state that I want to avoid web techs. No js, no web frameworks. I prefer industrial computing, to build things that are useful. I don't want to make another interface that will get thrown away for whatever reason.
Today the computing industry has completely migrated towards making user interfaces, UX things, fancy shiny smoothy scrolly whatnots, just to employ people who can't write SQL. Companies only want to sell attention. This is exactly what the economy of attention is about.
All I dream about is some OS, desktop or mobile, that lets the user write scripts directly. It's time you encourage users to write code. It's not that hard.
[+] [-] EL_Loco|6 years ago|reply
[+] [-] TeMPOraL|6 years ago|reply
Your friend is wrong. It's an imperfect proxy, but looking at programs that do work, speed is a good proxy for quality, because speed means someone gives a damn. There are good programs that are slow, but bad programs all tend to be bloated.
Of course "speed" is something to be evaluated in context. In a group of e.g. 3D editors, a more responsive UI suggests a better editor. A more responsive UI in general suggests a better program in general.
> this (speed) is never a measure of a program's quality. Is this universally accepted?
Universally? No. It all depends on who you ask. Companies tend to say speed isn't, but the truth is, a lot of companies today don't care about quality at all - it's not what sells software. If you ask users, you'll get mixed answers, depending on whether the software they use often is slow enough to anger them regularly.
[+] [-] joshspankit|6 years ago|reply
Especially when you’re doing the same task “template” on a day to day basis, even 1 second per input adds up quickly.
[+] [-] clarry|6 years ago|reply
In many cases I'm happy with simple but slow but fast enough.
[+] [-] tripzilch|6 years ago|reply
[+] [-] PavlovsCat|6 years ago|reply
[deleted]
[+] [-] jtbayly|6 years ago|reply
Regardless of anything else, this is 100% happening to me on a regular basis. And the ironic thing is that I think it is caused by the attempt to speed up getting some results onscreen. But it’s always 500ms behind, so it “catches up” while I’m trying to move the mouse to click on something.
[+] [-] baq|6 years ago|reply
[+] [-] johnday|6 years ago|reply
Part of the issue stems from the "strong data coupling" that's all the rage. Everything on the page should correlate at any given point in time. Add a character to a search box and the search results should be updated. The practical effect of this is that any single modification could (and often does) rewrite the contents of the entire page.
The other thing the article brings up is that developers and designers often disregard input flow. This may be partly driven by not having sufficiently dynamic tooling (Illustrator can hardly be used to design out flow patterns, for example.)
These two issues have a unifying quality: Websites must be "instagrammable", which is to say look good in single snapshots of time, and the dynamics take a serious back seat.
[+] [-] unlinked_dll|6 years ago|reply
Like the first Mac retailed for $2500 US. Go spend $2500 on a PC today, you'll have a great time.
Granted, economies of scale make this kind of a dumb argument. But it has a bit of truth to it. People are just less willing to spend as much on their machines, as well as push much more limited platforms like mobile to their limits. We should definitely deal with that as developers, don't get me wrong - but not having to deal with the optimizations they dealt with 40 years ago doesn't make me unhappy.
[+] [-] asserttrue5|6 years ago|reply
I have a top of the line Intel processor that’s less than 2 years old (launched, not bought). 970 Evo Pro that’s the one of the fastest drives around. 32 GB RAM (don’t remember the speed but it was and is supposed to be super fast).
Explorer takes a second or two to launch. The ducking start menu takes a moment and sometimes causes the entire OS to lock up for a second.
The twitter rant is spot on.
There’s so much of supposed value add BS that the core usage scenarios go to shit.
And this is coming from a Product Manager. :-)
Anyway the referencing problem is painful. I feel it often. Google maps or Apple Maps. Try to plan a vacation and Mark interesting places on it to identify the best location to stay. Yup gotta use that memory. Well isn’t that one of the rules of UX design, don’t make me think?
Regarding OSes: storage has gotten so much faster and CPUs haven’t, that storage drivers and file systems are now the bottleneck. We need less layers of abstraction to compensate. The old model of IO is super slow is no longer accurate.
[+] [-] cellularmitosis|6 years ago|reply
[+] [-] smnrchrds|6 years ago|reply
I disagree. I have such a PC (64 GB of RAM, Quadro GPU, SSD, etc.) and I absolutely do notice things being slow, even things like Word, Excel, and VS code, let alone resource-intensive professional software.
[+] [-] throwaway2048|6 years ago|reply
I know from experience, the most godlike PC you can possibly build does virtually nothing to make common applications less laggy.
[+] [-] Reedx|6 years ago|reply
[+] [-] mcv|6 years ago|reply
And why should they? Today's smartphones are much more powerful than the most powerful supercomputer of 1983. Computers have been powerful enough for most practical purposes for years, which means most people select on price rather than power. And then a new OS or website comes along and decides you've got plenty of power to waste on unnecessary nonsense.
[+] [-] stevenwoo|6 years ago|reply
[+] [-] saagarjha|6 years ago|reply
[+] [-] tripzilch|6 years ago|reply
Please stop blaming the consumers, they have very little freedom of choice.
> as well as push much more limited platforms like mobile to their limits.
I don't think anyone has really pushed any recent smartphone to their limits. I haven't checked if any demoparty maybe had a smartphone compo, but if they didn't, then yeah nobody has really tried.
The C64, Amiga and early x86 PCs have been pushed to their limits though, squeezing out every drop of performance. And there still exist C64 scene weirdos that work to make these machines perform the unimaginable.
Smartphones haven't been around long enough and have been continuously replaced by slightly better versions, that really nobody has had time to really find out what those machines are capable of.
> but not having to deal with the optimizations they dealt with 40 years ago doesn't make me unhappy.
I used to have to deal with such optimizations and I totally get that. It's freeing and I occasionally have to remind myself what it means that I don't have to worry about using a megabyte more memory because machines have gigabytes. Except that a megabyte is pretty huge if you know how to use it.
But not having to deal with the optimizations also means that new developers never learn these optimizations and they will be forgotten. And that's bad. Because there's still a place for these optimizations, like 95% of the code doesn't matter, but for that 5% performance critical stuff, ... if you just learned the framework, then you're stuck and your apps gonna suck.
It's kinda weird to optimize code nowadays though. At least if you're writing JS. It's not like optimizing C or machine code at all. If you're not measuring performance, 99% sure you'll waste time optimizing the wrong thing. Sometimes it feels like I'm blindly trying variations on my inner loop because sometimes there is little rhyme or reason to what performs better (through the JIT). Tip for anyone in this situation: disable the anti-fingerprinting setting in your browser, which fuzzes the timing functions. It makes a huge difference for the accuracy and repeatability of your performance measurements. Install Chromium and only use it for that, if you worry about the security.
[+] [-] DrScientist|6 years ago|reply
- it's not bloody obvious how they work - randomly clicking on meaningless icons, try to uncover functionality. - then just as you get used to it, they change it!
My biggest feature request would be a key stroke to hide all the floating crud that is obscuring my view of the map!
[+] [-] AtlasBarfed|6 years ago|reply
But dude, DESIGN. The design. Look at those rounded corners.
[+] [-] fxtentacle|6 years ago|reply
Selling a software release is a one-time payment. Selling a support subscription is recurring revenue. And if you make your software horrible enough to use without the support subscription, it is automatically immune to piracy.
As a practical example, I don't know anyone who uses the free open source WildFly release. Instead, everyone purchases JBoss with support. It's widely known that you just need the paid support if you want your company to be online more than half of the day. And as if they knew what pain they would be causing, their microservice deployment approach was named "thorn-tail".
[+] [-] hasperdi|6 years ago|reply
Remember when softwares are stored in floppy. It took a while to load. Then every application came with different behavior and key bindings.
[+] [-] cf|6 years ago|reply
https://danluu.com/input-lag/
The computers are faster, can do more stuff, and monitors have higher frame rates. But for many applications that aren't games latency and non-responsive UIs are a growing problem.
[+] [-] fyfy18|6 years ago|reply
15 years ago I would have been waiting 20 minutes for a single song to download on a hard wired PC.
[+] [-] Taniwha|6 years ago|reply
Well Moore's law is falling by the wayside, if they want to start doing more with less the software guys are going to have to stop using interpreted languages, GC, passing data as json rather than as binary, all that overhead that's deriguer but that doesn't directly go to getting the job done
[+] [-] aiCeivi9|6 years ago|reply
Still not very readable.
[+] [-] trickledown|6 years ago|reply
[+] [-] reportgunner|6 years ago|reply
The same people who are telling me that their computers are slow are the same people who need a flashy animated button for every single action and the same people who refuse to understand that passwords are not just a formality.
To each his own.
[+] [-] userbinator|6 years ago|reply
Computers have gotten much faster in terms of raw speed and throughput, yet that hasn't translated into much of an improvement in basic UI interactions and general functioning.
[+] [-] kissickas|6 years ago|reply