Disappointed I didn't see fairly reasonable explanations around processor security bugs which impact broad system performance such as Spectre and Meltdown.
This has been precisely my experience. Windows 7 was fast on an HDD and blazing fast on a SSD. Windows 10 is unusable on an HDD and usable on an SSD. Still kinda sluggish, even then, for what it's worth.
Not to mention the applications. A slow operating system that uses so much resources merely idling plus Applications-That-Are-Actually-Web-Browsers make day to day usage almost innavigable for someone with quick reflexes used to a Linux CLI.
Win10 is perfectly fine on HDDs. You just have to install Linux on the bare metal, and create a Win10 VM leaving at least 3GB of RAM to the Linux caches.
This is faster than W10 on bare metal with SSD and the same physical amount of RAM.
Back in 2020, after one update, windows 10 was taking about 5min to boot on a thinkpad, if not more. Had to install a SSD, changed everything.
Pretty curious to hear what kernel developer or filesystem developers have to say about this, because it doesn't really make sense to me.
I can understand why something would stop working, but not to just become slow. In what world does a system change result in worse performance in certain cases?
Maybe deep inside, w10 stopped using any optimization that allowed it to be fast on HDD, considering SSD should be the norm, but I fail to understand if this is choice that implies a difficult compromise, or if it's just laziness and negligence.
Absolutely true. I've converted more people to GNU/Linux over the last two years than in the twenty years before that. The vast majority of people in my (third world) country do not have SSDs, so GNU/Linux is their only hope of having an usable and moderately secure machine.
Yep. Windows 7 would sometimes take a few minutes to finish whatever boot time disk i/o it wanted to do; for Windows 10 on a disk, it seemed to always be doing some i/o and never settled down.
People must have very different performance requirements than me. I've run 10 on nothing but HDD for years now with no issues. Of course, I also avoid electron apps, modify Firefox so it doesn't eat memory like an addict on a bender, and generally keep bloat off my system. But Windows, basic Multimedia programs and games all seem to run fine. It's probably not cutting edge, but I don't feel like it's dragging.
I have an older laptop with an hdd. It always was under powered but still usable for things like streaming to a tv every once in a while. The latest Win10 updates have it taking minutes to run a search or context switch. Feels like we are getting into phone territory of forced HW upgrades for everything now.
For what's its worth macOS had excatly the same issue. My previous $WORK laptop was a HDD based model and all of a sudden after a macOS upgrade it was slow as heck. I can't recall the exact release but it was certainty very noticeable right after I applied a major upgrade. I was so happy when $WORK finally upgraded it to a SSD based model. I suspect OS developers pretty much target SSD machines and don't really bother to ensure there's no regressions on HDD machines. I also notice that Debian performs much better on SSD than HDD. So certainty not a Windows only issue.
Maybe this is true, I installed Ubuntu on a hard drive not long ago and immediately bought an SSD. Apt installing build-essentials was a slog but takes like 20-30 seconds on an SSD. I guess you mean using the GUI but I don’t think I could go back to using any OS on an HDD.
I could be completely wrong, but my guess at the cause would be the registry. Every functionality and feature in Windows has a flag somewhere in the registry. Every query is probably a disk read, and it's a blocking operation of course. Sure, there's probably a cache, but that cache is only so large.
There's a program that I can't recall the name of that can trace registry queries in the program it's attached to. You can attach to basically any process in Windows and see a monstrous number of registry queries.
Not a perfect comparison, but I have a Windows 10 box I built with a small M2 NVMe SSD boot drive, but a lot of apps -- including notably Steam games, OneDrive, and the browser Downloads folder -- runs off a larger 7200rpm hard drive. I've never felt any slowness and everything is blazing fast. It's got 16GB RAM, though.
I know a lot of cheap machines have 5400rpm hard drives, I wonder if 7200rpm+ drives will offer a better experience. Also what role RAM plays in user experience.
I don't have experience with W10+HDD, but note that modern HDDs have started to adopt SMR which can lead to slower writes. It's possible slowness comes from that direction and not just W10.
I won't disagree, or agree. If you are running spinning rust rather than an SSD, you are getting exactly what you chose. Just about any use case that justifies spinning rust can be solved with an external platter, thumbdrive, and/or SD card.
Why do people still expect modern software to run at incredible speeds on hard disk drives?
Run old technology with old software. It is ridiculous that consumer hardware is still being sold with crappy 5400 RPM disks in 2021. No, I have no interest in optimising my software for speeds that are 1/10th of my broadband.
Unless we're talking archival or huge storage necessities, stop complaining about modern OS or games running slow on a technology that hasn't realistically been updated in 20 years.
Do you expect Windows 10 to run on a Pentium II with 512MB RAM as well, because some version of Linux does?
On my current gaming PC (i7-7700) I have installed W10 in 2017 and... no problem whatsoever? SSD, 10s boot. idk how do people end up with all the problems. I'm really curious because there must be an underlying reason
Not having a windows machine here to test this, I can't believe some of the results there. Specifically the "Win32 applications" ones. 7 seconds to open the file manager or text editor? Or MS Paint?! On my laptop here I can load gimp including plugins within ~1.5-2s, and I never even bothered to optimize anything about this. I wouldn't even be able to measure the opening time of e.g. gedit without some sort of scripting.
Are win32 apps really this slow to start up, or are the 7s "baseline" measurements in that experiment some cumulative value over all the applications?
The benchmarks are hard to compare. He's using a VM and a fairly small amount of RAM (I don't know of many PCs only shipping with 4GB RAM these days).
I can say that Word takes about 2 seconds to open on my system, but I'm not running in a VM and I have 48GB RAM, so Windows can cache a lot to optimize opening times.
"Starting" is slow in Windows and just keeps getting slower. That could be booting, logging in, waiting for an application to start, waiting some more for an application to start, and then waiting even more, not being sure if it ever is going to start, waiting some more, and then it starts.
How much of the slowdown has to do with the Spectre and Meltdown mitigations? There was a similar thread the other day about drastic performance hits on the Linux side.
The spike in there is really weird. I'm wonder if that's spectre mitigations causing the bulk of the slowdown. If that's the case I'd be curious to know if disabling them helps and if popular Linux distributions show similar performance loss.
One time Windows got a "feature update" that made it not boot. Apparently it was an issue with Lenovo motherboards that is still not fixed to this day (afaik). In any case, that was the kick in the pants I needed to switch to Linux. Everything has gone swimmingly since!
How were these measurements obtained? Are they an average, and if so what was the variance of the measurements? For some the difference seems substantial but without knowing the variance across measurements it is a little difficult to assess whether the differences are actually significant. (Basically would there be a statistically significant difference between each condition)
Obviously this would be a lot more work, so I don’t want to detract from the work that’s already been done.
I don't know about slowing down: I do know that I keep getting stuck on some update after which I can't moved beyond due to some cryptic and unfixable error.
This has happened for the 2nd time in a year and I end up having to download an up to date iso to move past the dead end update
I can't remember the last time I used Windows search because it's worse than useless - there have been multiple occasions in the past where it can't even find a file in the current folder right in front of my eyes. Nowadays I just use Everything which I think is one of the best piece of software ever.
These speed issues are annoying but the thing that kills my experience is the lost clicks - and when one is missed, because it's quite common that Windows is just being its usual useless self, you then don't know for sure if it really missed the click or not. Invariably the time you think "it did miss it, try again", you'll click again only to find that now your slow Windows machine is stupidly struggling to do the damn task twice!
The other issue doesn't sit completely with Windows: in a corporate environment, there are numerous remote activities that get hooked up without much thought or care and it only takes an occasional slow response with one or two of them for Windows to become unusable.
This is purely speculation and observation but I had to disable all the anti-telemetry hacks on my (aging) W10 gaming desktop and I noticed a marked increase in latency in everything from opening a folder to launching simple applications. Once I re-enabled all the patches the latency seem to vanish.
I have no data or hard evidence to back any of this up so take it with a large grain of salt.
For me, every update seems to trigger some. NET compilation in the background (if I remember right). This process destroys your disk while it runs, and contributes a lot to the slowness in my experience.
I built my mom a moderate Windows PC for her accounting work a few years ago. Pretty standard Intel build, no graphics card cause it's not like she needs that.
And for the most part it works fine, until every few months it'll slow down to an unusably slow crawl and I'll have to hop into task manager, see what rogue Windows service is bugging out this time, Google it, and find some forum post somewhere telling me what registry edit I have to do to disable some service that restores it to full speed.
I've killed as much telemetry as I can, but every time my PC loses a core or two to a random background service, it's that piece of shit.
No matter how much you purge it, it comes back. Removing execution permissions seems to work best, because Windows still realises that the file is there, but eventually it'll have its ACLs restored and the shitshow starts again.
[+] [-] sharms|4 years ago|reply
This has a significant impact on Linux and Microsoft has even outlined that these fixes impact their performance (there have been many more security bugs identified since): https://www.microsoft.com/security/blog/2018/01/09/understan...
[+] [-] boba7|4 years ago|reply
[+] [-] moth-fuzz|4 years ago|reply
Not to mention the applications. A slow operating system that uses so much resources merely idling plus Applications-That-Are-Actually-Web-Browsers make day to day usage almost innavigable for someone with quick reflexes used to a Linux CLI.
[+] [-] marcosdumay|4 years ago|reply
This is faster than W10 on bare metal with SSD and the same physical amount of RAM.
[+] [-] jokoon|4 years ago|reply
Back in 2020, after one update, windows 10 was taking about 5min to boot on a thinkpad, if not more. Had to install a SSD, changed everything.
Pretty curious to hear what kernel developer or filesystem developers have to say about this, because it doesn't really make sense to me.
I can understand why something would stop working, but not to just become slow. In what world does a system change result in worse performance in certain cases?
Maybe deep inside, w10 stopped using any optimization that allowed it to be fast on HDD, considering SSD should be the norm, but I fail to understand if this is choice that implies a difficult compromise, or if it's just laziness and negligence.
Any linux dev could chime in?
[+] [-] lvass|4 years ago|reply
[+] [-] toast0|4 years ago|reply
[+] [-] ergot_vacation|4 years ago|reply
[+] [-] seniorThrowaway|4 years ago|reply
[+] [-] kiwijamo|4 years ago|reply
[+] [-] smoldesu|4 years ago|reply
[+] [-] ocdtrekkie|4 years ago|reply
[+] [-] edgyquant|4 years ago|reply
[+] [-] c7DJTLrn|4 years ago|reply
There's a program that I can't recall the name of that can trace registry queries in the program it's attached to. You can attach to basically any process in Windows and see a monstrous number of registry queries.
[+] [-] signal11|4 years ago|reply
I know a lot of cheap machines have 5400rpm hard drives, I wonder if 7200rpm+ drives will offer a better experience. Also what role RAM plays in user experience.
[+] [-] Superblazer|4 years ago|reply
[+] [-] michaelmrose|4 years ago|reply
[+] [-] jayd16|4 years ago|reply
[+] [-] stuaxo|4 years ago|reply
[+] [-] yyyk|4 years ago|reply
[+] [-] tester756|4 years ago|reply
insanely fast NVMe M2 disks are cheap, let alone normal SSDs.
[+] [-] brudgers|4 years ago|reply
[+] [-] unknown|4 years ago|reply
[deleted]
[+] [-] afturkrull|4 years ago|reply
[deleted]
[+] [-] sph|4 years ago|reply
Run old technology with old software. It is ridiculous that consumer hardware is still being sold with crappy 5400 RPM disks in 2021. No, I have no interest in optimising my software for speeds that are 1/10th of my broadband.
Unless we're talking archival or huge storage necessities, stop complaining about modern OS or games running slow on a technology that hasn't realistically been updated in 20 years.
Do you expect Windows 10 to run on a Pentium II with 512MB RAM as well, because some version of Linux does?
[+] [-] dangus|4 years ago|reply
Why in the world would a modern operating system not optimize for SSDs?
I can’t think of a single reason to use a spinning drive on my computer. Almost every conceivable use case is better relegated to a separate NAS box.
You can get a 1TB PCIe nvme SSD rated at 3100Mbps read speeds for $125. Why would I chip off an order of magnitude of performance to save $40?
[+] [-] brudgers|4 years ago|reply
I used Hyper-V as the hypervisor of choice
That is not how most end user installations are configured (aka, not as a virtual machine).
32GB fixed disk for each build.
That is much much less than the typical Windows 10 hardware.
the fast boot feature has been disabled for the purposes of this measurement.
That is not the default and not reflective of most installations.
[+] [-] haunter|4 years ago|reply
[+] [-] black_puppydog|4 years ago|reply
Are win32 apps really this slow to start up, or are the 7s "baseline" measurements in that experiment some cumulative value over all the applications?
[+] [-] tallanvor|4 years ago|reply
I can say that Word takes about 2 seconds to open on my system, but I'm not running in a VM and I have 48GB RAM, so Windows can cache a lot to optimize opening times.
[+] [-] PaulHoule|4 years ago|reply
"Starting" is slow in Windows and just keeps getting slower. That could be booting, logging in, waiting for an application to start, waiting some more for an application to start, and then waiting even more, not being sure if it ever is going to start, waiting some more, and then it starts.
[+] [-] jdlyga|4 years ago|reply
[+] [-] ziml77|4 years ago|reply
[+] [-] ggregoire|4 years ago|reply
So I wonder how OP gets 34 seconds, and how he went from 13 to 34 seconds over a couple of updates. Mine definitely didn't get 21 seconds slower.
[+] [-] tpxl|4 years ago|reply
I do wonder how if they only did one run per test or multiple, since n=1 will mean noise can mess with your results.
[+] [-] MonaroVXR|4 years ago|reply
There are differences, but I've read a lot of comments and a lot of people aren't specific. So it's difficult to judge.
[+] [-] smoldesu|4 years ago|reply
[+] [-] stygiansonic|4 years ago|reply
Obviously this would be a lot more work, so I don’t want to detract from the work that’s already been done.
[+] [-] antiterra|4 years ago|reply
[+] [-] 627467|4 years ago|reply
This has happened for the 2nd time in a year and I end up having to download an up to date iso to move past the dead end update
[+] [-] ncann|4 years ago|reply
[+] [-] nmstoker|4 years ago|reply
The other issue doesn't sit completely with Windows: in a corporate environment, there are numerous remote activities that get hooked up without much thought or care and it only takes an occasional slow response with one or two of them for Windows to become unusable.
[+] [-] fuzzy2|4 years ago|reply
Overlaying SPECTRE etc. mitigations could also provide some insight.
[+] [-] scrps|4 years ago|reply
I have no data or hard evidence to back any of this up so take it with a large grain of salt.
[+] [-] jiggawatts|4 years ago|reply
[+] [-] candiddevmike|4 years ago|reply
[+] [-] Ashanmaril|4 years ago|reply
And for the most part it works fine, until every few months it'll slow down to an unusably slow crawl and I'll have to hop into task manager, see what rogue Windows service is bugging out this time, Google it, and find some forum post somewhere telling me what registry edit I have to do to disable some service that restores it to full speed.
[+] [-] jeroenhd|4 years ago|reply
I've killed as much telemetry as I can, but every time my PC loses a core or two to a random background service, it's that piece of shit.
No matter how much you purge it, it comes back. Removing execution permissions seems to work best, because Windows still realises that the file is there, but eventually it'll have its ACLs restored and the shitshow starts again.
[+] [-] swiley|4 years ago|reply
[+] [-] cable2600|4 years ago|reply
The special effects slow it down, you can disable them: https://www.cnet.com/how-to/easy-ways-to-speed-up-windows-10...