> With their latest data measurements specific to the game, the developers have confirmed the small number of players (11% last week) using mechanical hard drives will witness mission load times increase by only a few seconds in worst cases. Additionally, the post reads, “the majority of the loading time in Helldivers 2 is due to level-generation rather than asset loading. This level generation happens in parallel with loading assets from the disk and so is the main determining factor of the loading time.”It seems bizarre to me that they'd have accepted such a high cost (150GB+ installation size!) without entirely verifying that it was necessary!
I expect it's a story that'll never get told in enough detail to satisfy curiosity, but it certainly seems strange from the outside for this optimisation to be both possible and acceptable.
afavour|2 months ago
They’re not the ones bearing the cost. Customers are. And I’d wager very few check the hard disk requirements for a game before buying it. So the effect on their bottom line is negligible while the dev effort to fix it has a cost… so it remains unfixed until someone with pride in their work finally carves out the time to do it.
If they were on the hook for 150GB of cloud storage per player this would have been solved immediately.
jeroenhd|2 months ago
That's why they did the performance analysis and referred to their telemetry before pushing the fix. The impact is minimal because their game is already spending an equivalent time doing other loading work, and the 5x I/O slowdown only affects 11% of players (perhaps less now that the game fits on a cheap consumer SSD).
If someone "takes pride in their work" and makes my game load five times longer, I'd rather they go find something else to take pride in.
WreckVenom|2 months ago
I've racked up 700 hours in the game and the storage requirements I didn't care about.
weavejester|2 months ago
I'm not sure that's necessarily true... Customers have limited space for games; it's a lot easier to justify keeping a 23GB game around for occasional play than it is for a 154GB game, so they likely lost some small fraction of their playerbase they could have retained.
runningRicky|2 months ago
I have to check. You're assumption is correct. I am one of very few.
I don't know the numbers and I'm gonna check in a sec but I'm wondering whether the suppliers (publishers or whoever is pinning the price) haven't screwed up big time by driving prices and requirements without thinking about the potential customers that they are going to scare away terminally. Theoretically, I have to assume that their sales teams account for these potentials but I've seen so much dumb shit in practice over the past 10 years that I have serious doubts that most of these suits are worth anything at all, given that grown up working class kids--with up to 400+ hours overtime per year, 1.3 kids on average and approx. -0.5 books and news read per any unit of time--can come up with the same big tech, big media, economic and political agendas as have been in practice in both parts of the world for the better part of our lives--if you play "game master" for half a weekend where you become best friends with all the kiosks in your proximity.
> the effect on their bottom line is negligible
Is it, though? My bold, exaggerated assumption is that they would have had 10% more sales AND players.
And the thing is, that at any point in time when I, and a few I know, had time and desire to play, we would have had to either clean up our drives or invest game price + sdd price for about 100 hours of fun over the course of months. We would have gladly licked blood but no industry promises can compensate for even more of our efforts than enough of us see and come up with at work. As a result, at least 5 buyers and players lost, and at work and elsewhere you hear, "yeah, I would, if I had some guys to play with" ...
oersted|2 months ago
And this being primarily a live-service game drawing revenues from micro-transactions, especially a while after launch, and the fact that base console drives are still quite small to encourage an upgrade (does this change apply to consoles too?), there’s probably quite an incentive to make it easy for users to keep the game installed.
scruple|2 months ago
wilg|2 months ago
zelphirkalt|2 months ago
clusterhacks|2 months ago
It was amazing how often people wanted to optimize stuff that wasn't a bottleneck in overall performance. Real bottlenecks were often easy to see when you measured and usually simple to fix.
But it was also tough work in the org. It was tedious, time-consuming, and involved a lot of experimental comp sci work. Plus, it was a cost center (teams had to give up some of their budget for perf engineering support) and even though we had racks and racks of gear for building and testing end-to-end systems, what most dev teams wanted from us was to give them all our scripts and measurement tools to "do it themselves" so they didn't have to give up the budget.
mikepurvis|2 months ago
PunchyHamster|2 months ago
> But it was also tough work in the org. It was tedious, time-consuming, and involved a lot of experimental comp sci work. Plus, it was a cost center (teams had to give up some of their budget for perf engineering support) and even though we had racks and racks of gear for building and testing end-to-end systems, what most dev teams wanted from us was to give them all our scripts and measurement tools to "do it themselves" so they didn't have to give up the budget.
Misaligned budgeting and goals is bane of good engineering. I've seen some absolutely stupid stuff like outsourcing hosting a simple site to us, because client would rather hire 3rd party to buy domain and put a simple site there (some advertising), than to deal with their own security guys and host it on their own infrastructure.
"It's a cost center" "So is fucking HR, why you don't fire them ?" "Uh, I'll ignore that, pls just invoice anything you do to other teams" ... "Hey, they bought cloud solution that doesn't work/they can't figure it out, can you help them" "But we HAVE stuff doing that cheaper and easier, why they didn't come to us" "Oh they thought cloud will be cheaper and just work after 5 min setup"
loeg|2 months ago
There will be huge mistakes occasionally, but mostly it is death by a thousand cuts -- it's easy to commit a 0.1% regression here or there, and there are hundreds of other engineers per performance engineer. Clawing back those 0.1% losses a couple times per week over a large deployed fleet is worthwhile.
deng|2 months ago
amlib|2 months ago
Because it's a recent 20TB HDD the read speeds approach 250MB/s and I've also specifically partitioned it at the beginning of the disk just for games so that it can sustain full transfer speeds without files falling into the slower tracks, the rest of the disk is then partitioned for media files that won't care much for the speed loss. It's honestly fine for the vast majority of games.
superkuh|2 months ago
SSD sizes are still only equal to the HDD sizes available and common in 2010 (a couple TB~). SSD size increases (availability+price decreases) for consumers form factors have entirely stopped. There is no more progress for SSD because quad level cells are as far as the charge trap tech can be pushed and most people no longer own computers. They have tablets or phones or if they have a laptop it has 256GB of storage and everything is done in the cloud or with an octopus of (small) externals.
robin_reala|2 months ago
PoignardAzur|2 months ago
bombcar|2 months ago
JohnBooty|2 months ago
154GB was the product of massive asset duplication, as opposed 23GB being the product of an optimization miracle. :)
How did it get so bad on PC?
Well, it wasn't always so crazy. I remember it being reasonable closer to launch (almost 2 years ago) and more like ~40-60GB. Since then, the devs have been busy. There has been a LOT of reworking and a lot of new content, and the PC install size grew gradually rather than suddenly.
This was probably impacted to some extent by the discontinued game engine they're using. Bitsquid/Stingray was discontinued partway through HD2 development and they continued on with it rather than restarting production entirely.
https://en.wikipedia.org/wiki/Bitsquid
aeve890|2 months ago
You should look at COD install sizes and almost weekly ridiculously huge "updates". 150gb for a first install is almost generous considering most AAA games.
jeffwask|2 months ago
seivan|2 months ago
[deleted]
jsheard|2 months ago
code_for_monkey|2 months ago
Ekaros|2 months ago
LtWorf|2 months ago
bombcar|2 months ago
On phone, I bet you see some more effort.
jjk166|2 months ago
bee_rider|2 months ago
pdntspa|2 months ago
tehjoker|2 months ago
Very fast decompression often means low compression or multicore. I have seen libjpgturbo vastly outperform raw disk reads though
mrguyorama|2 months ago
Which is the primary problem: Computers are so varied and so ever changing that if you are optimizing without hard data from your target hardware, you aren't optimizing, you are just doing random shit.
Add to that, game devs sometimes are just dumb. Titanfall 1 came with tens of gigabytes of uncompressed audio, for "performance", which is horse shit. Also turns out they might have been lying entirely. Titanfall 1 was made on the source engine, which does not support the OGG audio format their audio files were in. So they decompressed them at install time. They could have just used a different audio file format.
nerdjon|2 months ago
Sure they may loose some sales but I have never seen many numbers on how much it really impacted sales.
Also on the disk side, I can't say I have ever looked at how much space is required for a game before buying it. If I need to clear out some stuff I will. Especially with it not being uncommon for a game to be in the 100gb realm already.
That all being said, I am actually surprised by the 11% using mechanical hard drives. I figured that NVME would be a lower percentage and many are using SSD's... but I figured the percent with machines capable of running modern games in the first place with mechanical drives would be far lower.
I do wonder how long it will be until we see games just saying they are not compatible with mechanical drives.
onli|2 months ago
literallywho|2 months ago
XorNot|2 months ago
At any given point if it wasn't vital to shipping and not immediately breaking, then it can be backburnered.
Messing with asset loading is probably a sure fire way to risk bugs and crashes - so I suspect this mostly was waiting on proving the change didn't break everything (and Helldiver's has had a lot of seemingly small changes break other random things).
NBJack|2 months ago
The latest Ratchet and Clank, the poster child used in part to advertise the SSD speed advantage, suffers on traditional hard drives as well in the PC port. Returnal is in the same boat. Both were originally PS5 exclusives.
mikepurvis|2 months ago
By comparison a SATA III port caps out at 6Gbps (750 MB/s), and first generation NVMe drives ("gen 3") were limited to 3500 MB/s.
shantara|2 months ago
garaetjjte|2 months ago
root_axis|2 months ago
Cthulhu_|2 months ago
I believe this led to a huge wave of 'laziness' in game development, where framerate wasn't too high up in the list of requirements. And it ended up in some games where neither graphics fidelity or frame rate was a priority (one of the recent Pokemon games... which is really disappointing for one of the biggest multimedia franchises of all time).
ratelimitsteve|2 months ago
unknown|2 months ago
[deleted]
hinkley|2 months ago
I don’t think it’s always still the case but for more than ten years every release of OSX ran better on old hardware, not worse.
Some people think the problem was MS investing too eagerly into upgrading developer machines routinely, giving them a false sense of what “fast enough” looked like. But the public rhetoric was so dismissive that I find that pretty unlikely. They just didn’t care. Institutionally.
I’m not really into the idea of Helldivers in the first place but I’m not going to install a 150GB game this side of 2040. That’s just fucking stupid.
PunchyHamster|2 months ago
Also if goal was to improve things for small minority they could've just pawned it off to free DLC, like how some games do with 4k texture packs
oreally|2 months ago
kasabali|2 months ago
> These loading time projections were based on industry data - comparing the loading times between SSD and HDD users where data duplication was and was not used. In the worst cases, a 5x difference was reported between instances that used duplication and those that did not. We were being very conservative and doubled that projection again to account for unknown unknowns
Unfortunately it's not only game development, all modern society seems operate like this.
londons_explore|2 months ago
Twenty years on, and somehow that's still 'big'.
Computing progress disappoints me.
neuroelectron|2 months ago
whalesalad|2 months ago
Wait till you find out what engine this game is made in. https://80.lv/articles/helldivers-ii-was-built-on-an-archaic...
behringer|2 months ago