top | item 3385126

How can Skyrim be so unoptimized? Modders do better job than Bethesda

105 points| jhack | 14 years ago |forums.bethsoft.com | reply

56 comments

order
[+] newobj|14 years ago|reply
Wow. The forum is full of so many really smart people. And the Bethesda developers are so stupid. Maybe all the people in the forum should start a company and acquire the rights to develop Elder Scrolls VI. /sarcasm

Most likely explanation: PC port was crashing, they disabled optimizations, it stopped crashing. Been there, done that. Drop dead release date approaching, no time to find the root cause. Maybe the PC game already crashes enough that the people who pick up this patch don't notice that it's crashing more now.

Dunno. The forum thread just makes my head spin with naivete. As a former game developer, I'm reminded why I could only ever read forums with one squinted eye open, head turned to the side.

[+] davedx|14 years ago|reply
Inlining getters should not cause crazy stability bugs.

I'm also a former game developer and I see both sides of it.

To be honest one thing that I think is true is it's probably not some hacker's fault - the marketing and politics BS that goes on with respect to choice and support of platform when making AAA games is really horrible sometimes.

[+] anaisbetts|14 years ago|reply
+1, Went onto this thread to say exactly this. Enabling global optimizations has a lot of far-reaching consequences, especially for games where optimizations can change numerical results and result in nightmarish to debug glitches
[+] floody-berry|14 years ago|reply
Optimizations off in the release build would be one thing, but it's like this in the latest patch. Over a month of being out and 4+ patches and still needing? optimizations disabled does point to a bit of incompetence.
[+] nickolai|14 years ago|reply
> Rewriting some x87 FPU code and inlining a whole ton of useless getter functions along the critical paths because the developers at Bethesda, for some reason, compiled the game without using any of the optimization flags for release build.

That sounds appalling. This is not some tricky algorthm-level optimization - they seem to have simply disabled compiler optimizations. Or forgot to reenable them for the final release. Inlining a function should have zero impact on the QA process (some try to explain the lack of optimization by the need to 'fix' bugs). If it does, then there is some sort of memory corruption bug somewhere, and the code should fail QA anyway.

Ensuring compiler optimizations are active would be the first low-hanging-fruit thing to come to anyone's mind when considering performace. The fact that it was 'forgotten' means that no one even considered performance during the whole development process. Not even in the "let's leave it to the compiler" form.

[+] vilya|14 years ago|reply
I think you're making the same mistakes as people on that thread. It's a long stretch from "there's some x87 assembly instructions and some function calls that could be inlined" to "they compiled with optimisations off".

It's entirely possible, for example, that the relevant code came from a 3rd party library that the game was statically linked against; or that they had to disable optimisations in parts of the code because they were found to cause bugs elsewhere.

Creating a rich interactive world the size of Skyrim is a considerable technical achievement, so I certainly don't think you can accuse the developers of incompetence.

[+] JabavuAdams|14 years ago|reply
> Ensuring compiler optimizations are active would be the first low-hanging-fruit thing to come to anyone's mind when considering performace. The fact that it was 'forgotten' means that no one even considered performance during the whole development process. Not even in the "let's leave it to the compiler" form.

This is a silly conclusion. More likely it means that either it was a conscious decision, or at the last minute the ball was dropped and those who should have signed off on this decision didn't even know about it.

How could you go from serious-performance-error/tradeoff ships to "no one even considered performance during the whole development process" (my emphasis).

[+] iso8859-1|14 years ago|reply
SSE is not automatically faster than x87. GCC compiles to x87 on x86-32 by default, even with -O3.
[+] pak|14 years ago|reply
Because Bethesda has deadlines and P&L statements, and modders don't. It's as simple as that.

E.g., the Macintosh launched with a hard-crash bug in the Clipboard code in ROM [1]. When you're struggling to meet a tough date for a huge project, things fall through the cracks. They fixed it later with on-disk software.

[1] http://www.folklore.org/StoryView.py?project=Macintosh&s...

[+] ajross|14 years ago|reply
Bingo. The questions one asks before launch are "Does it work?" and "Will it sell?". Performance is part of the former only inasmuch as it impacts the latter. Far more important in those final days is the coarse QA, not tuning.

Look down that forum post for all the people asking for help getting it working. Every one of those would have been a lost sale if this were in the shipping product.

And who knows: maybe this was off for a reason. Maybe they hit some voodoo late in the process which produced a crash bug on one of their 19 test systems that didn't occur with a debug build. So one of the engineers tries an unoptimized build and it works. Slightly slower is better than crashing, so they pulled the trigger and shipped it.

Shipping software only looks easy when you look only at code.

[+] bstar77|14 years ago|reply
I think what the people in that (painful to read) thread failed to realize is how the QA process works. I'm guessing that there would need to be significant regression testing for some of those optimizations. When you are killing yourselves to hit a date, that's the last thing you want to worry about considering things are already working well enough.
[+] lt|14 years ago|reply
Most likely they have been explicitely disabled to workaround problems.
[+] alimbada|14 years ago|reply
Not much more to say than this: http://forums.bethsoft.com/topic/1321675-how-can-skyrim-be-s...

Bethesda are releasing patches, but right now they're fixing bugs rather than optimising performance and I'm presuming that it's a slow process because they need to test on all platforms before releasing patches.

[+] spacemanaki|14 years ago|reply
That kind of attitude from "fans" must be difficult to deal with. Skyrim is a really great game, and calling the developers lazy is missing the point entirely.

Having worked on products (much, much smaller than Skyrim of course) targeted at different platforms (browsers and mobile OSes) it seems pretty rational to me to make the PC version a console port especially given the breakdown of users on each platform.

[+] lloeki|14 years ago|reply
Indeed Skyrim is butter smooth at 1080p on my 360, even in Markarth and Whiterun. Maybe it's lacking a few bells and whistles compared to a PC powerhouse (whose GPU alone would cost more than my console) but I don't care. Given the performance of Oblivion and Fallout 3 (which was okay-ish) for an inconsistent visual quality (see checkerboard patterns in the hills) I would never have expected Skyrim to achieve such a level.

Secondly, the interface on the PC has been criticized, but on the 360 it's just fine. I bet on a PC it's a mere remap from buttons to keys and it just begs for a controller instead of keyboard/mouse.

I have no PC to compare with and honestly care less, but from what I hear it shows where (some) priorities lie.

[+] ugh|14 years ago|reply
I don’t know what it is, this attitude seems so common among gamers.
[+] mrhyperpenguin|14 years ago|reply
There are some great people at Bethesda, creating an engine that is capable of what Skyrim is capale is no small feat, so I'm sure they are perfectly aware of what they are doing.

Its hard to believe that they 'forgot' to turn on optimizations or optimize critical code paths, they simply didn't because they are probably fixing show-stopper bugs which is much more important (I'm saying this as a game developer.)

Part of it I believe is that we are spoiled by title updates, they allow developers to ship games earlier (before extensive QA testing or bug fixes) but at the same time is what the fans want and makes sense financially for their company.

So I doubt these modders can do a better job than the professionals at Bethesda, if anything they can do the trivial things that would take any novice programmer a day with a profiler.

[+] macspoofing|14 years ago|reply
>How can Skyrim be so unoptimized?

Time. Bugs and features coupled with (tight) deadlines will push aggressive optimizations "for later". This is especially true if the product is performing adequately and nobody really wants to mess with it lest they introduce unknown regression bugs.

[+] pohungc|14 years ago|reply
It's surprising that they didn't have SSE enabled to begin with.
[+] maximilianburke|14 years ago|reply
I doubt code optimizations were off, it's more likely that the offending functions were not declared inline and not visible across translation units.

I've seen this many times in games that definitely were optimized -- some trivial constructor exists out of line because it was forgotten about but then was called thousands of times per frame. Sometimes they just don't show up on the profiler.

[+] acgourley|14 years ago|reply
Another huge oversight which they only fixed 2 days ago was only enabled 2GB of virtual memory on 64bit OSes. This made most 64bit OS people I've spoken to crash every few minutes. How did that get past QA?

Of course, it could be fixed with a patch on the EXE someone released. Until they encryped the EXE (to prevent piracy) and broken the only fix people had for a month.

[+] unsigner|14 years ago|reply
A 32-bit executable can use 2 GB on both 32 and 64 bit OSes; there's a linker flag that lets you use 3 GB, if you're not playing any dirty tricks with your pointer bits. More than that, and you need to recompile the EXE for 64-bit, which is far from trivial, and uncommon for games. Especially games that also run on 512 MB consoles.
[+] sliverstorm|14 years ago|reply
Yea, because tweaking software is so much harder than writing an entire massive-world RPG.
[+] sgoranson|14 years ago|reply
I don't know much about reverse eng, could someone explain how you can recompile machine code with new compiler flags? And changing getters and setters to be inline?
[+] Omni5cience|14 years ago|reply
Am I the only one who thought of that story about the programmer who occasionally checked in unnecessary loops, so that when performance bonus time came around he could just take them out?