Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
How can Skyrim be so unoptimized? Modders do better job than Bethesda (bethsoft.com)
105 points by jhack on Dec 23, 2011 | hide | past | favorite | 56 comments


Wow. The forum is full of so many really smart people. And the Bethesda developers are so stupid. Maybe all the people in the forum should start a company and acquire the rights to develop Elder Scrolls VI. /sarcasm

Most likely explanation: PC port was crashing, they disabled optimizations, it stopped crashing. Been there, done that. Drop dead release date approaching, no time to find the root cause. Maybe the PC game already crashes enough that the people who pick up this patch don't notice that it's crashing more now.

Dunno. The forum thread just makes my head spin with naivete. As a former game developer, I'm reminded why I could only ever read forums with one squinted eye open, head turned to the side.


Inlining getters should not cause crazy stability bugs.

I'm also a former game developer and I see both sides of it.

To be honest one thing that I think is true is it's probably not some hacker's fault - the marketing and politics BS that goes on with respect to choice and support of platform when making AAA games is really horrible sometimes.


Even if you assume the compiler is perfect and only making semantically valid transformations, changing performance characteristics often changes your odds of hitting race conditions.


throw new BathwaterException(new Baby()) ?


No no no. Always throw by value, and catch by reference. Like this:

  throw BathwaterException(Baby());
Wait, we are talking about C++ programming, right? ;)


One interesting thing is that in a previous Bethesda game, Morrowind, converting the FPU instructions to SSE instructions actually alleviated most of the game crashes...

http://timeslip.users.sourceforge.net/exeopt.html


+1, Went onto this thread to say exactly this. Enabling global optimizations has a lot of far-reaching consequences, especially for games where optimizations can change numerical results and result in nightmarish to debug glitches


And nowadays you can throw in multithreading problems into the mix.


Optimizations off in the release build would be one thing, but it's like this in the latest patch. Over a month of being out and 4+ patches and still needing? optimizations disabled does point to a bit of incompetence.


We game developers look forward to seeing your magnum opus. Be sure to let us know how your discussions regarding optimizations and ship dates go with your publisher.


> Rewriting some x87 FPU code and inlining a whole ton of useless getter functions along the critical paths because the developers at Bethesda, for some reason, compiled the game without using any of the optimization flags for release build.

That sounds appalling. This is not some tricky algorthm-level optimization - they seem to have simply disabled compiler optimizations. Or forgot to reenable them for the final release. Inlining a function should have zero impact on the QA process (some try to explain the lack of optimization by the need to 'fix' bugs). If it does, then there is some sort of memory corruption bug somewhere, and the code should fail QA anyway.

Ensuring compiler optimizations are active would be the first low-hanging-fruit thing to come to anyone's mind when considering performace. The fact that it was 'forgotten' means that no one even considered performance during the whole development process. Not even in the "let's leave it to the compiler" form.


I think you're making the same mistakes as people on that thread. It's a long stretch from "there's some x87 assembly instructions and some function calls that could be inlined" to "they compiled with optimisations off".

It's entirely possible, for example, that the relevant code came from a 3rd party library that the game was statically linked against; or that they had to disable optimisations in parts of the code because they were found to cause bugs elsewhere.

Creating a rich interactive world the size of Skyrim is a considerable technical achievement, so I certainly don't think you can accuse the developers of incompetence.


>disable optimisations in parts of the code because they were found to cause bugs elsewhere.

Well that is exactly what I meant. Compiler optimizations by themselves do not cause bugs - they reveal bugs. They are specfically designed to be sematically equivalent[1]. Without the compiler optimizations the bugs may not manifest themselves, or only manifest themselves in particular conditions that may not be spotted by QA - but the bugged/unsafe code is still present. Any functional discrepancy between compiler-optimized and non-optimized code should be cause for concern.

>Creating a rich interactive world the size of Skyrim is a considerable technical achievement,

I wouldnt say that the size is a technical acheivement - the credit would mostly go to the artists there - and they did an amazing job. Although it would cerainly not be possible without a decent quality engine.

Don't get me wrong. I really like skyrim, and have spent a lot of time with it. But I simply get the feeling that performance receives less and less attention in the modern products.

[1]http://en.wikipedia.org/wiki/Compiler_optimization


Compiler optimizations by themselves do not cause bugs - they reveal bugs.

This is incorrect. Optimizations do often reveal bugs rather than causing them, but turning on optimizations can also cause bugs in itself. See for example:

http://gcc.gnu.org/bugzilla/buglist.cgi?short_desc=optimizat...


These are due to bugs in the compiler, not inherent property of optimizations. It's quire rare for anyone to hit a optimizer bug and when it happens one can always selectively disable the buggy optimization method. Disabling all optimizations kinda overdoes it.


Compiler optimisations aren't always designed to be semantically equivalent - for instance options which control floating point behaviour, like -ffast-math in gcc.

That aside, sometimes compilers themselves have bugs as others have pointed out (I found one once involving a combination of python, boost, exceptions and the Intel C++ compiler - which only happened at -O3, not at -O2). Or you might not be able to use certain settings because it would be incompatible with some thirdparty library that you don't have the source to.

Re: the size, the technical achievement is in managing all the content, how you interact with it and how it interacts with itself, etc. It's doing quite a lot of I/O to pull in the right assets at the right time - not to mention some computational geometry to figure out which are the right assets; running lots of AI for the creatures & NPCs; balancing memory usage between RAM & VRAM; simulating a night & day cycle, with weather too; rendering it all in (usually!) less than 30 milliseconds; and more besides. I certainly don't mean to play down the achievement of the artists involved, but don't underestimate the technical side either!


> Compiler optimizations by themselves do not cause bugs - they reveal bugs.

Incorrect, and naive in the extreme.

> They are specfically designed to be sematically equivalent

Unconvincing. Machines are specifically designed to not blow up, and yet they do.

> But I simply get the feeling that performance receives less and less attention in the modern products.

I agree, but often we gain something in terms of the scope of games that a team can create. If you want to make a movie, you probably shouldn't start by designing the camera.


>Compiler optimizations by themselves do not cause bugs - they reveal bugs. They are specfically designed to be sematically equivalent.

> Any functional discrepancy between compiler-optimized and non-optimized code should be cause for concern.

I agree that functional discrepancies are a cause for concern, but compilers have bugs too (and games programmers tend to push the limits).


> Well that is exactly what I meant. Compiler optimizations by themselves do not cause bugs - they reveal bugs.

Compilers are programs too. They have bugs. In an ideal world, they do not generate broken code. In the real world, they do.


So someone left the compiler optimization off. Anyway I reckon that during 3 years of development someone should have realized.

Or it could be raw assembly code or inline assembly. These would usually not be optimized.


> Ensuring compiler optimizations are active would be the first low-hanging-fruit thing to come to anyone's mind when considering performace. The fact that it was 'forgotten' means that no one even considered performance during the whole development process. Not even in the "let's leave it to the compiler" form.

This is a silly conclusion. More likely it means that either it was a conscious decision, or at the last minute the ball was dropped and those who should have signed off on this decision didn't even know about it.

How could you go from serious-performance-error/tradeoff ships to "no one even considered performance during the whole development process" (my emphasis).


SSE is not automatically faster than x87. GCC compiles to x87 on x86-32 by default, even with -O3.


False.

For the type of vector arithmetic you do in games, SSE (much less later flavors like SSE2/3/etc.) is a big win. SIMD instructions for number crunching is huge.

Your example there is more GCC being shitty than anything else.


You're both wrong. iso-8859-1 is wrong that it's not inherently faster. Individual SSE instructions are not necessarily faster (in a latency sense) than the x87 equivalents, but the cleaner register architecture (no stack) means that they can be parallel-issued better by the CPU, and code generated to use them does less spilling and filling to memory. SSE is just plain better, though not overwhelmingly so.

And angersock is missing the point: you can't take scalar code and rebuild it into SIMD (except in the very limited, never-works-as-well-as-you-think-it-should auto vectorization features in modern compilers), the parallelism needs to be designed in. That's not possible here without a rewrite of the game engine.


So, I specifically said that SSE was better for the vector arithmetic that games do. Almost any game you pick will, in the source somewhere, have Vector3::Add(), Vector4::Dot(), etc. functions.

Scalar code is not trivially fixed by using SSE, true, but the majority of really obnoxious math being done (skinning, vector arithmetic, etc.) should be really easy to make really fast.


Right, which is missing the point. The use case at hand is rebuilding some particular part of the engine (honestly it's unclear to me exactly what was done) with different flags, not reworking the vector librar{y,ies} to use SSE.

SSE wouldn't be a flags issue. Building a vectorized SSE library in "debug mode" would still produce vector instructions.


Because Bethesda has deadlines and P&L statements, and modders don't. It's as simple as that.

E.g., the Macintosh launched with a hard-crash bug in the Clipboard code in ROM [1]. When you're struggling to meet a tough date for a huge project, things fall through the cracks. They fixed it later with on-disk software.

[1] http://www.folklore.org/StoryView.py?project=Macintosh&s...


Bingo. The questions one asks before launch are "Does it work?" and "Will it sell?". Performance is part of the former only inasmuch as it impacts the latter. Far more important in those final days is the coarse QA, not tuning.

Look down that forum post for all the people asking for help getting it working. Every one of those would have been a lost sale if this were in the shipping product.

And who knows: maybe this was off for a reason. Maybe they hit some voodoo late in the process which produced a crash bug on one of their 19 test systems that didn't occur with a debug build. So one of the engineers tries an unoptimized build and it works. Slightly slower is better than crashing, so they pulled the trigger and shipped it.

Shipping software only looks easy when you look only at code.


I think what the people in that (painful to read) thread failed to realize is how the QA process works. I'm guessing that there would need to be significant regression testing for some of those optimizations. When you are killing yourselves to hit a date, that's the last thing you want to worry about considering things are already working well enough.


Most likely they have been explicitely disabled to workaround problems.


Not much more to say than this: http://forums.bethsoft.com/topic/1321675-how-can-skyrim-be-s...

Bethesda are releasing patches, but right now they're fixing bugs rather than optimising performance and I'm presuming that it's a slow process because they need to test on all platforms before releasing patches.


That kind of attitude from "fans" must be difficult to deal with. Skyrim is a really great game, and calling the developers lazy is missing the point entirely.

Having worked on products (much, much smaller than Skyrim of course) targeted at different platforms (browsers and mobile OSes) it seems pretty rational to me to make the PC version a console port especially given the breakdown of users on each platform.


Personally I do not think this attitude is hard to deal with. You just have to, despite their tone, not take it personally and calmly thank them for reporting the problems and say that you are working on them (if that is true, which it should be). This is constructive criticism delivered in an ugly way.

Complaining customers are the much better than the silent ones or the apologizing fanboys, since then the complainers help you building a better product for everyone.


I upvoted you but I felt strongly that I should reply too. What you say makes a ton of sense and is worth keeping in mind, especially "constructive criticism delivered in an ugly way". Thanks.


Indeed Skyrim is butter smooth at 1080p on my 360, even in Markarth and Whiterun. Maybe it's lacking a few bells and whistles compared to a PC powerhouse (whose GPU alone would cost more than my console) but I don't care. Given the performance of Oblivion and Fallout 3 (which was okay-ish) for an inconsistent visual quality (see checkerboard patterns in the hills) I would never have expected Skyrim to achieve such a level.

Secondly, the interface on the PC has been criticized, but on the 360 it's just fine. I bet on a PC it's a mere remap from buttons to keys and it just begs for a controller instead of keyboard/mouse.

I have no PC to compare with and honestly care less, but from what I hear it shows where (some) priorities lie.


1080p? Nope.. It's 720p on consoles. Butter smooth? Many people would say that 25-30 FPS is far cry from butter smooth, especially if compared to modern PC which can achieve up to 120 FPS.

edit: About the cost of GPU; according to Tom's Hardware[1], even sub-$100 GPUs are capable of providing console-like performance (30 FPS, medium details, sub-HD resolution) with Skyrim.

[1] http://www.tomshardware.com/reviews/skyrim-performance-bench...


So what's the real deal? Either the game visibly skips frames in areas like Markrath and Whiterun or it does not. Either the game runs fine on middle-ranged hardware or it does not. If it runs fine on such cheap hardware what's the deal over this optimization stuff? What I heard so far though was that:

- Skyrim has performance issues on PCs, even high-end ones

- Skyrim's UI is ridiculously contrived on the PC

I don't have the foggiest idea if it's true, I just seemed to notice a trend that the PC does not get a first-class version but a port of sorts, a version created from a common denominator. As I said I am very badly placed to compare anything as I have no PC to compare with. Still, I'd like to know what's the reality around this case.

For what it's worth personally I'll take a RPG game running at a constant 30fps anytime before a game jumping between 15~60fps (which is what Oblivion did) (And you can pry Forza 4 and its constant 60fps from my cold, dead hands) Besides I could care less if it's upscaled from 720p to 1080p as long as it's not showing upscaling artifacts like aliasing (which it does not). I have a Panasonic 42" plasma TV and watch/play at a comfortable distance so the difference between 720p (which is what the Apple TV outputs on HD movie rentals) and 1080p (which is what the xbox outputs on Full-HD movie rentals) is arguably negligible.


I have a 2-year-old 'nice' whitebox with a nice-but-not-truly-awesome video card from the time (radeon 5770? Catalyst software just lists '5700 series') and it runs fine. Perhaps not 60fps, but it's very rare that I'll notice frame skipping or slowdown, and I've got the quality settings maxxed. So anecdotally, the game doesn't run horribly on all PCs in terms of performance. There are a number of other issues which suck from being a console port though, like the menu system. Dragons being trivial to kill is also a bit sad, but that's not the console port's fault :)


>Skyrim is butter smooth at 1080p on my 360

Its rendered internally at 1280x720 and then up-scaled (Pretty much all xbox/ps3 games do this). Which is over course a legitimate approach & looks good. In contrast on a PC, it is actually rendered at 1080p without upscaling & so the PC ends up rendering more than twice as many pixels even though both are set to 1080p in the options. So direct PC-Console performance comparisons are a bit tricky.


I run it at 1440x900 on a Core2Quad (Q6600), 8GB RAM and an AMD HD4850 GPU. For the most part it's pretty smooth at high settings with FXAA turned on, but there are times when there's a bit of stutter or freezing especially when I quicksave (something you need to do often since Skyrim is so crash-prone).

The reason I agree that Bethesda are lazy (probably more accurate to say apathetic) with fixing the PC version is that they've been doing this for years with previous Elder Scrolls games and Fallout 3 where they've just allowed the "safety net" of the modding community to do their work for them. Of course, that's only possible because of the modding tools they provide, which apparently are great since they get the job done. But, I find that sort of attitude rather insulting as a PC player. Yes, consoles have gone mainstream and are more popular than PC gaming probably by orders of magnitude, but that doesn't mean that PC gaming has declined and that the needs of PC gamers shouldn't be taken seriously.


Where do you have the technical details on that?

I have a considerably larger monitor and have been going past that, but would love to know more about detailed specs like this to better tune the game


DigitalFoundry does some technical comparisons of console releases, eg Skyrim triple-platform Face-off: http://www.eurogamer.net/articles/digitalfoundry-face-off-sk...


Somewhere in the xbox vs ps3 religious wars it was discovered that you can deduce the internal rendering resolution via a technique called "pixel-counting". This involves very bored people looking at the individual pixels with a magnifying glass & counting jagged edges.

The info is solid, but its of no real use beyond the ps3-xbox wars.

Lots of info over here including res for pretty much every game: http://forum.beyond3d.com/showthread.php?t=46241

Or a less tedious summary w/ basic math: http://www.gamerawr.com/2007/09/28/halo-3-only-runs-at-640p/


19201080 = 2073600 pixels

1280720 = 921600 pixels

So it renders a lot more than twice as many pixels


It renders 12.5% more than twice as many pixels. Maybe that's a lot; my brain doesn't do relative quantities to better than 1 significant figure.


I don’t know what it is, this attitude seems so common among gamers.


There are some great people at Bethesda, creating an engine that is capable of what Skyrim is capale is no small feat, so I'm sure they are perfectly aware of what they are doing.

Its hard to believe that they 'forgot' to turn on optimizations or optimize critical code paths, they simply didn't because they are probably fixing show-stopper bugs which is much more important (I'm saying this as a game developer.)

Part of it I believe is that we are spoiled by title updates, they allow developers to ship games earlier (before extensive QA testing or bug fixes) but at the same time is what the fans want and makes sense financially for their company.

So I doubt these modders can do a better job than the professionals at Bethesda, if anything they can do the trivial things that would take any novice programmer a day with a profiler.


>How can Skyrim be so unoptimized?

Time. Bugs and features coupled with (tight) deadlines will push aggressive optimizations "for later". This is especially true if the product is performing adequately and nobody really wants to mess with it lest they introduce unknown regression bugs.


It's surprising that they didn't have SSE enabled to begin with.


I doubt code optimizations were off, it's more likely that the offending functions were not declared inline and not visible across translation units.

I've seen this many times in games that definitely were optimized -- some trivial constructor exists out of line because it was forgotten about but then was called thousands of times per frame. Sometimes they just don't show up on the profiler.


Another huge oversight which they only fixed 2 days ago was only enabled 2GB of virtual memory on 64bit OSes. This made most 64bit OS people I've spoken to crash every few minutes. How did that get past QA?

Of course, it could be fixed with a patch on the EXE someone released. Until they encryped the EXE (to prevent piracy) and broken the only fix people had for a month.


A 32-bit executable can use 2 GB on both 32 and 64 bit OSes; there's a linker flag that lets you use 3 GB, if you're not playing any dirty tricks with your pointer bits. More than that, and you need to recompile the EXE for 64-bit, which is far from trivial, and uncommon for games. Especially games that also run on 512 MB consoles.


Maybe I misunderstood the reason, but this patch fixed it for me and everyone else: http://www.ntcore.com/4gb_patch.php

It seems to just flip a simple flag in any exe


Yea, because tweaking software is so much harder than writing an entire massive-world RPG.


I don't know much about reverse eng, could someone explain how you can recompile machine code with new compiler flags? And changing getters and setters to be inline?


Am I the only one who thought of that story about the programmer who occasionally checked in unnecessary loops, so that when performance bonus time came around he could just take them out?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: