The Elder Scrolls V: Skyrim Performance and IQ Review @ [H]

FrgMstr

Just Plain Mean
Staff member
Joined
May 18, 1997
Messages
55,601
The Elder Scrolls V: Skyrim Performance and IQ Review - The latest Elder Scrolls installment is here, taking us to the frozen province of Skyrim in search of Dragons! We've checked out performance and image quality in this wildly anticipated game with current generation graphics card solutions on the market today, and we're ready to show your our results. You may just be surprised at what you see!
 
Thanks for the write-up. I admit I do still have some surprise to spare on AMD seemingly dropping the ball once again. Skyrim should not be posing such a challenge to their current top cards, and these results are ugly, no question.
 
My Acer's 6550M GPU handles it fine at 768p. No anti-aliasing though, kills the fluidity. AF at full x16, some minor tweaks. Honestly, I can barely see myself worrying about the graphics here. The game is just so immersive. Not to mention they included one of Morrowind's soundtrack in here. Ahhh, the nostalgia. Oh, did anyone notice the frequent change of FOV? It's really fucking annoying.
 
how come no cpu scaling tests? you mention on page five that you may be cpu limited but there is no testing to know for sure.
 
Last edited:
You are cpu limited, otherwise SLI would scale almost 100%. It does here when I use 1080p (and higher) and 8xSGSSAA. I guess Kyle needs an Ivy Bridge@6 GHz when it is out :D
 
Last edited:
You are cpu limited, otherwise SLI would scale almost 100%. It does here when I use 1080p (and higher) and 8xSGSSAA. I guess Kyle needs an Ivy Bridge@6 GHz when it is out :D

... And what cpu are you using?


I do think it would be interesting to see the results of using RadeonPro w/ the Oblivion profile - that's what I've been doing and it did allow me to bump things up to Ultra @ 6048x1080. You could also simply rename the .exe if you don't want to install RadeonPro. Without multi-GPU eyefinity wasn't playable at those settings. Being the hardware enthusiasts that we are we should use any tools available to improve performance. Minimally it should be mentioned so that multi-GPU users are aware of this fix.
 
Last edited:
... And what cpu are you using?


I do think it would be interesting to see the results of using RadeonPro w/ the Oblivion profile - that's what I've been doing and it did allow me to bump things up to Ultra (FXAA/No MSAA) @ 6048x1080. Without multi-GPU eyefinity wasn't playable at those settings.

8xSGSSAA. I am using a 2600K@4,3 GHz and at times it still holds be back a little bit. But very rarely now.

Unfortunately, there is a problem with the SLI profile with 8xSGSSAA. Sometimes usage drops to 50% without reason - I think it will be fixed soon. Anyway, without advanced AA features you cannot get proper use out of a highend setup in this game unless you play in Eyefinity or 3D.
 
Last edited:
... And what magical future cpu do you have that you aren't getting limited by it in SLI?

how about an overclocked 2500/2600k to 4.6+? or one of those new socket 2011 cpus overclocked to 4.6+? there are cpus out that are faster then a i7 920 at 3.6.
 
Great write up. Quite eager to see what you all figure out in the .ini files.
 
Now we just need that high texture pack like the one from oblivion.

I am also surprised you guys didn't mention the horses and how they can scale any mountain with little to no issue. :D
 
Thanks for the write-up. I admit I do still have some surprise to spare on AMD seemingly dropping the ball once again. Skyrim should not be posing such a challenge to their current top cards, and these results are ugly, no question.

I wouldn't say that the game is a challenge for AMD's lineup (i.e. would you be saying this if the 580 performed the same?). It looks more like there's an optimization(s) being performed by nVidia that isn't being done by AMD. Whether this is a driver issue, or hardware limitation, remains to be seen.

The CFX issues, however, really do seem like a driver issue, but I'm no CFX expert.
 
I7 920?

How dare you!

Upgrade that ancient cpu! Hrumph!

Actually, yeah I am going to take a look and see how my 2500K at 4.4 measures up. There was a difference in the BF series but with this older game I doubt it is as impactful.

Good writeup as usual.
 
Is it just me that thinks that FXAA appears to reduce texture clarity rather a lot?
 
Have you been able to prove or disprove that Skyrim is limited to using only two CPU cores?
 
Here is something I don't understand. [H] did some testing here and determined that the CPU, good as it was, held back some of the performance in a multi-gpu solution.

So why hasn't [H] updated your platform to do testing on when it comes to newer games? I would assume that once you identify a bottleneck in the system, you would want to eliminate it.

Some of us are running decent cards at our chosen resolution and are more in need of help when it comes to deciding whether a CPU upgrade will do much for us or not.

I would like to see CPU scaling addressed more often in your articles, particularly when it comes to games which people might assume to be more CPU dependant than others.
 
^ We saw CPU affecting game performance with multi-GPUs at 3 GPUs and upwards, in multi-display. Not at dual-GPU. Read the title of that article ;)
 
Have you been able to prove or disprove that Skyrim is limited to using only two CPU cores?

CPU usage has not generally been a part of our gameplay performance/iq articles. It used to be, but we didn't learn much and the info wasn't that useful, so we stopped it.

That said, I just checked this out for you. Here is what I found.

Here's my computer idling at the desktop with steam open:

cpu_idle.png


And here it is with skyrim running: (loaded the game, ran around for a minute, and alt-tabbed out to take the screenshot)
cpu_skyrim.png


Looks like all 4 cores are engaged to me.

As a side-note, this game doesn't alt-tab cleanly, and that annoys the crap out of me.
 
Last edited:
Finally, a game that offers a challenge for my 560 Ti. :D

I'm kind of surprised you guys rated the 560 Ti's highest playable as Ultra though. I've been running on High because, while Ultra ran well in general, it had some definitely noticeable framerate dips that drove me nuts. This usually happened when I was entering a new zone, and I guess it has to do with my card "only" having 1GB of VRAM, which really is kind of small for that caliber of card (but I would assume is a cost-cutting measure).
 
Performance woes, from what I can see SLI scaling is horrible. Crossfire can't say because its not implemented as of yet.
 
article quote : ["For them, the only reliable way to do it is to add a line containing the text "iPresentInterval=0" to the "..\My Documents\My Games\SkyrimPrefs.ini" file, at the end of the section labeled "[Display]". This is not the case with AMD video cards. They are still limited to 60 FPS, even with VSYNC disabled in the INI file AND the driver control panel."]

I've read that it's the Skyrim.ini not the SkyrimPrefs.ini that should receive the iPresentInterval=0 under [Display].

And I got a ati hd 4770 and got vsync off in panel and that line of code.
I got ati tray tool showing me a nice 700 fps + in the start menu and in games if i got settings low enough i get above 60....
 
Last edited:
My 6970 is working fine at Ultra settings with FXAA enabled. However, I have a newer CPU than the review rig, and am playing at 1920x1200 (the native resolution on my 24" LCD).

For everyone concerned about the ATI results, it's true that the Geforce beats it --but if you're not playing at 2560x1600, you should still get very good results from an ATI card.

ATI dropped the ball on Crossfire, though. They knew this game was coming, and should have anticipated the demand. To not have a Crossfire profile out at this point is inexcusable. I can understand the disappointment of everyone with a dual-GPU setup.
 
And alot of sites are saying that the game offloads shadow rendering to the CPU , thus why it is so cpu
dependant. Running my c2d at 3.2ghz x,x . Nice article ...I m surprised you guyz didn t get crash to desktop cause of sound issues like alot of ppl out there.
 
^^ I had a few crashes before Catalyst 11.11 came out. None after upgrading to that driver.
 
^ We saw CPU affecting game performance with multi-GPUs at 3 GPUs and upwards, in multi-display. Not at dual-GPU. Read the title of that article ;)

I did read it, but that article was written before the current crop of interesting games were released (BF3, MW3, Skyrim). What I am questioning is why would you choose to team a 2011 GPU with a 2008 CPU, it just really doesn't make a lot of sense. I think you should update your test platform to eliminate any potential CPU bottleneck from the discussion.

All slander towards any other sites aside, there appears to be evidence that Skyrim in particular may be CPU limited in some situations. Articles at Techspot and Toms both show that even with a single-card solution, the CPU can have a pretty good impact on performance.

Once again, the point I'm trying to make is that I feel there is no reason you guys should still be doing your game testing on an old CPU platform. I don't have anything against including an older CPU, but if you are reviewing gameplay then you should use equipment to reflect what a person can expect to find currently. I think you would be well served by moving your testing platform to a SB-based system, and maybe test it against a comparable AMD-based system. You are testing both brand's modern video cards, so why not CPUs?
 
And alot of sites are saying that the game offloads shadow rendering to the CPU , thus why it is so cpu
dependant. Running my c2d at 3.2ghz x,x . Nice article ...I m surprised you guyz didn t get crash to desktop cause of sound issues like alot of ppl out there.

There is no way they draw shadows on the cpu. The engine scales like horse shit on the cpu side but that doesn't even make sense to do.
 
Is it true that this game only uses a max of 2 GB of system RAM? I thought I read of a .exe fix that enables > 2 GB of system RAM. Would that make any difference in performance?
 
Ah so I should use the Control Panel and not Nvidia Inspector?

Here's what happened in my testing:
Forcing ("Overriding application settings")
8x Supersampling and 32x CSAA f**ks up the HUD.
And there's a 10fps performance drop.

Same thing with 4x SS and 32x CSAA. Only a 5fps performance drop though.

No Supersampling, 32x CSAA forced: whole screen is f**k'd up.

16xQ (16x CSAA: 8+8) only: HUD still all over the place.

16x (16x CSAA: 4+12) only: Screen is black.

8x CSAA: same story as above.

In all cases (where the screen isn't black), water (when not in falls) looks almost transparent. Also, when the screen isn't blank, I had to bring up the task manager first for it to show up after loading is complete. No "Antialiasing compatibility: Elder Scrolls 4 - Oblivion, Fallout 3" used.

-------------------------------------------
Using "Enhance Application Settings"
32x CSAA: Screen is black.

32x CSAA ("Antialiasing compatibility: Elder Scrolls 4 - Oblivion, Fallout 3" used): Same result.

16xQ (16x CSAA: 8+8) and 8x Supersampling ("Antialiasing compatibility: Elder Scrolls 4 - Oblivion, Fallout 3" used): Same result. And it even got worse, the drivers crashed.

16xQ (16x CSAA: 8+8) and 4x Supersampling ("Antialiasing compatibility: Elder Scrolls 4 - Oblivion, Fallout 3" and "Antialiasing - behavior flags: Fallout 3" used): rainbows on the screen, goes blank, I do the CTRL+ALT+DEL trick, the game shows up. HUD no longer f**k'd up. Water (when not in falls) looks almost transparent. Barely any performance hit (2-3fps drop only).


Besides the HUD f**king up and the water suddenly becoming almost transparent when not going through falls, it actually looks a bit better. Not worth it though.

All tests have all settings (that are not mentioned) at their highest option. AF is also at 16x.


Is it true that this game only uses a max of 2 GB of system RAM? I thought I read of a .exe fix that enables > 2 GB of system RAM. Would that make any difference in performance?

Well I used Large Address Aware for that. And it seems you have to do it to increase uGrids.
 
Another good writeup, which for any other game these days would be fine but this game has the benefit (or hindrance) of having an EXTREMELY customizable engine,straight from the .ini file.

Once ati gets their crossfire working, I'd like to see a comparison with beyond-Ultra settings such as:
ugridstoload at 7, 9, 11
Shadows set to further and higher quality settings (there's like 6 different lines to edit in ini)
Water rendering increased in quality (most of the settings have zero performance impact)
Adjusting LOD settings (again, a ton of settings to edit)
Modding the executable to be large address aware
and a bunch of other little settings I'm forgetting.

Some of the tweaks seem to be almost user specific though, shadows bugging out for me at one setting will look perfect on another guy's machine (at least according to forum posts and settings I've tinkered with).

This game has potential to look much better than it does out of box, the FXAA injector that brightens and sharpens the post process filter is something I suggest everyone grab (at the lower preset, the higher one makes things a bit too cartoony bright and sharp).
 
Considering how little SLI helps, would breaking up SLI and assigning one GPU to physics processing instead help more?


"We even experienced some shadow feathering effects that appeared randomly, which could be a bug, or just a nature of the dynamic shadows."

I found that setting the shadow z-buffer option to 1 fixes this. Specifically, it fixes the shadow flickering if they appear and disappear randomly depending on the slightest movement of your character.
 
Not bad for the very first outing of this new engine (okay, GameBryo iteration) of theirs but I'd still blame Bethesda and not just AMD. 6950 hardware framelimit wtf.
 
I7 920?

How dare you!

Upgrade that ancient cpu! Hrumph!

Actually, yeah I am going to take a look and see how my 2500K at 4.4 measures up. There was a difference in the BF series but with this older game I doubt it is as impactful.

Good writeup as usual.

As much as I love seeing how games perform on the latest and greatest hardware, it's nice to see results on hardware that most of us are actually likely to own and use from day to day.

Plus, it makes me less likely to splurge on an upgrade to my i7 920. :D
 
Now we just need that high texture pack like the one from oblivion.

I am also surprised you guys didn't mention the horses and how they can scale any mountain with little to no issue. :D

HAHA! I have already done some pretty impressive stuff with the horses on mountains, its great! Adds to the already ridiculous explorability to the game! ;)
 
Game definitely sticks to one thread per core of my i7 w/HT.

Water is very disappointing. Many places it flows out of instead of around. It could have been much flashier with DX10/11. Really a shame that graphics were likely held back for console compatibility.

I do think that the graphics a nice compared to Oblivion but still disappointed in lack of modern graphical features.
 
The whiterun spot was mentioned as being one of the most intensive spots in the game and yet there was no comparison of how much FPS Nvidia/AMD cards are getting at different resolutions there to confirm whether it's a CPU bottleneck or not?
 
As an out-of-the-box experience, the game's disappointing to the graphics enthusiast, but if it's anything like Oblivion on the PC, there will be several hundred texture packs, HD models and graphical addons to push your machines to the limit. (not to mention additional user content to keep on playing until those graphical mods come out)
 
Back
Top