The Elder Scrolls V: Skyrim Performance & IQ Preview @ [H]

The thing you need to realize, is some people cant play it with the problems they have with Video drivers.

really? who isn't using either nvidia or AMD drivers?

If you are referring to SLI/CF.. it's easy. turn it off.
 
really? who isn't using either nvidia or AMD drivers?

If you are referring to SLI/CF.. it's easy. turn it off.

On AMD cards you can either run RadeonPro w/ the Oblivion profile or the fixed DLL. I get pretty decent scaling using the DLL fix at 6048x1080 on my 2x 6950's - Ultra settings, FXAA enabled (no MSAA). This game is freaking amazing.

On NV cards I'm getting ok scaling ~50%, but the game isn't that demanding anyway (playing on mixed High on 2x 8800GT's).

Using a framerate limiter can smooth out how choppy the engine can be on dual card rigs from either brand.

I'm pretty excited about my weekend of Skyrim...
 
So, is there any chance that bethesda will patch the game to access more cores? Quads have been out for fucking ever, it's soooo stupid to get bottlenecked by CPU when 2 cores are just sitting there...

I really doubt that will happen. It would take extensive code changes to add more threads to the game. I don't see what the big issue with performance is though, I have a pretty weak AMD proc and a couple of 5770s and the game runs fine without crossfire. :confused:
 
I really doubt that will happen. It would take extensive code changes to add more threads to the game. I don't see what the big issue with performance is though, I have a pretty weak AMD proc and a couple of 5770s and the game runs fine without crossfire. :confused:

in my limited experience, i feel like the game doesn't scale up very well. I can play it fine on xbox settings, low to medium i'm guessing, but wanted a little more than that because i just built my computer last year.

My friend is playing this smoothly on an i7 with intel onboard HD graphics...

it's just boggling, that's all. I should be able to play it on high, and i can mostly, but the second i enter a town or look at some rocks :confused: i all of a sudden drop to 10fps.

Looking at my performance meters, nothing is maxed out--so i don't know WHERE exctly this performance drop is coming from, neither does anyone seem to know on the bethesda forums, but there sure is plenty of speculation from a ton of people experiencing the exact same thing.

I just with bethesda would come in and put an end to all the rumors, and do whatever it is they need to do to get the game running well on new computers.
 
When Oblivion came out I was playing at at 12-15 FPS on my system at the time. This is not the type of game where you need 60 FPS all the time.

The game runs great on my system, which has a fairly weak CPU and low-ish end GPU. It drops to around 25 FPS in the larger cities, but that's fine for this type of game. My framerate was lower in Oblivion with all the mods I had installed. In the wilderness and in caves/dungeons it's ridiculously smooth.

I have set everything to Ultra except shadows which are at High (setting them at Utra cut my frame rate in half in dungeons, for some reason). Also I use only FXAA, which makes MSAA redundant. I think FXAA looks almost as good as MSAA+Transparency AA, and it's much faster.
 
This game can be an absolute hog...but only when modded. I'm using the ugridstoload=9 command and it can drop to 30fps at times. Looks great, though.
 
WSGF guys have finally fixed the Eyefinity issues, at last I can play the game at 5960x1080. Too bad AMD still hasn't released a CAP for the game so my FPS gets a bit too low at times, though it looks like it might be CPU bound anyways, and in that case Bethesda's fault. They really should patch the game to not lag so badly in places like towns.
 
This game can be an absolute hog...but only when modded. I'm using the ugridstoload=9 command and it can drop to 30fps at times. Looks great, though.

I get drops to 30 FPS, modded or not. It's just an optimization issue.

Also, I tried that grids tweak and the game crashed constantly. :(
 
It would be awesome if Nvidia/Bethesda fixed multi-GPU performance, but I'm not optimistic.

NVIDIA is working on this with Bethesda. It's just a matter of Bethesda making the changes to make their game multi-GPU friendly. A Crossfire/SLI profile can only go so far in getting scaling. At a certain point with multi-GPU, to get good performance in a case where a game is not accounting for multi-GPU properly, NVIDIA and AMD have to choose between introducing corruption or just living with low scaling.
 
You do realize that the developers edited and created the entire game *on a PC* right?

As a number of developers have pointed out, driver overhead (and other things) make it really difficult to get close-to-bare-metal performance on PCs.

While you might justifiably say that they could have spent more time optimizing the PC version, to accuse them of making a "console port" shows a misunderstanding of the development process.

Realistically, business decisions are going to drive schedules and releases. You're not going to hold up the release of a product to spend more time optimizing for the lowest sales platform. It doesn't make any sense.

I place the blame squarely on Microsoft; their choices for driver architecture and in other areas are part of what it has made it so difficult and costly to optimize software for the PC. I feel as though they have spent far more time optimizing graphics and memory management on the Xbox than they have on Windows -- and it shows.

A few top tier developers have pointed that out maybe, but what about the 99% who are perfectly content with the fact that the driver abstracts a lot of the low level complexity and per-architecture compatibility issues away? There's a reason we are where we are today. Unless your goal is to raise the requirements for technical/system expertise/mastery for the average company to release a game, you'd be cutting off your nose to spite your face by eliminating the driver (which by the nature of it's existence will add some "overhead").
 
Ugh, that one!

Arkham Asylum looked a lot like a paid crippling (Nvidia even went so far as to have them call it "Nvidia™ Multi Sample Antialiasing" in the options, as if Nvidia invented antialiasing); seemed to work just fine in the GOTY edition, so they made liars out of themselves. That game has Nvidia slapped all over its box, settings, interface, opening videos, advertisements, and hell, even its instruction manual.

On the subject of Skyrim, though, I skipped MSAA altogether. FXAA looks good enough for me and has a minimal performance hit. I love my 120Hz monitor, too. So nice to have a game running so ridiculously smoothly.

Actually, if you're interested in the facts, NVIDIA implemented the MSAA themselves. If they hadn't done it, there'd be *no* MSAA in the game at all. In addition to this, leading up to the game's launch, the latest AMD drivers didn't support MSAA for the necessary format for some effects, and this caused corruption on AMD. AMD added the necessary support in later drivers.

That's why they originally labeled it and locked it down as an NVIDIA feature.
 
I always look at the scenario and unless bethesda refused to give amd access to the game at all before hand then I can only blame AMD in this particular situation.

This is what happened. Bethesda gave access to the game to the companies barely before launch. That means there's enough time for simple things like making a Crossfire/SLI profile, but not for any driver fixes/releases.
 
Game is solid gold. Great story, deep character desing, what more could you want? Well maybe hires textures but even without, it still is the best thing I've played all year.
 
Actually they used a standard microsoft code, so they had no right to call it like they owned it.

By that logic, every game should be free. I should have no right to withhold my software product from you simply because you didn't pay for it. Games use standard Microsoft DirectX code... if I put in money and time from my life to making something valuable via a "standard" API, I should have to give it away free?

However, I will say that if you try to universalize that sort of thing it would be bad with each side having spotty feature support. But it's clear it's not at that level today, so for little features here and there where AMD or NVIDIA write the code *for* the developers, it's not the end of the world. Also, obviously for things like tessellation where the art really has to be designed with the specifics of the tessellation mode in mind, it would be bad.
 
Last edited:
Remember Dragon Age 2 and NV were caught with their pants down for a month until driver fixed so many bugs and performance?

Shit happens.

As far as the performance problem that existed there, that wasn't chance or NVIDIA dropping the ball. You can read between the lines.

Heh, you almost got me there. :)
Anyway, for some games the ball is in AMD's court, and Skyrim is apparently one of those. StarCraft II maybe too, I did not look at it in detail. But for Batman AA and other games with vendor locks (or quasi-locks like Crysis 2) that was not the case.

What the fuck is a "quasi-lock"? There was no lock there at all. Now it seems like you're just reaching.
 
I get drops to 30 FPS, modded or not. It's just an optimization issue.

Also, I tried that grids tweak and the game crashed constantly. :(

The game is very CPU intensive. The grids tweak needs a modded exe (with CFF explorer) so it can use more RAM and the buffers must not be set to high. I'm playing with ugrids 9 and an exterior buffer of 120 without crashes.

I've found SLI support very lousy. I'm running TRI-SLI GTX 580, using the latest beta drivers w/Skyrim profile. I've even tried tweaking the Skyrim profile in Nvidia Inspector.

Basically MSI Afterburner reports an average 40% GPU utilization for each GPU. A well optimized SLI game would be above 75% for each GPU (e.g. Battlefield 3, WoW, etc.). I average 60 FPS in Skyrim @ 2560x1600. On TRI-SLI. Not good if you ask me. (should be more like 80+).

Unfortunately, this isn't surprising given the game was primarily designed for consoles, and today's aged consoles are more CPU centric -- hence why Skyrim scales far better on CPU.

It would be awesome if Nvidia/Bethesda fixed multi-GPU performance, but I'm not optimistic.

You acknowledge that it scales very well with the CPU and call for a SLI "fix"? SLI scales perfectly with this game...if you let it by removing the bottleneck. Try using SGSSAA and watch your usage go up. fps will stay mostly the same obviously because often you're CPU bottlenecked.

Honestly, I don't get why people jump to premature conclusions when instead they could do some simple testing that takes 5 mins. But blame it on Nvidia/AMD, that's easier ;)
 
I have been bitten by the pathetic ATI driver support in the last few months (Rage, Deus-Ex, BF3 and now Skyrim). Seriously, WTF ATI! This is getting old fast. My next card will be an NVIDIA.

I solved my problem by replacing my CF setup with an SLI one. No need to tell me it's not cost effective, I don't care. :D So far, it feels "smoother" when playing Skyrim. It's not a night and day difference, but it's a noticeable improvement.
 
Anyone got any news about a fix yet? Even a date on a patch would be nice

I would also like to know this. I'm swapping out my gaming 1 rig from signature with another system sporting dual unlocked 6950's and have definitely lost some of the urgency due to the skyim issues. Having said that, my 470 sli is doing amazingly well at 3150x1680 nv surround with DNA Ultra Extended settings. I was sure vram would be a limiting factor but I'm 15 hours in and haven't noticed any significant slowdown or jittering.

Also, my cards are both seeing 85%+ usage with 60 fps vsync on. In the DNA inin generator you can set hardware threads up to 8, are those feeling bottlenecks from their cpu running with this setting?
 
Also, my cards are both seeing 85%+ usage with 60 fps vsync on. In the DNA inin generator you can set hardware threads up to 8, are those feeling bottlenecks from their cpu running with this setting?
What is the setting?

edit: iNumHWThreads=X ?
 
What is the setting?

edit: iNumHWThreads=X ?

Yes, DNA shows options from 1 to 8 there. I didn't do much testing with stock settings but set to 8 on my 2600K has been smooth and gives heavy utilization on the gpu's. My gpu demand could also simply be due to the higher resolution from nvsurround, I'd be interested to hear if this makes an impact for you.
 
Where is this iNumHWThreads? I cannot find it in either the ini or on DNA's generator...
 
Need help.

Im running SLI 3gb 580's overclocked to 900mhz, i72600k @ 4.8ghz (Hyperthreading off), 8gb ram, latest nvidia drivers - 285.79

Run Skyrim at 2560x1440, all settings DNA extreme ini tweak, all in-game settings ultra, fxaa injector tweak preset1, NO msaa at all..and I cant maintain 60fps 100% of the time..it was starting to drop to high 40's (which I didnt mind)..but certain times it got down below 30 (which i don't like at all)...In the end to make it run smoother..as in never really below 50fps (so far anyway..not that far into the game)...I had to drop anisotropic filtering to 8x, drop object distance to half to get it to run smoother. Which isn't too bad I rarely notice the object drop off..but it does seriously help the frame rate.

Im only getting 35 - 55% gpu usage in SLI, So i tried all the fancy AA settings etc...and I did get my gpu usage up to 50 - 70%...but it ran worse...and didnt look a whole lot better.

I understand this game isn't a shooter, and I wouldn't have thought the odd frame rate drop would bother me...but it is annoying after going from constant 60fps in bf3 to this on skyrim. Understand game is cpu dependant..but are these issues present because of bad coding/console port etc ??? and if that is the case...whats the chances of a patch coming out to fix this ???

Has anyone out there been able to get this game to run on ultra 2560x1440 60fps+ all the time??
any suggestions on tweaks I can do?..just want constant 55fps + at ultra settings..not really fussed about AA because at this high res i dont really notice too much difference.

Im fairly new to pc gaming after making switch from consoles...so never really had to bother with tweaks, msaa settings ec...so go easy
 
We all are just going to have to wait for a patch, which Beth reported that they are working on.
 
Where is this iNumHWThreads? I cannot find it in either the ini or on DNA's generator...

They seem to have removed it from the most recent version of their generator, I guess it must not be doing what they thought.
 
Need help.

Im running SLI 3gb 580's overclocked to 900mhz, i72600k @ 4.8ghz (Hyperthreading off), 8gb ram, latest nvidia drivers - 285.79

Run Skyrim at 2560x1440, all settings DNA extreme ini tweak, all in-game settings ultra, fxaa injector tweak preset1, NO msaa at all..and I cant maintain 60fps 100% of the time..it was starting to drop to high 40's (which I didnt mind)..but certain times it got down below 30 (which i don't like at all)...In the end to make it run smoother..as in never really below 50fps (so far anyway..not that far into the game)...I had to drop anisotropic filtering to 8x, drop object distance to half to get it to run smoother. Which isn't too bad I rarely notice the object drop off..but it does seriously help the frame rate.

Im only getting 35 - 55% gpu usage in SLI, So i tried all the fancy AA settings etc...and I did get my gpu usage up to 50 - 70%...but it ran worse...and didnt look a whole lot better.

I understand this game isn't a shooter, and I wouldn't have thought the odd frame rate drop would bother me...but it is annoying after going from constant 60fps in bf3 to this on skyrim. Understand game is cpu dependant..but are these issues present because of bad coding/console port etc ??? and if that is the case...whats the chances of a patch coming out to fix this ???

Has anyone out there been able to get this game to run on ultra 2560x1440 60fps+ all the time??
any suggestions on tweaks I can do?..just want constant 55fps + at ultra settings..not really fussed about AA because at this high res i dont really notice too much difference.

Im fairly new to pc gaming after making switch from consoles...so never really had to bother with tweaks, msaa settings ec...so go easy

Serious suggestion, disable your third card. I'm running with three 580s as well, and when I tried out 1080p, I found one card had better FPS than three cards. My theory is that since the game is hugely CPU limited (because of the poor threading), all that happens when you throw too much GPU power at it is that the CPU gets further bogged down dealing with that inefficiency.
 
We all are just going to have to wait for a patch, which Beth reported that they are working on.

Did they say the patch is going to be helping any of us out in terms of performance? I don't think they did IIRC.
 
Serious suggestion, disable your third card. I'm running with three 580s as well, and when I tried out 1080p, I found one card had better FPS than three cards. My theory is that since the game is hugely CPU limited (because of the poor threading), all that happens when you throw too much GPU power at it is that the CPU gets further bogged down dealing with that inefficiency.

Im only running 2 x 3gb 580's. and I tried to run off one GPU, I get around 95% usage but doesnt run any better
 
Anyone notice that the third gpu in triple crossfire is not used at all?!

Run afterburner and watch gpu utilization. Only two of three ofmy 6970s are being used!
 
Back
Top