Fallout 4 Performance and Image Quality Preview @ [H]

If you want to feel input lag, just try out NBA 2K15/2K16. There's a world of difference running at 120fps with vsync off compared to 60fps vsync on.
 
That could get interesting with Freesync and Gsync. I hope you investigate.

It makes no difference with g-sync, if the FPS goes above 60 things like computer terminals go out of whack. I had to limit my fps to 60 even with my g-sync monitor.
 
Zarathustra[H];1041969509 said:
Yeah, it's all placebo effect.

Little kiddies who think they are super sensers, and can perceive humanly impossible differences at the single digit ms level.

Very similar to audiophile snobs. It's all in their heads.s

Can't really agree. VSync definitely makes input feel really sluggish depending on the game...like unwanted mouse acceleration. I stopped using it and just end up using borderless window and lock framerate to 58 or so, and usually that is enough to get rid of most tearing. GSync monitor is the ultimate goal but can't bring myself to pay $600-700 for one.
 
This game is also heavily reliant of RAM speed. There is a noticeable difference in frame rate between 1600 and 2400 ram.

That is both unusual and very interesting.

I've been running DDR3 at 1866Mhz for years without ever feeling the need to overclock or upgrade it.

Maybe there is finally a reason to get faster RAM :p
 
Zarathustra[H];1041971652 said:
That is both unusual and very interesting.

I've been running DDR3 at 1866Mhz for years without ever feeling the need to overclock or upgrade it.

Maybe there is finally a reason to get faster RAM :p

It mostly effects the low FPS areas so it's really useful to know. I also have run lower speed ram for years. I tested my ram from 1400 to 2000 in a trouble spot and the min framerate went from 38-40 to 50-52.
 
All I know is that with a clocked 6600K and 980ti at stock, you dont need even remotely fast ram.
I have DDR4 3000 that is only stable at 2500MHz currently, caused by a pants mobo BIOS update.

Even then with the game maxed and Ultra Godrays, at 1080p only about 30 to 40% GPU is needed hugging 58fps (fps limited with Afterburner).
CPU use is around 50% on each core.

With 1.5x DSR for AA, GPU use increases to about 55%, fps is a flat 58fps, zero variation.
No need for fast ram on Skylake.
 
also its kinda ingenuous compare directly DDR3 vs DDR4 bandwidth.
 
Zarathustra[H];1041969511 said:
Back when I played CS (and when I tried GO) I played with vsync on.

It's all in your head, man.

No you are wrong, CS:GO with vsync on is input lagfest. Its one of the recommended things to turn off when you first play the game.
 
No you are wrong, CS:GO with vsync on is input lagfest. Its one of the recommended things to turn off when you first play the game.

it depends on your monitor and peripherals mostly... anyone concerned about input lag should by nature have high response mouse, keyboard and more important monitor. for example a monitor like BenQ XL series with high refresh rate + Instant Mode and AMA you play at 6ms with CRT-like responsiveness with Vsync ON.. anyone concerned about input lag in CS:GO have that kind of monitors or CRT.
 
it depends on your monitor and peripherals mostly... anyone concerned about input lag should by nature have high response mouse, keyboard and more important monitor. for example a monitor like BenQ XL series with high refresh rate + Instant Mode and AMA you play at 6ms with CRT-like responsiveness with Vsync ON.. anyone concerned about input lag in CS:GO have that kind of monitors or CRT.

But with the in game vsync off there is little to none input lag. With it on it feels like you are moving the crosshair through mud. I have like 1000 hrs in cs:go so ive had some time to test things out.
 
No you are wrong, CS:GO with vsync on is input lagfest. Its one of the recommended things to turn off when you first play the game.

I've played CS in all its forms since beta 4 in 1999. Between the original (1.6 and before) CS:S and GO I probably have in excess of 10,000 hours in game.

All that time I played with vsync on. Tearing is awful.

I was pretty damned good too, when I played more.

I never felt the need to turn it off.
 
Zarathustra[H];1041969509 said:
That's not to say that input lag can't be a problem, but if you get a decent screen with low input lag, stick with one GPU (multi-GPU inherently worsens input lag) and make sure you can hit and stay at 60fps flat, you are good to go.

I - for one - can't wait to go back to a single GPU. I have high hopes for the next gens from AMD/Nvidia to drive my 4k screen with one GPU, in the games I enjoy.

I don't give a fuck. I'll pay next gen Titan prices to avoid SLI/CFX, and sell my 980ti's

SLI/CFX left such a bad taste in my mouth years ago I'll likely never go back. microstutter, huge fps deltas, poor driver support, extra mobo costs + heat + PSU requirements, decreased OC potential, etc. im interested in how dx12 handles multiple GPUs, but for me, i have cautious optimism at best. wait nope... its straight up pessimism. its gonna suck.

and same here: just want one GPU to drive a 4k screen :D if i have to watercool it, i will. if im lucky, maybe i can even downsample a bit past 4k.

Cabezone said:
This game is also heavily reliant of RAM speed. There is a noticeable difference in frame rate between 1600 and 2400 ram

i saw this in some PC gaming magazine over the weekend. found the numbers (lowest fps/avg fps):

1600MHz: 36.0/54.6
2133MHz: 39.0/61.0
2400MHz: 44.0/66.9

couldnt believe it. might just have to OC my RAM (dont even know what speed its running at)
 
1600MHz: 36.0/54.6
2133MHz: 39.0/61.0
2400MHz: 44.0/66.9

I'd also be curious to see how dual/triple/quad channel RAM compares, also ganged vs. unganged would be interesting) If the extra bandwidth from multiple channels helps, or if its more of a matter of raw speed per channel.
 
I find these RAM speed related performance differences to be interesting as well. I also would like to see this examined more in-depth.
 
But with the in game vsync off there is little to none input lag. With it on it feels like you are moving the crosshair through mud. I have like 1000 hrs in cs:go so ive had some time to test things out.

again it just depend on hardware and peripherals.. not the same play with a 5$ cheap mouse, than a professional gaming mouse with 1000Hz RT adjustable surface and a Gaming monitor with AMA Override. this allow fully Vsync ON with little to non existant input lag.

just look at this (I were only able to found about the XL2420T and XL2410T which a bit slower in response than the newer ones)

input_lag_1.jpg


and just look at that what the "instant mode" do..
Of note here on the XL2420T is the 'instant mode" option listed within the OSD 'picture' menu. This feature is designed to bypass some of the internal scaling and electronics to reduce input / display lag for gaming. It's a useful feature when implemented well, and gives the user a way to reduce the possible delay they might experience as compared with a CRT. We will test the input lag with and without instant mode active.

lower input lag than CRT.. :D
 
oh and BTW if you haven't noticed it, look at the vanilla XL2410T (20ms)versus XL2410T(5.6ms) with instant mode enabled almost fully 15ms cut.. and well, the 2420T its already old. we have now faster panels with faster response times.
 
Some of you are awfully hard to please. I'm a professional cg artist, and a huge graphics whore, and I'm fine with the graphics. Huge open world games have to balance quality with performance. Would a more efficient and scalable game engine be better? Sure. But I haven't had any issues with the game. Only a couple bugs that were minor.
 
Zarathustra[H];1041975125 said:
I've played CS in all its forms since beta 4 in 1999. Between the original (1.6 and before) CS:S and GO I probably have in excess of 10,000 hours in game.

All that time I played with vsync on. Tearing is awful.

I was pretty damned good too, when I played more.

I never felt the need to turn it off.

With you, only my point of reference is Q3A. VSync only for me. Tearing is distracting as hell. It's something I just refuse to live with.

I think a lot of people get confused about what is causing input lag when it can actually be the sum of many variables.

I've never had any issue at all playing with VSync. My typical setup doesn't introduce much in the way of latency though. Even when I've played on other setups that were slightly more laggy where input is concerned, I can much more easily adapt my playing to that than to have tearing on the screen.
 
I think a lot of people get confused about what is causing input lag when it can actually be the sum of many variables.

So, in the same application, you have:

VSync Off = no input lag
VSync On = terrible input lag / sluggish movement

Given that no other options have changed, what other variables are involved? Genuinely curious.
 
So, in the same application, you have:

VSync Off = no input lag
VSync On = terrible input lag / sluggish movement

Given that no other options have changed, what other variables are involved? Genuinely curious.

I wasn't talking about in-game software variables. I was referring to variables to input latency caused by outside sources. Monitors/TVs/smoothing algorithms/slow peripherals/erroneously selected display driver settings/mismatched driver and ingame settings/etc. Anything that someone might get some extra latency from, not realize it, and then blame VSync. It may show itself when VSync is selected, but I'd be willing to bet in most cases it's not the actual VSync setting causing the issue.

Also, the game's implementation, internal timing, and other developer related items could factor into this as well.

Think of it this way. Think back to your favorite upright arcade games. Nearly every single one of them would have been synced to the vertical refresh of the monitor which on NTSC systems in the US and Japan would be 60Hz. Did you ever experience anything but precise input in them? They all used VSync (disclaimer, maybe you could find an example of one that didn't, but it would be the exception) Think of game consoles before they became relatively underpowered to do what they should. Think about something like Soul Caliber on the Dreamcast. That was synced. Did it ever feel laggy? Not to me it didn't. Latency comes from all over the place, and can't be inherently correlated with syncing to a 60Hz display. As long as the frames are ready on time, every time, and no further latency is introduced by a slow panel, display circuitry, wireless input device, slow controller IC inside the input device, poorly coded driver, OS dependency, etc. your input will translate fairly instantaneously to what's on screen with VSync on. That's a long list of possibilities though where something else could cause slop. I'd be looking to those first. I could see extra software layers, translation, abstraction causing some of this in some cases. (in theory)
 
Last edited by a moderator:
I agree with you.

Personally i think people just need some metric to quantify in order to explain their poor gaming skills.

got your ass shot up? oh, it lag, im actually much better...
 
Some of you are awfully hard to please. I'm a professional cg artist, and a huge graphics whore, and I'm fine with the graphics. Huge open world games have to balance quality with performance. Would a more efficient and scalable game engine be better? Sure. But I haven't had any issues with the game. Only a couple bugs that were minor.

Stop being reasonable. Don't you know that's not what the internet is for?
 
^^ It was actually pretty good. I haven't seen any of the Mad Maxxes (old ones), didn't care to see this new one when it came out, but saw it on Amazon Prime and say why the hay not.

Wasn't disappointing. Good flick - this particular scene was quite awesome actually. Release the hounds has a whole new meaning to it. :D
 
^^ It was actually pretty good. I haven't seen any of the Mad Maxxes (old ones), didn't care to see this new one when it came out, but saw it on Amazon Prime and say why the hay not.

Wasn't disappointing. Good flick - this particular scene was quite awesome actually. Release the hounds has a whole new meaning to it. :D

First and second old ones were pretty good. First one takes place in a pre-collapse world where chaos is rising and anarchy is on the horizon. Second one is a few years later, after the fall. Third one was pretty weak to me.

This one, from what I've seen, is a bit like a remake of the second one, only bigger and more fun.
 
Seeing this image reminds me that I still haven't seen this movie! What is WRONG with me?!?

It wasn't very good, you haven't missed much.

Don't get me wrong. A lot of good effects, but essentially it was two really long action scenes with a two to five minute break in the middle.

Very little in the way of character development or story telling.

I was kind of disappointed.

The action scenes were cool, but it was like two hours of the same scene more or less. About 29 minutes in it started getting boring.
 
my son's fury-x is now getting an average of 51fps at 4k with the new drivers. presets at ultra. it's still dipping down into the hi 30's for minimums though. seems to be stutter free as well. looks like a good driver update finally.
 
that seems a bit low with 390s getting similar rates.

what CPU nad memory speed does he have?
 
that seems a bit low with 390s getting similar rates.

what CPU nad memory speed does he have?

This is with everything on, godrays etc. He gets about another 10fps on top of it if he lowers the godray settings. I just checked out overclock3d.nets new benches and they are showing roughly about the same thing with the 980ti only beating it by a little over 1 frame. Your saying the 390's are also getting this too now? That would be damn cool for them.
 
This is with everything on, godrays etc. He gets about another 10fps on top of it if he lowers the godray settings. I just checked out overclock3d.nets new benches and they are showing roughly about the same thing with the 980ti only beating it by a little over 1 frame. Your saying the 390's are also getting this too now? That would be damn cool for them.

If all these disparate cards are getting about the same results, it suggests that something else is the bottle neck.

With all the comments above on RAM speed, I wonder if that is it...
 
Why not test Godrays Low vs Ultra? Or even Off vs Ultra?

NV's own test guide, shows Low performs ~50% faster than Ultra, given that the 980Ti goes from 58fps to 90fps.

Their own image screenshot comparison show almost no difference. In fact, the videos show Low looks more realistic due to diffusion of light rays around objects, whereas Ultra retains sharp light rays like lasers.
 
What was the CPU and clockspeed used for this preview?
I looked hard for it in the article and in this thread but cant find a mention.
 
What was the CPU and clockspeed used for this preview?
I looked hard for it in the article and in this thread but cant find a mention.

Brent's machine is an i7 3770K at 4.8ghz 16GB RAM 1600mhz CAS9... So i guess that's what they used in the article.
 
Back
Top