GTX 460 SLI with SFR Benchmarks

eddieck

Gawd
Joined
Dec 13, 2009
Messages
1,010
Does anyone have benchmarks on GTX 460 SLI using SFR? Every review I've read seems to be using AFR, and that's unacceptable to me because:

  • Extra input lag AFR creates
  • Microstuttering
  • VSync + triple buffering issues
I am currently considering an SLI setup and am about to pull the trigger on a new SLI-supporting board and another 460, but AFR is unacceptable for the listed reasons. I'm aware SFR will naturally have lower scaling (though I plan to use SSAA which means rendering at 3840x2160, where SFR scaling is better) and that's fine, as long as it's still somewhat of a performance improvement from a single card setup.
 
AFR adds input lag? Never heard/felt that... Maybe I'm just used to lag now that I use an LCD.
 
AFR adds input lag? Never heard/felt that... Maybe I'm just used to lag now that I use an LCD.

Yep, supposedly 2 frames of input lag meaning about 32ms on a 60 Hz monitor. With my single-card setup I specifically set render ahead to 0, disabling it, in the NVCP.
 
Oooo... something I've never given more than an ounce of thought.

I am interested to see the answers to your question.
 
AFR adds input lag? Never heard/felt that... Maybe I'm just used to lag now that I use an LCD.

I know that a certain Mr. Pabst's site is not well regarded around here, but googling it is the only article I can find on the subject with good illustrations.


Check out this link to a review of the 1999 ATI Rage Fury MAXX, a dual GPU solution that exclusively used AFR. It was criticized heavily upon launch for introducing extra input lag:



I too would be very interested to see what the practical impacts are of AFR vs. SFR in modern titles as I have been researching getting another video card as of late.

Maybe someone with an SLI setup can volunteer to do some testing?
 
Zarathustra[H];1036389389 said:
I know that a certain Mr. Pabst's site is not well regarded around here, but googling it is the only article I can find on the subject with good illustrations.

I may be wrong but wasn't THG actually respected back when he was doing the reviews? It seems these days there are other editors managing the site. I'm not even sure if Tom still has any role in the site as I haven't seen any posts from him in the past few years.

Zarathustra[H];1036389389 said:
Maybe someone with an SLI setup can volunteer to do some testing?

Absolutely. I'll be getting a GTX 580 SLI setup soon, though it may not be on the 9th. (If anyone else wants to do it before I get my cards, that'd be cool.)
 
Yeah, I've wanted to see [H] do a review of this very thing, where they check out the different rendering modes for pros/cons. I'm going to assume that AFR1, i.e. the standard mode, by far gives the best scaling, but it would be interesting to see what the other modes could do with CFX/SLI issues that are keeping many of us from purchasing it.

Interestingly, when I had my 9800GX2 and played ARMA 2, I found my fps increased by like 40-50% when switching from AFR1 mode to AFR2 mode. SLI was working in AFR1 mode too, I definitely had higher fps than in single-GPU mode, but for some reason AFR2 mode gave me far more fps and actually made the game playable. Didn't think much more of it at the time though, so didn't do any more extensive testing.
 
Maybe we can ask Brent if he is willing to do a test like this for the [H]. After all he's done some good SLI writeups as of late...

We can't be the only people interested in this info...
 
Zarathustra[H];1036392263 said:
Maybe we can ask Brent if he is willing to do a test like this for the [H]. After all he's done some good SLI writeups as of late...

That would be awesome but I don't think Brent or the rest of the [H] crew have the time right now. GTX 580 launch and probably articles on 580 SLI, as well as Cayman.
 
Yes AFR add's input latency. Depending on framerate, you may or may not notice it.

But the OS also allows the queueing of frames anyway. Vista and Win7 allow up to 3 frames to have Present called without being consumed by the graphics driver before the OS will block the rendering thread of an application. The only reason multi-gpu is more likely to have input latency than single gpu is because you're more likely to have frames queued up so that the driver can process them somewhat in parallel. Think of it as sending off two frames to be processed instead of one.. you have to stop taking input as you go to render it, and you won't see the new input until frame 3 is done in a 2-way sli config, whereas it may show up on frame 2 in a single GPU configuration.

So depending on your framerate you may or may not see this difference. If the game is running slow enough, you're more likely to feel that extra n milliseconds of input latency.
 
Back
Top