Dual GPUs and Input Lag

Bop

2[H]4U
Joined
Oct 1, 2003
Messages
3,306
Has an extensive test ever been done to determine the precise impact of using two GPUs versus one on input lag?

I was considering looking into a dual GPU solution, but this issue usually makes me prefer a more expensive single GPU solution. (regardless of loss of performance)

The best info I could find was from Anand. (http://www.anandtech.com/show/2803/7) Judging from that info it *appears* that a single GPU with a render ahead limit of 1 may have up to ~20ms less GPU time than a dual GPU setup.

The article stated it was an area that they would like to explore later but here we are, over a year later, with no report...
 
If there is any lag it isn't noticeable even if you're an EXXXTREEEEME gaming pro who competes for money. Stuff like microstuttering can be more noticable but usually only when you really push the cards past their limits, in which case you just tone down the IQ/resolution and its fine.
 
I have never noticed any lag or microstutter in any game ever due to Dual GPU.

Where I do notice it is in my RC Simulator, Phoenix. I can feel it in the helicopter.
 
With my SLI setup with "render ahead=0" there is no input lag. In SLI, the frames are processed in real time and not buffered so their would be no input lag.
 
With my SLI setup with "render ahead=0" there is no input lag. In SLI, the frames are processed in real time and not buffered so their would be no input lag.

I thought I read somewhere that you shouldn't adjust the render ahead with a dual GPU setup since it may cause problems. I may be wrong, however, since it works fine on your setup. Also, I don't think the extra lag comes solely from render ahead but from the GPUs syncing frames.

I found some additional information here: http://developer.nvidia.com/object/sli_best_practices.html

One additional detail worth noting is that while frame throughput is the same on SLI systems as it is on non-SLI systems, frame latency is reduced due to parallelism. For example, if a typical frame takes 30ms to render, the effective latency of those frames is only ~15ms (assuming an SLI system with two GPUs). Thus, increasing the number of frames buffered in SLI does not linearly increase input lag as one might expect. Actual results will depend on how well your application is scaling in SLI.

I guess this information suggests that *buffer* latency will actually be the same. I'm still not sure if there are any other factors in multi-GPU computation that would increase latency, though.
 
If there is any lag it isn't noticeable even if you're an EXXXTREEEEME gaming pro who competes for money. Stuff like microstuttering can be more noticable but usually only when you really push the cards past their limits, in which case you just tone down the IQ/resolution and its fine.

It depends on what game you play. I have no doubt that in TF2 for example, the 70ms difference in input lag you can have with different setups would be enough to ruin the timing needed for a good rocket jump or your aim as a scout. You would not have to be a pro to notice that. If the maximum delay an SLI setup can add is 20ms, I'm not sure that would be noticable but these things have a way of adding up.

I have spent a lot of time looking into this the past few days as I just set up my first SLI system 2 days ago. I have been playing with the settings a lot and can't tell you I notice any definite additional input lag. So far what I have found on the subject of input lag is pretty confusing when you add the related variables of avg framerate, screen refreshrate, vsync, triple buffering, max prerendered frames, etc. As far as I can tell:

-dual GPU rendering will add some input lag, but the amount is pretty small. I suspect that if it allows you to keep the framerate above the monitor refresh rate at all times or if it increases your framerate from 30 fps to 60 fps or more, those benefits will outshine the tiny GPU syncronization overhead. Also, many of the input lag issue become less significant once you move away from 60Hz LCDs to the 120 Hz models. You need a decent amount of GPU muscle to keep those screens fed at 1080p.

-max prerendered frames should be at 3 per Anand or at least equalt to the GPU number per nVidia.

-vsync can add lag, but mainly if fps is lower than refresh rate.

-Triple buffering reduces input lag with vsync per Anand:
http://www.anandtech.com/show/2794/4
But, nVidia says the opposite:
"Triple buffering can reduce performance because of the increased memory consumption, and will offer no benefit if the frame output rate is higher than the refresh rate of the monitor. Additionally, triple buffering can introduce slightly higher input latencies because of prolonged storage times."

You can see many issues with input lag are less important if you can maintain a high fps, which is much easier to do with 2 GPUs. I'd like to see the followup from Anand detailing dual GPU effects. On the other hand, if you are trying to decide between 2 older vid card or one newer faster card, I'd definitely take the latter. The simplicity, lower power, and lower overhead are worth the extra cost. I think dual GPU setups only make sense with high end cards.
 
Well I don't necessarily want to narrow the thread on what personal setup I'm going for, but I guess it'd focus the discussion on higher end setups. I'm essentially looking at either a 6950 CF setup or a single GTX 580. Whichever GPU I choose is mostly overkill for the resolution I run at, but I like to have a min fps of 60. I'd like to get a 120hz monitor but I'm really hesitant to give up my G2400WD monitor and lose the 16:10 aspect ratio (1920x1200, almost zero input lag). Coming from my older monitor with 25ms input lag I could immediately tell the difference in game-play.
 
Well I don't necessarily want to narrow the thread on what personal setup I'm going for, but I guess it'd focus the discussion on higher end setups. I'm essentially looking at either a 6950 CF setup or a single GTX 580. Whichever GPU I choose is mostly overkill for the resolution I run at, but I like to have a min fps of 60. I'd like to get a 120hz monitor but I'm really hesitant to give up my G2400WD monitor and lose the 16:10 aspect ratio (1920x1200, almost zero input lag). Coming from my older monitor with 25ms input lag I could immediately tell the difference in game-play.

If you are sensitive to frame rate fluctuations, a dual GPU setup will drive you nuts. Generally, they do not improve min FPS either. I'd go with the 580.
 
Too bad we don't have much real information to draw from.

I bet it's more important to be able to stay over your monitor's refresh rate than it is to avoid synchronization lag. At 1900x1200x60 that can be hard to do in some games even with only 4x AA. The 6950 CF is about 50% faster than a GTX580, and still only manages 36fps in Metro2033. SLI/CF setups are very powerful now, but still not overkill @ 1900x1200.

http://www.techpowerup.com/reviews/ASUS/Radeon_HD_6950_CrossFire/12.html

On the other hand, if it turns out dual GPU setups are inferior for twitch games, you still have the option of assigning the second GPU to antialiasing and physx only, at least with nVidia drivers.
 
If you are sensitive to frame rate fluctuations, a dual GPU setup will drive you nuts. Generally, they do not improve min FPS either.

Not true:

http://www.hardwareheaven.com/reviews/1061/pg10/gigabyte-nvidia-geforce-gtx-580-sli-review-fallout-new-vegas.html

Fallout NV 1900x1200 4aa 16af:
580GTX - min 21fps
580GTX SLI - min 59fps

There are several other games tested there. On average the min framerate improves by about 50% with SLI. The main benefit appears to be with games that have low min framerates with single cards.
 
Fallout NV is a poorly optimized mess of a console port. While you're right it's a poor example to use.
 

Something is wrong with their testing. A 580GTX is only managing 19 mn FPS, compared to 39 on an HD5870?

And I said "generally." There are SOME cases where it improves min FPS, but they are few and far between.

For example, if you run into a scene that drops your FPS to 20, it's unlikely dual GPUs will improve on that.
 
I didn't know FO NV was a console port, but ya, the GECK engine is a mess. I just chose the most dramatic example.

All of the other games tested show a solid improvement in min FPS, particularly in the games where you need it the most. SLI scaling is quite a bit better than it used to be.
 
Back
Top