My personal Feelings About nVidia Surround.

suiken_2mieu

2[H]4U
Joined
Apr 7, 2010
Messages
2,911
So, I just kinda realized how bad a solution nVidia Surround is for surround gaming. Theirs a lot of things that they are doing wrong.



1. First and Foremost, NOT ENOUGH VRAM. Their card hit the wall in a lot of test, and i would never think of going nVidia until there's a card that has 2GB of VRAM or more. 12MP is a lot to render folks and it still need AA.
--I'm aware of the new 560's with 3GB of VRAM. But I'm sure it's gonna cost out the butt for them. I'm just poor so bleh.

2. It Takes AT LEAST 2 Cards in SLI to work. There are a lot of games that don't like SLI. Basically when SLI doesn't work, it really doesn't work. But when it does, it works well. But it's not gonna work well with a lot of obscure games. On top of that, the frame issue I though up. With having 2 cards, you have to cut each frame after they are rendered and send it to the correct card where then it maybe cut again and sent to the right port. This will create lag as to make sure each monitor is in sync with each other. Dumb. If the card could handle at least 3 monitors per gpu, then the only thing that would be needed is a transfer of the frame and then a clean cut and then out the ports. Done. No real syncing required.
--I'm aware of the 295, but that's not 3 screens per gpu, it's 2 per gpu, but there are two gpu's...DUMB.

3. The lack of different configurations. It's 3 monitors in landscape or portrait. Or 3 projectors in landscape. Or 3 3D monitors in landscape. All require 3 displays. Wtf?
--There's nothing to say about this except that they do allow for a 4 monitor to be extended while on the desktop, not a fullscreen game. DUMB.

So in summary I want nVidia to get serious about surround on their next GPU line. Make 2GB the baseline, allow for more display lanes per GPU, and Allow for more display configurations. (I mean why cant we have 2 screens?)

Rant over. Questions? Comments?
 
With their current setup 2GB per 384 bit interface is impossible. Second unlike AMD, NVidia does not FORCE users to use display port which does allow for monitor daisy chaining, but requires active adapters for more than one DVI interface per card. And third I don't know of any single GPU powerful enough to drive 3x2560x1600 monitors which is what you'd get for 3 DVI interfaces. Everything Nvidia has done is for a reason. Perhaps not the most logical but it makes more sense than what you are suggesting. Anyways hardware engineers look at these things and figure them out for a reason. Us gamers don't know that much about the inner workings of hardware thus why we don't build the cards ourselves. I don't think AMDs solution is that much better to be perfectly honest. Why pick on one company. Why don't you just simply say problems with multi monitor gaming. Since a few of these points apply to both sides.
 
they are coming out with better cards (memory wise). as for the 2 card requirement that simply sells more cards. why should they? and the DP has been as much as a liability as a benefit for AMD. I think we will need one more generation before all the bugs get worked out.
 
With their current setup 2GB per 384 bit interface is impossible. Second unlike AMD, NVidia does not FORCE users to use display port which does allow for monitor daisy chaining, but requires active adapters for more than one DVI interface per card. And third I don't know of any single GPU powerful enough to drive 3x2560x1600 monitors which is what you'd get for 3 DVI interfaces. Everything Nvidia has done is for a reason. Perhaps not the most logical but it makes more sense than what you are suggesting. Anyways hardware engineers look at these things and figure them out for a reason. Us gamers don't know that much about the inner workings of hardware thus why we don't build the cards ourselves. I don't think AMDs solution is that much better to be perfectly honest. Why pick on one company. Why don't you just simply say problems with multi monitor gaming. Since a few of these points apply to both sides.

I see no problem with DP. It's a new standard that's taking grip. IMO, the issue lays with monitor vendors not implementing it in the cheap-o monitors some people love using. And not implementing it in their "high end" monitors earlier.

Saying AMD forces people to use DP is rather much like saying nVidia forces users to buy multiple GPUs for surround to work. I see no problem with either solution, really. However, I'd rather have a single card setup over a multicard, anyday.


That being said, even off of one card, Eyefinity has it's fair share of issues (only glaring one, is refresh rates with certain setups).

The technical reason behind this was AMD has created a new bit of output logic (ASIC) that allows for upto 6 display outputs from a single top end GPU core. They didn't change the amount of TDMS drivers on each GPU core for whatever reason (die space, power consumption, voltages, licensing, whatever), so they stayed at 4. Dual channel DVI takes up 2 per link, and also takes up two "display outputs" from the AMD GPU. So after all is said and done, upto 2 TDMS driven connection may be used at a time, and were normally bonded together, shared between DVI and HDMI (both which rely on the TDMS clock signal). DP doesn't need this, so it takes the remaining two outputs. All 6 outputs can be retrained to DP use, and DP++ use (so the TDMS links aren't severed on Eyefinity6 card variants). VGA apparently leaches off of a DVI clocking signal, in it's current implementation on nVidia and AMD GPU.

nVidia was caught off guard, or didn't care to change their output logic in time, and their output logic only has enough space for 2 simultanious outputs from one card. As a result, you can only use two monitors as a time, no matter what the setup/mixture of HDMI, DVI, DVI dual link, DP, VGA there is. So that's why nVidia Surround needs multiple GPU to go beyond two displays.

I'm tired, my eyes are barely hanging on, and I still have a bit of watercooling testing to do on my mITX machine. So good night/morning/afternoon to all.
 
Oh don't get me wrong I don't think there's anything wrong with display port. But for such a new interface and not fully supporting more than one DVI monitor, that's not the smartest move. I mean it would be better for monitor companies to switch to display port but there's a huge conflict with that and HDMI. So I suspect display port will not move up much until something radical happens. While yes I do agree that AMDs solution is cleaner GPUs at the moment still lack the horsepower to drive this stuff. So I'm not entirely sure AMDs solution is any better.Yeah I also have to agree with the time, good night all or good morning all (depends on timezone).
 
The ones making it hard on consumers are monitor manufacturers. AMD got the ball rolling in 2007 by being the first to include it in all their cards so that there is a sizable installed base and monitor manufacturers couldn't use the chicken and egg excuse. Fast forward to today and display manufacturers are still making products with HDMI but with VGA as well. It's ridiculous considering DisplayPort is completely royalty free and HDMI is not. Thankfully AMD and Intel are forcing the issue and will stop support for DVI and VGA by 2015 and will only support HDMI and DisplayPort from then on.
 
With their current setup 2GB per 384 bit interface is impossible. Second unlike AMD, NVidia does not FORCE users to use display port which does allow for monitor daisy chaining, but requires active adapters for more than one DVI interface per card. And third I don't know of any single GPU powerful enough to drive 3x2560x1600 monitors which is what you'd get for 3 DVI interfaces. Everything Nvidia has done is for a reason. Perhaps not the most logical but it makes more sense than what you are suggesting. Anyways hardware engineers look at these things and figure them out for a reason. Us gamers don't know that much about the inner workings of hardware thus why we don't build the cards ourselves. I don't think AMDs solution is that much better to be perfectly honest. Why pick on one company. Why don't you just simply say problems with multi monitor gaming. Since a few of these points apply to both sides.

The issues I brought up are nVidia Surround specific. The only thing I can think that is remotely an eyefinity issue is the "small" ammount of display configurations. We don't have PLP or multi-resolution solutions through either card manufacturer.

People tell me (a lot) no one card is powerful enough to play at 7680x1600. But Games play way better on my 5870 2GB at 7680x1600 then they did on a 4870 1GB at 3840x720 (this was a couple years ago).

I guess it might be because I don't have to run everything at max and 60 frames per second (I was a console gamer before I was a pc gamer, I just don't care). As long as I get play able frame with a good amount of AA and Good textures, I'm fine.

The DP require is really a non-issue (imho) now. You can get a DP -> single link dvi adapter for $30 or less. They also work flawlessly. If you want to go higher than 1920x1200 it requires DL DVI adapters (in which i got used off ebay for $35). So I mean it's much more affordable than buying an extra card. Also this only applies if you didn't buy DP monitors.

That brings up another issue. SLI when it works it's great and when it doesn't it's terrible. I find crossfire (two cards) is more compatible with more games (though scaling is not as good).

Also I think it's kind of messed to say I'm picking on them when these are valid issues. And basically taking the, they did it for a reason and just being ignorant as reasons to not question what they did? and why they did it? is dumb as hell. I've been using Multi-Monitor Rigs sens the TH2Go and each solution has had it's quirks, but I really don't think nVidia has taken surround seriously. I mean were on the 3 GPU generation that could use surround, you'd think they add more features by now.

Put it this way the only way to play a game that doesn't work well with SLI in surround is with SoftTH.
 
As it stands, Eyefinity doesn't work in more than 2-way crossfire. That's a problem, as two HD6970s aren't meaty enough to drive more than about 5760x1200. Meanwhile, Vision surround requires SLI in absolute, so even if you have a 3GB GTX580, you can't run 5040x1050, which the card would be reasonably capable at, without a second card. Problems on both sides really.

The displayport thing is only a problem because of all the cheap and nasty (yet highly priced) DP->DVI dongles out there, and the fact that many monitors don't have DP on them. If it were me, I'd be using displayport on all my displays if I started an eyefinity PC today. However, annoyingly they don't make an eyefinity6 card for the HD6 series, not even eyefinity3, so you still need at least one dongle, even if you do have DP on your monitor. Infuriating.
 
I see no problem with DP. It's a new standard that's taking grip. IMO, the issue lays with monitor vendors not implementing it in the cheap-o monitors some people love using. And not implementing it in their "high end" monitors earlier.

Saying AMD forces people to use DP is rather much like saying nVidia forces users to buy multiple GPUs for surround to work. I see no problem with either solution, really. However, I'd rather have a single card setup over a multicard, anyday.


That being said, even off of one card, Eyefinity has it's fair share of issues (only glaring one, is refresh rates with certain setups).

The technical reason behind this was AMD has created a new bit of output logic (ASIC) that allows for upto 6 display outputs from a single top end GPU core. They didn't change the amount of TDMS drivers on each GPU core for whatever reason (die space, power consumption, voltages, licensing, whatever), so they stayed at 4. Dual channel DVI takes up 2 per link, and also takes up two "display outputs" from the AMD GPU. So after all is said and done, upto 2 TDMS driven connection may be used at a time, and were normally bonded together, shared between DVI and HDMI (both which rely on the TDMS clock signal). DP doesn't need this, so it takes the remaining two outputs. All 6 outputs can be retrained to DP use, and DP++ use (so the TDMS links aren't severed on Eyefinity6 card variants). VGA apparently leaches off of a DVI clocking signal, in it's current implementation on nVidia and AMD GPU.

nVidia was caught off guard, or didn't care to change their output logic in time, and their output logic only has enough space for 2 simultanious outputs from one card. As a result, you can only use two monitors as a time, no matter what the setup/mixture of HDMI, DVI, DVI dual link, DP, VGA there is. So that's why nVidia Surround needs multiple GPU to go beyond two displays.

I'm tired, my eyes are barely hanging on, and I still have a bit of watercooling testing to do on my mITX machine. So good night/morning/afternoon to all.

I upgraded from a 5870 to SLI for 120Hz on 3 screens. If AMD puts out a displayport adapter that is certified to work at 120Hz I'd consider them again. Until then I will not be buying AMD again.
 
Back
Top