NVIDIA & Multi-display Gaming Editorial @ [H]

One could say the same thing about the implementation of Anti-Aliasing ;)
No, not really. UE3 demands shader MSAA resolve because it uses a deferred rendering approach to calculate lighting, which is apparently not the most trivial thing to implement (UE3 still doesn't support anti-aliasing out-of-the-box, I believe). That's what NVIDIA wrote for Rocksteady. There's no shader, or any other code, that needs to be written for Eyefinity functionality. You don't have to touch anything for Eyefinity support. If the render can output the desired resolution, Eyefinity is supported.

I'm only assuming that NVIDIA Surround works similarly to Eyefinity, which is a fair assumption considering that a handful of games were demoed at CES running Surround (or is it 3D Surround? I'm still way the hell confused about that).
 
From what I've read about UE3 when I researched the Batman topic, it doesn't support AA in DX9 mode but does in DX10. It's been a while since I've fired up UT3 but i'm almost positive it had an AA option in there for both my nVidia and ATI based rigs. I'll double check when I get home.
 
I wish there was more info on NVIDIA 3D Vision Surround." I hate when companies throw you a little bone,
I'm curiosu if we will require 120hz display to do 3d? and if so what about those people that will not go out and buy a shitty tn panel, wlll be assed out? I dont remember seeing any specs remotly close to H-IPS and 120hz.

Also i hope they will let us do Portrait-Landscape Protrait.
 
From what I've read about UE3 when I researched the Batman topic, it doesn't support AA in DX9 mode but does in DX10.
Correct. AA in DX9 is unsupported (but it'll work via a workaround for NVIDIA users). DX10 requires shader MSAA resolve.
 
I wish there was more info on NVIDIA 3D Vision Surround." I hate when companies throw you a little bone,
I'm curiosu if we will require 120hz display to do 3d? and if so what about those people that will not go out and buy a shitty tn panel, wlll be assed out? I dont remember seeing any specs remotly close to H-IPS and 120hz.

Also i hope they will let us do Portrait-Landscape Protrait.

Yes, 3D still requires 120Hz displays, powered 3D shutter glasses, and special code to work. All the images, must effectively be rendered twice.
 
After further thought, I take back my original statement.

I think Kyle was correct to write what he did. After more thought, I have changed my initial thoughts about what Kyle wrote.

Regardless of the lack of actual proof that Nvidia and Eidos had some sort of shady deal, the bottom line was ATi owners not only didn't get AA, they got punished for it.

That will hurt the industry in general, and at least someone with a voice said something.

Thanks , on behalf of all of us gamers (I'm sure) for saying something Kyle., Sorry for commenting before really giving the issue the time and critical thinking it deserved
 
I'm just wondering if SLI is a definite requirement for 3 panel display. Yes I know almost all the cards have two DVIs so you need to get that third output somehow but I have my old 8800GT for a Physix card (might as well, wasn't able to find a buyer) with my 280 GTX as the main...so technically I got four ports... yeah I know we'll know for sure with a future driver release. Basically I want to not upgrade to anything for the next while and just get monitors so I'm hoping! :D
 
ChrisRay stated @ Rage3D, two monitors could be used for 3D surround on a single card.
SLI will be required for 3 monitor 3D Surround.
 
ChrisRay stated @ Rage3D, two monitors could be used for 3D surround on a single card.
SLI will be required for 3 monitor 3D Surround.
Who would use two monitors for anything 3D though? You'd be staring straight at the bezels.
 
ChrisRay stated @ Rage3D, two monitors could be used for 3D surround on a single card.
SLI will be required for 3 monitor 3D Surround.

I tried 2 monitors and the bezels completely kill the experience, unless we get bezel-less monitors or use 2 projectors, its of little use.
In driving games, there needs to be no discontinuity at all, a seamless screen is needed or near as dammit.
In FPS games, it splits your character and isnt pretty.
 
I'm just wondering if SLI is a definite requirement for 3 panel display. Yes I know almost all the cards have two DVIs so you need to get that third output somehow but I have my old 8800GT for a Physix card (might as well, wasn't able to find a buyer) with my 280 GTX as the main...so technically I got four ports... yeah I know we'll know for sure with a future driver release. Basically I want to not upgrade to anything for the next while and just get monitors so I'm hoping! :D

You couldn't find a buyer for an 8800GT?

I hope that there aren't too many issues with 3x SLI and PhysX and three monitors. 3 280s should have enough power to just about anything at 5760x1200 at maximum since one 280 can run just about any game at max at 1920x1200.

I do wonder it AMD will respond to this. I can't think of a such a big feature ever introduced to a legacy GPU with only a driver update.

Much has been said of AMD value and its been good over the last 18 months but if Fermi looks good, the multi-display lock-in concerns that Kyle raised don't arise, and they add solid multi-monitor to legacy GTX 2XX GPUs, nVidia could end up looking pretty good at the end of the day.
 
I tried 2 monitors and the bezels completely kill the experience, unless we get bezel-less monitors or use 2 projectors, its of little use.
In driving games, there needs to be no discontinuity at all, a seamless screen is needed or near as dammit.
In FPS games, it splits your character and isnt pretty.

That's why you need THREE monitors, its not nearly as bad with three, not that I've tried it but this issue comes up from time to time and three really is FAR better than two.
 
I hope that there aren't too many issues with 3x SLI and PhysX and three monitors. 3 280s should have enough power to just about anything at 5760x1200 at maximum since one 280 can run just about any game at max at 1920x1200.

I do wonder it AMD will respond to this. I can't think of a such a big feature ever introduced to a legacy GPU with only a driver update.

Much has been said of AMD value and its been good over the last 18 months but if Fermi looks good, the multi-display lock-in concerns that Kyle raised don't arise, and they add solid multi-monitor to legacy GTX 2XX GPUs, nVidia could end up looking pretty good at the end of the day.

Very true, they could come out of this smelling like roses if they want to. We will see how it shakes out in the coming weeks and months I guess.
 
That's why you need THREE monitors, its not nearly as bad with three, not that I've tried it but this issue comes up from time to time and three really is FAR better than two.

I'm more than clear I need 3 screens having tried 2 already, thats why I wrote the post you quoted :)
 
Who would use two monitors for anything 3D though? You'd be staring straight at the bezels.
No one for long. I was just passing along what ChrisRay stated.
Even he stated the obvious about the bezels.
 
Much has been said of AMD value and its been good over the last 18 months but if Fermi looks good, the multi-display lock-in concerns that Kyle raised don't arise, and they add solid multi-monitor to legacy GTX 2XX GPUs, nVidia could end up looking pretty good at the end of the day.

Yeah... Where does that leave DisplayPort tho? I'm sure it's still the display interface of tomorrow, but if NVidia can get multi-display setups working w/DVI plugs (on old hardware to boot), why couldn't AMD get it working on a whole new card they were about to launch?

Y'know, there's some very real incentives for NV to actually succeed in some of that... If they can get it working on old hardware it's gonna move a lot of old cards (even after Fermi's launch) as people that never would've tried SLI begin buying a second (old) card to try this out... Personally, I'd be one of 'em. Haven't done SLI since the Voodoo days but if they enable multi-display on my GTX 260 I'm definitely getting a second one and trying it out.
 
I can't express how glad I am someone is making a stand for this but also has the forethought to make it before Nvidia have already done something gay, quite sick of their bullshit with the PhysX

Actually that reminds me I still need to write up my findings with batman AA showing the majorety of the physx effects running perfectly fine on a quad CPU DESPITE CPU usage being pretty bad (40%)

Yeah, that would be helpful. If possible lower your OC down to the 2.8 or 3.0Ghz region for one run of the tests. It will keep the "few people run that heavily OCed" complaints to a minimum.
 
Much has been said of AMD value and its been good over the last 18 months but if Fermi looks good, the multi-display lock-in concerns that Kyle raised don't arise, and they add solid multi-monitor to legacy GTX 2XX GPUs, nVidia could end up looking pretty good at the end of the day.

To think Fermi is going to look good is to ignore the laws of physics and myraid data to the contrary.

Also, most computers out there with nvidia gpus in them don't have sli capability. They can't do 3 monitor Vision with legacy geforce cards. For that matter, what percentage of currently selling computers and motherboards support sli? 3%?... 4%? ALL of them can run 3 monitor eyefinity though. Nvidia didn't even have a prototype single board Fermi running a three monitor set-up.

I don't see where Nvidia isn't going to end up looking like crap at the end of the day.
 
Last edited:
To think Fermi is going to look good is to ignore the laws of physics and myraid data to the contrary.

Also, most computers out there with nvidia gpus in them don't have sli capability. They can't do 3 monitor Vision with legacy geforce cards. For that matter, what percentage of currently selling computers and motherboards support sli? 3%?... 4%? ALL of them can run 3 monitor eyefinity though. Nvidia didn't even have a prototype single board Fermi running a three monitor set-up.

I don't see where Nvidia isn't going to end up looking like crap at the end of the day.

Dude, wtf? Have you seen a Fermi? If not then your laws of physics argument is pretty weak wouldn't you say? You can't apply ANY law of physics you know nothing about.

ALL high end enthusiast boards these days support both SLI AND Crossfire, like the one I have in my system, like many others.

And for the sake of argument, for those of us that have GTX 2XX SLI setups, if these drivers do indeed support older GPUs and it works well, once again I don't know and neither do you know how well this is going to work, but if it does, then I will have gotten a cool new feature, one that Kyle was been saying is the next big deal in PC gaming, for FREE! Well, for no more money than I've already spent.

The next BIG thing in gaming on almost 2 year old hardware? Sorry, if it works well and AMD doesn't support this tech on the Radeon 4000s, they that's a BIG plus for nVidia. Sure no DX 11 but with my two year old hardware the potential to experience better multi-monitor gaming than all the folks that ran out and bought 5000s to do it, hey that'll work for me.

I have no idea how this is going to play out. I do know that I was about to my a 5770 or 5870 and decided to wait until I know more. Hell, if multi-monitor works well with my 280s I might just hold on to them a little longer.

nVidia has my interest for another 60 days. I'm not lacking any gaming punch right now, they only thing I don't have is multi-monitor and that my be only a driver away.

And if all this is crap, I'll just take out the 3 280s and put in 3 5870s. You know, one of the 3% or 4% of motherboards that support SLI!
 
Bolded part: I think most of the posters here don't really understand or just don't care because they are what, 16 to 20 years olds, who....and please understand I'm not taking shots at you guys, don't understand business. Contracts are contracts and you don't use your money to help competitors.


I disagree about one big tv vs eyefinity though. Unless that TV can achieve the resolutions eyefinity can, then hell yes I would take it because there would be no bezels.

But until a 55 inch tv can do...what is it, 5760 X 1200, eyefinity is better IMO.

I don't know. 4k TV's are pretty much OTW, and of course there will be a price premium at first. You lose a little horizontal but (and the standard has pretty much been hammered out) at 4096x3072, you can pretty much nuke any need for AA. I think that's enough resolution for me, for now, especially on a 50"+ LED/LCD ;)
 
X58/P55 support both CF and SLI.

A Newegg search turned up 15 intel socket 1156/1366 sli boards out of 140 total, 5 socket 1156 and 10 socket 1366. 12 of those boards were $200 and above.

And that's aftermarket build your own. Mass market computers with SLI are going to be a small fraction of that.

ALL of the motherboards and mass market computers can run Eyefinity.
 
Last edited:
Eyefinity won't ever sway console gamers.

People buy consoles is because they are cheaper, easier and you can play them from your couch. Eyefinity is even MORE expensive (when PC gaming is already more expensive). Very few people can afford (or are willing to pay) that much money for 2 extra displays and a beefy card to power them. And you still won't play that from your couch.

While I'm sure it's ridiculously awesome (if only I had the income...), it's just not as revolutionary as it's made out to be.

Once, or if, it becomes a more affordable or easy solution then the word "revolutionary" should come bacl.
 
But NVIDIA didn't ship any code. Assumably, Eidos owns whatever NVIDIA wrote for Batman. I'd guess that absolves NVIDIA from being subject to antitrust suits.

They didn't lock out AMD, either. AMD could have attempted to work with Eidos but either didn't or Eidos locked AMD out. Again, the focus is directed toward Eidos.

More like ATI / AMD is continuing on it's old path of doing f*ck-all for developers. TWIMTBP doesn't offer a lot of money. Nope, it offers a lot of NDA (Nvidia's programmers have to sign strict NDA's because they are being given access to X company's game) program loops. Nvidia may be a PITA at times, but comparing the two major vendors (now that Larrabee has been kicked to the curb by Intel) developer programs, AMD (and ATI on it's lonesome was) is the biggest loser on the planet. You know why you see TWIMTBP everywhere? That's because Nvidia is willing to make an effort (and that backed up with man-hours aka money!) working with developers. For every game ATI does that with, Nvidia will do 20.

AMD (and ATI before the AMD purchase) bought, or tries to buy, developers with cash. Nvidia bought, and continues to buy, developers with free code while asking them to pop up a logo. Which program has been more successful at getting games? Maybe AMD should pull their head out of their a$$ and figure it out.
 
Dude, wtf? Have you seen a Fermi? If not then your laws of physics argument is pretty weak wouldn't you say? You can't apply ANY law of physics you know nothing about.

If NOTHING was known about Fermi you would be right.

But's that not the case, is it?

A great deal is known about Fermi. Nvidia, itself, has officially released a great deal of data about Fermi. We know it's overall design, how many transistors it has. What fabrication node and process it's being manufactured on. How big it is. That a large chuck of those transistors are for GPGPU functions. The laws of physics apply. Compared to Cypress, it will cost more to fabricate. It will consume more power. It will run hotter. That's just flat out built into the chip design. There is no magic wand here. No thinking rational person with a grounding in basic physics could think otherwise. What remains to be seen is how much performance relative to it's increased cost, power draw and heat it will have. The deafening silence from Nvidia on that = not promising.
 
A Newegg search turned up 15 intel socket 1156/1366 sli boards out of 140 total, 5 socket 1156 and 10 socket 1366. 12 of those boards were $200 and above.

And that's aftermarket build your own. Mass market computers with SLI are going to be a small fraction of that.

ALL of the motherboards and mass market computers can run Eyefinity.

Because Eyefinity can run on one card and I still don't know if nFinity can yet on Fermi. Do you?

As for the mass market honestly we're not talking about a mass market product yet. $200 for a motherboard is piss money. And to REALLY drive Eyefinity you need at least a 5850 and three monitors which is about an $800 proposition.
 
If NOTHING was known about Fermi you would be right.

But's that not the case, is it?

A great deal is known about Fermi. Nvidia, itself, has officially released a great deal of data about Fermi. We know it's overall design, how many transistors it has. What fabrication node and process it's being manufactured on. How big it is. That a large chuck of those transistors are for GPGPU functions. The laws of physics apply. Compared to Cypress, it will cost more to fabricate. It will consume more power. It will run hotter. That's just flat out built into the chip design. There is no magic wand here. No thinking rational person with a grounding in basic physics could think otherwise. What remains to be seen is how much performance relative to it's increased cost, power draw and heat it will have. The deafening silence from Nvidia on that = not promising.

Yes, we know all about it except as you just said, performance. Unless its super hot and or extremely power hungry, do you really think the average PC hardware enthusiast looking for the best performance is going to care?
 
Well expect it to consume no more than 300watts since that is the maximum spec for PCI Express 2.0 standard. That's probably nVidia achilles heel. I wouldn't be surprised if they're needing to downclock the card to meet the spec, thus compromising it's "stock" performance.
 
Well expect it to consume no more than 300watts since that is the maximum spec for PCI Express 2.0 standard. That's probably nVidia achilles heel. I wouldn't be surprised if they're needing to downclock the card to meet the spec, thus compromising it's "stock" performance.

That could leave quite a bit of room to OC, if you can cool it and power it that is. We will find out in a month or three.


Eyefinity won't ever sway console gamers.

People buy consoles is because they are cheaper, easier and you can play them from your couch. Eyefinity is even MORE expensive (when PC gaming is already more expensive). Very few people can afford (or are willing to pay) that much money for 2 extra displays and a beefy card to power them. And you still won't play that from your couch.

While I'm sure it's ridiculously awesome (if only I had the income...), it's just not as revolutionary as it's made out to be.

Once, or if, it becomes a more affordable or easy solution then the word "revolutionary" should come bacl.

It's not meant to. Why would they bother? By and large, console buyers do so on price and convenience, not quality or performance or features. You don't advertise Ferrari's new bells or whistles in the projects. It's a waste of time.

As for affordable, it is. People have this thing in their head that it takes $$$$ to buy a decent gaming rig when it does not. Your typical $600-$800 family pc bought in the last year or so is usually only a gpu upgrade away from being a damn decent game player. Far better than any of the current consoles as well at this point in their life cycle.
 
For that matter, what percentage of currently selling computers and motherboards support sli? 3%?... 4%? ALL of them can run 3 monitor eyefinity though. Nvidia didn't even have a prototype single board Fermi running a three monitor set-up.
That's a fair point, actually--three monitors on a single card is a competitive advantage. Shall we start keeping score?

Advantages for AMD:
--Available now.
--Three monitors on a single card.

Advantages for nVidia:
--Can use VGA outputs.
 
As for the mass market honestly we're not talking about a mass market product yet. $200 for a motherboard is piss money. And to REALLY drive Eyefinity you need at least a 5850 and three monitors which is about an $800 proposition.

Two $120 19" monitors and a $130 5750 = $370 for the average non enthusiast pc gamer to enjoy eyefinity after he visits a friend or a friend of a friend, experiences it first hand and falls in love with it. No AA or AF and some eye candy turned down is what most pc gamers experience. See the configuration stats over at Steam.

And a soon arriving $30 display port adapter.
 
Last edited:
I dont know if the article mentions it or someone else had asked, my appologies in advanced if I sound like a broken record.
Will the Nvidia surround have to be set up with 3d compatible monitors or can I just use my non-3d compatible 20" Dell Monitors.
 
For everything, not just 3D glasses, IMO. Unfortunately, there's no such thing as a 120Hz IPS panel.

Seriously. I have no intention of even using those goofy glasses but if a reasonably priced 120Hz IPS panel came out I would be all over it.
 
Two $120 19" monitors and a $130 5750 = $370 for the average non enthusiast pc gamer to enjoy eyefinity after he visits a friend or a friend of a friend, experiences it first hand and falls in love with it. No AA or AF and some eye candy turned down is what most pc gamers experience. See the configuration stats over at Steam.

And a soon arriving $30 display port adapter.

Most people are not going to want to do Eyefinity with only two monitors. And a 5750 is cutting REAL close. And that's still the price of a laptop. I don't know why you think that this is or will be anytime soon an average persons setup. It's not and if you low ball it its going to suck and you'll wonder why people like it.
 
For everything, not just 3D glasses, IMO. Unfortunately, there's no such thing as a 120Hz IPS panel.

I was under the impression you could use non 120Hz monitors just to get the three screen running without any fancy dancy 3D.
 
Seriously. I have no intention of even using those goofy glasses but if a reasonably priced 120Hz IPS panel came out I would be all over it.


Forgive me if I'm out in left field on this, but: Do any ips panels have fast enough response times to even display at 120Hz under normal conditions? I mean, unless my math is off,
1000/120=8, you would need a real world, worse case scenario, min response time of 8ms or less to truly display at 120Hz. I know a lot of them are "claiming" low response times, but since the ratings the manufacturers use is not all that reliable or uniform I don't know how well it can be believed.
 
Back
Top