Leaked 256 driver supports 3D Vision Surround

Status
Not open for further replies.

vjcsmoke

Supreme [H]ardness
Joined
Dec 5, 2006
Messages
4,511
I want 64 bit drivers. Beta or whatever is fine, just gimme gimme! Nuff said.
 

Naieve

Gawd
Joined
Jan 20, 2010
Messages
749
+1 one but I really don't want to bother with them if they are too flaky, I'd rather wait till they are reasonably solid.

Apparently they are only flaky in a couple dx11 games, i bet if those games were turned to 9 it would straighten up some.

I'd rather have beta drivers so I could at least play a few games in surround.
 

Brackle

Old Timer
Joined
Jun 19, 2003
Messages
7,918
This is great news :)

Now the real question is, can you go higher then 1080p monitors?
 

Unknown-One

[H]F Junkie
Joined
Mar 5, 2005
Messages
8,905
This is great news :)

Now the real question is, can you go higher then 1080p monitors?
DVI doesn't have the bandwidth to go much higher than 1920x1200 at 120Hz. You'll have to drop to a lower refresh rate to use higher resolutions, which means Nvidia's 3D glasses wont work.

There should be nothing stopping you from running plain old triple monitor (without 3D glasses) at higher resolutions, though.
 

Brackle

Old Timer
Joined
Jun 19, 2003
Messages
7,918
DVI doesn't have the bandwidth to go much higher than 1920x1200 at 120Hz. You'll have to drop to a lower refresh rate to use higher resolutions, which means Nvidia's 3D glasses wont work.

There should be nothing stopping you from running plain old triple monitor (without 3D glasses) at higher resolutions, though.

No see NV surround is suppose to only support 1080p monitors because it was an SLI bandwidth issue.

That does not mean it IS True, I am just wondering if this pre beta drivers supports higher then 1920X1080 in NV surround gaming.
 

Naieve

Gawd
Joined
Jan 20, 2010
Messages
749
No see NV surround is suppose to only support 1080p monitors because it was an SLI bandwidth issue.

That does not mean it IS True, I am just wondering if this pre beta drivers supports higher then 1920X1080 in NV surround gaming.

The actual problem is that single link dvi can at most run slightly above 1920x1200 at 60hz, sli bandwidth has nothing to do with it. To run 1920x1080 at 120hz you need the extra bandwidth of dual link dvi to physically deliver the information to the monitor. Regular surround gaming is stated to support up to 2560x1600 x 3 fyi, that is stated in the White Paper.
 

GoldenTiger

Fully [H]
Joined
Dec 2, 2004
Messages
22,454
Nope, they were planning on holding surround gaming back till the end of June to finish working out the bugs being reported in this thread.

The question now is, will they let the x64 leak, will they release a beta package of surround with the 24th release now that the cat is out of the bag, or will they merely stay quiet and make the x64 crowd wait for the initial production version of surround gaming in June.

This is Nvidia people, so I am assuming they will stay quiet and make the x64 crowd, like myself, wait till June hoping for a leak.

edit:

http://www.bit-tech.net/news/hardware/2010/05/20/nvidia-surround-gaming-hit-by-delays/1

I was under the impression that it 3d surround, not nVidia Surround (eyefinity) that was slated for June. Regardless, I don't care about that... I'm much more interested in the reported performance boosts that will stick these cards even further ahead :D.
 

kllrnohj

Supreme [H]ardness
Joined
Apr 1, 2003
Messages
6,845
Regular surround gaming is stated to support up to 2560x1600 x 3 fyi, that is stated in the White Paper.

And that is the part that is supposedly wrong. Rumor has it that Surround Vision (*NOT* 3D) was also limited to 1080p due to SLI bandwidth issues.
 

Naieve

Gawd
Joined
Jan 20, 2010
Messages
749
And that is the part that is supposedly wrong. Rumor has it that Surround Vision (*NOT* 3D) was also limited to 1080p due to SLI bandwidth issues.

No way, if this was true they would be forced to drop the 3d surround entirely, as it wouldn't be possible, which it obviously is since they already demonstrated the system. 3d requires twice the bandwidth, so it is without a doubt they can do regular surround.

Lets look at it this way.

5760x1080= 6.25 megapixels, double that for 3d at twice the hz = 12.5 megapixels

2560x1600= 4 megapixels times 3 screens = 12 megapixels

As they have already done a public demonstration of 3d surround, there can be no doubt the bandwidth exists for regular surround up to 2560x1600x3.
 

Timbo

n00b
Joined
Apr 13, 2010
Messages
5
As they have already done a public demonstration of 3d surround, there can be no doubt the bandwidth exists for regular surround up to 2560x1600x3.

I agree with this, however... ATi have a "little secret" ( http://www.techreport.com/articles.x/18521/2 ) in that ATi's crossfire bridge is bandwidth limited to about 4 megapixels. I understand the SLi bridge to be similarly (if not worse) affected. (and while GF100 may have had SLi bridge tweaks, they can't go back and change the 200's now).

I say this because crossfire is actually broken at 3x30" (Mine runs at 8064x1600 - almost 13 megapixels with BC) and does start to be impacted (albeit hard to discern) at between 4-6 megapixels progressively getting worse thereafter. I'm actually done with Crossfire on tripl 30"s (because of the above) and am watching Nvidias version closely - because - NV 'Must" overcome bandwidth limitations on SLi for their 120hz/1080 systems to run. (as Naieve correctly stated, if there's sufficient bandwidth for 120hz/1080, then there's probably sufficient for 3x30") How they do this though, I'm keen to see...

cheers
 
Last edited:

ThunderGod66

Limp Gawd
Joined
Aug 5, 2004
Messages
369
Please play somes games in NVsurround without 3d and take pics and video if you can. Please let us know how it runs.

I got BC2 to run...someone mentioned here that DX11 games had problems so I forced it to DX9 in the ini file and it runs fine.

I took a bad pic with my iphone. It's really hard to get it all in the picture because 3 23" monitors are really wide. I had fraps running but you can't see because of how bad quality the pic is...it says 78fps at that moment though. I had everything on high and I think AA was on 2x and AF on 4x. Just messing around with settings still to see what's playable.




Uploaded with ImageShack.us
 

Unknown-One

[H]F Junkie
Joined
Mar 5, 2005
Messages
8,905
I agree with this, however... ATi have a "dirty little secret" (Tech reports words - not mine, though they are correct) in that ATi's crossfire bridge is bandwidth limited to about 4 megapixels. I understand the SLi bridge to be similarly (if not worse) affected. (and while GF100 may have had SLi bridge tweaks, they can't go back and change the 200's now).

It's likely they're using bandwidth from the PCIe slot itself to supplement the bandwidth of the SLI bridge. After all, SLI can function without a bridge (all traffic over PCIe), so it's reasonable to assume they can use the SLI bridge in concert with PCIe to get the inter-card bandwidth they need.
 

ThunderGod66

Limp Gawd
Joined
Aug 5, 2004
Messages
369
Age of Conan in DX10 Max details with SSAO off and 4x AA...16xQ AA gives me 20 - 30fps in town, about 50fps out in the emptier areas. 4x AA gives more like 30 - high 70fps.
It's kind of fish bowl-ish on the sides but I don't know if that's intended or something with the fov I should change. Doesn't bother me because it's more peripheral stuff while playing.



Uploaded with ImageShack.us



Uploaded with ImageShack.us
 
Last edited:

bigdogchris

Fully [H]
Joined
Feb 19, 2008
Messages
18,435
Can someone explain to me why Nvidia requires 2 GPU's (each being more powerful than a single ATI) to do what ATI is doing with 1? This is a serious question btw. I can't find the answer. Is it just to sell more graphics cards?
 

Unknown-One

[H]F Junkie
Joined
Mar 5, 2005
Messages
8,905
Can someone explain to me why Nvidia requires 2 GPU's (each being more powerful than a single ATI) to do what ATI is doing with 1? This is a serious question btw. I can't find the answer. Is it just to sell more graphics cards?

Nvidia's cards only have two RAMDAC's, which means each card can output to two displays. You need a 2nd card to connect a 3rd monitor.

ATi's cards share the same limitation, they only have two RAMDAC's, so you can only get two DVI signals. They allow you to have a 3rd display by also including DisplayPort, which doesn't need a RAMDAC.
 

Matrices

Supreme [H]ardness
Joined
Feb 5, 2003
Messages
5,256
Very nice. Now maybe I need to reconsider the $300+ I was thinking of spending to watercool my two 470s; maybe it's better spent on a new monitor setup!
 

bizzmeister

2[H]4U
Joined
Apr 26, 2010
Messages
2,322
I got BC2 to run...someone mentioned here that DX11 games had problems so I forced it to DX9 in the ini file and it runs fine.

I took a bad pic with my iphone. It's really hard to get it all in the picture because 3 23" monitors are really wide. I had fraps running but you can't see because of how bad quality the pic is...it says 78fps at that moment though. I had everything on high and I think AA was on 2x and AF on 4x. Just messing around with settings still to see what's playable.




Uploaded with ImageShack.us

thanks for responding man. But did you even try to play at direct x 11?
 

Kowan

Supreme [H]ardness
Joined
Jun 28, 2000
Messages
5,316
Ha ha! Nice try to cover up your own assholeness. So let's see who's conveying information properly here: the OP posts a screenshot clearing showing he's using 3 DVI ports (NOT his HDMI port), you see a totally unrelated HDMI error in a screenshot, call him a liar, and now you're telling him to get over himself and "convey information in a more complete way". If the forum rules allowed it I would call you a ***** ** ****.

HB
Well said. :D
 

noquarter

Gawd
Joined
Jan 7, 2010
Messages
606
No way, if this was true they would be forced to drop the 3d surround entirely, as it wouldn't be possible, which it obviously is since they already demonstrated the system. 3d requires twice the bandwidth, so it is without a doubt they can do regular surround.

Lets look at it this way.

5760x1080= 6.25 megapixels, double that for 3d at twice the hz = 12.5 megapixels

2560x1600= 4 megapixels times 3 screens = 12 megapixels

As they have already done a public demonstration of 3d surround, there can be no doubt the bandwidth exists for regular surround up to 2560x1600x3.

3d doesn't require twice the bandwidth itself, though there does exist overhead and other data the majority of the data sent across the SLI link is the frame buffer and the frame buffer is a static size equivalent to the resolution (5760x1080 = 6220800 pixels = 24.8MB).

What 3d does require is double the frames to give you your normal fps. You could claim from this that 3d requires double the bandwidth, but if you get 30 fps in non-3d you would only pull 15fps in 3d which means it would always require the same bandwidth out of the SLI link as non-3d for any given set up.

For what it's worth, the SLI link is supposedly capable of 1GB/s, which would allow for 43 frames per second at 5760x1080 (not including overhead and other data besides frame buffer) without hitting the PCI-e bus if it has to send the complete frame buffer, or ~86 frames per second if the driver is clever enough to send 1/3 one way and 2/3 the other way on the following frame.

The CrossfireX link does 0.9GB/s but you are able to double them up for 1.8GB/s (unsure if you can do this with SLI). I have a feeling some of the stuttering some people report in Crossfire Eyefinity is because of data being pumped across the PCI-e bus when the Crossfire link is saturated. The bandwidth exists in the PCI-e bus but I believe ATI has complained about latency and sync issues when hitting the PCI-e bus. Also with Crossfire the max fps is automatically in the 'clever driver' guesstimate I gave for nVidia since the slave card only has to send its frame buffer on the frames it was responsible for (every other frame), and if you double bridge it this'd be about 155 frames. Perhaps there exists some other significant overhead or the link can't always provide the instantaneous bandwidth required though, it's hard to find information about how these links work :(
 
Last edited:

ThunderGod66

Limp Gawd
Joined
Aug 5, 2004
Messages
369
thanks for responding man. But did you even try to play at direct x 11?

Yeah, I posted earlier in the thread that I tried BC2 but all the textures were messed up...it was like I was playing under the gameworld and walking through ground. Kind of like it looks when your video card is about to die. That was singleplayer and multiplayer...I tried both and I tried changing every graphic option while in DX11 and nothing had an effect. I'm fine with using DX9 for now...I mean the drivers aren't even officially out so who knows what I'm using right now.
 

Michaelius

Supreme [H]ardness
Joined
Sep 8, 2003
Messages
4,684
Hmm now if only nvidia allowed SLI on ati chipsets.

Also what happens with display configuration when you disable SLI?
 

zehoo

Limp Gawd
Joined
Aug 22, 2004
Messages
429
Looking good :p


Now just need the game devs for fix that horrible FOV issue with the stretch on the sides :p
 

Timbo

n00b
Joined
Apr 13, 2010
Messages
5
I have a feeling some of the stuttering some people report in Crossfire Eyefinity is because of data being pumped across the PCI-e bus when the Crossfire link is saturated...

You're correct. I believe this is the root source of multi GPU issues with eyefinity. (Exacerbated by low vram on 1gb GPU's making even more demand on the PCIe bus)

It's likely they're using bandwidth from the PCIe slot itself to supplement the bandwidth of the SLI bridge. After all, SLI can function without a bridge (all traffic over PCIe), so it's reasonable to assume they can use the SLI bridge in concert with PCIe to get the inter-card bandwidth they need.

The PCIe bus is also saturated. This is what ATi already does - and fails. I'm well aware that there's little apparent bench difference between PCIe 8x and 16x where crossfire and SLi are not trying to reach beyond the bridge requirements. However, when the bridge (bus) is saturated (beyond say 4 megapixels) that all changes. This is why I say "I'm Keen" to see how they are going to acheive this - as I believe (firmly) that the PCIe is already overburdened and will be of little benefit to a saturated bridge.

it's hard to find information about how these links work :(

Agreed. It'd be great to see links to anything that can shed further light on these...

Cheers...
 
Last edited:

Unknown-One

[H]F Junkie
Joined
Mar 5, 2005
Messages
8,905
3d doesn't require twice the bandwidth itself

We're talking about DVI bandwidth (which is what sets a hard limit on screen size for 3D Vision). 1920x1080 at 120Hz requires exactly double the DVI bandwidth of 1920x1080 at 60Hz.

A dual link DVI cable simple doesn't have the bandwidth to carry 2560x1600 at 120Hz...it barely has the bandwidth to handle 2560x1600 at 60Hz. That's why Nvidia's 3D glasses won't work on anything larger than 1920x1080 or 1920x1200, because they're the highest resolutions you can run at 120Hz over DVI.
 

Matrices

Supreme [H]ardness
Joined
Feb 5, 2003
Messages
5,256
All this talk of the PCI-E bus being "saturated" - where is the evidence? A simple test should suffice: apply supersampling to a game and benchmark it on a 16x/16x platform and an 8x/8x platform. Make the supersampling level comparable to 5760x1080 and compare results.
 

Unknown-One

[H]F Junkie
Joined
Mar 5, 2005
Messages
8,905
Considering triple 2560x1600 at 60Hz actually requires LESS data throughput than than triple 1920x1080 at 120Hz, it might not be so bad.
 

WBurchnall

2[H]4U
Joined
Oct 10, 2009
Messages
2,622
Age of Conan in DX10 Max details with SSAO off and 4x AA...16xQ AA gives me 20 - 30fps in town, about 50fps out in the emptier areas. 4x AA gives more like 30 - high 70fps.

Hmm, nice screenshots. There don't seem to be any graphical glitches which is always good for a leaked beta driver. Can anyone with eyefinity chime in here and give some information regarding how these 20-30 fps in town numbers compare to their system? I know in the case of Wow, Wow in cities is usually cpu-limited so we might be seeing the same thing in AoC? So this might not be the best game to use as a benchmark for how 3D Stereo performs I imagine...

On another note, those screenshots seem unsettling to me. I used to play AoC back when it first came out and remember the good old days of graphically/gameplay impressive Tortage that died out shortly after Tortage. Weird, I must be remembering Age of Conan in a better-light than it actually is though. For some reason, those graphics don't seem quite as impressive as they used to be for 'max details'. I guess since playing LOTRO on Ultra high quality, it seems somehow a bit uglier than I remembered. I remember when I first moved to AoC back during the launch phase of AoC, AoC seemed beautiful compared to Wow's Saturday Morning Cartoon graphics or FF11's fairly dated graphics(the two mmorpgs I played most recently at the time). Those screenshots though don't seem as impressive now that I've played lotro ultra-high quality settings though. I guess my mmorpg graphical standards have moved up a notch.
 

Naieve

Gawd
Joined
Jan 20, 2010
Messages
749
"You could claim from this that 3d requires double the bandwidth, but if you get 30 fps in non-3d you would only pull 15fps in 3d which means it would always require the same bandwidth out of the SLI link as non-3d for any given set up."

Correct me if I'm wrong, which I might very well be since I haven't really been very interested in 3d, but I thought 3d required a steady 120fps being pushed to the monitors to keep the 3d effect going.

"For what it's worth, the SLI link is supposedly capable of 1GB/s, which would allow for 43 frames per second at 5760x1080 (not including overhead and other data besides frame buffer) without hitting the PCI-e bus if it has to send the complete frame buffer, or ~86 frames per second if the driver is clever enough to send 1/3 one way and 2/3 the other way on the following frame."

The one thing about using the SLI link for video is that you only need to transfer one screen from the main card over. Which cuts the load by 1/3.

"The CrossfireX link does 0.9GB/s but you are able to double them up for 1.8GB/s (unsure if you can do this with SLI). I have a feeling some of the stuttering some people report in Crossfire Eyefinity is because of data being pumped across the PCI-e bus when the Crossfire link is saturated. The bandwidth exists in the PCI-e bus but I believe ATI has complained about latency and sync issues when hitting the PCI-e bus. Also with Crossfire the max fps is automatically in the 'clever driver' guesstimate I gave for nVidia since the slave card only has to send its frame buffer on the frames it was responsible for (every other frame), and if you double bridge it this'd be about 155 frames. Perhaps there exists some other significant overhead or the link can't always provide the instantaneous bandwidth required though, it's hard to find information about how these links work"

Eyefinity runs off of a single card, crossfire is completed in the regular manner without any need to transfer video to the DVI on the second card. If you are saying the link is saturated simply from the information load, then obviously they need to cut it down. As has been mentioned more ram, the 480 has a half gig more, also I'm pretty sure cutting down on pre-rendered frames would lower the bandwidth load as well.
 
Last edited:

Naieve

Gawd
Joined
Jan 20, 2010
Messages
749
It's supposed to support GTX260 SLI, GTX275 SLI, GTX280 SLI, GTX285 SLI, and the GTX295 (dual GPU card with three working outputs, so you only need one).

I'm pretty sure that is going to be added later. So far Surround is only on the 400 series, at least until the 256 series drivers support their flagship product decently, then I bet they concentrate on older models.
 

noquarter

Gawd
Joined
Jan 7, 2010
Messages
606
We're talking about DVI bandwidth (which is what sets a hard limit on screen size for 3D Vision). 1920x1080 at 120Hz requires exactly double the DVI bandwidth of 1920x1080 at 60Hz.

A dual link DVI cable simple doesn't have the bandwidth to carry 2560x1600 at 120Hz...it barely has the bandwidth to handle 2560x1600 at 60Hz. That's why Nvidia's 3D glasses won't work on anything larger than 1920x1080 or 1920x1200, because they're the highest resolutions you can run at 120Hz over DVI.

Oops, Naieve's post was a response to a post about SLI bandwidth so I assumed that's what we were talking about in terms of bandwidth. You are definitely right about the DVI bandwidth limitation. But afaik Nvidia has limited it to 1920x1080 per monitor in even regular Surround which shouldn't be a DVI limitation and hints at an SLI limitation.

Matrices said:
All this talk of the PCI-E bus being "saturated" - where is the evidence? A simple test should suffice: apply supersampling to a game and benchmark it on a 16x/16x platform and an 8x/8x platform. Make the supersampling level comparable to 5760x1080 and compare results.

You could as long as you set the AA high enough to run the card out of VRAM and force it to use system ram. But I think the real issue is PCI-e latency not bandwidth, though there are clearly some bursts of data that are capable of briefly saturating the PCI-e bus on single GPU's or there wouldn't be a 5% drop in fps from 16x to 8x.
 

noquarter

Gawd
Joined
Jan 7, 2010
Messages
606
"You could claim from this that 3d requires double the bandwidth, but if you get 30 fps in non-3d you would only pull 15fps in 3d which means it would always require the same bandwidth out of the SLI link as non-3d for any given set up."

Correct me if I'm wrong, which I might very well be since I haven't really been very interested in 3d, but I thought 3d required a steady 120fps being pushed to the monitors to keep the 3d effect going.
It doesn't require 120fps, it just requires 120hz which is entirely independent of fps, and this requirement only exists to reduce flickering from shutter glasses (same flickering you would get on CRT's at 60hz). Other than this 120hz requirement the fps is just like normal fps, where 30 is barely playable, 60 is most desirable, etc, except that a card that could do 30 fps non-3d would only be able to do 15 fps since it has to render each scene twice.

"For what it's worth, the SLI link is supposedly capable of 1GB/s, which would allow for 43 frames per second at 5760x1080 (not including overhead and other data besides frame buffer) without hitting the PCI-e bus if it has to send the complete frame buffer, or ~86 frames per second if the driver is clever enough to send 1/3 one way and 2/3 the other way on the following frame."

The one thing about using the SLI link for video is that you only need to transfer one screen from the main card over. Which cuts the load by 1/3.

Right, well, on odd frames 1 screen needs to be transferred from the main card to the secondary, and on even frames 2 screens need to be transferred from the secondary to the main. It's whether the drivers are smart enough to cut up the frame before sending, they should be but it's conceivable they aren't able to transfer a partial frame buffer also (remember it doesn't render it as 3 1920x1080 frames but 1 giant 5760x1080 frame)

"The CrossfireX link does 0.9GB/s but you are able to double them up for 1.8GB/s (unsure if you can do this with SLI). I have a feeling some of the stuttering some people report in Crossfire Eyefinity is because of data being pumped across the PCI-e bus when the Crossfire link is saturated. The bandwidth exists in the PCI-e bus but I believe ATI has complained about latency and sync issues when hitting the PCI-e bus. Also with Crossfire the max fps is automatically in the 'clever driver' guesstimate I gave for nVidia since the slave card only has to send its frame buffer on the frames it was responsible for (every other frame), and if you double bridge it this'd be about 155 frames. Perhaps there exists some other significant overhead or the link can't always provide the instantaneous bandwidth required though, it's hard to find information about how these links work"

Eyefinity runs off of a single card, crossfire is completed in the regular manner without any need to transfer video to the DVI on the second card. If you are saying the link is saturated simply from the information load, then obviously they need to cut it down. As has been mentioned more ram, the 480 has a half gig more, also I'm pretty sure cutting down on pre-rendered frames would lower the bandwidth load as well.

Eyefinity runs off a single card but when you Crossfire them the secondary card still needs to send its entire frame (5760x1080) to the primary for output. Nvidia's implementation may actually have an edge here because sending the entire 5760x1080 frame every other frame incurs higher bandwidth spikes (no data, big data, no data, big data) than sending 1920x1080 on odd frames and 3840x1080 on even frames (small data, medium, small, medium).

The memory capacity doesn't play a role on this bandwidth requirement, I'm only talking about the output frame which needs to be sent out to the appropriate video outputs and the SLI bridge is the fastest way to get it there..
 
Last edited:

darkpaw

2[H]4U
Joined
May 29, 2008
Messages
2,279
Anyone using the leaked drivers able to comment on the rumor they work over DVI only and not over DVI->HDMI adapters?

If I wasn't working today I'd be installing 32-bit on another partition to test.
 

Murry

Limp Gawd
Joined
Dec 24, 2005
Messages
215
I am playing in 3D VISION SURROUND with two GTX 480's and three Samsung 2233RZ displays which are, in fact, 3D Vision Ready.

I took a pic since apparently that's worth a thousand words.

2010-05-22%2009.11.07.JPG


Can you take another pic with all 3 monitors in view. Thanks :) Now let me clean the drool off my keyboard. Awesome!!!!!!!!!
 
Last edited:

Murry

Limp Gawd
Joined
Dec 24, 2005
Messages
215
Physx, 3dvision, and now surround, all-in-one. Crank that sh*t up!!!!!!
 
Status
Not open for further replies.
Top