thehybridfrog
Limp Gawd
- Joined
- Oct 8, 2009
- Messages
- 422
this is great news !! and i was worried that nv surround was a myth .
It's not a myth, people saw it live at CEbit.
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
this is great news !! and i was worried that nv surround was a myth .
I want 64 bit drivers. Beta or whatever is fine, just gimme gimme! Nuff said.
+1 one but I really don't want to bother with them if they are too flaky, I'd rather wait till they are reasonably solid.
DVI doesn't have the bandwidth to go much higher than 1920x1200 at 120Hz. You'll have to drop to a lower refresh rate to use higher resolutions, which means Nvidia's 3D glasses wont work.This is great news
Now the real question is, can you go higher then 1080p monitors?
DVI doesn't have the bandwidth to go much higher than 1920x1200 at 120Hz. You'll have to drop to a lower refresh rate to use higher resolutions, which means Nvidia's 3D glasses wont work.
There should be nothing stopping you from running plain old triple monitor (without 3D glasses) at higher resolutions, though.
No see NV surround is suppose to only support 1080p monitors because it was an SLI bandwidth issue.
That does not mean it IS True, I am just wondering if this pre beta drivers supports higher then 1920X1080 in NV surround gaming.
Nope, they were planning on holding surround gaming back till the end of June to finish working out the bugs being reported in this thread.
The question now is, will they let the x64 leak, will they release a beta package of surround with the 24th release now that the cat is out of the bag, or will they merely stay quiet and make the x64 crowd wait for the initial production version of surround gaming in June.
This is Nvidia people, so I am assuming they will stay quiet and make the x64 crowd, like myself, wait till June hoping for a leak.
edit:
http://www.bit-tech.net/news/hardware/2010/05/20/nvidia-surround-gaming-hit-by-delays/1
Regular surround gaming is stated to support up to 2560x1600 x 3 fyi, that is stated in the White Paper.
And that is the part that is supposedly wrong. Rumor has it that Surround Vision (*NOT* 3D) was also limited to 1080p due to SLI bandwidth issues.
As they have already done a public demonstration of 3d surround, there can be no doubt the bandwidth exists for regular surround up to 2560x1600x3.
Please play somes games in NVsurround without 3d and take pics and video if you can. Please let us know how it runs.
I agree with this, however... ATi have a "dirty little secret" (Tech reports words - not mine, though they are correct) in that ATi's crossfire bridge is bandwidth limited to about 4 megapixels. I understand the SLi bridge to be similarly (if not worse) affected. (and while GF100 may have had SLi bridge tweaks, they can't go back and change the 200's now).
Can someone explain to me why Nvidia requires 2 GPU's (each being more powerful than a single ATI) to do what ATI is doing with 1? This is a serious question btw. I can't find the answer. Is it just to sell more graphics cards?
I got BC2 to run...someone mentioned here that DX11 games had problems so I forced it to DX9 in the ini file and it runs fine.
I took a bad pic with my iphone. It's really hard to get it all in the picture because 3 23" monitors are really wide. I had fraps running but you can't see because of how bad quality the pic is...it says 78fps at that moment though. I had everything on high and I think AA was on 2x and AF on 4x. Just messing around with settings still to see what's playable.
Uploaded with ImageShack.us
Well said.Ha ha! Nice try to cover up your own assholeness. So let's see who's conveying information properly here: the OP posts a screenshot clearing showing he's using 3 DVI ports (NOT his HDMI port), you see a totally unrelated HDMI error in a screenshot, call him a liar, and now you're telling him to get over himself and "convey information in a more complete way". If the forum rules allowed it I would call you a ***** ** ****.
HB
No way, if this was true they would be forced to drop the 3d surround entirely, as it wouldn't be possible, which it obviously is since they already demonstrated the system. 3d requires twice the bandwidth, so it is without a doubt they can do regular surround.
Lets look at it this way.
5760x1080= 6.25 megapixels, double that for 3d at twice the hz = 12.5 megapixels
2560x1600= 4 megapixels times 3 screens = 12 megapixels
As they have already done a public demonstration of 3d surround, there can be no doubt the bandwidth exists for regular surround up to 2560x1600x3.
thanks for responding man. But did you even try to play at direct x 11?
Looking good
Now just need the game devs for fix that horrible FOV issue with the stretch on the sides
I have a feeling some of the stuttering some people report in Crossfire Eyefinity is because of data being pumped across the PCI-e bus when the Crossfire link is saturated...
It's likely they're using bandwidth from the PCIe slot itself to supplement the bandwidth of the SLI bridge. After all, SLI can function without a bridge (all traffic over PCIe), so it's reasonable to assume they can use the SLI bridge in concert with PCIe to get the inter-card bandwidth they need.
it's hard to find information about how these links work
3d doesn't require twice the bandwidth itself
Is this coming to 2xx series cards ever? or am i boned.
Age of Conan in DX10 Max details with SSAO off and 4x AA...16xQ AA gives me 20 - 30fps in town, about 50fps out in the emptier areas. 4x AA gives more like 30 - high 70fps.
It's supposed to support GTX260 SLI, GTX275 SLI, GTX280 SLI, GTX285 SLI, and the GTX295 (dual GPU card with three working outputs, so you only need one).
We're talking about DVI bandwidth (which is what sets a hard limit on screen size for 3D Vision). 1920x1080 at 120Hz requires exactly double the DVI bandwidth of 1920x1080 at 60Hz.
A dual link DVI cable simple doesn't have the bandwidth to carry 2560x1600 at 120Hz...it barely has the bandwidth to handle 2560x1600 at 60Hz. That's why Nvidia's 3D glasses won't work on anything larger than 1920x1080 or 1920x1200, because they're the highest resolutions you can run at 120Hz over DVI.
Matrices said:All this talk of the PCI-E bus being "saturated" - where is the evidence? A simple test should suffice: apply supersampling to a game and benchmark it on a 16x/16x platform and an 8x/8x platform. Make the supersampling level comparable to 5760x1080 and compare results.
It doesn't require 120fps, it just requires 120hz which is entirely independent of fps, and this requirement only exists to reduce flickering from shutter glasses (same flickering you would get on CRT's at 60hz). Other than this 120hz requirement the fps is just like normal fps, where 30 is barely playable, 60 is most desirable, etc, except that a card that could do 30 fps non-3d would only be able to do 15 fps since it has to render each scene twice."You could claim from this that 3d requires double the bandwidth, but if you get 30 fps in non-3d you would only pull 15fps in 3d which means it would always require the same bandwidth out of the SLI link as non-3d for any given set up."
Correct me if I'm wrong, which I might very well be since I haven't really been very interested in 3d, but I thought 3d required a steady 120fps being pushed to the monitors to keep the 3d effect going.
"For what it's worth, the SLI link is supposedly capable of 1GB/s, which would allow for 43 frames per second at 5760x1080 (not including overhead and other data besides frame buffer) without hitting the PCI-e bus if it has to send the complete frame buffer, or ~86 frames per second if the driver is clever enough to send 1/3 one way and 2/3 the other way on the following frame."
The one thing about using the SLI link for video is that you only need to transfer one screen from the main card over. Which cuts the load by 1/3.
"The CrossfireX link does 0.9GB/s but you are able to double them up for 1.8GB/s (unsure if you can do this with SLI). I have a feeling some of the stuttering some people report in Crossfire Eyefinity is because of data being pumped across the PCI-e bus when the Crossfire link is saturated. The bandwidth exists in the PCI-e bus but I believe ATI has complained about latency and sync issues when hitting the PCI-e bus. Also with Crossfire the max fps is automatically in the 'clever driver' guesstimate I gave for nVidia since the slave card only has to send its frame buffer on the frames it was responsible for (every other frame), and if you double bridge it this'd be about 155 frames. Perhaps there exists some other significant overhead or the link can't always provide the instantaneous bandwidth required though, it's hard to find information about how these links work"
Eyefinity runs off of a single card, crossfire is completed in the regular manner without any need to transfer video to the DVI on the second card. If you are saying the link is saturated simply from the information load, then obviously they need to cut it down. As has been mentioned more ram, the 480 has a half gig more, also I'm pretty sure cutting down on pre-rendered frames would lower the bandwidth load as well.