surround view release?

ziggymon

n00b
Joined
Oct 8, 2008
Messages
20
anyone knows when is this feature going to be available
im seriously considering buying an ATI card for its eyefinity technology, but knowing that nvidia will be doing the same while using my 88xx series cards would save me some coin to be better off spending on a couple gf 100's when theyre released.
 
The whole Surround View thing by nVidia seems like a slap in the face to us current users. Specially telling us you HAVE TO HAVE SLI to make it work. Haven't seen a game now or coming in the future that a single 5870 couldn't run at high resolution and High FPS. So you would infact save money by buying 1 card that is capable of 3 monitors and DirectX 11 which games are starting to use now.

Buddy If I had two GTX 260's like you do, I would sell them, get the 5870 or hell even the 5970. Then I would use the physics hack and pair that GTS250 with the 5xxx series card. Then the money you were gonna spend on $600 Fermi cards, go get some nice panels. Thats just my opinion and thoughts on what I would do. Not to mention Fermi is gonna use alot a power in that SLI combo.
 
The feature is supposedly available upon release of Fermi.

That you need SLI to run it, isn't optimal, but it gives backwards compatibility at least on older setups, which is a good thing. ATI put a lot of display pipelines in their GPU's, so they could add DP's for single card multi-screen setups. Nvidia might not have done that with Fermi, but they have had multimonitor setups a long time for Quadro cards with SLI mosaic.

Nvidia could have chosen to downplay multi-monitor gaming, since they didn't have that. Instead, they offer it within SLI. I am glad they did, because once you get 3X screens, its not good to lock yourself to one vendor and having options to chose between ATI and Nvidia every generation is a good thing in my book. :)
 
The whole Surround View thing by nVidia seems like a slap in the face to us current users. Specially telling us you HAVE TO HAVE SLI to make it work.

slap in the face? lol.

damn nvidia for retrospectively shipping every GT200 owner an expansion slot daughter card containing a third digital display output!

how dare they!
 
requiring you to have sli should eliminate the need of a displayport monitor or adapter though right?
 
ATIs drivers are a slap in the face.

And none of us really know for sure how nv is gonna do the surround. We'll have to wait and see. I want Eyefinity/nfinity, but I want at least 5x1 portrait. I'd guess that Nvidia would require 3 cards for that setup. Not ideal, but probably better than the DisplayPort debacle. We'll just have to wait and see how the dust settles on the Eyefinity6 vs. Fermi, etc.
 
I recall how I ran FarCry on Nvidia 6800gt or ultra across two 20" monitors for 3200x1200 resolution back in 2004... can't imagine why it would be so hard to duplicate this feature now.
 
actually having working drivers that supports multiple cards + surround view is what's needed, considering how gpu intensive dx11 games at uber resolutions will be. something ati hasn't fixed yet with their crossfire + eyefinity setups. so to have dual fermi's running games at 5760x1200 ex should be required if you like 60fps and then adding 3d if you want that as well.
 
nvidia's drivers have usually been a step ahead of ati's as far as taking advantage of the hardware and features, especially SLI.
 
nvidia's drivers have usually been a step ahead of ati's as far as taking advantage of the hardware and features, especially SLI.

I think that has more to do with Nvidia generally releasing their cards before ATI and less to do with drivers. You can't have a driver support a hardware feature if the hardware isn't released yet ;)

Of course, this generation its the complete opposite, with Nvidia being months late to the game
 
The whole Surround View thing by nVidia seems like a slap in the face to us current users. Specially telling us you HAVE TO HAVE SLI to make it work.

I'm failing to see how nvidia is slapping me in the face. Is it because I can buy another used gtx 285 for $250 (if I catch another one at that price) and run a surround setup, instead of going out and buying a $400+ video card? Or is it a slap because it's backwards compatible with the 200 series, unlike ati's? I personally think it's all quite cool, and will be beneficial to me, since I bought a 285 (and a power supply to match it) not long ago.
 
ATIs drivers are a slap in the face.
This ^. I'd got fed up with waiting, so got a 5850, nothing but terrible drivers. My three screen setup was flicking in windows, flickering in game with corrupted image shown ever few seconds. My monitors have DisplayPort, so that was not an issue. 5850 was back and refunded same day.

Waiting on Fermi now. Since have seen nothing but hotfix after hotfix from ATI and still no resolution, just new issues like lock-ups in game, and grey screens. My first and only experience of ATI, been Nvidia since the company started.
 
The real issue here isnt drivers or the actual support. Its how many IGA's or separate streams the GPU can support.

To properly output 3 separate screens, the GPU needs to be able to support 3 separate IGA's IIRC. "Having to use SLI" just means that you need that second GPU to support the 3rd output in full 3d.

There is no reason Nvidia can't add this feature to their current gen products that support SLI. The real question is if its worth it to them to have it to compete with eyefinity.

ATI just thought ahead on the new Cypress GPU's and added the ability to run 3 IGA's from the GPU which allows support for 3 displays.

"Forcing you to use SLI" is not the proper term, as its a hardware limitation on the GPU. You NEED SLI for it to run. The fact is nvidia just didnt think about needing more than 2 3D outputs when the GPU was designed.
 
I'm failing to see how nvidia is slapping me in the face. Is it because I can buy another used gtx 285 for $250 (if I catch another one at that price) and run a surround setup, instead of going out and buying a $400+ video card? Or is it a slap because it's backwards compatible with the 200 series, unlike ati's? I personally think it's all quite cool, and will be beneficial to me, since I bought a 285 (and a power supply to match it) not long ago.

Don't you end up with a little ..less...of a future proof system in the end since you don't have a directx capable video card if you go the 2xGTX285 route? I'd personally say 2xGTX285 will be able to power probably all of this year's games and most of next year's at a good triple-monitor resolution with no aa but some af. You wouldn't get the DX11 features though which might encouage/make SLI 2xx series users want to upgrade to Fermi requiring them to purcahse two new cards at probably 400 each versus a single 400 card from the competitors camp. I've really enjoyed my 8800 GTS 640MB but would be kind of torn if I had to spend twice as much to keep with nVidia's side as ATI's side. There would need to be some kind of compelling advantage of feature that's not 3D as I don't want to buy 3 120hz monitors for upwards of $1000.00.

It's starting to seem like more and more games in the future for PC will be DX11 capable. I'd be willing to bet that in 2011 at least 60% of the games will be dx11 and the 40% that are not dx11 would be console ports. 2010 will probably be more like 20%/80% so to me the DX11 issue + pricing issue, would push me into the ATI camp a bit. I'm just hoping Fermi's prices are more competitive than I'm expecting.

So yes, if you want dx11 + surround monitors, you'd need to buy two Fermis and a GTX285 wouldn't cut it due to hardware limitations(dx10 only). I also wonder a bit about how long it'll take for driver support to be added for the 2xx series. nVidia's main concern and I imagine a good portion of its driver team will be trying to push out quality Fermi drivers that at LEAST match or succeed beyond where the 57xx/58xx series will be in 'March'-ish.
After all if nVidia came out six months late, requiring sli for surround view, had bad drivers/buggery drivers and at a lesser price:performance ratio, nVidia support base even by fans might be hurting. So will the drivers for surround on the 2xx series be at the same time? take a month after Fermi's release? two months? three?
 
You make a good point that I won't be able to use DirectX 11 by sticking with a 285. It's also a little saddening that I won't be able too, since I've seen tessellation as the next big thing since normal mapping, since it's a similar principal. That said, how many worthwhile games are going to be DirectX 11 by the time Fermi is out? Not many, as is currently evidenced by the hoarding of dx11 titles for ati benchmarks.

I've found a pretty decent rhythm for myself. I usually wait one generation, and get the most powerful one from that. For instance, I picked up my 9800 gtx+ right around the time the gtx 280 was coming out. Later on, I got a bigger monitor, and it was starting to stutter in places. Sold my 9800 to a friend and upgraded to a 285 back in October. Both times I got those cards for a fraction of what they cost initially. Heck, my fiance is still using a GTS 250 for her 1440x900 monitor, and it's more than enough power for that.

As a result, I'm very interested in the fact that nvidia will allow me to extend my current card with sli and surround. I would've been significantly more interested in ati if they had done that with EyeFinity and the 4890, for instance. I guess I'm just not much of an early adopter, which is why I'm so interested in being able to get more use out of my 285.
 
Funny how people buy and use GTX 295s for a single screen but then complain they need more than one card for THREE DISPLAYS?????

I say the ATI solution for 3 displays on a single card is crap and the POSSIBLE Nvidia solution of at least 2 cards is great! More cards = more frames per second.
 
Funny how people buy and use GTX 295s for a single screen but then complain they need more than one card for THREE DISPLAYS?????

I say the ATI solution for 3 displays on a single card is crap and the POSSIBLE Nvidia solution of at least 2 cards is great! More cards = more frames per second.
I was really excited to see ATI with their three video outputs. I have three screens at home for professional use, I only game on the central screen. At present I use an 8400GS to drive the 3rd screen.
 
I'm failing to see how nvidia is slapping me in the face. Is it because I can buy another used gtx 285 for $250 (if I catch another one at that price) and run a surround setup, instead of going out and buying a $400+ video card? Or is it a slap because it's backwards compatible with the 200 series, unlike ati's? I personally think it's all quite cool, and will be beneficial to me, since I bought a 285 (and a power supply to match it) not long ago.


its a slap in the face because theres some of us that actually own motherboards that dont have dual pci-e x16 slots.. not to mention people stuck with motherboards that have dual x16 slots with the second one running at 4x instead of 8x..


Funny how people buy and use GTX 295s for a single screen but then complain they need more than one card for THREE DISPLAYS?????

I say the ATI solution for 3 displays on a single card is crap and the POSSIBLE Nvidia solution of at least 2 cards is great! More cards = more frames per second.


*ahem* and more people with even less common sense and knowledge making posts..*ahem*
 
its a slap in the face because theres some of us that actually own motherboards that dont have dual pci-e x16 slots.. not to mention people stuck with motherboards that have dual x16 slots with the second one running at 4x instead of 8x..

You realize you are talking about adding a feature to an existing product right? Why would somebody who was previously happy with a GT2xx card be pissed off because a new feature/option was made available to them?
 
I for one simply can not wait for the support for nfinity using 2xx GPU's my GTX 285's are collecting dust. We'll see...

Edit: The more I play around with these 5770's, eyefinity, and ATI's drivers the more problems I am having, lately when I game I've been using only one of my displays.... just some real random crap has been happening. Not to say nvidias drivers will be any better but I can honestly say that I've had much less problems running than ATI's. I'm far from a fan boy to any camp as I've owned and ran 4870>4870CF>4870x2>4890CF and now 5770CF with nvidia counterparts in between.
 
Last edited:
You realize you are talking about adding a feature to an existing product right? Why would somebody who was previously happy with a GT2xx card be pissed off because a new feature/option was made available to them?


Trinibwoy, I think he was referring to having to SLI Fermi. If you have a motherboard that supports SLI, some motherboards will have 16x PCI Express 1.0 or 2.0, we'll say 1.0 where as the '2nd' slot for SLI has perhaps only 4x or 8x PCI Express 1.0 speed or possibly worse yet, both PCI Expresses drop to 8x. Hence, with that second Fermi, your going to get a tiny bit less performance.

Someone might have bought one of these motherboards thinking 'Ohh, I'll never need to SLI. I'd rather buy one GPU and upgrade each year with a new GPU rather than buying two GPUs at a time and upgrading once every year and a half or two years.'

Now this person finds out he HAS to SLI whether he likes it or not in order to get the surround view. Its kind of a slap in the face in that he now has to change his buying habits to fit nVidia, gets his PCI Express slots working slower, etc. This guy's situation isn't so bad.

Just imagine, John Hitchcock who goes and buys a motherboard that's crossfire or buys a motherboard that doesn't have SLI support. He goes over to his friends house and sees how AWEESOME nVidia 3D surround view is and decides 'I WANT THAT' but only has one Fermi. He considers buying a second Fermi but wait! his motherboard doesn't support it and he has to buy another motherboard to get that second Fermi adding another $200.00-$300.00 onto the cost of Fermi. He goes out, buys the 2nd Fermi card, buys the motherboard, etc.

Later, he's surfing on the Internet and finds out ATI 57xx/58xx users get eyefinity with only having to buy one card, the cards cheaper and he didn't need to buy a new motherboard for ATI's solution. John Hitchcock may find himself asking 'Why the heck did I buy nVidia in the first place? I just got ____ up the ____ and omgosh it hurts. That was a real slap-in-the-face'.


From another possible slap-in-the-face viewpoint, some 2xx users never ending to SLI may have bought a power supply thats more that sufficent for running 1 card but not so sufficent for running 2 cards. IE a 600 watt power supply trying to power 2 Fermis or trying to power 2xx cards might be a bit too much juice if you've overclocked our processor, have a bunch of hard drives eternally, a few externally, dvd-rw, web-camera, other usb devices, blu-ray drive, etc. Now to get this surround view, they also might need to buy a new psu to go along with everything else.
 
Last edited:
Trinibwoy, I think he was referring to having to SLI Fermi. If you have a motherboard that supports SLI, some motherboards will have 16x PCI Express 1.0 or 2.0, we'll say 1.0 where as the '2nd' slot for SLI has perhaps only 4x or 8x PCI Express 1.0 speed or possibly worse yet, both PCI Expresses drop to 8x. Hence, with that second Fermi, your going to get a tiny bit less performance.

Someone might have bought one of these motherboards thinking 'Ohh, I'll never need to SLI. I'd rather buy one GPU and upgrade each year with a new GPU rather than buying two GPUs at a time and upgrading once every year and a half or two years.'

Now this person finds out he HAS to SLI whether he likes it or not in order to get the surround view. Its kind of a slap in the face in that he now has to change his buying habits to fit nVidia, gets his PCI Express slots working slower, etc. This guy's situation isn't so bad.

Just imagine, John Hitchcock who goes and buys a motherboard that's crossfire or buys a motherboard that doesn't have SLI support. He goes over to his friends house and sees how AWEESOME nVidia 3D surround view is and decides 'I WANT THAT' but only has one Fermi. He considers buying a second Fermi but wait! his motherboard doesn't support it and he has to buy another motherboard to get that second Fermi adding another $200.00-$300.00 onto the cost of Fermi. He goes out, buys the 2nd Fermi card, buys the motherboard, etc.

Later, he's surfing on the Internet and finds out ATI 57xx/58xx users get eyefinity with only having to buy one card, the cards cheaper and he didn't need to buy a new motherboard for ATI's solution. John Hitchcock may find himself asking 'Why the heck did I buy nVidia in the first place? I just got ____ up the ____ and omgosh it hurts. That was a real slap-in-the-face'.


From another possible slap-in-the-face viewpoint, some 2xx users never ending to SLI may have bought a power supply thats more that sufficent for running 1 card but not so sufficent for running 2 cards. IE a 600 watt power supply trying to power 2 Fermis or trying to power 2xx cards might be a bit too much juice if you've overclocked our processor, have a bunch of hard drives eternally, a few externally, dvd-rw, web-camera, other usb devices, blu-ray drive, etc. Now to get this surround view, they also might need to buy a new psu to go along with everything else.

While having to use SLi for nFinity/surround view, any board that supports SLi is min 8x on the PCIe slots. Boards that say they support XFire on the other hand I have seen them sporting 1 16x and 1 4x slot. A reason why Nvidia certifies the boards that get SLi approved.
 
Trinibwoy, I think he was referring to having to SLI Fermi. If you have a motherboard that supports SLI, some motherboards will have 16x PCI Express 1.0 or 2.0, we'll say 1.0 where as the '2nd' slot for SLI has perhaps only 4x or 8x PCI Express 1.0 speed or possibly worse yet, both PCI Expresses drop to 8x. Hence, with that second Fermi, your going to get a tiny bit less performance.

So Nvidia owes those people something because their motherboard doesn't have two full speed PCIe slots? :confused: If Surround View requires SLI and you want Surround View then obviously you would have to upgrade. Just like Eyefinity requires a Cypress card so if you want it you have to upgrade. Or is that also a slap in the face? I'm not following......

John Hitchcock may find himself asking 'Why the heck did I buy nVidia in the first place? I just got ____ up the ____ and omgosh it hurts. That was a real slap-in-the-face'.

If poor John has buyer's remorse because he didn't compare all the solutions on the market before buying that's his problem.
 
Last edited:
Back
Top