Fermi vs. 5870 + Eyefinity

It is no "afterthought". Nvidia has used it in their Quadro line as "Mosaic" to power multi-monitor for quite some time. Now they are using SLI'd GeForce GPUs to drive the more demanding 3D gaming on 3 LCDs.

Just because they had it on the Quadro line, it doesn't mean it wasn't an afterthought to put it on GeForce...
 
Yes, I'm very sorry I used facts to question a subjective claim about Cypress performance on 3-monitor setups. I humbly apologize for offending you with empirical evidence. Please excuse my mistake.
Scratch the first comment as well, you're just as thick as the rest of them. You inferred overall performance from one set of benchmarks that didn't even use Eyefinity and then claimed that your conclusion must be correct, despite never having even used the hardware or the setup being described. Where's the objectivity again? Yeah, you're a real asset here.

Now get smart and stop replying or I'm going to keep beating you over the head with this.
 
Is it just me or is this whole thread based off the assumption there will only be one "fermi" card?
 
Scratch the first comment as well, you're just as thick as the rest of them. You inferred overall performance from one set of benchmarks that didn't even use Eyefinity and then claimed that your conclusion must be correct, despite never having even used the hardware or the setup being described. Where's the objectivity again? Yeah, you're a real asset here.

Your rhetoric is quite compelling in the face of contradicting evidence. The more you talk, the more the facts become less real.

Now get smart and stop replying or I'm going to keep beating you over the head with this.

Oh no, you're FAR too entertaining for that :)

Is it just me or is this whole thread based off the assumption there will only be one "fermi" card?

Nah it's based off the assumption that 1) everyone is gonna buy 3 monitors and 2) an HD5870 is fast enough to drive such a setup and crossfire isn't necessary.
 
Last edited:
Actually I think you might.

Q: Why is Nvidia requiring SLI for 3-monitors?
A: Because they don't have a single card with (3) digital display outputs, including Fermi.

Q: Why not if they are planning to do "stereo surround"?
A: They weren't planning on it until they realized ATi had a good idea they could steal.

Q: Why the hell would I buy an Nvidia GPU if I was planning on driving 3 monitors?
A: I must have around $2,000 ready to blow on two power-hogging video cards and a couple more monitors.

The point is: Nvidia had no intention of integrating multi-monitor setups into gaming. They responded by figuring out how to get two of their cards to supply video to 3 monitors.

  • Why SLi? - because 3D is twice as demanding as 2D
  • Nvidia *already* has eyefinity in Mosaic for Quadro cards :p
  • No idea why *you* would buy a Nvidia GPU
  • - i guess you wouldn't

Nvidia already had 'eyefinity' for Quadro long before AMD released it
- they call it "Mosaic' .. it is clearly not an "afterthought"
 
pretty sure its already been proven by the H reviews of the 5870 that it can and does drive 3 monitors at full resolution with alot of eye candy turned on..

so I dont see how you can argue this point as being untrue.
 
Your rhetoric is quite compelling in the face of contradicting evidence. The more you talk, the more the facts become less real.
You haven't proved a single thing. You linked to a single set of benchmarks not using Eyefinity and are inferring, despite a complete lack of experience, to prove a point. Which is a joke. So keep it up, I'll keep showing how moronic your argument is and will link back to this whenever you keep trying to present your inane, baseless opinions as fact.
Oh no, you're FAR too entertaining for that :)
The fact that you're trying to laugh this off without showing any evidence otherwise to prove your point shows I've done my job shining a light on your ineptitude. Try to manipulate yourself out of this one, I don't really care. I've already pointed out the problem, anything beyond this is just a victory lap for me.
  • Why SLi? - because 3D is twice as demanding as 2D
  • Nvidia *already* has eyefinity in Mosaic for Quadro cards :p
  • No idea why *you* would buy a Nvidia GPU
  • - i guess you wouldn't

Nvidia already had 'eyefinity' for Quadro long before AMD released it
- they call it "Mosaic' .. it is clearly not an "afterthought"
NVIDIA has a software hack, which is not Eyefinity, so no.
pretty sure its already been proven by the H reviews of the 5870 that it can and does drive 3 monitors at full resolution with alot of eye candy turned on..
so I dont see how you can argue this point as being untrue.
For most people it isn't. What you have here is a couple of NVIDIA fan boys arguing ad nauseum in the hope that people will just give up and they can think themselves right. I think it's funny to see them squirm and enlist in some jihad for a tech company. Keep the entertainment coming :D.
 
pretty sure its already been proven by the H reviews of the 5870 that it can and does drive 3 monitors at full resolution with alot of eye candy turned on..

so I dont see how you can argue this point as being untrue.

This. And what Mr. K6 is saying. It has been proven a 5870 can power 3 displays with medium-high settings and retain great frame rates. Just look on [H]'s reviews.

You might have to sacrifice some IQ for increased FOV and immersiveness, but hell, for some people, it is worth it. Ask Kyle. ;)
 
You haven't proved a single thing. You linked to a single set of benchmarks not using Eyefinity and are inferring, despite a complete lack of experience, to prove a point. Which is a joke. So keep it up, I'll keep showing how moronic your argument is and will link back to this whenever you keep trying to present your inane, baseless opinions as fact.

Come now, let's not equate hard numbers with "inane, baseless opinions". The latter is your specialty. Let's see - one card gets < 40 fps on a single monitor in a given title. So at triple the resolution performance should go up or down? You only have one try to answer the question. You would be much better at this if you would focus on the numbers and not some whimsical notion that you're right cause you say so.
 
NVIDIA has a software hack, which is not Eyefinity, so no.
Ridiculous. Nvidia was demoing its 3-Panel Surround (in 2D) back at Nvision08 using GTX 280 SLI. It is no SW hack as it has been implemented in Quadro long before AMD even thought of Eyefinity :p
:rolleyes:
 
Come now, let's not equate hard numbers with "inane, baseless opinions". The latter is your specialty. Let's see - one card gets < 40 fps on a single monitor in a given title. So at triple the resolution performance should go up or down? You only have one try to answer the question. You would be much better at this if you would focus on the numbers and not some whimsical notion that you're right cause you say so.
You can't throw up numbers and think that automatically makes you correct. What are you, the Glenn Beck of video cards? Like I said, you're inferring from a single set a benchmarks that all games wouldn't run on any Eyefinity setup with a 5870 because one game got below 60FPS. I've used Eyefinity and so has the [H] team, and both of us agree that a 5870 is more than enough for most Eyefinity setups and almost every game out there. You on the other hand, have no experience, and for some reason think that by using your awesome powers of deduction, have outsmarted every one. Give me a break.

Ridiculous. Nvidia was demoing its 3-Panel Surround (in 2D) back at Nvision08 using GTX 280 SLI. It is no SW hack as it has been implemented in Quadro long before AMD even thought of Eyefinity :p
:rolleyes:
So it's an older software hack, it's still a software hack. To deny it otherwise just shows how green with envy you are. You two are ridiculous.
 
So it's an older software hack, it's still a software hack. To deny it otherwise just shows how green with envy you are. You two are ridiculous.

Do you even understand what a "hack" is?
- to say Mosaic is a "hack" shows ignorance in putting down competitant competing technology.

There are advantages and disadvantages to Eyefinity and to Surround. With Eyefinity, you need a Display Port for at least one of the LCDs and you cannot play in 3D.

With Surround, you must have SLi but that is not an issue for most high-end gamers as they will be looking at playing in 3D. Lower-end gamers can use GT 200 SLi to power their Surround in either 2- or 3-D.

i do not understand the "green with envy" comment
:confused:

i am not hurting in any way by currently running i7 920 at 4.0 GHz and HD 4870-X3 TriFire in my PC. i just decided to skip 5870 until the refresh (5890) is out, and then i will decide whether to get a single 5890 or crossfired 5870s. And i will be getting GTX 470 also; perhaps GTX 480. :p
:cool:
 
Do you even understand what a "hack" is?
- to say Mosaic is a "hack" shows ignorance in putting down competitant competing technology.

There are advantages and disadvantages to Eyefinity and to Surround. With Eyefinity, you need a Display Port for at least one of the LCDs and you cannot play in 3D.

With Surround, you must have SLi but that is not an issue for most high-end gamers as they will be looking at playing in 3D. Lower-end gamers can use GT 200 SLi to power their Surround in either 2- or 3-D.

i do not understand the "green with envy" comment
:confused:

i am not hurting in any way by currently running i7 920 at 4.0 GHz and HD 4870-X3 TriFire in my PC. i just decided to skip 5870 until the refresh (5890) is out, and then i will decide whether to get a single 5890 or crossfired 5870s. And i will be getting GTX 470 also; perhaps GTX 480. :p
:cool:
Because NVIDIA surround is a software hack, a band-aid if you will. The underlying technology might have been present and used for some time can be fantastic, doesn't mean it wasn't some knee-jerk reaction to Eyefinity. Eyefinity is a hardware solution that has a wide range of applications (beyond gaming, oh god, who can say such words). I've stated before that I think Eyefinity is ahead of the monitors it needs to support it, it doesn't mean I don't think it's great or don't like its implementation. And I'm talking purely about triple screen support, because NVIDIA's 3D sucks and is a joke, and we need much better 3D technology before I jump on that boat. Most high-end gamers don't have multi-GPU setups and most high end gamers aren't looking at playing in 3D. If you want to label yourself as nonpartisan, great, we need more people like that on the forums, just don't make ridiculous statements that make it seem otherwise.
 
I don't see what the big deal is, I'm currently using 3x24's with a single 5870 and for the most part can throw anything I want at it which is primarily L4D2, modded Oblivion, Bioshock, Fallout 3 and a ton of others, my frame rate has been pretty solid which I honestly didn't expect it seems like the 5870 enjoys whatever the hell I throw at it.

With a single card most of the main games I play (L4D2, HEAVILY modded Oblivion) can be run maxed at 5870x1200 just fine, I've never had any serious lag that I've encountered. From a hardware standpoint I've run everything from a 7800GT, 7900GT, 8800 GT's in SLi. That's not to say I've had my share of ATi's as well: 9800 Pro, X800 and now a 5870 with probably A second at some point after Fermi depending on price drops, but I've even questioned the use of it since most games that I play don't need it what so ever, which I completely wasn't expecting.

But even if not the single performance card performance of the 5870 has wowed me, I honestly didn't think it would push L4D2 maxed at 8x AA as well as it does, I mean don't get me wrong it's no Crysis as far as what it needs but I'm loving it so far.

The only game that I have that it can't run maxed is of course Crysis (not War Head) which I really don't play too much however it plays about the same as my 8800 GT's did at 1920x1200 so getting that performance from a single card at 5760x1200 has been pretty sweet.

The whole point about Nvidia's surround panel is ridiculous, if nVidia had their way it'd have been years until we got their version Eyefinity tech if at all so the fact that people are getting it as soon as they are (especially for legacy cards) is a huge bonus so just enjoy it, it's awesome to have.
 
Because NVIDIA surround is a software hack, a band-aid if you will. The underlying technology might have been present and used for some time can be fantastic, doesn't mean it wasn't some knee-jerk reaction to Eyefinity. Eyefinity is a hardware solution that has a wide range of applications (beyond gaming, oh god, who can say such words). I've stated before that I think Eyefinity is ahead of the monitors it needs to support it, it doesn't mean I don't think it's great or don't like its implementation. And I'm talking purely about triple screen support, because NVIDIA's 3D sucks and is a joke, and we need much better 3D technology before I jump on that boat. Most high-end gamers don't have multi-GPU setups and most high end gamers aren't looking at playing in 3D. If you want to label yourself as nonpartisan, great, we need more people like that on the forums, just don't make ridiculous statements that make it seem otherwise.

Ridiculous. Nvidia has also been pioneering multi-monitor setups that extend beyond gaming; i saw it first hand at Nvision08, at their GTC09 and at CES2010. Nvidia and ATi have been paralleling each other for years.

As to Nvidia's Surround, you may choose to run it in 2D. i have not spent enough time with 3D vision gaming nor Surround 3D vision gaming to give a final judgment call on it or myself.

You certainly cannot call yourself "non-partisan" since you spend most of your time building up ATi graphics while ridiculing Nvidia's efforts. :p
. . . i also don;t think you are qualified to say what HW a "high end" gamer uses
 
Ridiculous. Nvidia has also been pioneering multi-monitor setups that extend beyond gaming; i saw it first hand at Nvision08, at their GTC09 and at CES2010. Nvidia and ATi have been paralleling each other for years.
Doesn't mean their implementation is anything close to Eyefinity, now does it? They might have been working on the technology longer, but they never did anything with it. Also, from what we've been shown or at least the rumors we know about it, it seems at the most equal to Eyefinity and in many respects is inferior. You don't think it's funny that they announced Surround soon after AMD launched Eyefinity? If AMD hadn't entered that market, how long do you think it would have taken them to get something out there?
As to Nvidia's Surround, you may choose to run it in 2D. i have not spent enough time with 3D vision gaming nor Surround 3D vision gaming to give a final judgment call on it or myself.
That's a good call, I respect the stance of holding off judgment before experiencing the technology.
You certainly cannot call yourself "non-partisan" since you spend most of your time building up ATi graphics while ridiculing Nvidia's efforts. :p
. . . i also don;t think you are qualified to say what HW a "high end" gamer uses
Because I point out that one company has flaws doesn't mean I'm partisan. I'm not saying "Eyefinity is better because AMD rulez!" I'm saying I think Eyefinity's implementation and technology is better. It's very likely that on the surface, the two solutions could operate so similarly that it would be tough to tell which is which in a blind test.

And I build at least five systems a year for other people, most of them for gamers. I think I have a generally good idea about what's out there on the market. If you think most high-end gamers are sporting a multi-GPU setup, you are seriously out of touch with the real world. These are seriously niche technologies that few people buy into.
 
I'm currently using 3x26" displays with single 5870. With 2xaa, 4xaf everything is playable in Eyefinity - from Shift, through Mass Effect 2, Dragon Age, Age of Conan and Star Trek Online. Though making AAx4 chokes the games.

However I'll take Eyefinity overy any single display setting, it's just great and much improves gaming immersion. I'll take it over 3d that nvisia makes, actually over any 3d system, be it Ati, nvidia, even intel - my eyes can't bear the stress of gaming in the 3d glassess, and after 15 minutes I've to make a break. So for me multi-display setting is winning solution, but 3d is not.

I doubt there will be any game that will require much more power then 5870 to be pulled with all eye candy, so, I rather get 2nd 5870 and 2 more displays (as I'll have to give away current setup in a week = hard life of reviewer :p) then buy 2 nvidia cards and then 3x120hz displays.

Eyefinity will give me same level of immersion and will be much more cheaper.

And for ports - I use 1x DVI, 1xHDMI 1xDisplay port - no problem with setting up the system.
 
Doesn't mean their implementation is anything close to Eyefinity, now does it? They might have been working on the technology longer, but they never did anything with it. Also, from what we've been shown or at least the rumors we know about it, it seems at the most equal to Eyefinity and in many respects is inferior. You don't think it's funny that they announced Surround soon after AMD launched Eyefinity? If AMD hadn't entered that market, how long do you think it would have taken them to get something out there?
That's a good call, I respect the stance of holding off judgment before experiencing the technology.
Because I point out that one company has flaws doesn't mean I'm partisan. I'm not saying "Eyefinity is better because AMD rulez!" I'm saying I think Eyefinity's implementation and technology is better. It's very likely that on the surface, the two solutions could operate so similarly that it would be tough to tell which is which in a blind test.

And I build at least five systems a year for other people, most of them for gamers. I think I have a generally good idea about what's out there on the market. If you think most high-end gamers are sporting a multi-GPU setup, you are seriously out of touch with the real world. These are seriously niche technologies that few people buy into.

Again, you have to realize that AMD/ATi debuted Eyefinity with the 5000 series. Nvidia is doing the *same thing* with their GF100 series. It is a FACT that Nvidia is about 4-6 months "late". But you cannot say with any certainty that is a 'knee jerk" reaction to AMD's Eyefinity. If so, you give Nvidia a LOT of credit for implimenting a competing solution in 6 months.

You and i have a very different definition of "high end" gamer. i co-own a tech site.
;)
 
  • Why SLi? - because 3D is twice as demanding as 2D
  • Nvidia *already* has eyefinity in Mosaic for Quadro cards :p

    Nvidia already had 'eyefinity' for Quadro long before AMD released it
    - they call it "Mosaic' .. it is clearly not an "afterthought"


  • I think that it was an afterthought, but I can't see how that is relevant for other then PR reasons. Point is that they are coming with a solution for the consumer market.

    Yes, SLI mosaic is something that Nvidia have had for years and there has also been a similar software solution called softTH, which could be used for both ATI and Nvidia cards.

    However, Nvidia haven't offered this for consumers until Eyefinity and currently isn't offering it. Cypress was build with multi-monitor functionality in mind. Therefore it has much more display pipelines then the 4000 series build into the GPU.

    As with softTH, dual GPU solution is a requirement. I don't think that Nvidia chose dual gpu as a solution because thats best, but because they haven't build their cards capable of running multiple monitors on a single card. Nvidia then HAVE TO use two cards.

    Having the possible of using BOTH crossfire or single GPU on a multi-monitor setup is a bonus, not a drawback. Forcing the use of Crossfire or SLI is a drawback, since it provides with less options. Come on, even though one should use SLI for 3X 120hz 3d surround its not a good thing that you have to use it to get multi-monitor gaming.

    But, both are offering multi-monitor gaming and we should all be happy for that. This way, we will not be tied to a single vendor after we bought 3 screens.

    Nvidia surround and ATI Eyefinity gives choices. You should all be happy about that, instead of having a pissing contest about who was first out. Which solution that is optimal is something we will first see when Nvidia actually releases Nvidia surround.
 
Nvidia surround and ATI Eyefinity gives choices. You should all be happy about that, instead of having a pissing contest about who was first out. Which solution that is optimal is something we will first see when Nvidia actually releases Nvidia surround.

Here i am totally agreed.

i am just aware that Nvidia has been working on multi-monitor for gaming for quite some time; at least for 1-1/2 years when they demoed it at Nvision08.

The advantages of Surround over Eyefinity, is that it will be available for older SLi cards and that Display Port is not required.

i see trade offs that each manufacturer chose which they felt were best. i am looking forward to testing and playing with both Surround and Eyefinity. i cannot say what i will like best atm.
 
Again, you have to realize that AMD/ATi debuted Eyefinity with the 5000 series. Nvidia is doing the *same thing* with their GF100 series. It is a FACT that Nvidia is about 4-6 months "late". But you cannot say with any certainty that is a 'knee jerk" reaction to AMD's Eyefinity. If so, you give Nvidia a LOT of credit for implimenting a competing solution in 6 months.
But it's not the same thing, that's my point. It seems like you're presenting two points: 1) NVIDIA was doing multimonitor gaming/apps before AMD and 2) They're the same thing. However it doesn't matter how much longer NVIDIA was working on it if AMD developed a superior solution and came out with it first, because as consumers that's the only benefit we get. Also if NVIDIA was working with multi-monitor for so long, why should they be given credit for finally getting off their asses and giving it to customers, considering it's a software-only solution? Seriously, some of your arguments are very weak. And let me again say that on the surface Surround may work very much the same as Eyefinity. However it is not the same thing, it is a not a hardware-implemented solution, and has the immediate disadvantage of requiring multi-GPU to run, unlike Eyefinity. And that's only from the little we know as of now.
You and i have a very different definition of "high end" gamer. i co-own a tech site.;)
Evidently, but when you're marketing a tech solution, would you rather market it to 10,000 people or 1,000,000 people?
Nvidia surround and ATI Eyefinity gives choices. You should all be happy about that, instead of having a pissing contest about who was first out. Which solution that is optimal is something we will first see when Nvidia actually releases Nvidia surround.
You make some excellent points and I share your sentiments. Competition is a great thing, and how it plays out remains to be seen.
 
You miss the point.

Nvidia has implemented this identical multi-monitor solution long ago with Quadro. It is called Mosaic and they just ported it over to their GeForce drivers. i believe that besides GF100, GT 200 owners should also be able to drive 3 monitors in SLi.

Nvidia also offers their Surround 3-panel solution in 3D; this is something AMD has yet to do with Eyefinity.
- Nvidia's Surround is far more demanding performance-wise over Eyefinity to render three 120hz LCD panels for 3D, so they limit it to SLi-powered PCs

honestly i doubt the 3D thing will catch on much at all. you need the monitors, the glasses and the games. the tech isnt going to be much more advanced than "avatar" which while kinda neat to watch, was far from flawless.

no matter which front you want to back, dual video cards will give you much better performance. this isnt to hard to figure out. duh. but the fact is that you CAN drive 3 screens NOW with ONE ati card and have decent graphic settings while the games still being very playable. with all the money nvidia has to back its r+d dept i doubt very much that they couldnt have released fermi a long time ago. my guess is that they are going to back 3d more than multimonitor support, simply because they know they dont have the upper hand, and this is where there will surely fail. the only way ati can fail with eyefinity is a lack of marketting and getting developers to enter some simple fov coding.....if ati hired some of the marketing guys from apple they would shove nvidia in the ground.....this is why the ipod is prominent.
 
honestly i doubt the 3D thing will catch on much at all. you need the monitors, the glasses and the games. the tech isnt going to be much more advanced than "avatar" which while kinda neat to watch, was far from flawless.

no matter which front you want to back, dual video cards will give you much better performance. this isnt to hard to figure out. duh. but the fact is that you CAN drive 3 screens NOW with ONE ati card and have decent graphic settings while the games still being very playable. with all the money nvidia has to back its r+d dept i doubt very much that they couldnt have released fermi a long time ago. my guess is that they are going to back 3d more than multimonitor support, simply because they know they dont have the upper hand, and this is where there will surely fail. the only way ati can fail with eyefinity is a lack of marketting and getting developers to enter some simple fov coding.....if ati hired some of the marketing guys from apple they would shove nvidia in the ground.....this is why the ipod is prominent.
First of all, i guarantee 3D is the next big thing in both gaming and television. AMD is exploring it now and they will have their own 3D. Sony is working on it for the PS platform as is MS for the Xbox. So it is definitely going to "catch on" :p

i get your point; AMD is the ONLY company with DX11 also
- but that is changing in a few weeks to a few months . . . there is no way to proclaim either company has the better solution - except *now* - when AMD has the only solution.

There are a lot of companies that were "first" but failed later on to competitors who implemented it better or had better marketing. So that point is moot.

Evidently, but when you're marketing a tech solution, would you rather market it to 10,000 people or 1,000,000 people?
Multi-GPU is far more than 10,000 people. Also, you market your TOP product so that brand name recognition will filter through the entire line up.
 
First of all, i guarantee 3D is the next big thing in both gaming and television. AMD is exploring it now and they will have their own 3D. Sony is working on it for the PS platform as is MS for the Xbox. So it is definitely going to "catch on" :p

i get your point; AMD is the ONLY company with DX11 also
- but that is changing in a few weeks to a few months . . . there is no way to proclaim either company has the better solution - except *now* - when AMD has the only solution.

There are a lot of companies that were "first" but failed later on to competitors who implemented it better or had better marketing. So that point is moot.


Multi-GPU is far more than 10,000 people. Also, you market your TOP product so that brand name recognition will filter through the entire line up.

That's quite a few generalizations. 3D as the "next big thing" in what manner? I would suggest linking your first statement with your last statement. This kind of 3D isn't going to revolutionize the gaming industry any time soon, if ever. 3D is a value added product that happens to be possible from indirect improvements in panel tech. 3D, let's be reminded, was possible before lcds took a shit on the market. True 120Hz might be the next big thing, which in turn feeds 3D. Furthermore, Nvidia hasn't busted the 3D market wide open, and if/when 3D becomes prevalent, the honour won't be Nvidia's.

The point is NOT moot. Eyefinity already works. No one here is going to argue it's ATI's fault for developers not supporting larger FOVs. No one is going to argue that displayport is the wrong way to implement eyefinity (adapters suck, but considering the length of time DP has been present and the flexibility of it in monitor applications, the blame rests on manufactures).

DX11 already works. The ability for developers to progress with DX11 is on an ATI card, so right now, the gaming industry is better because of ATI.

ATI has pushed the envelope. Competitors may implement better solutions, but we can clearly see they (Nvidia) wasn't going to implement shit had ATI not raised the bar. I would wary a guess the same about lcd manufacturers and DP.
 
That's quite a few generalizations. 3D as the "next big thing" in what manner? I would suggest linking your first statement with your last statement. This kind of 3D isn't going to revolutionize the gaming industry any time soon, if ever. 3D is a value added product that happens to be possible from indirect improvements in panel tech. 3D, let's be reminded, was possible before lcds took a shit on the market. True 120Hz might be the next big thing, which in turn feeds 3D. Furthermore, Nvidia hasn't busted the 3D market wide open, and if/when 3D becomes prevalent, the honour won't be Nvidia's.

The point is NOT moot. Eyefinity already works. No one here is going to argue it's ATI's fault for developers not supporting larger FOVs. No one is going to argue that displayport is the wrong way to implement eyefinity (adapters suck, but considering the length of time DP has been present and the flexibility of it in monitor applications, the blame rests on manufactures).

DX11 already works. The ability for developers to progress with DX11 is on an ATI card, so right now, the gaming industry is better because of ATI.

ATI has pushed the envelope. Competitors may implement better solutions, but we can clearly see they (Nvidia) wasn't going to implement shit had ATI not raised the bar. I would wary a guess the same about lcd manufacturers and DP.

You should write PR for AMD. :p

i stand by what i wrote. 3D is the next big thing. it was what CES featured endlessly. i did not go into any details because they are not relevant to the topic - i just answered you and your attempt to minimize it because AMD is late compared to Nvidia.

You don't seem to get the fact that Surround is only 'late' because Fermi is.

DX11 works because of Microsoft - not AMD. All of the devs in twiimtbp program have had GF100 for quite some time, so the industry is not being held back by Nvidia in any way shape or form. Nvidia also has a workable physics implementation while ATi appears to be waiting for a 3rd party to implement it for them. Nvidia is miles ahead of Stream with CUDA.

So what? Each vendor pushes the envelope and technology in their own way. i am impressed with what both AMD and Nvidia has accomplished in the industry independently of each other and they push each other to compete and become better at what they do. i love both companies.
 
eyefinity has a much larger appeal than 3d though due to its affordability as far as new technology goes.. there is no cheap way to get a 3d system, not to mention there are quite a few games that will not work with it. atleast with eyefinity there are hacks that force the game to run at a large resolution setting (like mirrors edge). and on top of that the loss of FPS is much less in eyefinity than it is in 3d

surround wouldn't exist if ATI didnt make Eyefinity. and you can tell its been hastily put together and you have to have sli to use it which will nearly double its price in comparison to Eyefinity.

I love capitalistic competition because it forces companies to give us the best product out there, and right now ATI has it but that doesn't necessarily mean though that they will have it in a year
 
You should write PR for AMD. :p

i stand by what i wrote. 3D is the next big thing. it was what CES featured endlessly. i did not go into any details because they are not relevant to the topic - i just answered you and your attempt to minimize it because AMD is late compared to Nvidia.

You don't seem to get the fact that Surround is only 'late' because Fermi is.

DX11 works because of Microsoft - not AMD. All of the devs in twiimtbp program have had GF100 for quite some time, so the industry is not being held back by Nvidia in any way shape or form. Nvidia also has a workable physics implementation while ATi appears to be waiting for a 3rd party to implement it for them. Nvidia is miles ahead of Stream with CUDA.

So what? Each vendor pushes the envelope and technology in their own way. i am impressed with what both AMD and Nvidia has accomplished in the industry independently of each other and they push each other to compete and become better at what they do. i love both companies.

I'm the PR rep?

You even mentioned Surround worked on non-Fermi cards. What is the technological reasoning for Nvidia Surround having to coincide with Fermi?

CES has had 30" OLED displays along with a dozen other varieties of OLED. Let me know when you buy one. Make sure you get the 3D version.

I never said DX11 works because of ATI. I said DX11 is only running on ATI cards right now. Hence, DX11 in playable in the gaming community because of ATI. The argument isn't moot because developers have green DX11 cards,

The topic is about Fermi vs. 5870 + eyefinity. The here-and-now, which has been going on for 6 months now, is 5870+eyefinity. Even when Fermi is released, 5870+eyefinity will be the mature choice. Add another 3 months and maybe Fermi will be a solution to contemplate, but in my opinion, the day Fermi is released, it's still in consumer's best interest to go ATI.
 
Last edited:
I'm the PR rep?

You even mentioned Surround worked on non-Fermi cards. What is the technological reasoning for Nvidia Surround having to coincide with Fermi?

CES has had 30" OLED displays along with a dozen other varieties of OLED. Let me know when you buy one. Make sure you get the 3D version.

I never said DX11 works because of ATI. I said DX11 is only running on ATI cards right now. Hence, DX11 in playable in the gaming community because of ATI. The argument isn't moot because developers have green DX11 cards,

The topic is about Fermi vs. 5870 + eyefinity. The here-and-now, which has been going on for 6 months now, is 5870+eyefinity. Even when Fermi is released, 5870+eyefinity will be the mature choice. Add another 3 months and maybe Fermi will be a solution to contemplate, but in my opinion, the day Fermi is released, it's still in consumer's best interest to go ATI.

You sound like Huddy when he goes on a "Nvidia is holding back the industry"
- you can take that as a compliment, also ;)

Surround has worked on Quadro for some time. Nvidia demoed it on GeForce back at Nvision08 and clearly they have been working on it while AMD has been working on Eyefinity.

We don't what shape Fermi will be in when it arrives. Agreed. The topic IS about Fermi vs. Eyefinity so clearly we MUST talk about a yet-as-unreleased product. We do know the feature set and that is what we are comparing. :p

We will know in a few weeks and at least know something "major" Monday morning with Nvidia's announcement on twitter and facebook.
 
You can't throw up numbers and think that automatically makes you correct. What are you, the Glenn Beck of video cards? Like I said, you're inferring from a single set a benchmarks that all games wouldn't run on any Eyefinity setup with a 5870 because one game got below 60FPS. I've used Eyefinity and so has the [H] team, and both of us agree that a 5870 is more than enough for most Eyefinity setups and almost every game out there. You on the other hand, have no experience, and for some reason think that by using your awesome powers of deduction, have outsmarted every one. Give me a break.

Fine, you win. Numbers don't matter. And Glenn Beck? lolz.....
 
I'd not believe in 3d games being "next thing" - it's too much stress on eyes - lots of people who have been using nvidia 3d solution can't play more then hour a day. My eyes can take less then 15 minutes, and the effects are not that great. I take Eyefinity hands down - at least it lets me have fun for more then few minutes

Besides for 3d movies to kick into consumer's living room it needs price to be lowered - 3k USD TV Set, then we talk about 1k for BluRay, also another 2k for some good Denon home theater - I can't see anyone sane making that switch for just one or two titles, no 3d channels and so on.

The next generation of TV sets, like LG is working on, that will not require glassess, due to special glass covering TV might be the "big thing" current generation will not be, IMO.
 
First of all, i guarantee 3D is the next big thing in both gaming and television. AMD is exploring it now and they will have their own 3D. Sony is working on it for the PS platform as is MS for the Xbox. So it is definitely going to "catch on" :p

I'm going to hold you to that. 3D has tried to take off for *years* now. There was a huge push for it like 10 years ago (or more?), didn't catch on then, I'm betting it still won't catch on now. The glasses just suck.
 
Surround has worked on Quadro for some time. Nvidia demoed it on GeForce back at Nvision08 and clearly they have been working on it while AMD has been working on Eyefinity.

We don't what shape Fermi will be in when it arrives. Agreed. The topic IS about Fermi vs. Eyefinity so clearly we MUST talk about a yet-as-unreleased product. We do know the feature set and that is what we are comparing. :p

We will know in a few weeks and at least know something "major" Monday morning with Nvidia's announcement on twitter and facebook.

ATI has shown off their 3D technology, but haven't implemented it. Like Nvidia Surround, they have the tech, but are holding off to showcase it when they can market it with a significant impact.

The thing is, no one is talking about ATI 3D because no one cares. I haven't heard of any ATI supporters holding out until ATI 3D is released. You're defending Nividia Surround because we don't know anything yet. All Nividia has to do is release it. So if we MUST talk about it, let's talk about the delay because they want Fermi to make a bigger splash. CES was in January, they could have released it then, but instead, we're looking at 3 months later. And lets not forget they had it since Nvision'08.

A few weeks has been a few weeks for 6 months, lets not go down that path. I'm not willing to allow Nvidia to lead me around like that. Yes, the question is Fermi vs. 5870+eyefinity, and as I've said, that question is answered. Maybe in a few weeks we'll know differently, but the ability for anyone to back up Nvidia's claims is weak. What you've seen at Nvision'08 hasn't materialized to the consumer, what we saw at Nvidia's previous announcement was a fake card, and what we saw at CES was old promises.

This isn't about ATI support, it just happens that there is nary a thing that Nvidia is doing RIGHT NOW that has bearing on this discussion. Nvidia is building hype at the cost of the consumer. And what is worse, because of NO competition from Nvidia, ATI is free to improve their tech at a leisurely pace. Innovation is going on and Nivida has made significant contributions, but these contributions aren't the culmination that is Fermi.
 
Back
Top