Fermi vs. 5870 + Eyefinity

Yossarian22

[H]ard|Gawd
Joined
Apr 27, 2009
Messages
1,799
To start off, this thread isn't meant to be a troll thread, I'm asking legitimate questions here.

Let us assume for a moment that Fermi came out tomorrow. It was $600 and it beat the 5870 in most if not all applications by 30%. (that's a bit generous since people have been speculating at a 20%).

What would the incentive be as a gamer to buy a $600 Fermi that can only drive one display at a time? nFinity only works with SLI. For approximately $100 more ($700 bones) you could get an HD5870 and 2 more monitors (this assumes you already have a 22" or 20" or 23") and the entire 5xxx series is already above MSRP due to lack of competition and yields and what have you, expect the price of it to drop (not sure HOW much) when Fermi releases.

I mean, with the consolization of games (they aren't really advancing in requirements due to consoles), what is the point of driving one 1920x1080 display with super badass Fermi when you could drive three (3) displays with a 5870 and still be able to crank the settings? All for the same-ish price, at the moment about $100 more.

Do you see what I'm getting at here? Unless nFinity/Fermi can drive 3 displays at once and the Eyefinity thing isn't for you, what is the point, really, of getting a Fermi (this assumes you don't do NSA@home)? :confused:
 
Once it's released, many will be considering whether it's worth it to them.
Many bought 5870s and 5970s at prices way above MSRP.
Some let the upgrade bug nudge them into hitting the order now and have regrets later.
Some are totally happy with their choice.
 
The point is that a single card driving 3 monitors give you crap FPS. NVIDIA is correct in that 2 cards in SLI will give you a much better result with 3 monitors at one time.

ATI doesn't allow Crossfire with Eyefinity. Either that or the can't make it work.
 
The point is that a single card driving 3 monitors give you crap FPS. NVIDIA is correct in that 2 cards in SLI will give you a much better result with 3 monitors at one time.

ATI doesn't allow Crossfire with Eyefinity. Either that or the can't make it work.

CF with Eyefinity has worked since the 9.12 hotfix drivers released in December. The upcoming 10.3 drivers will also add bezel management.
 
The point is that a single card driving 3 monitors give you crap FPS. NVIDIA is correct in that 2 cards in SLI will give you a much better result with 3 monitors at one time.

ATI doesn't allow Crossfire with Eyefinity. Either that or the can't make it work.

u have no idea what your talking about. your either flat out unaware of what you speak or flat out lieing.
 
I would NEVER run 3 monitors on a single card and expect good results. Anything less than 60, no even 100 FPS in most games just doesn't cut it.
 
The point is that a single card driving 3 monitors give you crap FPS. NVIDIA is correct in that 2 cards in SLI will give you a much better result with 3 monitors at one time.

You say that as if Nvidia made the conscious choice to only allow SLI - they didn't. It is just the only way that they can get enough outputs. Nvidia was blind-sided by Eyefinity just like everyone else was.

I would NEVER run 3 monitors on a single card and expect good results. Anything less than 60, no even 100 FPS in most games just doesn't cut it.

Haha, haha, ha... ha. Playing on an LCD?
 
Yes, 3x 120hz ones.

Well probably 90%+ has 60hz monitors, so for them it doesn't matter. And if you spend money on the 120hz monitors, then you should be aware that you needed a much larger graphics card setup.

But saying EyeFinity doesn't work on a single card solution in general, is just a load of bull.

If you have a high hz, super high resolution, super high details and such, yes then it probably will become a problem with a single card - but you could just downscale your needs just a bit, to make it actually work. I presume playing EyeFinity > SuperDuperJedi graphics - that is atleast what I think :)

Back to the point - isn't nFinity going to be software driven? Because usually hardware driven kinda beats software solutions. So I suspect that nVidia is going to have some serious issues in the beginning, and you probably have to buy the highend cards in SLi to make it work - time will tell and I hope that I will be pleasantly suprised :)
 
Yes, 3x 120hz ones.

jesus, you can tell what he runs because its in his sig... oh wait


just like to point out 99% of people cant tell the difference between 60 and 100 fps




carry on trolling
 
The point is that a single card driving 3 monitors give you crap FPS. NVIDIA is correct in that 2 cards in SLI will give you a much better result with 3 monitors at one time.

ATI doesn't allow Crossfire with Eyefinity. Either that or the can't make it work.

Yeah, also, the massive delay of fermi is a good will of nvidia, it would let you spend more time with your mom, that reduces family violence.
 
Do you see what I'm getting at here? Unless nFinity/Fermi can drive 3 displays at once and the Eyefinity thing isn't for you, what is the point, really, of getting a Fermi (this assumes you don't do NSA@home)? :confused:

Strange question. Aren't you assuming that everyone is going to go out and buy 3 monitors?
 
it probably depends on a couple of things. I suppose the sort of enthusiast that would drop the cash and desk pace for a three monitor setup, may not mind paying an extra few hundred to ensure that they can run on high or highest settings by getting another card.
I would hazard to guess that alot of enthusiasts don't mind which brand, and would care more about which setup will benefits them most, with budget as a secondary restraint.
 
Strange question. Aren't you assuming that everyone is going to go out and buy 3 monitors?

Mate, this is an enthusiast forum :D

I'm not really sure how two monitors is better than 3, i've had dual monitors before and having the screen split down the middle while playing was a pain.
 
Being an enthusiast has nothing to do with whether or not you have 3 monitors. Millions of people buy mid/high end video cards. Millions of people aren't going to buy 3 monitors. Let's assume every single person with a 3 monitor setup goes with Cypress. Is that really going to impact Fermi sales? Nope.
 
I would NEVER run 3 monitors on a single card and expect good results. Anything less than 60, no even 100 FPS in most games just doesn't cut it.



I have a single 5870 pushing three 19" monitors.
Max Settings in most games - smooth as silk.

Dirt 2 - everything maxed - 72 fps
COD World at War - 180 fps - medium/high settings

Can you really see 100 fps? You can't even see 60 fps.


Eyefinity vs Nvidias Solution - Eyefinity is hardware based. Nvidia is software based and an after-thought - Maybe Nvidia nugged softTH to send them their stuff.
 
Last edited:
To start off, this thread isn't meant to be a troll thread, I'm asking legitimate questions here.

Let us assume for a moment that Fermi came out tomorrow. It was $600 and it beat the 5870 in most if not all applications by 30%. (that's a bit generous since people have been speculating at a 20%).

What would the incentive be as a gamer to buy a $600 Fermi that can only drive one display at a time? nFinity only works with SLI. For approximately $100 more ($700 bones) you could get an HD5870 and 2 more monitors (this assumes you already have a 22" or 20" or 23") and the entire 5xxx series is already above MSRP due to lack of competition and yields and what have you, expect the price of it to drop (not sure HOW much) when Fermi releases.

I mean, with the consolization of games (they aren't really advancing in requirements due to consoles), what is the point of driving one 1920x1080 display with super badass Fermi when you could drive three (3) displays with a 5870 and still be able to crank the settings? All for the same-ish price, at the moment about $100 more.

Do you see what I'm getting at here? Unless nFinity/Fermi can drive 3 displays at once and the Eyefinity thing isn't for you, what is the point, really, of getting a Fermi (this assumes you don't do NSA@home)? :confused:

You are making a lot of assumptions. Hang on a bit and you will have less speculation.

The highest end gamer is going to be looking to GTX 480 SLi to drive three 120MHz LCDs for 3D gaming.

Eyefinity vs Nvidias Solution - Eyefinity is hardware based. Nvidia is software based and an after-thought - Maybe Nvidia nugged softTH to send them their stuff.
It is no "afterthought". Nvidia has used it in their Quadro line as "Mosaic" to power multi-monitor for quite some time. Now they are using SLI'd GeForce GPUs to drive the more demanding 3D gaming on 3 LCDs.
 
Last edited:
I mean, with the consolization of games (they aren't really advancing in requirements due to consoles), what is the point of driving one 1920x1080 display with super badass Fermi when you could drive three (3) displays with a 5870 and still be able to crank the settings? All for the same-ish price, at the moment about $100 more.

That is the key isn't it. Nvidia's solution is an afterthought just to catch up to ATi and it isn't as practical. I haven't played a game yet that my 5870 hasn't smashed through.
 
That is the key isn't it. Nvidia's solution is an afterthought just to catch up to ATi and it isn't as practical. I haven't played a game yet that my 5870 hasn't smashed through.

You miss the point.

Nvidia has implemented this identical multi-monitor solution long ago with Quadro. It is called Mosaic and they just ported it over to their GeForce drivers. i believe that besides GF100, GT 200 owners should also be able to drive 3 monitors in SLi.

Nvidia also offers their Surround 3-panel solution in 3D; this is something AMD has yet to do with Eyefinity.
- Nvidia's Surround is far more demanding performance-wise over Eyefinity to render three 120hz LCD panels for 3D, so they limit it to SLi-powered PCs
 
Last edited:
Just for some clarification, but does nvidia multi monitor require two of the same cards since it's specifically SLI?

What's the advantage of doing this over say SoftTH which does require two cards, but the second card is just there for the extra monitor port so it can be a cheap card.

I see a lot of people claiming this but aren't there games that chug even on one 24" monitor with a single 5870? So how are you guys getting world beating performance on three monitors?
No the 5870 won't get 100 fps on triple wide. But it's good enough for me. I think what's more exciting right now is that both companies are starting to support surround gaming so moving forward I'm assuming every gen after this will support surround gaming. This is an exciting prospect since the monitor investment isn't tied to your videocard purchase.
 
Just for some clarification, but does nvidia multi monitor require two of the same cards since it's specifically SLI?

Good question, since it's all software based it may be possible to just use some uber-cheapo card just for the extra display outputs. No idea if they'll do that though.

No the 5870 won't get 100 fps on triple wide. But it's good enough for me. I think what's more exciting right now is that both companies are starting to support surround gaming so moving forward I'm assuming every gen after this will support surround gaming. This is an exciting prospect since the monitor investment isn't tied to your videocard purchase.

Agreed, just pointing out that claims of 5870's tearing up 3 monitors seem to be nonsense.
 
Agreed, just pointing out that claims of 5870's tearing up 3 monitors seem to be nonsense.
Why? Because you've actually used a 5870 and your opinion is worth something? Oh wait...

I'll take a hardware solution any day. I like Eyefinity's flexibility and functionality, I think we'll have to wait for monitors to catch up to it for it to reach its full potential. Add in the fact that Eyefinity is getting bezel compensation and other tweaks soon, it's looking good. Even if on the surface NVIDIA's solution is exactly the same, the SLI-only requirement kills it. It'd have to add something else, like support for non-identical displays, to make itself stand out.
 
To start off, this thread isn't meant to be a troll thread, I'm asking legitimate questions here.

Let us assume for a moment that Fermi came out tomorrow. It was $600 and it beat the 5870 in most if not all applications by 30%. (that's a bit generous since people have been speculating at a 20%).

That's a simple calculation.

You can buy two PROBABLY very hot, very noisy, barely (if at all) overclockable Unobtanium cards to do 'eyefinity' at $600/ea. = $1200 that can run every game in sight at very high or maxed out settings.

or ...

You can buy two DEFINTELY very overclockable, very cool running, and very quiet Powercolor AX5850's for $300/ea. = $600 that will run every game in sight in Eyefinity at very high or maxed out settings.
 
Last edited:
To start off, this thread isn't meant to be a troll thread, I'm asking legitimate questions here.

Let us assume for a moment that Fermi came out tomorrow. It was $600 and it beat the 5870 in most if not all applications by 30%. (that's a bit generous since people have been speculating at a 20%).

What would the incentive be as a gamer to buy a $600 Fermi that can only drive one display at a time?
(snip)

It's pretty much a given that Fermi can't drive 3 displays with a single card. However, there are several assumptions in the OP:
1. Everyone wants 3 display gaming (and doesn't care about the thick bezel in the middle issue with their current monitor). Also, by extension, people who already have multiple displays with mis-matched resolutions / sizes will now sell those extra displays and get matching displays.
2. A single Fermi can only drive 1 display - this may turn out to be true, but it's also possible that it could drive 2 (but not 3 with a single card).
3. Fermi will cost $600 and (on average) beat the 5870 by 30%.
4. A single 5870 can drive 3 displays with settings cranked. The 5870 is a brilliant card (I have one) but it can't drive demanding games on 3 monitors with settings cranked.
5. People are perfectly happy with their 5870s and see no value in anything Fermi will offer.
6. People have only a single PC and will choose either a 5870 OR a Fermi :D

As you can see, if some of your assumptions are challenged, the reason for picking a Fermi will become clearer. Not to mention there are those who will buy the single fastest card available as long as it's actually better / faster and has some value proposition. Nvidia is not insane, they are not going to commit market suicide by releasing a card nobody wants at the volumes they can ship. If the 30% performance delta is true, then I can only see a $600 price tag while the volumes are really low - because they know they will be able to sell a small number even at that price.
 
Negative. Nvidia never had time to stick a DisplayPort on Fermi (hw+sw+firm issues) when their knee jerked violently due to the worlds positive reaction to EyeFinity and they added 3D Surround.
 
It's pretty much a given that Fermi can't drive 3 displays with a single card. However, there are several assumptions in the OP:
1. Everyone wants 3 display gaming (and doesn't care about the thick bezel in the middle issue with their current monitor). Also, by extension, people who already have multiple displays with mis-matched resolutions / sizes will now sell those extra displays and get matching displays.
2. A single Fermi can only drive 1 display - this may turn out to be true, but it's also possible that it could drive 2 (but not 3 with a single card).
3. Fermi will cost $600 and (on average) beat the 5870 by 30%.
4. A single 5870 can drive 3 displays with settings cranked. The 5870 is a brilliant card (I have one) but it can't drive demanding games on 3 monitors with settings cranked.
5. People are perfectly happy with their 5870s and see no value in anything Fermi will offer.
6. People have only a single PC and will choose either a 5870 OR a Fermi :D

1. 99.99% is close enough to make that argument beyond moot.
2. We already know Fermi will be able to drive two monitors and require SLI to drive three.
3. It will probably cost MORE and perform LESS.
4. But TWO can, which is it's competition ... TWO Fermi's in SLI.
5. Again, that 99.99% thingie.
6. ??
 
You miss the point.

Actually I think you might.

Q: Why is Nvidia requiring SLI for 3-monitors?
A: Because they don't have a single card with (3) digital display outputs, including Fermi.

Q: Why not if they are planning to do "stereo surround"?
A: They weren't planning on it until they realized ATi had a good idea they could steal.

Q: Why the hell would I buy an Nvidia GPU if I was planning on driving 3 monitors?
A: I must have around $2,000 ready to blow on two power-hogging video cards and a couple more monitors.

The point is: Nvidia had no intention of integrating multi-monitor setups into gaming. They responded by figuring out how to get two of their cards to supply video to 3 monitors.
 
Strange question. Aren't you assuming that everyone is going to go out and buy 3 monitors?

Being an enthusiast has nothing to do with whether or not you have 3 monitors. Millions of people buy mid/high end video cards. Millions of people aren't going to buy 3 monitors. Let's assume every single person with a 3 monitor setup goes with Cypress. Is that really going to impact Fermi sales? Nope.

I'm not saying Multi-monitor gaming is "the way" but what I'm saying is if Fermi is supposed to be this super powerful monster of a video card, what is the sense of driving just one 2560 x1600 panel? Video cards are catching up, games and resolutions are not. That's the point I'm trying to make.

For comparison, think about a few years ago or whenever you were still sporting a 17" CRT and running games at 1024x786 with your 8400GT or 7900GS thinking you were hot shit, then you picked up a new 22" - 1680x1050... rut-ro, there go those frames.

I just think that with these new powerful video cards, we've run into that situation (again). The video cards have caught up, but most of the games and certainly the resolutions are lacking.

It's like Nigel Tufnel. Where do you go once you're on 10? How do you step it up a notch? Add more monitors! But not everyone enjoys MM gaming or they haven't experienced it, but logically it is the next place to go.

Although for a moment, I forgot we're talking about computer enthusiasts. I was going to make the comparison to cars and how people tune them up but yet the speed limit on the residential road is still 25mph. It's more of a "look at what I accomplished\dick size" thing. Same with video cards for whatever reason.

And yeah, I believe Fermi will be $600-ish at launch.
 
You miss the point.

Nvidia has implemented this identical multi-monitor solution long ago with Quadro. It is called Mosaic and they just ported it over to their GeForce drivers. i believe that besides GF100, GT 200 owners should also be able to drive 3 monitors in SLi.

Nvidia also offers their Surround 3-panel solution in 3D; this is something AMD has yet to do with Eyefinity.
- Nvidia's Surround is far more demanding performance-wise over Eyefinity to render three 120hz LCD panels for 3D, so they limit it to SLi-powered PCs

ATI is also working on a 3D solution. They demonstrated it at CES on one monitor and will apply it to Eyefinity in the future, most likely in the 6800 series. Nvidia hasn't done 3D surround themselves the way Fermi has been delayed and it will be difficult to get the second card to SLI when released.
 
Negative. Nvidia never had time to stick a DisplayPort on Fermi (hw+sw+firm issues) when their knee jerked violently due to the worlds positive reaction to EyeFinity and they added 3D Surround.

I'm with you on getting blind-sided by Eyefinity, but I'm not so sure they neglected ports because of design failure. I'm guessing they look at nFinity only from a marketing perspective. And from there, the Fermi equivalent of 295 can quite easily have 3 ports on it (if not factory, then vendor custom pcb). They will have a "single" card sporting nFinity which will be the fastest card on the market.

ATI has been getting grief over not having crossfire support for their cards for eyefinity, this is for obvious reasons. How is it a 4850x2 has four ports but a 5970 doesn't? It's highly likely if you have the funds for 3 monitors, you're looking to add crossfire to the equation. Nvidia will have already done that for you. Do I find this to be full of win, no, but I think from a marketing perspective Nvidia isn't going to lose much in the general public's eye.
 
I'm with you on getting blind-sided by Eyefinity, but I'm not so sure they neglected ports because of design failure. I'm guessing they look at nFinity only from a marketing perspective. And from there, the Fermi equivalent of 295 can quite easily have 3 ports on it (if not factory, then vendor custom pcb). They will have a "single" card sporting nFinity which will be the fastest card on the market.
From what I've read, how many ports are supported is part of the chip design = there is no AIB workaround and Unobtanium will ALWAYS require SLI for MM gaming.

It was the whole reason for ATI's extreme secrecy about Eyefinity until it launched, so Nvidia wouldn't bake 3 way monitor support into it's chip.

As with everything else Unobtanium, Nvidia is left sucking hind teat.
 
Why? Because you've actually used a 5870 and your opinion is worth something? Oh wait...

Of course not. I get all my information on hardware by reading random people's claims on forums. Objective reviews with hard numbers are utterly useless in comparison. Instead of mouthing off with nonsense how about you read the link I posted and respond to that if it's not too much trouble?

I'm not saying Multi-monitor gaming is "the way" but what I'm saying is if Fermi is supposed to be this super powerful monster of a video card, what is the sense of driving just one 2560 x1600 panel? Video cards are catching up, games and resolutions are not. That's the point I'm trying to make.

I think you're grossly underestimating how stressful modern games are. The likes of Crysis, STALKER and Metro 2033 will still put the smackdown on Fermi @ 2560x1600 if Cypress numbers are anything to go by.
 
Honestly though i think this would be hard to decide on till it comes out and we see how good it is.
 
Of course not. I get all my information on hardware by reading random people's claims on forums. Objective reviews with hard numbers are utterly useless in comparison. Instead of mouthing off with nonsense how about you read the link I posted and respond to that if it's not too much trouble?
Right, exactly why I am here to curb your bullshit. There's too many people on these forums that read a chart or two and think that gives them a free pass to talk out of their asses. Now if you want to post "hey, I saw this review and it looked like the 5870 was getting below 60FPS in some games, what do you guys make of it?" that's a conversation starter. If you want to start talking about something like you have a clue (when you don't), you're going to get called on it. 99% of the games out there a 5870 will have no problem running on 3x 24" monitors with maximum settings, and will probably take a good swing at 3x 30". If you're going to play Crysis all day then you'll need more power, but anything short of that isn't really a problem.
 
It's your perogative to soak up other people's subjective opinions to your liking. I'll stick to the facts if that's ok with you.
Because it's in a graph doesn't mean it's official. You're exactly what I'm talking about, but at least you have the sense to be quiet and back down when you've realized your error (if not the maturity to admit it).
 
Yes, I'm very sorry I used facts to question a subjective claim about Cypress performance on 3-monitor setups. I humbly apologize for offending you with empirical evidence. Please excuse my mistake.
 
From what I've read, how many ports are supported is part of the chip design = there is no AIB workaround and Unobtanium will ALWAYS require SLI for MM gaming.

I remember seeing a 4XXXs card with HDMI out instead of DVI. Not sure if I'll find it, but here is an article on ATI's 4770 and one on Nvidia's9500. This doesn't prove either way.

What does ATI & Nividia graphic chips support? If you can convert a DP signal using a simple adapter, I hardly doubt the limitation for AIBs is in chip design. The same goes with the 6-Disport Port Edition for ATI. I'm guessing this isn't anything more than implementing the software to support 6 displays that is stopping it from appearing on the market.

Nvidia just has to check the box, I don't think they are going to try to take over Eyefinity. Personally, if ATI wants to add some legitimacy to their Eyefinity, we need to see more monitors eyefinity friendly. The 6 displayport card is about as bad as Nvidia SLI considering the requirements. Any monitor you currently own now costs more to run on your new card, and then if you don't want to add more to your new monitors you'll be adding, you have to go with DP friendly monitors, limiting your choices. And to top it off, I can't believe the vast majority of high performance 120hz monitors have a non-symmetrical bezel going on. Who the hell planned that?
 
Back
Top