5870 owners - Would you sell your card and get a Fermi IF....

5870 owners - would you sell your card for a Fermi IF.....

  • Fermi arrived with faster performance, Nfinity, true geometry addition, same exact price as 5870

    Votes: 103 48.4%
  • Keep it no matter what - you cannot pull this 5870 / 5970 from my hands!

    Votes: 110 51.6%

  • Total voters
    213
Well Nfinity is worse, the 5xxx range sports plenty of tesselation power and there is no way in hell Nvidia are launching a flagship product the same price as the 5870.

I see no reason to upgrade even if the part is faster I dont need the additional power for anything yet, and I have a 2560x1600 native LCD...
 
I'm selling mine even if Fermi is $500+ and only 30% faster. 1GB is not enough memory for 2560x1600. I'll probably hold onto the 5970 until fermi is out, but I'd ditch my barely used xfx 5870 today if I had a decent offer.
 
I already sold both 5970s in anticipation. That said, I don't think that Fermi will be anything truly groundbreaking in the short-term and I certainly don't have any hate towards the 58/5900 series as they are great cards.

As mentioned above, I think that 1gb is not enough for playing at 2560x1600 and I hope Fermi changes that.
 
I'm selling mine even if Fermi is $500+ and only 30% faster. 1GB is not enough memory for 2560x1600. I'll probably hold onto the 5970 until fermi is out, but I'd ditch my barely used xfx 5870 today if I had a decent offer.

Huh?!

I played about 90% of all my games in 2560x1600 just fine with a 512Mb 4870 crossfire rig, and play 100% of all my games in 2560x1600 just fine with a 1Gb 5970.

The few games that struggle at 2560x1600 aren't struggling because of lack of memory I'm fairly certain.
 
Sure if it had all the same features, was faster, and cost the same as a 5870 (~$400), it would be a nobrainer to get one. But I really doubt it will cost the same. I'm expecting Fermi to retail at $499 or $599 when it first debuts.
 
The first option, Nfinity? Did you forget you need two Fermi for that? Did they forget to mention that in your shilling-classes?

If I can get two Fermi with better performance for the price of a 5870 I'll go do that, naturally, but that's not going to happen.
 
Here's the real question: What if Fermi STILL requires 2 cards for triple monitor gaming? We're talking at least $800 to side with nVidia and get the eyefinity equivalent. Even if they were same cost as the 5870, faster, and had OMG TRUE GEOMETRY, there's no way in hell I'm buying 2 of them for my 3 monitors.

It's funny how neutral you first stated yourself to be shuttleLuv, but when people started disagreeing with you and shooting down your arguments, you back up into a color and let your true fanboyism shine.
 
I didn't even vote and I know the results.
Seriously the first option is almost completely impossible, but a definite truth if it ever happens.
 
ShuttleLuv, please get this through your head. "true geometry" is just marketing speak for DX11's tessellation. It is *NOT* special or unique. Nvidia's marketing team is simply taking a standard, *required* feature and making it sound all special and awesome because this is their first attempt at hardware tessellation. So please, *PLEASE* stop touting "true geometry" as some zomg awesome fermi feature - its just DX11, nothing more. As a matter of fact, the 5xxx series already has "true geometry" - imagine that! :rolleyes:

Here's the real question: What if Fermi STILL requires 2 cards for triple monitor gaming? We're talking at least $800 to side with nVidia and get the eyefinity equivalent. Even if they were same cost as the 5870, faster, and had OMG TRUE GEOMETRY, there's no way in hell I'm buying 2 of them for my 3 monitors.

It's funny how neutral you first stated yourself to be shuttleLuv, but when people started disagreeing with you and shooting down your arguments, you back up into a color and let your true fanboyism shine.

I'm pretty sure Nvidia already stated that Fermi will only have 2 video outputs and that SLI is required for nFinity. I could be wrong, though, I haven't bothered reading much about Fermi as so much of it thus far has been fanboy drivel or marketing slides.
 
what's this nonsense about 1gb not being enough for 1600p? i just got through enjoying some crysis goodness at 4800x1600... which, is a significant amount more pixels than 2560x1600

and i dont want to hear that nonsense about the main display having all the action and the side monitors only showing blah blah blah

in crysis and at that resolution / the entire game is wide open across all 3 monitors / the aspect and field of view is perfect / geometry is correct

the video cars is pushing 7.7 million pixels in crysis [read that over a few times...]

the game played fine / settings spread across medium and high [objects quality on very high]

every game i play new to old i play at my native reso... my lowly 5850 ($250) with 1gb doesnt even spin the fan up / and all this on a 500w thermaltake middle of the road power supply...

my point is... team green isnt going to be able to touch that price/performance out the gate - maybe this time next year...
 
Flamebait poll. I would pick option 3, wait until the next generation of cards and games to be released because upgrading now when games don't seem to stress my system is pointless.
 
I am going to keep my 5850 just because I don't want to have to sell it. Also I have a feeling the new card from Nvidia is going to be expensive.
 
Oh yeah, I almost forgot that I like to put my old cards in other systems, having a 230w+ card doesn't make that easy. So no, even if it was a ton faster I would not sell my card just because I know my card will go in a secondary system and I want it to be quite, not loud.
 
To the non fan boys this poll reads:
If you could sell your current card and get something significantly better for the same money would you?
Yes
No

To the fan boys this reads:
Would you sell ati and buy nvidia?
Yes
No

The results of this poll are more an indictment of the state of these forums then anything.
 
Last edited:
There's nothing wrong with leaning either direction, both have advantages. nVidia is really touting a lot of features that while having a high cost, may be worth the premium for many people. The problem with this poll is how leading this question is.

Let me exaggarate this poll
Would you rather buy Fermi if it could....

***be cheap, powerful, cures cancer, solves world hunger:):):)
or
***Hey man, Fermi is great and all, and IMMA LET YA FINISH, but I am an ultimate fanboy of ATI and you'll have to pry the 5800 cards from my cold dead hands even if it contributes to global warming!!
 
Last edited:
I find it interesting that a lot of people here were clamoring for Crossfire support when using Eyefinity, and even though a hotfix has been released it’s still somewhat buggy and unusable, but now that it is known NV Surround requires SLI, those same people are saying SLI being required for multi-monitor gaming is stupid, not worth it, and makes NV Surround worse than Eyefinity.

NV Surround worse than Eyefinity? Hardly, it properly supports SLI and bezel management right from the start and doesn’t require an expensive adapter. Yes, requiring SLI in order to get multi-monitor goodness can be viewed as a downside, but that doesn’t make it any worse than Eyefinity. Both have their strengths and weaknesses. Personally, if I’m gaming at 7680x1600 or 5760x1200, I’m going to want multiple-GPUs, regardless of which solution I’m using (Eyefinity or NV Surround).
 
at the same price as 5870 and better performance? hmmm i dont think i am ready to add $5/month to my electric bill
 
This is hardforum, so you can't consider cost. The amount of money I spent on my computer in the last year alone I could have bought a small car. I know that sounds really sad, this is hardforum where people have badass setups.

If Fermi comes in faster which it should I'm buying it. Cost is not an issue, it's not like nvidia's card is going to be 100's more then ati. I think what it boils down to is value, do you feel you are getting value for your money?!
 
To the non fan boys this poll reads:
If you could sell your current card and get something significantly better for the same money would you?
Yes
No

To the fan boys this reads:
Would you sell ati and buy nvidia?
Yes
No

The results of this poll are more an indictment of the state of these forums then anything.

Not necessarily. I voted "no" because even if Fermi is faster at the same price, I don't want to have to buy a *second* card because I want to run Eyefinity. And that is on top of the hassle of selling the card and paying shipping etc... Fermi would have to be significantly faster at the same price AND support single card nFinity for me to be interested in going through that effort. Otherwise I'll hang onto my 5870 for a year or two. I don't need to have the latest and greatest.

@UtherLazarus: You can't claim NV Surround isn't worse than Eyefinity until people have actually used it. You *claim* it properly supports SLI, but you don't have any idea if it does. For all we know it requires SLI, but only 1 card is doing the rendering and the other is just for output. Scaling could be utter crap for all we know. You also claim it supports bezel management, but again, we don't know that. ATI could also have bezel management in place before Fermi launches, making that point moot anyway.
 
where is the option of keeping my 4870 for another year till the 6k series comes out?
 
I'f and when nvidia comes out with something to beat ati with. They would only make it fast enough to beat ati by a little, and they would say, were the king now. That will cost you 800 bucks. And nvidia says, thank you for your business and have a good day. SUCKER!!!
 
Last edited:
Honestly - I would. I like my 5870, but the whole stuttering w/ vsync thing really annoys the hell out of me. Plus, I just simply like the Nvidia drivers release system better. I'd rather have a beta a week instead of a release a month. Finally, the PhysX ATI hack works, but it's still pretty weird.
If Fermi were to hit at a comparable price with the same/better performance I think I might jump on it.

The one drawback would be HDMI audio implimentation. Would the Fermi cards be able to handle lossless audio via HDMI? If not - then I'll stick to the 5870 'til they do.
 
If Fermi is faster, it won't be at the exact same price because AMD would drop their price soon.
 
@UtherLazarus: You can't claim NV Surround isn't worse than Eyefinity until people have actually used it. You *claim* it properly supports SLI, but you don't have any idea if it does. For all we know it requires SLI, but only 1 card is doing the rendering and the other is just for output. Scaling could be utter crap for all we know. You also claim it supports bezel management, but again, we don't know that. ATI could also have bezel management in place before Fermi launches, making that point moot anyway.

Read:
http://www.nvidia.com/object/IO_86775.html

and

http://www.nvidia.com/object/quadro_sli_mosaic_mode.html

Unless nvidia flat out lied, NV Surround will support bezel management from the start and will render two screens with one card and the third screen with the other. My take? It's simply SLI Mosaic (limited to 1x3 config) enabled on GT 2xx and GF1xx cards. Why did they not keep the SLI Mosaic name? I would suggest because:

1. The decision to add support was made late in game
2. The next iteration of Fermi cards will support NV Surround on a single card

The last two points are just speculation on my part.

However, by your logic it's just as wrong to claim NV Surround is worse than Eyefinity, but I don't see you jumping all over the person who said "Nfinity is worse" as if it were a known fact. I never said NV Surround was better or worse actually. I simply said both schemes have their strengths and weaknesses, based on what we know currently.
 
Fermi will be so magical it will just add geometry to every game you play on it. It's going to make the GldSrc engine look like Crysis.
 
Read:
http://www.nvidia.com/object/IO_86775.html

and

http://www.nvidia.com/object/quadro_sli_mosaic_mode.html

Unless nvidia flat out lied, NV Surround will support bezel management from the start and will render two screens with one card and the third screen with the other. My take? It's simply SLI Mosaic (limited to 1x3 config) enabled on GT 2xx and GF1xx cards. Why did they not keep the SLI Mosaic name? I would suggest because:

1. The decision to add support was made late in game
2. The next iteration of Fermi cards will support NV Surround on a single card

The last two points are just speculation on my part.

However, by your logic it's just as wrong to claim NV Surround is worse than Eyefinity, but I don't see you jumping all over the person who said "Nfinity is worse" as if it were a known fact. I never said NV Surround was better or worse actually. I simply said both schemes have their strengths and weaknesses, based on what we know currently.

I'm not saying Nvidia is flat out lying - just that bezel management might not be there at launch, or that ATI will already have it by then. As for SLI, that doesn't address how well it scales in games. I'm just trying to say that we really don't *know* how nFinity will behave whereas we do *know* how Eyefinity behaves. I certainly wouldn't claim that nFinity is worse than Eyefinity, either.

Honestly - I would. I like my 5870, but the whole stuttering w/ vsync thing really annoys the hell out of me.

Uh, changing your card won't change how VSync works. The "stuttering" you refer to is simply due to how VSync works, it isn't a card or driver bug. You could try forcing triple buffering.

Finally, the PhysX ATI hack works, but it's still pretty weird.
If Fermi were to hit at a comparable price with the same/better performance I think I might jump on it.

The weird thing is that a company would intentionally disable their own hardware.
 
I'm not saying Nvidia is flat out lying - just that bezel management might not be there at launch, or that ATI will already have it by then. As for SLI, that doesn't address how well it scales in games. I'm just trying to say that we really don't *know* how nFinity will behave whereas we do *know* how Eyefinity behaves. I certainly wouldn't claim that nFinity is worse than Eyefinity, either.

Fair enough, one thing is for sure though... it will be interesting to see how things play out ;)
 
Uh, changing your card won't change how VSync works. The "stuttering" you refer to is simply due to how VSync works, it isn't a card or driver bug. You could try forcing triple buffering.


No - I've been running games with vsync on since the days of Quake 1. Image tearing drives me bonkers. This isn't mouse lag or anything like that. With my 5870, about 1/2 of my games stutter and jerk around when mouse looking and moving at the same time. This was never the case with any Nvidia cards I've ever owned, including my GTS250 that I usually use as a PhysX card in the same machine.
Triple buffering doesn't fix this issue, and the others that have had the same issue haven't come up with any fixes except "it's an issue with PowerPlay." This infamous PowerPlay seems to be the root of a lot of problems.
While I've had issues with single games on an Nivida card, I've never had something so annoying across the board.
 
Huh?!

I played about 90% of all my games in 2560x1600 just fine with a 512Mb 4870 crossfire rig, and play 100% of all my games in 2560x1600 just fine with a 1Gb 5970.

The few games that struggle at 2560x1600 aren't struggling because of lack of memory I'm fairly certain.

Try out some texture mods. You'll be singing a different tune for sure.
 
Back
Top