5870 owners - Would you sell your card and get a Fermi IF....

5870 owners - would you sell your card for a Fermi IF.....

  • Fermi arrived with faster performance, Nfinity, true geometry addition, same exact price as 5870

    Votes: 103 48.4%
  • Keep it no matter what - you cannot pull this 5870 / 5970 from my hands!

    Votes: 110 51.6%

  • Total voters
    213
Yeah, you know this but you were calling people fanboys for complaining about it, making it seem like nVIdia right out of the gate had everything we wanted but we were simply complaining becuase it came from nVidia. The thought of having "options" never even occured to you until I brought it to your attention. That's when you promptly started making up stories as if you have inside info on what AMD and nVIdia's intentions were with multi-monitor gaming and then tried to shift the attention back to performance of the cards.
 
Bro, I lived through the 90's. Microsoft's anti trust trial. I know all about a monopoly, and I do not like monopolies. Choice is a powerful thing. It really is. But you and I know full well eyefinity crossfire support and maybe nFinity as a whole were added in late in the game to stay afloat. When the real benches come out, we'll be scouring them for benchmarks in games to compare to the 5870 and 5970.
 
I call you or the rest "fanboys" but I have yet to see any of you say you like nvidia.

Well, I would say it's hard to like nVidia when their next generation GPU is several months late, and the only things coming out are marketing slides.
 
Whats it to you if they are alittle late? Rather than push out some half assed GPU like the FX 5800 series, why not try to get it right alittle late? If they released it way back when they were supposed to, we'd be hearing how they rushed a half assed product. Damned if you do, damned if you don't I suppose. :rolleyes: Every company releases PR slides. It's called business. Take an economics class, hell look at our world. Everything is business. The job shedding in the economy is proof. Before a company will go down, they'll shed the little guy. Before nVidia will get down on their knees for Ati, they will release a PR slide emphasising their GPU's accomplishments. Ati has done this. They continue to. And even worse, remember Quack.exe?

http://www.techreport.com/articles.x/18332

How people forget so soon. Don't act like ATI is all innocent, they are not. Business first, as always.

5870 PR slides from September 09

2.jpg
 
Last edited:
Whats it to you if they are alittle late? Rather than push out some half assed GPU like the FX 5800 series, why not try to get it right alittle late? If they released it way back when they were supposed to, we'd be hearing how they rushed a half assed product. Damned if you do, damned if you don't I suppose. :rolleyes: Every company releases PR slides. It's called business. Take an economics class, hell look at our world. Everything is business. The job shedding in the economy is proof. Before a company will go down, they'll shed the little guy. Before nVidia will get down on their knees for Ati, they will release a PR slide emphasising their GPU's accomplishments. Ati has done this. They continue to. And even worse, remember Quack.exe?

http://www.techreport.com/articles.x/18332

How people forget so soon. Don't act like ATI is all innocent, they are not. Business first, as always.

5870 PR slides from September 09


because its still a half ass over power hot chip thats going to take your whole bank account to buy.. anything else? go buy your friggin fermi card already and save us from your crap.. oh wait.. you cant.. damn thats right it doesnt exist.. guess we have to deal with your crap for another 2 months..

*shoots self*
 
Whats it to you if they are alittle late? Rather than push out some half assed GPU like the FX 5800 series, why not try to get it right alittle late? If they released it way back when they were supposed to, we'd be hearing how they rushed a half assed product. Damned if you do, damned if you don't I suppose. :rolleyes: Every company releases PR slides. It's called business. Take an economics class, hell look at our world. Everything is business. The job shedding in the economy is proof. Before a company will go down, they'll shed the little guy. Before nVidia will get down on their knees for Ati, they will release a PR slide emphasising their GPU's accomplishments. Ati has done this. They continue to. And even worse, remember Quack.exe?

http://www.techreport.com/articles.x/18332

How people forget so soon. Don't act like ATI is all innocent, they are not. Business first, as always.

5870 PR slides from September 09
And then AMD proceeded to release the product soon after and on schedule. Meanwhile NVIDIA is dicking around with nothing to show BUT PR slides, and the market is stagnating. You act like anyone who isn't in the Fermi circle jerk is an AMD fanboy, they aren't, they just aren't fans of mediocrity.
 
I don't think that there is anything wrong with PR slides or late product launch but I can't stand fanboys who are drooling because of a PR slide without any real world product or number. These drooling fanboys then go to forums spouting crap as if it is the second coming and it will change the world forever.
 
It does make it worse. Not everyone needs a dual GPU set up for eyefinity, if you're using something like 22" (1680x1050) panels or below you can power 3 of those on one 5870 or 5970. You could probably get away with a 5970 for 3x24" (1920x1200 or 1080p) panels and run more or less everything apart form Crysis at max settings.

Eyefinity will have bezel management, that's in the pipeline and it's support for crossfire is obviously going to be improved over time, I suspect with a looming major update from ATI with game profiles we can probably set lots of crossfire specific settings on a per game basis (just my prediction)


I wouldn't say it's worse though. Both implementations have their good and bad. For all we know, NV Surround will perform better than Eyefinity. It wouldn't surprise me, as SLI Mosaic as been around for a while now and nvidia has had a lot of time to work at perfecting it. Anyone who thinks NV Surround isn't a limited form of SLI Mosaic is fooling themselves. With that being said, I still think that being able to support multi-monitor gaming on a single card is a big advantage for Eyefinity, and we'll see how that pans out. Is the demand for a multi-monitor rig that much more than the demand for a multi-GPU rig? I don't know, but I bet outside communities like this there is little regard for either. However, the argument can be made that because Eyefinity can be supported on a single card, it opens up more opportunities to being accepted by the mainstream, and that I can agree with.

Just as Eyefinity will probably support bezel management at some uncertain point of time in the future, NV Surround will probably support a single card implementation at some uncertain point of time in the future. I'm just trying to see the best in both implementations, and enjoying the fact that both ATi and nvidia are pushing for more support of multi-monitor gaming, which I think we can all agree on is thanks to ATi making it far more accessible with Eyefinity.
 
How does it not make sense? If u have a 5870 and fermi came with said features at the same price, would u ditch ur 5870??

How does it make sense? Why would I want to spend another $400 to get what I already have? Unlike some people, my 5870 is running flawlessly. No big cursors, no stuttering, no coil whine, no gray screen, no vertical lines, etc... and I run everything maxed out.
How is getting a fergie (and I doubt you'll see it at $400 regardless) gonna improve on that?
It isn't bringing anything new to the table as far as I can tell.
 
Whats it to you if they are alittle late? Rather than push out some half assed GPU like the FX 5800 series, why not try to get it right alittle late? If they released it way back when they were supposed to, we'd be hearing how they rushed a half assed product. Damned if you do, damned if you don't I suppose. :rolleyes: Every company releases PR slides. It's called business. Take an economics class, hell look at our world. Everything is business. The job shedding in the economy is proof. Before a company will go down, they'll shed the little guy. Before nVidia will get down on their knees for Ati, they will release a PR slide emphasising their GPU's accomplishments. Ati has done this. They continue to. And even worse, remember Quack.exe?

http://www.techreport.com/articles.x/18332

How people forget so soon. Don't act like ATI is all innocent, they are not. Business first, as always.

5870 PR slides from September 09

I'm fairly new to the forums, and I don't really keep track of who says what. But I gotta say that the way you come across defending nvidia so, I get the feeling that nvidia could take a shit, slap an air cooler on it, call it the Fumy and you'd still claim it was revolutionary. Sorry, but that's how you come across....
 
How does it not make sense? If u have a 5870 and fermi came with said features at the same price, would u ditch ur 5870??

because ATI 6000 is coming soon enough after fermi is out. it is stupid to waste time and money on fermi and not wait for 6000
 
To the non fan boys this poll reads:
If you could sell your current card and get something significantly better for the same money would you?
Yes
No

To the fan boys this reads:
Would you sell ati and buy nvidia?
Yes
No

The results of this poll are more an indictment of the state of these forums then anything.


Maybe but nvidia fanboys would never buy an ati card :D
 
Bro, I lived through the 90's. Microsoft's anti trust trial. I know all about a monopoly, and I do not like monopolies. Choice is a powerful thing. It really is. But you and I know full well eyefinity crossfire support and maybe nFinity as a whole were added in late in the game to stay afloat. When the real benches come out, we'll be scouring them for benchmarks in games to compare to the 5870 and 5970.

Yes, added in a hotfix shortly after release... Very late in the game :rolleyes:. Clearly the hardware had the capability from the start. Lets see if fermi can remove the SLI requirment in a driver update, and we'll see which technology was an afterthought.
 
i run a pretty high end system, but for the vid card (well now). and ive been very close to getting a 5970 on meny occasions, and i would have if they were in stock. But the more I think about it I dont need it. that was the SHINY NEED IT NOW wearing off.

I can still play any game comming out with max res at 1900x1200 (no dx 11) so why would i upgrade? so i can go from 60 fps in vsync to 60 fps in vsync?

idk seems pointless. I'll be looking to upgrade when the 5800 refresh comes out or the 6k series at the end of the year. At the moment hardware is meny generations ahead of games. there is really no reason to upgrade ifyou have a quadcore and a 2xx or a 48xx or better. just sit tight.
 
How does it not make sense? If u have a 5870 and fermi came with said features at the same price, would u ditch ur 5870??

Okay, I'll spell it out for you.

Option 1's biggest problem is that the likelihood of GTX380 being priced equal to HD5870 seems incredibly small. GF100 is a big chip, which means no matter how Nvidia prices their GF100, AMD will always be able to sell the HD5870 for less and still make more profit than Nvidia. If GF100 ends up faster than the HD5870 (likely) then AMD will cut prices until the HD5870 offers similar or better performance/$ so that they still get sales. That's the way the market will stabilize because of basic free market principles. If this doesn't make sense to you, go back to school.

However, in the magical hypothetical world where GF100 is actually the same price as the 5870, it still seems like a lousy proposition. If I already had an HD5870, why would I spend $400+ for a small-ish 30% gain? Especially when the HD5870 slaughters current games (as you have said yourself) there doesn't seem much value in getting 30% higher framerates. You could argue that future games would make better use of GF100's power, and you'd be right... But if I'm going to wait for future games then I have time to wait for future graphics cards too, where I'd likely get more bang for those same 400 bucks (plus interest!). edit: okay, I could sell the 5870 and help reduce those costs, but I certainly wouldn't get anywhere close to MSRP with GF100 romping around at that price ;)

So Option 1 is pretty lame.

Option 2 says that I'd want to keep my 5870 no matter what. There could be certain theoretical (but unlikely) scenarios that would convince me to get rid of my 5870. To vote for option 2 would be like admitting fanboyism. It seems like you're trying to troll.

edit:However, in retrospect I can see that you're trying to be lighthearted about this, so I guess I shouldn't have been upset that your poll options weren't completely literal. Sorry.

Here's some info from a recent tech report article :
http://www.techreport.com/articles.x/18332

^^^ That last quote is what gets me. This is exactly what I was saying in my other Fermi thread. The Fermi appears to be a game changer. This is the card that will be the forefather of many cards to come that aspire to beat it spec wise geometry wise. Future cards will undoubtedly out do this card geometry wise, but this is the card that will put geometry focus on the map. Mark my words.

It is certainly interesting that Nvidia has taken the effort to parallelize their tessellator and rasterizer operations. It's forward-thinking, so kudos to Nvidia for taking initiative and trying to innovate instead of simply doubling their shader cores.

With that said, it's not like AMD's Evergreen's tesselator and ROPs aren't at all impressive either. They just haven't made a big stink because they've had hardware tessellation support for years and years.

Furthermore, bundling these functions into Fermi's GPCs might actually be problematic for budget parts. Imagine a chip with only a single GPC - it would have lower fillrate and less shader power and so would be targetted at lower resolutions... but it would also have less geometry power. Unfortunately, geometry stays consistent between resolutions, meaning that a low-end Fermi part would actually be struggling with geometry on new games designed for tessellation. I could be wrong - I'm just guessing.
 
This is a post not meant to be taken serious. It's to see if in a situation like that gamers would let go of their 5870's. A way to weed out fanboys. One thing I find funny though is that ati guys wanted crossfire support on eyefinity, but now that fermi needs sli to do nfinity they don't put much emphasis on it. I think requiring dual gpu's or it is somewhat cumbersome, but in the long run will probably ensure a better, smoother, experience. Eyefinity / nfinity needs serious power.

Bad news: Your poll is too simple. It mostly fails as a fan*** detector.
Even if I decided Fermi was better, selling my current card is pain in the ass, so I prolly would not bother.
Even if I decided Fermi was better, my current card is performing quite well in all my games, so I prolly would not bother.
Even if I decided Fermi was better, I have serious issues with Nvidias hinder and block behaviour that hurts gamers, so I prolly would not bother.
Even if I decided Fermi was better,..........

You get the point now? You don't need to be an AMD fan*** to not buy Fermi. You could just be satisfied with your current situation, hate Nv's behavior, or any number of other things. Trying to decide who is and is not a fan*** from it would be somewhat difficult.

Now if your poll was about somebody already in the market for a new card it would be different.
If Nv stopped their block and hinder tactics, Fermi was better for the same price, and I was in the market for a card, then yes I would buy one.

Good news: Your poll does not totaly fail as a fanboy detector. It marks out 1 quite clearly.
 
because its still a half ass over power hot chip thats going to take your whole bank account to buy.. anything else? go buy your friggin fermi card already and save us from your crap.. oh wait.. you cant.. damn thats right it doesnt exist.. guess we have to deal with your crap for another 2 months..

*shoots self*

What does that make a 5970? I hear the vrm's run obscenely hot and forget about overclocking that pos. Also whats the average price tag on that pos?

Your just posting bs based on completely unconfirmed sources. Atleast wait for Charlie to comment on the subject since he has been right for the most part but Fudzilla is a fucking joke.
 
Fermi is going to be super expensive, super hot, and I honestly don't know about perf yet. I do not put any weight on the benchmarks NVIDIA has shown.

Now, can we act like adults please?
 
I agree those PR slides are always a joke. AMD does the same thing. I'm sure that Nvidia will release a slightly scaled down fermi something like gtx260 was to gtx280 as well for hopefully a reasonable price. Although I'm sure that they will gouge for that first month, like always. Where Nvidia really needs help is if fermi doesn't scale down. They will be absent much like now in the mainstream market.

Now on the super hot topic, what was 4870? You had your choice between super hot and super loud based on my experience. How many cards had degraded memory chips? What happens when you run OCCT fullscreen at any decent resolution on any reference 1870/90/x2? I would also be willing to bet that fermi will run cooler than hemlock. There is no overclocking on air with that thing (5970) from what I hear. Not that the vrms on my 280 don't run hot just no where near as hot as ATI's x2 monsters.

I would gladly pay a little more for a faster single gpu card. If fermi does turn out to be as fast as hemlock it deserves a slightly higher price. Lets be honest crossfire is a headache. It seems like when you really need it, it doesn't scale. I can go into more detail since I am specking from experience and not trolling.

Lastly, why does eyefinity come into every topic? What percentage of either company's market share would end up using that? Like 1%, if that? I always laugh when I see people with three TN panels running eyefintity. I'm sure that looks great with those viewing angles, backlight bleed, color accuracy, etc on those TN panels.

I hope that Fermi delivers or at least causes ATI to drop the price of their 5870 since at this point I can't justify spending over 30% more than I paid for my GTX280 over a year ago and 5850 doesn't sound like much of an upgrade, tbh.

The fact that they aren't showing any benchmarks has me more worried.

This I agree with but it sounds like Charlie has been right all along. Expect a paper launch in late February at best.
 
Last edited:
I felt like voting because I don't think 5850 owners should be excluded.

I voted keep the card because we know that 2 fermi cards are required for nfinity. We know that nVidia will not sell fermi at $150/$200. Therefore, the 5850/5870 is the better option since it allows multi screen gaming at a cheaper price.

Who cares about raw horsepower? give me more features at playable framerates! I'd choose 3x24" lcds over a 30" LCD any day.
 
I only own one game that i'd want more than 1gb of video memory for. GTAIV. With everything maxed I cannot turn the view distance up to what I want. The game won't allow it because a lack of video memory. I have an Asus 5870.
 
I voted keep the card because we know that 2 fermi cards are required for nfinity. We know that nVidia will not sell fermi at $150/$200. Therefore, the 5850/5870 is the better option since it allows multi screen gaming at a cheaper price.

That cheaper price gets you a crappy experience though. Dealing with 25fps or having to drop settings to medium is a bad tradeoff for three screens. Both ATI and Nvidia need multiple GPU's to do it right. ATI is only definitely better for the few people who don't want to power their setup properly.
 
I felt like voting because I don't think 5850 owners should be excluded.

I voted keep the card because we know that 2 fermi cards are required for nfinity. We know that nVidia will not sell fermi at $150/$200. Therefore, the 5850/5870 is the better option since it allows multi screen gaming at a cheaper price.

Who cares about raw horsepower? give me more features at playable framerates! I'd choose 3x24" lcds over a 30" LCD any day.

I'm sure everything must look really uniform across those three TN panels. :rolleyes:

From the benchs that I have seen you need the raw horsepower for eyefinity, especially if yo want playable framerates.

That cheaper price gets you a crappy experience though. Dealing with 25fps or having to drop settings to medium is a bad tradeoff for three screens. Both ATI and Nvidia need multiple GPU's to do it right. ATI is only definitely better for the few people who don't want to power their setup properly.

QFT
 
I've got a question, and unfortunatly on these boards I feel I have to mention that I'm a neutral video card purchaser. I've only skimmed 2 posts about fermi so far, both started by that shuttleLuv guy, and I agree he's definatly a fanboy, so I can see why people are a bit harsh in criticizing him. I also don't have anywhere near the technical knowledge of most people around here. That's why I'm asking this.

I am wondering why people seem to be actually putting down NVidia's emphasis on geometry. Obviously dx11 tesselation support is nothing new, and claiming so is just wrong. The extra computing power dedicated to geometry did honestly get me a bit excited though. I've thought for quite awhile, especially after seeing what crysis was capable of, that shaders can make things look pretty near realistic, but even on those beautiful scenes, I did wish that the models and everything could have a higher poly count. Is that something that the extra geometry power in fermi would be capable of providing? I also understand that it's up to the devs, well mostly the artists, to provide higher poly count objects, but people didn't exactly like it when the software came before the hardware, like in crysis' case, so i figured that providing the power was a step in the right direction. Oh, and would higher res heightmaps benifit from increased geometry power? Stuff like that.

Am I wrong in thinking these things? I don't want to fall victem to NVidia's propaganda and the way you guys are talking makes me feel like I'm missing something.

And one last thing. Is it fair to make this comparison? Tesselation is to Geometry as LoD is to textures. I'm going to watch some vids on tesselation now that I have time. Hope that I'm not too far off.
 
I've got a question, and unfortunatly on these boards I feel I have to mention that I'm a neutral video card purchaser. I've only skimmed 2 posts about fermi so far, both started by that shuttleLuv guy, and I agree he's definatly a fanboy, so I can see why people are a bit harsh in criticizing him. I also don't have anywhere near the technical knowledge of most people around here. That's why I'm asking this.

I am wondering why people seem to be actually putting down NVidia's emphasis on geometry. Obviously dx11 tesselation support is nothing new, and claiming so is just wrong. The extra computing power dedicated to geometry did honestly get me a bit excited though. I've thought for quite awhile, especially after seeing what crysis was capable of, that shaders can make things look pretty near realistic, but even on those beautiful scenes, I did wish that the models and everything could have a higher poly count. Is that something that the extra geometry power in fermi would be capable of providing? I also understand that it's up to the devs, well mostly the artists, to provide higher poly count objects, but people didn't exactly like it when the software came before the hardware, like in crysis' case, so i figured that providing the power was a step in the right direction. Oh, and would higher res heightmaps benifit from increased geometry power? Stuff like that.

Am I wrong in thinking these things? I don't want to fall victem to NVidia's propaganda and the way you guys are talking makes me feel like I'm missing something.

And one last thing. Is it fair to make this comparison? Tesselation is to Geometry as LoD is to textures. I'm going to watch some vids on tesselation now that I have time. Hope that I'm not too far off.

Well you're in luck I happen to have asked a 3d artist who does this for a living here's your answer. Take note Shuttlelove: http://www.xtremesystems.org/forums/showpost.php?p=4217368&postcount=1963

Oh that's right looks like us geometry haters will be eating our words................................











in 10 - 15 years.
 
Last edited:
I'm sure everything must look really uniform across those three TN panels. :rolleyes:

From the benchs that I have seen you need the raw horsepower for eyefinity, especially if yo want playable framerates.

Who said anything about TN panels?! You know what happens when you assume.

I am using 2x ips panels and one TN panel at the moment. Sure, there is some colorshift, but is it worth purchasing another 22" ips panel? That's arguable.
I can tell you do not own a 5850/5870 yet based on your biased comments. You're just deluding yourself and trying to justify NOT purchasing it because you're either an nvidia fanboy, or you can't afford it.

No, you do NOT need more horsepower for "playable" framerates. At 5040x1050, I can play Dirt 2, Shift, Dragon Age, Grid, and most games at max settings with a Radeon 5850. My framerates in Grid are consistently over 90fps and Shift are near 60 fps. Crysis warhead at mainstream /gamer mixed settings gives me average 30fps. That's playable.
 
That cheaper price gets you a crappy experience though. Dealing with 25fps or having to drop settings to medium is a bad tradeoff for three screens. Both ATI and Nvidia need multiple GPU's to do it right. ATI is only definitely better for the few people who don't want to power their setup properly.

So since you own a 5870, then you are one of those who doesn't want to power his setup properly? :eek:
The 5850 gives great framerates with max details on most games at 5040x1050. I would not call it a "crappy" experience by any stretch. Your argument is only valid for those running 3x24" (or larger) screens.

Are you seriously going to argue that it's not the fastest card right now? Or are you implying that fermi will do better with nfinity performance? I hope it does, because it needs TWO cards. Compare it to crossfire 5850/5870 and then you have a more fair comparison.
Wait to see what they're going to sell and the performance, then you can start the bashing about "powering your setup properly".
 
I've got a question, and unfortunatly on these boards I feel I have to mention that I'm a neutral video card purchaser. I've only skimmed 2 posts about fermi so far, both started by that shuttleLuv guy, and I agree he's definatly a fanboy, so I can see why people are a bit harsh in criticizing him. I also don't have anywhere near the technical knowledge of most people around here. That's why I'm asking this.

I am wondering why people seem to be actually putting down NVidia's emphasis on geometry. Obviously dx11 tesselation support is nothing new, and claiming so is just wrong. The extra computing power dedicated to geometry did honestly get me a bit excited though. I've thought for quite awhile, especially after seeing what crysis was capable of, that shaders can make things look pretty near realistic, but even on those beautiful scenes, I did wish that the models and everything could have a higher poly count. Is that something that the extra geometry power in fermi would be capable of providing? I also understand that it's up to the devs, well mostly the artists, to provide higher poly count objects, but people didn't exactly like it when the software came before the hardware, like in crysis' case, so i figured that providing the power was a step in the right direction. Oh, and would higher res heightmaps benifit from increased geometry power? Stuff like that.

Am I wrong in thinking these things? I don't want to fall victem to NVidia's propaganda and the way you guys are talking makes me feel like I'm missing something.

No, what people are putting down is the insanely heavy marketing of Nvidia and how much some people believe that it will be revolutionary. Faster is always welcome, but that is what people expect, especially with the rumored price and transistor counts. Fermi is most likely going to be faster than the 5870, but that doesn't make it special or unique by any means.

Geometry is just another name for polygons, and video cards have been able to push more and more of them pretty much every year. Fermi is just continuing tradition, not setting a precedent.

And one last thing. Is it fair to make this comparison? Tesselation is to Geometry as LoD is to textures. I'm going to watch some vids on tesselation now that I have time. Hope that I'm not too far off.

Kind of. Tessellation is dynamic and can be used to add and remove polygons based off of a heightmap. Its basically taking the idea of bump maps but actually creating the polygons rather than faking them.
 
Who said anything about TN panels?! You know what happens when you assume.

I am using 2x ips panels and one TN panel at the moment. Sure, there is some colorshift, but is it worth purchasing another 22" ips panel? That's arguable.
I can tell you do not own a 5850/5870 yet based on your biased comments. You're just deluding yourself and trying to justify NOT purchasing it because you're either an nvidia fanboy, or you can't afford it.

No, you do NOT need more horsepower for "playable" framerates. At 5040x1050, I can play Dirt 2, Shift, Dragon Age, Grid, and most games at max settings with a Radeon 5850. My framerates in Grid are consistently over 90fps and Shift are near 60 fps. Crysis warhead at mainstream /gamer mixed settings gives me average 30fps. That's playable.

So, are 3 IPS panels comparable in price to a single 30'? That seems to have been your original point.
 
People need to stop posting from their perspective and think of it from NVIDIA's perspective. No one is doing this.

You are NVIDIA. You were the talk of the town with DX10 and the 8800GTX, G80 core.

Since then, you have re-branded cards and changed models numbers so many times it's not funny.
You release the 200 series, great cards. But, ATI have the 4 series.

You (NVIDIA) didn't see this coming and not at this price. You scramble. You drop prices..... ATI are able to drop more and release newer cards (4890, etc).

You announce GF100, Fermi, targetted towards GPGPU and not gaming where you were built from the ground up. Sounds like ties to IW too, back stab your core audience.

Your competitor, ATI release their 5 series. You sick back and feel almost sick..... tell everyone your card is coming in October.

Delays, TMSC problems, probably internal conflict too.

Brand partners see the new 5 series card's pricing/performance... they begin to doubt. They ask you for discount on wholesale GPU's/reference boards. You decline because you're already running on RAZOR THIN MARGINS.

Board partners stop making/selling your card. By end of Nov/Dec, there's a worldwide shortage of high end 275/285/295's. You then (because you declined cutting your prices) discontinue the high end 200 series.

Yet, all this time keyboard warriors bag ATI for their "paper launch", "stock shortages" and "price gouging".

The CEO steps out with what appears a near final card, nearly 2 - 3 months ago - yet you (NVIDIA) are clearly lying as there's nothing apart from that on show. Another tech function and NVIDIA show off some sled and rocket.

WOW. This is going to sell cards people!!

You don't see EyeFinity coming.
You don't see the lower power consumption yet arse-kicking performance coming (180W for 5870).

You delay, delay and refine tech.

At the tech function for the rocket sled, you show off "Surround Gaming". Clearly, NVIDIA didn't build the Fermi for multi monitor gaming (it's a GPGPU first, gaming second... NVIDIA's words).

3 monitor gaming (to keep up with your competitor) requires 2 x cards. 2 x cards require an SLI motherboard. We find out high end GF100's will use 285W each. This is now requiring 600W of power just dedicated to GPU (to run 3 outputs). Also requiring a more expensive and limited range SLI equipped motherboard.

After all of that, people still think NVIDIA are coming out of this unscathed?

A company that once, lead the market. A company that once, innovated instead of going stagnant in 2006/2007. Now we get product releases with bugs/issues (295). We get products that never get the right support (3d vision). We get products that aren't made for certain functions, yet released to keep up with competitors offerings (nvidia surround vision).

After that, you think this 6 month delayed product, that is a beast to build. Will use more power, wasn't built for gaming or 3 monitor support. Being shown a product that was a clear FAKE, if that card was real, why have we yet to have seen any real benchmarks?

It's just amazing how blind people can truly be...
 
Here's the real question: What if Fermi STILL requires 2 cards for triple monitor gaming? We're talking at least $800 to side with nVidia and get the eyefinity equivalent. Even if they were same cost as the 5870, faster, and had OMG TRUE GEOMETRY, there's no way in hell I'm buying 2 of them for my 3 monitors.


I'm so sick of the ATI drivers. Even if two cards are needed. I'm back on board the nvidia train.

I'll throw these 2- 5870's in my other rig and run them @ 1920 X 1200.

The last ATI 10.1 drivers was a big disappointment for me. and there is a point when you just cant take the frustrations anymore.
 
So, are 3 IPS panels comparable in price to a single 30'? That seems to have been your original point.

Sure. NEC EA231wmi. 1920x1080 and $300. It's a tradeoff between extra fov or vertical viewing angles.

2560x1600 or 5760x1080 ? Peripheral vision has huge benefit in games. I'd still go for the 3x smaller screens. Rotate the NEC ea231 into portrait for 3240x1920 resolution for desktop use. 30" lcds are far too expensive in my opinion. Multiple monitors are better especially for normal desktop use.
 
If it arrived the same price and had higher performance (same price thing wont happen) then it would be senseless not to get the highest possible price/performance ratio possible. Brand loyalty is silly
 
Gotta love the new video card time of year. So many panties in so many bunches it's an underwear lovers wet dream.
 
Sure. NEC EA231wmi. 1920x1080 and $300. It's a tradeoff between extra fov or vertical viewing angles.

2560x1600 or 5760x1080 ? Peripheral vision has huge benefit in games. I'd still go for the 3x smaller screens. Rotate the NEC ea231 into portrait for 3240x1920 resolution for desktop use. 30" lcds are far too expensive in my opinion. Multiple monitors are better especially for normal desktop use.


:D

Wow, I didn't know. I may have to look into that monitor. I would love to chuck my crappy TN panel.

Thanks and I agree at that price I would also go with three of these in eyefinity over a 30".
 
I'm so sick of the ATI drivers. Even if two cards are needed. I'm back on board the nvidia train.

I'll throw these 2- 5870's in my other rig and run them @ 1920 X 1200.

The last ATI 10.1 drivers was a big disappointment for me. and there is a point when you just cant take the frustrations anymore.

So you think that the drivers for a brand new Nvidia architecture will somehow be better right out of the gate than the ATi drivers for cards that have been out on the market for four months now (six months by the time Fermi is expected to launch)?



I highly doubt it.
 
Back
Top