Fermi/GF100 tidbits: NDA expiry, performance, pricing.

Not really. A 5870 is $400+ and the 5970 is $600+ and you need more than one for the best performance so the high-end market really hasn't changed all that much.

Yes really. Look what Ati did with the 4000 series, they proved you can have high end performance without a high end price tag. The only reason the 5000 series is that expensive is because there hasn't been anything put out by Nvidia to contend with. Which brings us back to the consumer standpoint, we need Nvidia to come out with a video card close to Ati's pricepoint so that us the consumer will not have to spend 500+ dollars on a high end graphics card. Nvidia will lose money? yes, but I bet in the long run they will draw more customers to there products just like Ati did. The 500 dollar price point scares people away, especially in times like these.
 
Small interest here but... in any case, my 2560x1600 makes good use of these cards ;). As far as OMG 500 ZOH NOES, if the performance is there, it will be worth it. Worth is subjective... I would rather have a spectacular time with my gaming than a good one, and since I do a lot of it, it's worthwhile easily. All I see in this thread is "I don't want it so no one else does" rather than any compelling arguments. Also LOL @ people acting like ATI altruistically drove down prices. On pricing, nothing is different now than 10 years ago: high-end costs more and isn't penny-per-penny as efficient but overall much better just like 99% of other products.

Yes dude, I agree mostly. We know about diminishing returns in high end products.

Take a typical luxury car priced at around 60k an d then take another higher end niche luxury car priced around 80-90k. Different? Yes somewhat. Alot more car for the money? No. Worth the extra 30k? Hell no. However that's how it is for the ultra high end.
 
You sir fall into that general populace factor I was talking about. common sense folk like you and myself remember the days of the Geforce wars. I know I won't ever pay that much for a video card again, 300 dollars is my limit.

So sanity prevails in the enthusiast/home computing market. But NVIDIA (et al) are trying to grow into the HPC/Supercomputing market, leveraging the same chips along the way (CUDA/OCL FTW). NVIDIA might not charge more than $3-400 any more, because they will have exorbitant fees for the Tesla and Quadro cards.
 
First off will these new nVidia cards do nFinity like AMD's EYEFINITY ? I know there is an option coming where old nVidia cards can run triple screen gaming BUT in SLI with 2 cards :rolleyes: Will just one single new generation nVidia card do it ?

Plus these are due in March ? That will be 6-7 months after the 5800 series was released, I would think ATI has a refresh coming then ? And by March I know the chip makers will have their factory in full swing for AMD with no shortages by then.

Supposedly Nvidia's surround will only work with (at least?) 2 cards since it will have one card handle the left and center monitor's display and the second card will handle the right display,using something like it's "Quadro Plex" technology

http://www.nvidia.com/object/qplex_display_configurations.html
 
Yes really. Look what Ati did with the 4000 series, they proved you can have high end performance without a high end price tag. The only reason the 5000 series is that expensive is because there hasn't been anything put out by Nvidia to contend with. Which brings us back to the consumer standpoint, we need Nvidia to come out with a video card close to Ati's pricepoint so that us the consumer will not have to spend 500+ dollars on a high end graphics card. Nvidia will lose money? yes, but I bet in the long run they will draw more customers to there products just like Ati did. The 500 dollar price point scares people away, especially in times like these.

No. If ATI could get away with it they would price as high as nV. Also, the 4xxx cards were lower priced because they weren't up to snuff @ the higher res or aa settings. They would have rotted if they charged what nV did at the time. The market is still the same basically.
 
At TWICE the performance of a 285 and DX 11 they will. I know I will!:D

You're making the egocentric mistake of assuming that everybody is like a person who owns a tri-SLI machine. And that's not even getting into the questionable logic and reasoning that went into making that decision.
 
You're making the egocentric mistake of assuming that everybody is like a person who owns a tri-SLI machine. And that's not even getting into the questionable logic and reasoning that went into making that decision.

People who could afford it and wanted the absolutely fastest setup possible went Tri-Sli. I now if I could have afforded I would have.
 
No. If ATI could get away with it they would price as high as nV. Also, the 4xxx cards were lower priced because they weren't up to snuff @ the higher res or aa settings. They would have rotted if they charged what nV did at the time. The market is still the same basically.

I don't agree on that at all, you obviously don't know anything about marketing or looking at the big picture. Ati is the popular choice among consumers now because of what they did regarding there marketing approach with the 4000 series. Thats not to say that Nvidia will do the same, I bet they won't but they should consider it. Who knows by this time next year Nvidia will be back on top again. Wyldfire did make a really good point though, Nvidia does have alot of tech to supply funds for, I bet there is a lot of truth in what he says.

Not up to snuff compared to what at the time of release? The 9800GTX?\
http://www.fudzilla.com/content/view/8111/40/
 
Last edited:
520$? pfft. When will Nvidia learn. The days of 500 dollar video cards are over. The general populace isnt going to spend that kind of money on a video card. Its a niche market at best.

Did you read that? It kicks the fuck out of the 5870, and the 5870 is what I was going to be getting next.

-actually, let me say this, IF Nvidia has a GOOD eyefinity solution, then YES, I will pay $520 for it.

If Nvidia has a crappy eyefinity solution, I'll still be jumping for joy because the prices for the 5870 should drop.

See? All of the fanboys that were screaming Nvidia is finished should be jumping for joy. This product will help lower the cost of the ATi cards.

No matter what card I end up getting, I, the consumer has won.

None of us should be cheering the demise of a competing company. Talking shit? Sure...but predicting the demise of a company with perverse glee.....negative.

And that douche Charlie dejounue (or whatever), just got a big STFU, in form of a solid product, from Nvidia.
 
Did you read that? It kicks the fuck out of the 5870, and the 5870 is what I was going to be getting next.

-actually, let me say this, IF Nvidia has a GOOD eyefinity solution, then YES, I will pay $520 for it.

If Nvidia has a crappy eyefinity solution, I'll still be jumping for joy because the prices for the 5870 should drop.

See? All of the fanboys that were screaming Nvidia is finished should be jumping for joy. This product will help lower the cost of the ATi cards.

No matter what card I end up getting, I, the consumer has won.

None of us should be cheering the demise of a competing company. Talking shit? Sure...but predicting the demise of a company with perverse glee.....negative.

And that douche Charlie dejounue (or whatever), just got a big STFU, in form of a solid product, from Nvidia.

Well said, good sir... well said.
 
Did you read that? It kicks the fuck out of the 5870, and the 5870 is what I was going to be getting next.

-actually, let me say this, IF Nvidia has a GOOD eyefinity solution, then YES, I will pay $520 for it.

If Nvidia has a crappy eyefinity solution, I'll still be jumping for joy because the prices for the 5870 should drop.

See? All of the fanboys that were screaming Nvidia is finished should be jumping for joy. This product will help lower the cost of the ATi cards.

No matter what card I end up getting, I, the consumer has won.

None of us should be cheering the demise of a competing company. Talking shit? Sure...but predicting the demise of a company with perverse glee.....negative.

And that douche Charlie dejounue (or whatever), just got a big STFU, in form of a solid product, from Nvidia.

+1. I want the fracking best card nVidia can crank out, I don't understand why anyone wouldn't. When AMD comes out with its refresh, it's unlikely I'd buy it and might deep down hope that its not too much better than what I would have just bought but in reality I would want AMD to put out the best it can, just to keep up the pressure and the competitive circle.
 
I don't agree on that at all, you obviously don't know anything about marketing or looking at the big picture. Ati is the popular choice among consumers now because of what they did regarding there marketing approach with the 4000 series. Thats not to say that Nvidia will do the same, I bet they won't but they should consider it. Who knows by this time next year Nvidia will be back on top again. Wyldfire did make a really good point though, Nvidia does have alot of tech to supply funds for, I bet there is a lot of truth in what he says.

Not up to snuff compared to what at the time of release? The 9800GTX?\
http://www.fudzilla.com/content/view/8111/40/

The GTX260 and GTX280 which were released before the 4870.
 
-actually, let me say this, IF Nvidia has a GOOD eyefinity solution, then YES, I will pay $520 for it.

None of us should be cheering the demise of a competing company. Talking shit? Sure...but predicting the demise of a company with perverse glee.....negative.



Totally agree about the fanboy part :rolleyes: Your a tool and king douche bag if you LOVE one company over the other :rolleyes: Unless you work for AMD or nVidia why be a fanboy :rolleyes:

I have been building my own rig's since the 90's, and over those years have happily owned both nVidia and ATI cards. I NEVER care who's brand is on the card, this shit is all made in some Taiwan factory anyways, who cares, the tech is all some Asian wires and chips etc...

What I do when a new generation VideoCard is released, is read like 3 or 4 different web sites review the new card, and make my decision based on the price and performance compared to what else is avaialbvle at that time. Over the years nVidia is on top for awhile, then ATI is top dog, it seems to go back and forth every couple of years.

I really don't give two fucks what brand name card is in my computer, I mean why the heck would I ?
 
I don't agree on that at all, you obviously don't know anything about marketing or looking at the big picture. Ati is the popular choice among consumers now because of what they did regarding there marketing approach with the 4000 series. Thats not to say that Nvidia will do the same, I bet they won't but they should consider it. Who knows by this time next year Nvidia will be back on top again. Wyldfire did make a really good point though, Nvidia does have alot of tech to supply funds for, I bet there is a lot of truth in what he says.

Not up to snuff compared to what at the time of release? The 9800GTX?\
http://www.fudzilla.com/content/view/8111/40/

The GTX 260/280, as they released before the 4870 (though right around the same time).
 
So WTF is this a paper launch? So maybe by Christmas you fanboys can order one and then bitch about them being out of stock blah blah?
 
So WTF is this a paper launch? So maybe by Christmas you fanboys can order one and then bitch about them being out of stock blah blah?

LOL... so anyone who wants the best performance is a stupid fanboy, now, I see. Sounds more to me like you're an ATI fanboy who can't stand that nVidia's coming out with a better card.
 
I have no interest in eyefinity.

Neither do I. Giant black bars running across my screen? No thank you. Also, having to buy two or three moniters instead of one? No thank you.

There's no point in going top-of-the-line these days unless you 1) care about multiple moniters 2) want to show off in your sig or 3) want to play Crysis for the 20th time on higher settings

I can't even name any PC exclusives coming out soon that's going to break my PC's balls. I have a 2 year old computer and it STILL runs everything at 1920x1200 with no lag at all.

Let's look at the top 2 (and only 2) PC game developers : Blizzard and Valve. You can run their games on a TOASTER. And maxing them out doesn't require $1500 of TRI-SLI Geforce 100OMGWTFBBQ hardware.

Everything else is a console port which, again, can be max'd out by mid range hardware.
 
So which one is more cost efficient?

Not sure exactly what you're comparing in your question but I'm pretty sure that your question can't be answered right now anyway because the information necessary to make this determination for nVidia's parts is not out yet.
 
ummm yeah hello...the whole point of this thread is that the specs will be posted tomorrow, aka midnight east coast if your the simple type
 
Neither do I. Giant black bars running across my screen? No thank you. Also, having to buy two or three moniters instead of one? No thank you.

There's no point in going top-of-the-line these days unless you 1) care about multiple moniters 2) want to show off in your sig or 3) want to play Crysis for the 20th time on higher settings

I can't even name any PC exclusives coming out soon that's going to break my PC's balls. I have a 2 year old computer and it STILL runs everything at 1920x1200 with no lag at all.

Let's look at the top 2 (and only 2) PC game developers : Blizzard and Valve. You can run their games on a TOASTER. And maxing them out doesn't require $1500 of TRI-SLI Geforce 100OMGWTFBBQ hardware.

Everything else is a console port which, again, can be max'd out by mid range hardware.

Do you ever run any mods? I can think of quite a few games that will break my rigs balls while all modded up. It always amazes me how many people actually post stuff like this. My favorite patr of pc gaming is being able to throw mods at my games.
 
Do you ever run any mods? I can think of quite a few games that will break my rigs balls while all modded up. It always amazes me how many people actually post stuff like this. My favorite patr of pc gaming is being able to throw mods at my games.

That or 2560x1600 + AA + AF + high settings still has a lot of framerate dips even with a 5870 on most games... add in mods + uber texture packs for Oblivion, HalfLife 2 Cinematic, GTA4 highest settings, FallOut 3, et al. and you're talking insane requirements that even that card can't meet alone. The image quality blows lower-end rigs out of the water as they in turn blow consoles out of the water.
 
That or 2560x1600 + AA + AF + high settings still has a lot of framerate dips even with a 5870 on most games... add in mods + uber texture packs for Oblivion, HalfLife 2 Cinematic, GTA4 highest settings, FallOut 3, et al. and you're talking insane requirements that even that card can't meet alone. The image quality blows lower-end rigs out of the water as they in turn blow consoles out of the water.

I have run modded Oblivion and Modded HL2, both of which ran fine. Even modded, the games are years old, and aren't that taxing. AF is not taxing either, and AA sort of hits diminishing returns after 4x.

Also, I've found that 30 inch moniters are simply too large for PC gaming. I sit way closer to my PC than I do my TV when gaming, and a 30 inch screen being viewed from only a foot away would not work. I wouldn't be able to see the entire screen at once, which is annoying and would undoubtedly get me killed.
 
Also, I've found that 30 inch moniters are simply too large for PC gaming. I sit way closer to my PC than I do my TV when gaming, and a 30 inch screen being viewed from only a foot away would not work. I wouldn't be able to see the entire screen at once, which is annoying and would undoubtedly get me killed.

Have you actually used one at home for anything more than a few minutes? I sit about 2.5 feet away from my screen to my eyes, and can see the whole monitor just fine ;)... all I see is better detail. As far as "AA hitting diminishing returns" 4x @ 2560x1600 with high-rez texture packs or newer games still can tax it to 30-40fps minimums if not lower in some cases, which is too low for comfort ;). Personally I prefer 8x MSAA with transparency (Adaptive on ATI) so that bushes/sprites are nice and don't shimmer, which is way more taxing than plain 4x.
 
I love how all these sites are throwing up the same regurgitated paper technical releases about cores, shaders, pipelines, cache, etc. We want HANDS ON reviews.
 
Have you actually used one at home for anything more than a few minutes? I sit about 2.5 feet away from my screen to my eyes, and can see the whole monitor just fine ;)... all I see is better detail. As far as "AA hitting diminishing returns" 4x @ 2560x1600 with high-rez texture packs or newer games still can tax it to 30-40fps minimums if not lower in some cases, which is too low for comfort ;). Personally I prefer 8x MSAA with transparency (Adaptive on ATI) so that bushes/sprites are nice and don't shimmer, which is way more taxing than plain 4x.

Fair enough. I just cannot justify paying $600 for a card anymore. I bought a PS3 for $299, srsly, that's a far better deal than paying twice that for a graphics card.

I'll stick to gaming on my tiny 24 incher, where I don't need to pay out of my ass to run the game at high settings.

PC gaming is in such a crap hole these days, anyway, if it wasn't for Blizzard and Valve, I'd just quit PC gaming altogether.
 
Fair enough. I just cannot justify paying $600 for a card anymore. I bought a PS3 for $299, srsly, that's a far better deal than paying twice that for a graphics card.

I'll stick to gaming on my tiny 24 incher, where I don't need to pay out of my ass to run the game at high settings.

PC gaming is in such a crap hole these days, anyway, if it wasn't for Blizzard and Valve, I'd just quit PC gaming altogether.

Agreed on PC gaming. I bought Fall Out 3 a while back and couldn't play it past a few hours. Fallout 2 I spent hundreds of hours playing again and again. I bought Dragon Age not too long ago and it's sitting around while Bioware has hit a new low in my book (as well as Gamespot.com for rating it so high for nothing interesting). I played and loved KOTOR, BG1 & 2 etc for hundreds of hours.

Diablo 3 is going to be awesome though, and it will probably run @ 60 fps on integrated graphics. I know I'm going to get Star Craft 2 even if it is a huge rip-off. I was never a big competitive RTS player, but I did love Warcraft 3.

I had a lot of fun in MW2... for about 5 hours. Sigh. The good stuff doesn't last and the rest is shit.

How long has it been since the PS3 has been on the market? 3 years now? I'm getting one in March... when FF13 comes out. I think I did the same thing with the PS2, got it for FFX. I never had a PS1 but probably would have gotten it for FF8 (ended up playing FF7 on PC and FF8 on the PS2).
 
Last edited:
Agreed on PC gaming. I bought Fall Out 3 a while back and couldn't play it past a few hours. Fallout 2 I spent hundreds of hours playing again and again. I bought Dragon Age not too long ago and it's sitting around while Bioware has hit a new low in my book (as well as Gamespot.com for rating it so high for nothing interesting). I played and loved KOTOR, BG1 & 2 etc for hundreds of hours.

Diablo 3 is going to be awesome though, and it will probably run @ 60 fps on integrated graphics. I know I'm going to get Star Craft 2 even if it is a huge rip-off. I was never a big competitive RTS player, but I did love Warcraft 3.

I had a lot of fun in MW2... for about 5 hours. Sigh. The good stuff doesn't last and the rest is shit.

How long has it been since the PS3 has been on the market? 3 years now? I'm getting one in March... when FF13 comes out.

about 3 years. I bought one a few months ago and I like it. There are far more games coming out for it than the PC.

The graphics aren't $600 gpu quality, but honestly, it's not that bad if you sit back a little and don't TRY to look for every jagged edge and blurry texture. In motion, and from a distance, it looks pretty good.
 
Oh come on now, everyone knows that the Nvidia cards run Far Cry 2 better, but when they run dark void, Ati is nowhere to be seen ? :rolleyes:

well to be fair Dark Void is not out yet and they probably don't have a personal copy yet to benchmark in a simulated test environment.
 
Wow you need 2 GF100 cards to run 3 monitors.....kind of a drawback/half ass backwards attempt IMO.

Of course not having to deal with a Display port adatper is a nice bonus...but still.

I will hold out on making a decision on getting one until I see if in Kyle's hands and a review on the website.

I do not trust any benchmarks from any company that they display on systems they provide.

Now bring on march!!!
 
I do not trust any benchmarks from any company that they display on systems they provide.

I don't think too many be would trust numbers of this sort. However nVidia has kind of set the bar high here. In releasing some real benchmark numbers, they now have set a target of at least a third more performance over a 5870, at least that's my take. Whatever part this is, it better consistently be beating a 5870 but close to a third, assuming that this will be a $500 to $600 dollar card. If a 5870 is around $400 then $520 for a third more performance would be right in line with a 5870 is price versus performance.

But this is STILL just all speculation. We need real cards and numbers now to move on from here.
 
Back
Top