It's starting to show (Fermi) (NEW official MSRP's)

GoldenTiger

Fully [H]
Joined
Dec 2, 2004
Messages
29,762
Official MSRP's are $399.99 for the GTX 470, and $549.99 for the GTX 480.


preordergtx480pc.jpg


I pre-ordered my 480 for now, can always cancel prior to ship if it's not a good card per reviews... if it is, PNY has a lifetime warranty so it's all good :).

PNY 480: http://www.pcsuperstore.com/products/11112353-PNY_Technologies-VCGGTX480XPB.html

PNY 470: http://www.pcsuperstore.com/products/11112352-PNY_Technologies-VCGGTX470XPB.html

Their prices have always been accurate in the past on prior launches such as GTX 280 and 5870 (MSRP) so there's no reason to assume they'd suddenly be wrong this time around.
 
PNY's lifetime warranty is for the shelf-life of the product. Add $50-$100 for Canadians lol.
 
PNY's lifetime warranty is for the shelf-life of the product. Add $50-$100 for Canadians lol.

Ew, wasn't aware of that snafuu with their warranty... thanks for the heads-up. Apparrently once they end production and run out of stock, your warranty is over.
 
I thought offical prices were 349 and 499 respectively or is that site just gouging before they even launch?
 
Well, is the purchase price the "your cost"? If so, $10 over previous MSRP is better than expected.
 
I thought offical prices were 349 and 499 respectively or is that site just gouging before they even launch?

They COULD be gouging the MSRP, but honestly, I saw their GTX 280 preorder and 5870 ones with spot-on MSRP's and slightly lower "Your cost" listed. So, take that as you will, my hunch is they're not gouging.

http://www3.pny.com/support/media//...a-bd8a152eb045/XLR8 Warranty - 9-01-09_v1.pdf

Apparrently PNY's warranty is lifetime, not shelf life. See above link.

Well, is the purchase price the "your cost"? If so, $10 over previous MSRP is better than expected.

Yes, plus shipping.
 
Jesus, you all should just wait until reviews. I can't believe some of you all commit to buying a card before a single review is out........I know OP did mention elsewhere he can still cancel preorder.
 
will never buy PNY again, being in Canada i end up paying 50-100$ for every RMA i send in to those clowns in brokerage because they cant label a damn package properly for international shipping.
 
They COULD be gouging the MSRP, but honestly, I saw their GTX 280 preorder and 5870 ones with spot-on MSRP's and slightly lower "Your cost" listed. So, take that as you will, my hunch is they're not gouging.

http://www3.pny.com/support/media//...a-bd8a152eb045/XLR8 Warranty - 9-01-09_v1.pdf

Apparrently PNY's warranty is lifetime, not shelf life. See above link.



Yes, plus shipping.

Well, we have to see as to how PNQ defines the video card and under which category. If it claims a life time warranty, you'd look at this:

http://www3.pny.com/support/media//...21-bae0-e0a276b6b9f0/Lifetime Warrenty_v1.pdf

edit:
XLR8 is 3 years.
XLR8 GTX is 1 year.

Shows you how much PNY cares about the GTX series. But nonetheless, if the box says lifetime warranty, I'd presume the above PDF applies.
 
dont think of it as costing 550 dollars, think of it as making 300 when you sell it to a fanboi for 850 :)
 
Well, we have to see as to how PNQ defines the video card and under which category. If it claims a life time warranty, you'd look at this:

http://www3.pny.com/support/media//...21-bae0-e0a276b6b9f0/Lifetime Warrenty_v1.pdf

edit:
XLR8 is 3 years.
XLR8 GTX is 1 year.

Shows you how much PNY cares about the GTX series. But nonetheless, if the box says lifetime warranty, I'd presume the above PDF applies.

Your link is an older version of the terms and conditions, from 2006. The one I linked is the latest from late 2009. Note the "Rev" date at the bottom of each :).

dont think of it as costing 550 dollars, think of it as making 300 when you sell it to a fanboi for 850 :)

There's some positive thinking :p .
 
http://www3.pny.com/NewsEvents/PDFS/PNY_Announces_Lifetime_Warranty_for_XLR8_GTX_Series.pdf

Thus, the warranty goes over to the life time warranty policy upon registration which means shelf life.

The way I read that is that it is for the other cards. All new releases should switch from 1-year to the lifetime I linked, upon registration. All old cards such as the ones listed in that prior press release (from August, vs. the terms I linked from September) would go to the old one which was shelf life. In any case, there's enough doubt here I think I will be calling to make sure.
 
I thought offical prices were 349 and 499 respectively or is that site just gouging before they even launch?

Remember what happened with the 58X0 series ATI cards? Retail prices ended up $50-$100 higher due to demand and short supply and those prices are still here today, 5 months later.

Fermi is going to be like that...except WAAAY worse because its stocks will be ridiculously low. I bet you won't see any significant volume until end of May.
 
Regarding the lifetime warranties being only for shelf-life, it's true for many products not just PNY. Make sure you read the terms or get confirmation from the manufacturer first. I have friends who work in a large computer retailer and they are told to instruct customers that lifetime is only shelf life just in most cases.
 
will never buy PNY again, being in Canada i end up paying 50-100$ for every RMA i send in to those clowns in brokerage because they cant label a damn package properly for international shipping.

thank god XFX and EVGA know what they are doing. Love the giant "warranty/repair replacement" sticker they slap on the box.
 
Man, I hope that's not true. For a card that performs about 5-10% more then a 5870 that's way too much. I highly doubt it will drive prices down if so. :(
 
Man, I hope that's not true. For a card that performs about 5-10% more then a 5870 that's way too much. I highly doubt it will drive prices down if so. :(

Not really a known thing yet on the performance, and the rumor mills didn't get the MSRP right even.
 
Well GoldenTiger, last time I ended up siding with you when you chose the 280 over ATi's offerings. While in theory, a much cheaper 4850 CF would have stomped the 280, the immature drivers and CF overhead and other issues led to me selling my 4850s and getting a 280. We thought you were nuts then, but time proved you right.

What will time tell us this round? I have a feeling the heat will be the biggest issue, followed closely by a lack of perceived value over the 5870, but there's no way to know for sure now...
 
thank god XFX and EVGA know what they are doing. Love the giant "warranty/repair replacement" sticker they slap on the box.


yes thats all it takes, other wise the smacktards at customs feel the need to tax us based on the insured replacement value of whats in the box, not reading the fine print that it is an rma/warranty replacement/repair
 
Friday cann't get here fast enough.I am basically a long time ATI but my EVGA 275 SC has been great.With these prices I don't know.I am a impluse buyer and could get one.I have spent top dollar before until cheapness set in.
 
the gtx470 price is not that bad considering 5850 price is normally $300 (but in reality you can grab one for around $250 or lower).

The gtx480 price is too much compared to the 5870 and rumored performance difference
 
Ordered just for the hell of it. Figure I'll cancel if the reviews are bad and just stick with my 5870.
 
Well GoldenTiger, last time I ended up siding with you when you chose the 280 over ATi's offerings. While in theory, a much cheaper 4850 CF would have stomped the 280, the immature drivers and CF overhead and other issues led to me selling my 4850s and getting a 280. We thought you were nuts then, but time proved you right.

What will time tell us this round? I have a feeling the heat will be the biggest issue, followed closely by a lack of perceived value over the 5870, but there's no way to know for sure now...

:) Glad someone remembers (and thank you), also found the 8800GT launch right off. RL's been treating me roughly lately, and I got tracked off in flame wars recently this time around... still, been thinking ;).

Anyway, my honest hunch is... at 1920x1200 and 2560x1600 with 2x-4x aa it's going to be at least a good 30-35% faster on the minimum framerates with maybe 20-25% avg. framerate boost (on those titles), across all games. At 1920x1080/below without AA it'll have a far slimmer lead, which will cause many to see it as not worth it (and probably wouldn't be for them). In short, probably similar to the 5870's lead over the GTX 280 gap-wise. I absolutely don't believe the rumors that claim it is tied or below the 5870: it doesn't make logical sense because a) it would be product suicide to release something slower, that much later, and that much pricier, and b) the price is higher than expected meaning they have to think they have something to offer (another thing is that apparrently reviewers were told $350/500 prices, because the admin of another site acted surprised when I mentioned these $400/550 ones, meaning it was last-minute... additionally bestbuy's MSRP in their system lists it at $399.99 for the 470 when I called with their internal SKU and asked a sales rep).

Note that I do *not* have any inside connections with reviewers or nVidia... I just make educated guesses. So, take it as you want. I simply am connecting dots from catching posts that get deleted on various forums, as well as timing of posts/specs/price/etc. Also making me lean toward these, admittedly optimistic, figures is that the FUD machine is in overdrive with targets all over the map, mostly trending toward the improbable (matching or stupidly-good-not-going-to-happen). Simply I think the rumor sites are being fed different rumors, while mostly avoiding the actuality.

Everyone can feel free to throw pies at me if I'm wrong :p .

EDIT: To touch on the heat issue: the heatsink looks incredibly beefy and well-designed: I don't think it's going to be a problem, though it will certainly limit air-cooled overclocks a bit (I don't think it'll be as bad as "it can overclock 5% and that's IT!!" level). It'll run warm, but not uncomfortably hot. They were probably planning on a higher TDP but shied away, leaving a slightly-more-than-needed cooling solution here. Why do I say only a bit? Not one, but two manufacturers already display voltage adjustment and "overclocking boost" features on their boxes (Asus being one, I forget the other offhand). They wouldn't include even a marketing bullet point for something that's going to get end users raging that it doesn't work or overheats/crashes their cards. They would at most tag it in a small line on the back of the box and leave it at that. These badges are large and center on the front of the packaging.
 
Last edited:
I'm guessing you don't remember the launch of the R600. It played out very similar to how Fermi is playing out. I think Charlie's right about much of his information - although he stretches it a bit with his pure hate for nvidia. The TDP isn't going to be what it is because performance is great, rather because they had to push as much power into the card as possible to get performance that would best ATI's offering.

You can't just make a good product out of thin air, and the existing 88/g92/gtx200/gtx300/whatevers couldn't be extended further with just name changes and die shrinks.

While you may not have any inside information, there are plenty of people that do, in the days of the internet the information is bound to get passed at some point in time.. most general public opinion of products prior to launches have turned out to be true.. ie r600, phenom.. id wager there are only a few times, if any that the public is truly surprised with the final product with regards to computer hardware.

If your promise holds, I have a feeling you'll be seeing a lot of pies.. :p
 
Well when I say heat, what I really mean is performance per watt.

If Fermi burns an absurd amount of power (approaching 2x or 1.5x of the 5870) for a small amount of performance gain (5% avg with some 10% edge cases) there's no way I'm buying it. Not when ATi can double the performance of their previous architecture while increasing load only 5W.

I mean, yes, performance/$ is important to me, but if the card offers a really low performance/watt figure, that belies an extremely inefficient design to me and I won't buy it. I would rather have a really efficient, optimized design that is a fraction slower than a logic spam design (which, let's face it, is what GT200 was) that guzzles an absurd amount of power just to keep up.
 
I'm guessing you don't remember the launch of the R600. It played out very similar to how Fermi is playing out.

While you may not have any inside information, there are plenty of people that do, in the days of the internet the information is bound to get passed at some point in time..

If your promise holds, I have a feeling you'll be seeing a lot of pies.. :p


I didn't follow the R600 (2900XT) rumors much at all at that point in time. I've followed card rumors since the V3 3000 though :).

While it may be similar, I think there's something a bit suspicious with how varied/wild the rumors and slam-machine are this time around. While people with inside info do pass it out, much of that is drowned out/lost in the sea of fake leaks and outright bull that are posted as well. There's too little signal in all this noise right now.

So, it is my hunch that the GTX480 will perform as I said above... if not, I guess I'd better get a fork out to clear some of those pies away!
 
Well when I say heat, what I really mean is performance per watt.

that guzzles an absurd amount of power just to keep up.

I'd say it'll guzzle some extra power, definitely more than the performance gained, but I don't think it'll be as bad as a 1.5x ratio. My guess would be 1.2:1 perf/watt ratio vs. the 5870.
 
Hmm, if the actual prices did turn out to be $550 for the gtx480 and $400 for the gtx 470, that wouldn't terribly surprising me to be honest. The idea of 350/500 pricing seemed a little bit too good to be true in a way. I think the way they'll justify the extra pricing will probably be along the lines of "Well, you get PhysX, 3D, etc and possibly 5-10% more performance(not including PhysX or 3D of course)"
 
http://www.xtremesystems.org/forums/showpost.php?p=4304409&postcount=1360
Supposed nVidia "reviewer's guide" benchmark #'s, avg. fps listed.

Hmmmm... if true (and the watermark matches with what I've seen in the past, who knows though... remember we're in rumor-ville still), the #'s for 2560x1600 look like this percentage-wise for the average framerates (I'm guessing minimums would be higher on the 480 vs 5870, as well sort of how the 280 vs. the 5870 had very high minimum FPS gains but lesser average ones):

Far Cry 2 2560x1600 4x/16x: 40.5% advantage
Hawx 2560x1600 4x/16x: 12.5% advantage
Resident Evil 5 2560x1600 4x/16x: 16.6% advantage
Crysis Warhead 2560x1600 4x/16x: 17% advantage (but still unplayable)
Batman AA 2560x1600 4x/16x: 39.3% advantage
Left4Dead 2560x1600 4x/16x: 6.4% advantage (CPU-bound?)
Battleforge 2560x1600 4x/16x: 23.5% advantage
World In Conflict 2560x1600 4x/16x: 23.7% advantage
Aliens vs. Predator 2560x1600 4x/16x: 25% advantage
Battlefield BC2 2560x1600 4x/16x : 2.3% advantage (lol)
Metro 2033 2560x1600 4x/16x: 38.5% advantage (but unplayable, 18 fps vs 13... no settings really specified either as far as tesselation)
Bioshock 2 2560x1600 1x/16x: 11% advantage
CoD5 2560x1600 4x/16x: 15.7% advantage
CoD6 MW2 2560x1600 4x/16x: 14.5% advantage
Dirt 2 2560x1600 4x/16x: 10.4% advantage

No system specs, no driver info, etc., so take with a few pebbles of salt until official reviews show up. If this turns out relatively the same in actual reviews vs. the Catalyst 10.3, it's not a bad launch in my opinion: if not, that's going to evaporate much of the difference. :eek:


annihilat0r on XS linked thread said:
I've made the calculations:

GTX 480 beats the HD5870 %24 average on 1920 + 4xAA, and 23% average on 2560 + 4xAA.

GTX 470 performs nearly completely equal to HD5870 on both settings.

if those benches are correct. :p:

EDIT:
Gilgamesh said:
http://futuremark.yougamers.com/forum/showpost.php?p=1346470&postcount=1

"They used the 10.2 ATi cats for their comparison by the way."

10.2's I have seen on forums claim are around 3%-5% slower average than 10.3's... so it's pretty accurate still on the 5870's performance I'd say.


If your promise holds, I have a feeling you'll be seeing a lot of pies.. :p

Seems I'm saved!!!! I can put my silverware away for now... ;).


=============

Anddddd the big parade of people who said GoldenTiger was a 100% bs'ing troll with a low IQ and lacking the common sense of a turnip... amazing how they all vanished, isn't it?

Nvidia obviously spent the last few months wringing the last bit of performance out of the GTX 480 while keeping under the 300 watt spec. so as to eke out some kind of performance 'win' over the 5870, without which they couldn't have realistically released the card at all, leaving any old 5870 able to kick it's ass with nothing more than a standard CCC overclock because the GTX 480 will have zilch overclocking headroom to counter with.

So for all practical purposes, the GTX 480 won't even have a performance edge on the 5870.

No performance edge
Costs substantially more
Runs far hotter (and louder)

How is that not a massive fail?

WorldExclusive said:
Do you have anything else to do? All of your posts have been filled with BS. Ignore this guy.

GoldenTiger said "Originally Posted by GoldenTiger
How do you know this info about a 512 part being NEEDED to beat the 5870? Are you an nVidia engineer sir?

LOL @ saying Charlie would be "right", he has predicted 448, 480, and 512 in the last month alone. He can't be wrong when he claims everything possible as fact ."

-----
Common sense of a rutabaga?

A 295 TDP + 512 shaders + barely beating 5870's performance = at the outer edge of power and using ALL possible shaders = absolute utilization and maximization of all resources available to Nvidia's engineeer and they are still able to barely eke out a performance win.

Obvious to even that rutabaga ANYTHING LESS wouldn't get job no. 1 done. And job no. 1 for Nvidia's engineers IS to get a performance win over the 5870.

Goldentiger posted: "Right, just because I don't bash nvidia every silly way possible like you I'm a troll . Get serious...

Besides, you keep talking about how it's slow, hot, late, pricier, and performs poorly: why on earth would someone want a product like that , buddy?"
---------

it's funny, your tone completely changes every time NV is about to release a card :p Enthusiasm is good, but you always seem to lean to the green side!

either way, the overall rumors are consistent. it's hard to ignore them when there is not 1 single source saying anything to contradict them



Dashit said:
Originally Posted by GoldenTiger
We can't see it for now anyway ! Besides, I thought most here didn't care anyway from all the remarks of how fail Fermi is with silly speculation .
----------------------
No most remarks are towards you since you like spamming the crap outta Fermi threads.

You're almost a big of a troll as DualOwn.

how is this version of fermi a success again?

7+ months late? check
no appreciable performance gain over ATi offerings? check
definately going to be limited availability (to quote nvidia mass availablity starting on the 6th of next month) check.
massive power draw compared to competition? check

again where is the success in this part?

So which one beats the 5870 by 5-10%. The 480 512SP ultra? or the 480 480SP normal?

Boy this is getting all confusing now. Specially if they send out Ultra for review samples, but only the 480gtx normal and 470gtx is released.

/confused

Problem is that OBR is known NV fanboy and he often spread whatever NV PR tell him - in past 6 months he created lots of articles about how awesome Fermi will be (performance like 5970, reviews before xmax 2009 for sure etc.) - several of them were soon deleted by his superiors because it was just BS.

Long story short: Charlie is much more trustworthy source than OBR.

He was not in the past,. "Hate" is a very motivating driving force. I believe he is spot on as before or very close, problem is though that I understand more and more why he dislikes Nvidia so much.

Charlie is either going to look like a genius or he is going to take a dirt nap.

How about the fact is 8 months late, running too hot (Already one major buyer "Oak Ridge" has dumped them), is on 3rd A revision already, B1 being done (If Kyle's heard it and feels confident to say that I believe him).

40nm process has leakage problems. ATI adapted. NV didnt. They tried desperately and did patching to get round what is really a redesign issue.

Its a fundemental engineering problem. It IS costing NV alot and i doubt any A revision Fermi cards will make any profit. Using risk wafters for retail cards? Ballsy. Its going to be EXTREMELY interesting to see their actual yields but i very much doubt we'll ever see as it will be kept under carpet.

It's accurate {EDIT by GoldenTiger: this is in reference to Charlie's article claiming the Dirt2 results are fudged}. The Dirt demo does not recognize GF100 cards and thus switches automatically to DirectX 9. Not sure if that's a big deal tho.

So we are supposed to roll everything back to DX9 to make Fermi look better now? Sad, nVidia, sad. Too bad Semi Accurate is spot on. Irony is a funny thing.

These are DX11 parts, if we have to cripple everything to give Nvidia a fighting chance then what is the point of Fermi ...

Guess GoldenTiger wasn't such a big FUD-spreading troll with his educated guesses after all, huh? ;)
 
Last edited:
If that's the case, I'd see myself getting a 470 with the proceeds from the 5850's I just sold, EXCEPT I NEED 2 FOR 3D SURROUND. we still have no idea when that will release, either. :(
 
Hmm, I wonder why that chart doesn't include the 5970 on it on the linked forum :)? I guess nVidia didn't want to include dual gpu cards...but wait...sli'd GTX 480s is listed ;). I guess they wanted to be seen as being on 'top' even if there's a faster card or setup out there form a competitor. In that case, just don't show it on your fancy graph and pretend it doesn't exist would be the best solution?

@GoldenTiger
I think your stats are a bit skewed as you listed them in regards to how the actual product may perform. If you went through and evaluated the two cards were playable frame-rates occur, you'd find they are much closer. If you removed the top 3 performers which 'coincidentally?' are all TWIMTBP games, the performance advantage would drop to around 10%. Take for example Crysis Warhead which by your own account you said was unplayable under the circumstances were the GTX480 has a 17% advantage. Why don't we look at an arguably playable-fps situation shall we?

Crysis warhead at 1920x1200x4aa is 30.7 fps for the 5870 versus 30.3 fps for the gtx 480 which is somewhat playable fps and at this resolution, the 5870 actually has a extremely tiny 1% advantage. Still, it would bring down the claims of an average "24%" advantage that you quoted. At 2560x1600x4aa, your down to 16.2 fps for the 5870 and 19.0 fps for the gtx 480. So yes, it pulls ahead by 17% but what's the advantage of pulling-ahead at ultra-high resolution if the frame rate is unplayable?

That would be like myself claiming my car on a highway drives 22mph while your car on the highway drives 19mp5. Sure you'll be faster but its a mute point as cars will still be piling up behind both of us honking at us to move faster, passing us whenver the opportunity arises and shouting profanities at us as they do so. Neither one of us would be getting highway driveable speeds.

I do find some information odd and strange on that card. According to the chart, when your running at 2560x1600x8aa and enable sli in games like Tom Clancy's H.A.W.K., you get 99.3% performance enhancement! Almost 100% scaling which is practically unimaginable.

Admitedly, one of the games were nVidia seems to shine the most is H.A.W.K but with 10.3's drivers according to ATI:

Improves up to 15% with ATI Radeon HD 5970 graphics products
Improves up to 20% on ATI Radeon HD 5800 Series products and ATI Radeon HD 5700 Series products

There's another game that might turn into ATI's favour when you take into account the 10.3 driver enhancements.

A 24% improvement sounds nice and I'm sure it would be, but, if we start looking at resolutions that give us playable frame rates and take 10.3 into effect, it seems like that number is overly-optimistic at best yet the 25% greater cost of the GTX480(assuming $550) versus the $412 the 5870(newegg.com's pricing) remains in existance.
 
Last edited:
I would like that if the 470 is equal to the 5870, maybe we'll have some price wars and I can get another 5870 for $300 or less with cb. SLI gtx480s are going to be too expensive and produce too much heat for my rad.
 
http://www.xtremesystems.org/forums/showpost.php?p=4304409&postcount=1360
Supposed nVidia "reviewer's guide" benchmark #'s, avg. fps listed.

Hmmmm... if true (and the watermark matches with what I've seen in the past, who knows though... remember we're in rumor-ville still), the #'s for 2560x1600 look like this percentage-wise for the average framerates (I'm guessing minimums would be higher on the 480 vs 5870, as well sort of how the 280 vs. the 5870 had very high minimum FPS gains but lesser average ones):

Far Cry 2 2560x1600 4x/16x: 40.5% advantage
Hawx 2560x1600 4x/16x: 12.5% advantage
Resident Evil 5 2560x1600 4x/16x: 16.6% advantage
Crysis Warhead 2560x1600 4x/16x: 17% advantage (but still unplayable)
Batman AA 2560x1600 4x/16x: 39.3% advantage
Left4Dead 2560x1600 4x/16x: 6.4% advantage (CPU-bound?)
Battleforge 2560x1600 4x/16x: 23.5% advantage
World In Conflict 2560x1600 4x/16x: 23.7% advantage
Aliens vs. Predator 2560x1600 4x/16x: 25% advantage
Battlefield BC2 2560x1600 4x/16x : 2.3% advantage (lol)
Metro 2033 2560x1600 4x/16x: 38.5% advantage (but unplayable, 18 fps vs 13... no settings really specified either as far as tesselation)
Bioshock 2 2560x1600 1x/16x: 11% advantage
CoD5 2560x1600 4x/16x: 15.7% advantage
CoD6 MW2 2560x1600 4x/16x: 14.5% advantage
Dirt 2 2560x1600 4x/16x: 10.4% advantage

No system specs, no driver info, etc., so take with a few pebbles of salt until official reviews show up. If this turns out relatively the same in actual reviews vs. the Catalyst 10.3, it's not a bad launch in my opinion: if not, that's going to evaporate much of the difference. :eek:


did you notice something fishy about that ?

all test is UNDER 2560x1600 4x/16x.
which clearly blows 1GB Vram on 5870 in order to get those gap.

what about the raw performance on common monitor?
I would say its +/- 5-10% around 5870....

overall, that benchmark is NOT for those who use lower than 2560x1600.
and would be funny to see if the 5870 2GB takes back the crown on same setting... if it ever comes out..
 
Hmm, I wonder why that chart doesn't include the 5970 on it on the linked forum :)? I guess nVidia didn't want to include dual gpu cards...but wait...sli'd GTX 480s is listed ;). I guess they wanted to be seen as being on 'top' even if there's a faster card or setup out there form a competitor. In that case, just don't show it on your fancy graph and pretend it doesn't exist would be the best solution?

@GoldenTiger
I think your stats are a bit skewed as you listed them in regards to how the actual product may perform. If you went through and evaluated the two cards were playable frame-rates occur, you'd find they are much closer. If you removed the top 3 performers which 'coincidentally?' are all TWIMTBP games, the performance advantage would drop to around 10%. Take for example Crysis Warhead which by your own account you said was unplayable under the circumstances were the GTX480 has a 17% advantage. Why don't we look at an arguably playable-fps situation shall we?

Crysis warhead at 1920x1200x4aa is 30.7 fps for the 5870 versus 30.3 fps for the gtx 480 which is somewhat playable fps and at this resolution, the 5870 actually has a extremely tiny 1% advantage. Still, it would bring down the claims of an average "24%" advantage that you quoted. At 2560x1600x4aa, your down to 16.2 fps for the 5870 and 19.0 fps for the gtx 480. So yes, it pulls ahead by 17% but what's the advantage of pulling-ahead at ultra-high resolution if the frame rate is unplayable?

That would be like myself claiming my car on a highway drives 22mph while your car on the highway drives 19mp5. Sure you'll be faster but its a mute point as cars will still be piling up behind both of us honking at us to move faster, passing us whenver the opportunity arises and shouting profanities at us as they do so. Neither one of us would be getting highway driveable speeds.

I do find some information odd and strange on that card. According to the chart, when your running at 2560x1600x8aa and enable sli in games like Tom Clancy's H.A.W.K., you get 99.3% performance enhancement! Almost 100% scaling which is practically unimaginable.

Admitedly, one of the games were nVidia seems to shine the most is H.A.W.K but with 10.3's drivers according to ATI:



There's another game that might turn into ATI's favour when you take into account the 10.3 driver enhancements.

A 24% improvement sounds nice and I'm sure it would be, but, if we start looking at resolutions that give us playable frame rates and take 10.3 into effect, it seems like that number is overly-optimistic at best yet the 25% greater cost of the GTX480(assuming $550) versus the $412 the 5870(newegg.com's pricing) remains in existance.

Nice post. Earlier this week a certain person gave me a lecture on using canned benchmarks and now that same somebody posts benchmarks straight from Nvidia showing a 24% average improvement over the 5870 and this is supposed to prove that the GTX480 is an amazing card worth the $550?

As you point out, there are a few very odd things in the results. These have also been noticed in the other threads where these benchmarks are posted.

If I was thinking of buying fermi, I would actually be worried with these benchmarks. These are straight from nvidia, We don't know what kind of systems they ran them off, how they ran them, nothing, but, I bet they made sure to give every advantage to their own card. And I am sure ATI to the same with their own benchmarks. I don't trust benchmarks from any graphic card company. That's why I would be worried. If 24% is the best they can get from inhouse benchmarks, how much is that going to drop in the real world, like when the card gets reviewed and tested here. And if it drops a lot how can they justify the $550 price? A price of around $450 would have been perfect.

We still know nothing about the card. Power, heat, noiselevels etc.

The benchmarks above look ok, nothing special. Nothing that would make me rush out and buy one. So I hope that they are wrong, oh, don't misunderstand me, I mean wrong on the low side, that the real reviews prove it to be 30% to 40% better than the 5870.

Friday will tell the story.
 
Back
Top