Hardocp called it like it is.
AMD hyped this up as much as possible. In the end, their hype was empty. They delayed legitimate reviews because the card wouldn't be able to handle a legitimate review.
The expectations that some people have with these RMAs are just unreal.
You are literally complaining about the serial number not being a number you like.
Come on, everyone knows that all 7XXX series cards can overclock to at least 1.5GHz. If you don't compare Nvidia cards at stock to AMD cards at 1.5GHz, then you are a Nvidia shrill, fanboy, and employee.
Anyways, on planet reality, the GTX 670 is always the better choice.
'
So the 7950 is better than the GTX 670 because the GTX 670 "cheats" by getting more performance in games, and the 7950 is better because it has extra Vram and memory bandwidth that doesn't actually translate into higher performance in games that aren't Crysis 1?
Okay.
So Crysis 1 and Metro 2033 prove that an overvolted 7950 that uses about 50% to 70% more power than a cheaper and quieter GTX 670 is better because these two games are slightly better than the GTX 670?
Okay. Whatever dude. I guess if you put 1.5 volts into the 7950 and invest in a liquid...
If the 7990 is even made, it will be almost completely pointless. The 7970 uses more power than the 6970, and it uses far more power than the actual competition (the GTX 670/680). This means a dual GPU card like the 7990 will have to be underclocked in order to meet power limits. Remember, AMD...
Computer Case
http://www.newegg.com/Product/Product.aspx?Item=N82E16811163178
$660
Motherboard
http://www.newegg.com/Product/Product.aspx?Item=N82E16813131803
$470
Four 6GB Video Card
http://www.newegg.com/Product/Product.aspx?Item=N82E16814133347
$15,992
DVD Drive...
You addressed the price part of my argument, and ignored the performance part. It is the combination of the two that is annoying to me. Do you think people would be as annoyed at a $550 price if it was 45-55% better than the GTX 580? My hypothetical argument is about the performance of the...
Or we realize that AMD isn't the only company on the planet that will be making 28nm parts, and Nvidia's 28nm parts should have absolutely no trouble matching a card using 250~ watts and only provides 20% more performance than a GTX 580.
Nvidia is so close to the 7970's performance that it...
It should be a lot more than 25% better since this is a "real" generation improvement due to the die shrink.
I don't imagine Nvidia's 5-6 billion transistor 28nm chip being "only" 25% better than the GTX 580.
Note the use of the ancient I7-920 in his "reviews" combined with automatic benchmark tools that greatly reduce CPU load
also note the lack of frame graphs to show consistency.
AMD intentionally shipped garbage motherboards to reviews in order to get worse reviews! Also, anyone that says otherwise is an Intel shrill, puppet, employee, AND investor.
1) Denial
2) Anger
3) Bargaining
4) Depression
5) Acceptance
What graphics card(s) and CPU can run The Witcher 2 completely maxed out with a minimum FPS of 40? 1920x1080 resolution and 4x Anti-Aliasing is required. Absolutely every setting must be maxed out.
I primarily use newegg for that reason. There really isn't any competition with their ability to sort items usefully. For example, try searching for a GTX 460 1GB on any other site. Amazon will insist on giving links to every GTX 560 1GB, GTS 450 1GB, GTS 550 1GB, and GTX 460 1GB SE mixed...
I'd estimate a 10-20% upgrade. This is not a path I would go down. You would be better served waiting a year for 28nm cards. By then, a real performance increase with should be practical.
"Someone in this forum called the performance increase from GTX 480 to GTX 580 "pathetic". I wonder what's the word used for the performance jump from Cypress to Cayman..."
Pathetic is a good word.
The problem I have we these leaked numbers is how close the 6870 is to the rumored 6950's performance. I guess the point of the 6950 under these rumors is to compete against the full-powered GF114 GTX 560, but that seems a bit of a tiny performance "tier" to aim a new hyped GPU at.
I guess...