Leaked Radeon R9 290X Benchmarks?

They have an ace up their sleeve: Maxwell is coming within 3-4 months.
Titan has been out for 8+ months already, and the gtx780 for 3+. They've already sold a crapton of them, and they can afford to drop the prices a little and still sell them, since Ati as always have driver problems, and no support for cuda/physx/lightboost.
Seriously, with this small performance gap, Nvidia has nothing to fear. They will just release Maxwell Q1 2014 and crush Ati again until 2015...

Just to give an idea, GK110 is like 18 months old, and Ati only now is catching up with it in performance.
Question is: how much has Nvidia advanced in the last 18 months while Ati was catching up? I have the feeling Maxwell will easily be double the performance of a Titan...

With those sorts of feelings you clearly have no idea whatsoever how tech works, at all. If we look at how the new nodes are working... the efficiency is getting less apparent, which means that although you have more real-state to pack more transistors, efficiency ain't really there.

So, even with 20nm, and provided that GK104 and 110 are very efficient... I would expect a 30% performance increase (at the same TDP) TOPS. And what if there isn't any new nodes? Well... expect a 10%, and be happy about it.

But DOUBLE the performance of a Titan? On the same node? Yes, at 450W tdp a piece, sure. Otherwise... :rolleyes:


GK110, the chip used in Titan, was presented at the beginning of 2012, and then put in Teslas.
Only the next year it was released on consumer devices, with Titan first, and 780 later.

gtx6xx series was Nvidia trolling, since Ati was so far behind that Nvidia had no reason to release GK110 for consumers. So they just released a low cost, high margin gk104 video card.
And this was Ati's fault, since not only they always catch up, but they also suck at it. How much do we actually have to wait until Ati can put some fire on Nvidia's ass? How much do we have to wait until we actually have some fucking competition in this expensive market?

Ffs, the new consoles are coming, and shit like Star Citizen is gonna push the computing requirements stupidly high, and Nvidia/Intel aren't even bothering to increase performance on their stuff, just trying to milk the customers the most they can with power saving crap, while AMD/ATI just stands and watches as we're getting ripped off, and the computing power progress in the PC field is moving at the pace of a snail...

Sorry for the rant, but this is getting really really annoying.

Wrong on all counts. Tesla featured GK110 on... November 2012. Had yields been that superb and whatnot, the thing would have been released much, much earlier.

Also, and for your information, GK104 was always stated to be the competitor for Tahiti cards (don't care what you believe, since you have a clear bias) since, precisely, you only have to understand really little to see that GK104 is a gaming-focused card whereas GK110 is a compute-focused card (or multi-purpose).
 
hehe i'm well aware of my present location this is [H] Go big or go home.

But even on a site like this there has to be some common sense and decision based on logic :p

We both entitled to our opinons.

And for me I will take 1600p on very high with no AA to get 80fps avg and a 40fps mins.

While there are others that need to have 1600p 8x AA and getting 45fps avg and 20mins and they are fine with that.

I will sacrifce max settings for performance other prefers to throw $1000 dollar at the problem.

To each their own.



I would drop 4x SSAA and go to 4x MSAA and like magic 60fps!

And when even throwing $1000 at the "problem" doesn't get you a 60 fps minimum(ffs, that crysis warhead benchmark only has 1 card that can beat a 60 fps average), what then? I want smooth gameplay first, but if I have to make the equivalent of a mortgage payment to get it, I want the bells and whistles to go along with it.
 
Last edited:
I doubt it. Nvidia has xtremely loyal customers.

I have purchased the AMD/ATI card on a few generations. I had the 5870's in crossfire and they ran really good. The thing for me with those cards at the time were BFBC2 and the major stuttering issue with that game. I had those cards for a good year and a half. It just depends on the performance of the cards and such. I will go either way...I am not loyal to either company.
 
They have an ace up their sleeve: Maxwell is coming within 3-4 months.
Titan has been out for 8+ months already, and the gtx780 for 3+. They've already sold a crapton of them, and they can afford to drop the prices a little and still sell them, since Ati as always have driver problems, and no support for cuda/physx/lightboost.
Seriously, with this small performance gap, Nvidia has nothing to fear. They will just release Maxwell Q1 2014 and crush Ati again until 2015...

Just to give an idea, GK110 is like 18 months old, and Ati only now is catching up with it in performance.
Question is: how much has Nvidia advanced in the last 18 months while Ati was catching up? I have the feeling Maxwell will easily be double the performance of a Titan...

If they release Maxwell that early it won't be on 20nm, most reports indicate we'll see 20nm GPU around Q3 2014. And if it's not on 20nm there's no chance of it doubling Titan performance.
 
It doesent make sense that Nvidia would release something that is 100% faster than their current offering. Performance is a finite commodity: eventually nodes, efficiency, design will either be at their limit for the medium (silicon), or be unfeasibly expensive. If you release a product that rushes toward that finish line unnecessarilly, yeah, you might get market share, but you just reduced your brand's longterm longevity. This is why the ~%20 increases in performance seem pretty consistent across generations. None of the companies want to get to that silicon finish line...
 
It doesent make sense that Nvidia would release something that is 100% faster than their current offering.
N has done this for years, as soon as A announces something new that will beat them.
NO we have a new card and It will blow that away, you just wait 100% is just a small thing, and by the time its out you will forget we said that.


IF you have it show it!!!!
uh its on the way DONT BUY you just wait and see.
and 6-8 months later here it is. Hidden behind web pages that only have a timer till it gets to 0 and then tells you how many days before
they show the cardboard model. WEEEEEEE

But don't buy, then on that day some time in the future you get to see it.
And again its 4% faster than that OLD A, aren't you glad you waited.

you may get in line to kiss our ass.


================================================

None of the companies want to get to that silicon finish line...

well there are 2 ways to look at that.
increase quality and cut the prices instead of bleeding us each release for 4%.
make improvements the cards, and NOT putting connectors on the back is not an improvement,
Keep on giving us 4-5% at only about $200 each year and saying they are now kicking ass.
most of what that do is just the same old thing and call it new.
Come up with an AA that has no performance penalty and no blur, thats something but then they can't make money off that.

This leap frog (planned) rape is just crazy.


You ever notice how when something new comes out they release a final driver that is 30% faster on the old card.
Its because they keep them slower till the next release.
OH WOW its the battle of the driver wars.
both of them do it, the cards are released throttled in many ways then when the other guy sets his a little higher, well we can go 8% on this driver update. LEAD
over and over

They say things like oh there is a shortage and we just can't produce them, 5 days later we get raped for $100+ because there just aren't any, next day the market is flooded and the price stays the same.

you may think I am a N hater, but I am just an industry hater, hater of the F we are getting from these 2 clowns.
 
Last edited:
They have an ace up their sleeve: Maxwell is coming within 3-4 months.
Titan has been out for 8+ months already, and the gtx780 for 3+. They've already sold a crapton of them, and they can afford to drop the prices a little and still sell them, since Ati as always have driver problems, and no support for cuda/physx/lightboost.
Seriously, with this small performance gap, Nvidia has nothing to fear. They will just release Maxwell Q1 2014 and crush Ati again until 2015...

Just to give an idea, GK110 is like 18 months old, and Ati only now is catching up with it in performance.
Question is: how much has Nvidia advanced in the last 18 months while Ati was catching up? I have the feeling Maxwell will easily be double the performance of a Titan...

hahaha, very funny post!! GK110 is 18 months old now? So it was ready in March of last year? And Nvidia decided to hold back millions of dollars of R&D until November because?? I really would like to see what answer you have to that question.

ATI? Who are they again? You mean AMD. And yes they do have driver issues, most of them are in mulitple card configurations. But, where have to been for the last year and a half? Nvidia's drivers aren't much better these days, they have had their own share of issues. AMD have no support for Phys or Cuda? Please tell me how is that AMD's fault?

Nvidia's stuff sells really well, but AMD gained back market share in the first two quarters of this year. Which shows their GPU's have been selling really well too. As for AMD playing catchup, hmmm, surely it's only since the release of Titan that AMD are playing catchup? I mean the GHz edition of the 7970 is faster than the 680.

Maxell is coming in three to four months? So then it's going to be 28nm, and you are saying it's going to be double the performance of the Titan? Can I please have whatever you are smoking? People more technical than me will explain it better, but there is no way that they can double the performance of the Titan on 28nm process.

And even if they somehow do manage to release maxwell on a 20nm process, it still won't be double the performance of the Titan. The days of massive performance increases from shrinking die sizes are gone.


GK110, the chip used in Titan, was presented at the beginning of 2012, and then put in Teslas.
Only the next year it was released on consumer devices, with Titan first, and 780 later.

gtx6xx series was Nvidia trolling, since Ati was so far behind that Nvidia had no reason to release GK110 for consumers. So they just released a low cost, high margin gk104 video card.
And this was Ati's fault, since not only they always catch up, but they also suck at it. How much do we actually have to wait until Ati can put some fire on Nvidia's ass? How much do we have to wait until we actually have some fucking competition in this expensive market?

Ffs, the new consoles are coming, and shit like Star Citizen is gonna push the computing requirements stupidly high, and Nvidia/Intel aren't even bothering to increase performance on their stuff, just trying to milk the customers the most they can with power saving crap, while AMD/ATI just stands and watches as we're getting ripped off, and the computing power progress in the PC field is moving at the pace of a snail...

Sorry for the rant, but this is getting really really annoying.


I love the way you have this idea that Nvidia held back the GK110 because AMD didn't have any competition. They didn't hold anything back, they didn't have anything else. In today's declining PC market, you think they held back the GK110 from the Tesla market? really? That they had an order for 22,000 of them for the Oakridge supercomputer and didn't fill it because AMD wasn't competiting on the consumer GPU front?

Please tell me how that makes any kind of business sense? I will answer for you, it doesn't. The GK110 wasnt ready until the middle of November 2012.

You do realise that the consoles have all AMD APU's in them?

Power saving crap? Energy efficiency is now one of the most important things no matter what you are building, cars, houses etc. And like it or not that's the way it is. Performance Per Watt is the new King.
 
Last edited:
GK110, the chip used in Titan, was presented at the beginning of 2012, and then put in Teslas.
Only the next year it was released on consumer devices, with Titan first, and 780 later.

gtx6xx series was Nvidia trolling, since Ati was so far behind that Nvidia had no reason to release GK110 for consumers. So they just released a low cost, high margin gk104 video card.
And this was Ati's fault, since not only they always catch up, but they also suck at it. How much do we actually have to wait until Ati can put some fire on Nvidia's ass? How much do we have to wait until we actually have some fucking competition in this expensive market?

Ffs, the new consoles are coming, and shit like Star Citizen is gonna push the computing requirements stupidly high, and Nvidia/Intel aren't even bothering to increase performance on their stuff, just trying to milk the customers the most they can with power saving crap, while AMD/ATI just stands and watches as we're getting ripped off, and the computing power progress in the PC field is moving at the pace of a snail...

Sorry for the rant, but this is getting really really annoying.

Wow...are you for real?
 
Wow, people actually believe this whole 'the 680 is actually a 660' thing... The 7970 was the top-end card, faster than the 680 for a while before Nvidia could bin enough chips to release the Titan. You're saying that Nvidia just LET AMD sit there for nearly a year?
 
I'm shocked there are people who actually think the 680 was the orginal high end part. Nvidia took advantage of AMD underwhelming part, and we the gamers got screwed.
 
Wow, people actually believe this whole 'the 680 is actually a 660' thing... The 7970 was the top-end card, faster than the 680 for a while before Nvidia could bin enough chips to release the Titan. You're saying that Nvidia just LET AMD sit there for nearly a year?

Aye there is, look at the post below yours for an example :D
 
Wow, people actually believe this whole 'the 680 is actually a 660' thing... The 7970 was the top-end card, faster than the 680 for a while before Nvidia could bin enough chips to release the Titan. You're saying that Nvidia just LET AMD sit there for nearly a year?

Please correct me if I'm wrong.
I thought the GTX 680 was faster at first, until AMD upped the 7970 to the GHz edition.

Moving on....it's almost time for the video stream to start. Here's hoping AMD delivers some killer specs and killer prices!
 
Please correct me if I'm wrong.
I thought the GTX 680 was faster at first, until AMD upped the 7970 to the GHz edition.

Moving on....it's almost time for the video stream to start. Here's hoping AMD delivers some killer specs and killer prices!

Also drivers.
 
A quote that I think people need to remember from ryan at anandtech:
snip
If you want the point hammered home a little more...what is Pitcairn's die size? What is the rumored die size of this chip? Not a coincidence.

Didn't think anything needed hammering. Again, Imo, I thought this series would be a marginal increase in shader horsepower as 79xx series seems to have bandwidth pretty well in check.
I'm very impressed with the route AMD took, very efficient etc etc.
As for pitcairn=small die, R290X had a rumored larger than 79xx die, didn't automatically mean it had a 512Bit wide mem bus. But thanks anyways :cool:
 
I have purchased the AMD/ATI card on a few generations. I had the 5870's in crossfire and they ran really good. The thing for me with those cards at the time were BFBC2 and the major stuttering issue with that game. I had those cards for a good year and a half. It just depends on the performance of the cards and such. I will go either way...I am not loyal to either company.

Just because you are doesn't mean the rest of their fanbase isn't. I sold my GTX680 after finding out it was worth more than it was when Nvidia did this: http://www.techpowerup.com/173110/n...ll-evbot-support-from-gtx-680-classified.html
http://www.xbitlabs.com/news/graphi...Partners_to_Validate_Own_Product_Designs.html

I've had my fair share of problems from both vendors but as far as business practices go, Nvidia would lie more in the gray for me.
 
I'm shocked there are people who actually think the 680 was the orginal high end part. Nvidia took advantage of AMD underwhelming part, and we the gamers got screwed.

I literally lol'd at this post. You do realize that despite rampant bias from many posters here (and the [H] reviews themselves), the 7970 GHz was a faster card than the 680. The performance gap only widened when overclocking was taken into account.

This kind of total imaginary memory syndrome is typical of the Nvidia fanboys that infest this site.
 
I literally lol'd at this post. You do realize that despite rampant bias from many posters here (and the [H] reviews themselves), the 7970 GHz was a faster card than the 680. The performance gap only widened when overclocking was taken into account.

This kind of total imaginary memory syndrome is typical of the Nvidia fanboys that infest this site.

The GHz part was faster, but AMD didn't launch the GHz card until after the GTX 680 was released. When the GTX 680 launched, it was faster than the 7970 (hence the price drop and GHz edition launch).

This kind of total imaginary memory syndrome is typical of the AMD fanboys that infest this site.
 
I may have missed this somewhere, and I realize reviews and benchmarks aren't out, but as a sort of odd question...

Which one of these would you say is closest to 2x6950s? I have one 6950 currently, and I figure my next card should outperform two of them in Crossfire.

Either that, or I should just buy a second 6950 and go Crossfire :D
 
It's not that hard to make your card faster vs a card in the current market as all Nvidia did with the 680GTX was set the clocks high enough to beat the HD7970 that was in there lab for 6 months which i'm sure they got it from a best buy store.

The HD7970Ghz bios was just AMD's way of showing Nvidia that they can play the same game by upping the clocks.

Titan /780GTX really apeared to early as Nvidia already knew AMD's plans and why the cards exist as R9 290X was to appearQ1 - Q2 -2013

The days of both companys releasing cards with-in days of each other is gone and a fair shootout doesn't exist anymore as to say.
 
Titan /780GTX really apeared to early as Nvidia already knew AMD's plans and why the cards exist as R9 290X was to appearQ1 - Q2 -2013

The days of both companys releasing cards with-in days of each other is gone and a fair shootout doesn't exist anymore as to say.

Hardly too early.
They managed to milk the market at price points which would be impossible to reach with competitor existing.

Considering we have factory overclocked 780 cards that are faster than Titan they can react by making 780 ti and dropping prices.
 
The GHz part was faster, but AMD didn't launch the GHz card until after the GTX 680 was released. When the GTX 680 launched, it was faster than the 7970 (hence the price drop and GHz edition launch).

This kind of total imaginary memory syndrome is typical of the AMD fanboys that infest this site.

Being anti-Nvidia fanboy is a far cry from being an AMD fanboy. An obvious distinction I'd never expect you to understand, of course.
 
Mmmmm if this is all true, then gimmeh dat Never Settle bundle and this looks like a mighty enticing early upgrade for that price/performance. :)
 
thanks for narrowing it down


hahahahhaha
Haha sorry.

Well with 280x @ $299, I would expect 290 @ $449-$499 and 290x @ ~$599.

7970 launched at $549 and supposedly 290x does beat GTX780 so they could potentially price it @ $649, though not likely.
 
I can't be the only person who was surprised that so much detail was revealed about mid ranged cards and so few details (none, really) were given for the flagship cards.

What's the deal with that. It almost makes me wonder if AMD still hasn't finished putting the final touches on 290X.
 
(I'm on my tiny phone so I can't read the brand)

What 4K monitors were they using? Were they playing at 30Hz or 60Hz?
(I thought all/most 4K monitors werent yet capable of high hertz)
 
I can't be the only person who was surprised that so much detail was revealed about mid ranged cards and so few details (none, really) were given for the flagship cards.

What's the deal with that. It almost makes me wonder if AMD still hasn't finished putting the final touches on 290X.

They don't want to give anything away before they have to.
 
Preorder prices are always quite high. Let's see if that is true here...
 
It looked damn smooth regardless of the refresh rate. Pretty intense for two cards.
 
I own my own little pc shop in SA, I have a good client base, nothing like the big companies but since the mantle news dropped I've gotten 30 back orders for R9 290X BF4 editions.
To put that into perspective, when the Titan launched I had 6 back orders. That's a big difference.

A close friend of mine owns the largest online shop in SA and he has confirmed the exact same thing, huge amount of back orders, more than we've ever seen with anything from AMD.

http://www.overclock.net/t/1429286/...-strike-the-599-price-point/580#post_20883574
 
Back
Top