AMD R9 390X, Nvidia GTX 980 Ti and Titan X Benchmarks Leaked

I know Nvidia is gonna Nvidia but surely seeing the leaked price/performance ratio (if at all accurate) of the new AMD cards NV will make a logical decision to price it somewhat competitively. Right? Right guys?

Titan X will have a 3 month head start or more. After that they can still sell it to professional users. There will be no price drop. They will wait until the 390x drops and then toss out the 980ti. Just like last round. Meanwhile they are still showering in money from the 9xx series. Pascal will probably be held back until it's time to do the same trick all over again. Wait for AMD to blow it's wad and then simply counter with a mid-range chip.
 
Ever wonder that it was from an internal meeting? Ever wonder its part of a 40 slide presentation? Ever wonder if its a controlled leak meant to show whats actually coming to combat Titan X release? Ever wonder in general?

Watercooling has been in the pipeline and we knew that a long time when AMD partnered with Asetek (295x2 ring any bells?)
8GB edition along with a 4GB is common sense knowing the fact that AMD is pushing 4k where VRAM is needed.

See where I am going? Seems you dont!

You can go ahead and assume what ever you want, I won't argue against assumptions.
 
Titan X will have a 3 month head start or more. After that they can still sell it to professional users. There will be no price drop. They will wait until the 390x drops and then toss out the 980ti. Just like last round. Meanwhile they are still showering in money from the 9xx series. Pascal will probably be held back until it's time to do the same trick all over again. Wait for AMD to blow it's wad and then simply counter with a mid-range chip.

The supposed professional angle kills me. It's a Geforce card. There is no driver or support for professional applications. None! Zero! Zilch! Nada! No professional would use it? The number of people who buy this card because of it's DP probably wouldn't pay for he logo on the shroud never mind making it worthwhile to manufacture.
 
Actually it's rumored that Titan X will have DP neutered so it's basically a pure gaming card and worthless to the professional segment.
 
AMD won't blow anything its a "pure promise" thing, nothing is more Faster and IQ Amazing than Maxwell...OC to 1500+ Samsung MEM to 8100 Stock Air on Stock Volts ? I have 4 of them, and Tested already 7 of course i kept the GALAX 980HOF cauz i love them

TITAN X yes i agree with that reference design will NOT go further because it's obvious that a 980Ti with 8GB is about to appear...

but AMD man serious ? Grab one, put the Project Cars (the most demanding game ever) grab a 30 Inch IPS good Panel and connect AMD to it then Test Maxwell...you will CLEARLY see the difference in IQ and Performance.

I am a Overclocker, Tester and Reviewer and Gamer just because i love it, and i tested almost all.

Kind Regards
Sergio
 
AMD won't blow anything its a "pure promise" thing, nothing is more Faster and IQ Amazing than Maxwell...OC to 1500+ Samsung MEM to 8100 Stock Air on Stock Volts ? I have 4 of them, and Tested already 7 of course i kept the GALAX 980HOF cauz i love them

TITAN X yes i agree with that reference design will NOT go further because it's obvious that a 980Ti with 8GB is about to appear...

but AMD man serious ? Grab one, put the Project Cars (the most demanding game ever) grab a 30 Inch IPS good Panel and connect AMD to it then Test Maxwell...you will CLEARLY see the difference in IQ and Performance.

I am a Overclocker, Tester and Reviewer and Gamer just because i love it, and i tested almost all.

Kind Regards
Sergio

Nvidia focus group member obviously.
Nice try.
Nothing beats amd and havent for a long time.
 
AMD won't blow anything its a "pure promise" thing, nothing is more Faster and IQ Amazing than Maxwell...OC to 1500+ Samsung MEM to 8100 Stock Air on Stock Volts ? I have 4 of them, and Tested already 7 of course i kept the GALAX 980HOF cauz i love them

TITAN X yes i agree with that reference design will NOT go further because it's obvious that a 980Ti with 8GB is about to appear...

but AMD man serious ? Grab one, put the Project Cars (the most demanding game ever) grab a 30 Inch IPS good Panel and connect AMD to it then Test Maxwell...you will CLEARLY see the difference in IQ and Performance.

I am a Overclocker, Tester and Reviewer and Gamer just because i love it, and i tested almost all.

Kind Regards
Sergio

You claim to be a professional but you don't seem like one. GPU clocks are only relevant in a vacuum. Hawaii offers more performance per MHz. A 256bit bus needs RAM to run 2x as fast just to break even with a 512bit bus.

Now, is GM104 faster than Hawaii? Marginally. Is it more efficient while gaming? Definitely. Superior IQ though? Apparently only in your eyes. Better or more efficient in other non gaming loads? Doesn't appear to be. Yet you make it sound like Maxwell walks on water. Professional? I doubt it.
 
hmmm. so the marketing slides are saying ~55% improvement over 290x (allegedly). wouldn't this put the 390x behind titan x (assuming what we've seen for numbers are factual)?
Assuming AMD's graphs are 100% true, the 390X will probably be faster than the Titan X no matter how you slice it ... 40-45% over the 980, which matches the Chiphell leaks.
The only Titan X numbers I've seen bantered about were 30-40% ?

RIP Nvidia.
 
Assuming AMD's graphs are 100% true, the 390X will probably be faster than the Titan X no matter how you slice it ... 40-45% over the 980, which matches the Chiphell leaks.
The only Titan X numbers I've seen bantered about were 30-40% ?

RIP Nvidia.

well, the leaked numbers all put the new titan at 30-40% over the 980.. which is already, what, 20% faster than a 290x? (pulling that out of my ass, someone correct me). so, i don't know. i'd love to see the 390x beat out the titan, but it'll probably be super close. competition is great, at any rate.
 
I believe that the leaked benchmarks are as real as it gets. Stock clock wise, 390X should on par or faster than Titan X. Cause 390X is going to have 4096SP which is ~45% more than 290X. Hence, its not surprising to see that 390X will have ~50% faster performance (taking memory etc... into account).

Titan X has 50% more CUDA cores than GTX980 but the clocks are bump down. This restricts the performance of Titan X. So I don't think its going to be 50% faster than GTX980. So it looks that 30-40% is around that.

Overclocking wise, no idea but read that Titan X is easily overclockable so it could be easily 50% faster than GTX980. However, I do not want to concentrate too much about overclocked performance because we all should know by now that this shortens the lifespan of the GPU so whether it makes sense to overclock to the max or not is another thing.
 
I imagine Kyle, Steve, Brett, & co have been having a wry chuckle over this thread since they've likely had the card for at least a week..
 
I'd be curious to know how much, if at all, the 300W limit on the specification is holding the card back. Both 290x and Fiji XT list ~290W TDP, a few percent under the specification. The memory change should have saved a little power and there are ~45% more shading units on a 25% larger die. They shrank all the processors and left the GPU clock the same according to the leaked specs.

If the 390x and Titan X are roughly comparable in speed, one is still nearly double the price of the other.

In regards to drivers, crossfire/sli, and optimizations, with the move towards DirectX12 and Mantle/OpenMantle/Volcano/Vulkan does past performance even matter? I've seen people point out that it would help their CPU business, but isn't the better argument that it removes Nvidia's ability to optimize their drivers over AMD? With very little driver there is very little room for some of the crazy optimization they would have done in the past. It also seems likely that all the games that have been supporting Mantle will quickly adopt Vulkan/DX12 through patches. If 390x is a new architecture who knows what the GCN changes will bring for all those titles.

All those points aside, I'm really curious to see what either card paired with a APU using Vulkan/DX12 will do. It seems rather likely that using the APU for physics or even vertex shading prior to handing it off the a discrete GPU would be a huge plus for a lot of games.
 
Well the latest rumor is 390X WCE will cost $700+ so if the Titan X is $1000 then it still undercuts it by a decent amount. The 390X WCE will probably be a tiny bit faster than the cut down GM200 as well and priced similarly and that's where the real competition will take place.

Anyone else concerned about the Titan X blower? Last Titans I owned got hot very fast w/that blower so I don't have much confidence in it. I wish NVIDIA allowed AIBs to build custom designs, especially for a $1000 halo part. Not gonna bother with SLI this time around, it seems to be a dying technology, especially with the new Unreal engine not supporting it natively.
 
Do you ever get tired of being wrong? It fully supports CUDA. Which has plenty of professional applications.

No I'm not wrong. Do you ever get tired of giving answers that don't apply to the post you've quoted? Hardware support for CUDA is not support for professional apps.
 
Do you ever get tired of being wrong? It fully supports CUDA. Which has plenty of professional applications.

That's not unique to the Titan X though. The Kepler Titan's selling point was its FP64 capability (1/3 FP32 instead of 1/24 FP32). But since rumors indicate FP64 will be gimped in the Titan X, it really is being relegated to just a pure gaming card.

Per KitGuru:

KitGuru said:
However, keep in mind that Nvidia’s “Maxwell” was not designed to handle professional computing tasks, therefore, it does not support native double precision FP64 compute capabilities. Even the GM200 will not be able to beat its predecessor, the GK110 in high-performance computing tasks (e.g., simulations) that require FP64.
 
it seems to be a dying technology, especially with the new Unreal engine not supporting it natively.

Interesting. I haven't heard anything about that.
Do you have a link where I can read more or is it a snippet from an interview?

I wonder why that is.
Edit- All I could find was an answer Paul Oliver gave on the UE4 support forum.
 
Interesting. I haven't heard anything about that.
Do you have a link where I can read more or is it a snippet from an interview?

I wonder why that is.
Edit- All I could find was an answer Paul Oliver gave on the UE4 support forum.

The link you have is the most salient one. Basically you have to hack in AFR support which I don't think many developers will do and given how popular Unreal Engine is for developers, SLI/Crossfire are going to get less and less support. The reason I gave up my 980 SLI setup for a single Titan X is partially because of poor SLI support. NVIDIA still hasn't figured out how to do DSR/MFAA + G-Sync yet so it sucked not having the latest features and after reading about the lack of native UE4 support, I decided to give it up in all future builds. Just going to jump from big chip (e.g. Titan X) to the next generational big chip each time I upgrade. Instead I'm going to spend that extra $800-$1000 I was going to put into a second Titan X on custom watercooling + have my buddy from T|I just mod me a vbios ;)
 
The link you have is the most salient one. Basically you have to hack in AFR support which I don't think many developers will do and given how popular Unreal Engine is for developers, SLI/Crossfire are going to get less and less support.

Traditional SLI/Crossfire at any rate, DX12 multi-gpu handling will simply take over.
 
I find it hilarious people saying things like "390x obliterates titan x on the benches". Lol, No, it barely scores higher, and there's way more at play than a static benchmark who's difference between the cards will go unnoticable in game.

The 390x and Titan X tied on the bench basically. But, as I said, there's much more at play and here's why Nvidia is still king:

It's all about the whole complete package in what you choose for your vid card solution:

Long story short if you don't want to read below: Nvidia is better for multi card, AMD is tolerable if using single card. So it's not just about a few points higher on a benchmark, it's about drivers, heat, power usage, and noise. The whole package. Nvidia is still king. If you think differently, you're wrong. I won't be coming back to check on comments/retorts so don't bother ;)


--Nvidia Pros--
- more power efficent
- less heat
- don't have to have huge water cooling with tubing and reservoir/fan cooling that you have to figure out where to mount. That's going to be fun with crossfire
- you get what you pay for
- market share says it all
- smoother gameplay experience with SLI despite what some reviews say. I saw it TWICE with my own eyes as my examples below explain.
- D R I V E R S, OMG duh!!!! I don't care if your hardware is a butthair slightly faster, you're only as good as your drivers. You have WAY WAY better sli profiles than crossfire. More games support this, and more will take advantage of SLI over CX. Nvid drivers are stable, have less issues, are updated WAY FASTER than CX. Adaptive vsync, physx, etc.
Can go on and on about drivers. Such a huge bonus for Nvid.

EXAMPLE 1
I've had two different points in my history of problems with Crossfire setups. CX 5870's and 290x's. With the 5870's, I couldn't get the drivers to run both cards with Bad Company 2 unless using older Catalyst drivers. So that was fine, until one day Wow patched to some new patch and the older Catalyst drivers started making Wow get flashing textures all over. Unplayable.
So I was forced to update to newest Catalyst to fix Wow, but that then rebroke Bad Company 2 to not use both vid cards. God damn AMD drivers.....

EXAMPLE 2
Trying dual 290'x that a friend gave me for free. They were worth that price. I removed my SLI 580's, fully removed/cleaned drivers entirely, installed latest Catalyst and suddenly Diablo 3 had like bad frame pacing or something. Was jaggy, chuggy, like suddenly vsync wasn't enabled even though it was. Same thing happened in Wow. It was a worse experience than the 580 sli setup that I actually put the 580's back in.


-AMD pros-
- they tend to be less expensive, so if you have a smaller budget, then AMD will look better to you
- if you are a fan of the underdog, then you'll like AMD
- if only running 1 vid card, then crossfire problems aren't an issue, AMD is tolerable despite the heat, noise, and power usage. Far less driver issues. In fact, AMD fanbois who defend them probably aren't talking about Crossfire


So it's not just about a few points higher on a benchmark, it's about drivers, heat, power usage, and noise. The whole package. Nvidia is still king. Don't care if you disagree, you're wrong
 
Cute. Has no bearing in reality but I guess it is fun for some.
 
Last edited:
That's a great post for the kids on LinusTechTips.com but there's nobody here who needs a Pro/Con list for Nvidia / AMD. We're all acutely aware of what each company brings (or doesn't bring) to the table. You could have just said "I don't like AMD for reasons already known" and everyone here would have understood what you meant.
 
Nvidia is still king. If you think differently, you're wrong. I won't be coming back to check on comments/retorts so don't bother ;)

So if you don't even care about the discussion, why the hell did you make this post?
 
So if you don't even care about the discussion, why the hell did you make this post?

I guess he is just trolling...... I bet he is nothing more than a kid who is still schooling and living on parents money....lol....
 
Btw, all these fanboy arguments always amuse me....lol....

I am an AMD fan but I don't really bother to argue with others. Its my money and I buy what I want.
 
It's like a sales force marketing slide. Go on forums and rehash the talking points in order to sell cards to anyone trying to be "informed".

As to his huge point on drivers. With the introduction of DX12/Vulkan do drivers really matter? All the efficiency comes from shifting the driver work from the driver/API to the application. Like I pointed out earlier, you can't optimize something you don't have control over so both vendors are on equal footing there. I'm sure this is why all the major engines were playing around with mantle as quickly as they were. Same with everyone getting onboard with the Vulkan and linux push. It's not the Windows market game devs would be targeting, it's the Linux/Android/Mac market with all the handhelds, streaming boxes, and to some degree consoles. Couple years and there should be ample opportunity to re-release old games on different platforms and make some quick cash.

Same point with SLI/Crossfire. With DX12/Vulcan that is left up to the developer, any ability to make profiles will be very limited for future games. So unless someone wants a SLI/Crossfire setup that will melt through their case to play old games, what does it matter?
 
They are like Semiaccurate. About 50-50.

Just my opinion. Chiphell (who leaked the benchmarks) have been known to show fake benchmarks, but also have shown to be accurate. Chiphell is the one who showed how GOOD the 970 was in benchmarks and I for one DID NOT believe i would get 780ti performance for $329....

Anyway my 0.02c

EVGA GTX 970 FTW. Probably $350 now and it's faster than 780ti
http://www.anandtech.com/bench/product/1354?vs=1072

But this cheaper card is really just meant for 1080p gaming. 2560x1440 or 3440x1440 you probably wanna go GTX980
 
Remember the 512bit Radeon 2900XT :) It was supposed to be the second coming of christ, not saying the 390x will be, not even close, but lets wait till some real numbers are out.
 
The numbers are a percentage not fps, 290X = 100%.

Fiji XT was 149.2% @ 4K resolution so ~50% faster than 290X.
Fiji XT was 139.9% @ 1600p resolution so ~ 40% faster than 290X.

If the $549 price and slightly-better-than-$1K-TitanX peformance are correct AMD looks to have a winner here.

Price gouging will likely be insane if they're even slightly supply constrained at launch though.

$550 price on 390x is a pipedream. It'll be $750 for the 4GB and the 8GB in June will be even more. Titan X may debut at $1000, but it'll drop in price when 390x comes out.

With DX12, even though slightly more expensive, Titan X will be the faster better card
 
AMD Master Race!

1. 8.6 TFLOPS.. (27% faster than titan x) (this spec will matter)
2. 768 GB/s bandwidth.. (128% faster than titan x) (this spec will be moot)
3. 8gb vram.. (-33% less than titan x)(this spec will be moot too)
4. dual 8 pin.. (gonna be hot and power hungry no matter what) (nvidia wins here)
5. $700 price point.. (this will be a much better overall value - hot or hungry or not)
6. multi-gpu pcie bridge / scaling.. (amd wins here)

PuiB14z.png
 
Last edited:
As to his huge point on drivers. With the introduction of DX12/Vulkan do drivers really matter? All the efficiency comes from shifting the driver work from the driver/API to the application. Like I pointed out earlier, you can't optimize something you don't have control over so both vendors are on equal footing there. I'm sure this is why all the major engines were playing around with mantle as quickly as they were. Same with everyone getting onboard with the Vulkan and linux push. It's not the Windows market game devs would be targeting, it's the Linux/Android/Mac market with all the handhelds, streaming boxes, and to some degree consoles. Couple years and there should be ample opportunity to re-release old games on different platforms and make some quick cash.

Same point with SLI/Crossfire. With DX12/Vulcan that is left up to the developer, any ability to make profiles will be very limited for future games. So unless someone wants a SLI/Crossfire setup that will melt through their case to play old games, what does it matter?

Um, yea, drivers always matter, they're huge. The efficiency of DX12 doesn't come from shifting workload to the application, it comes from removing a CPU bottleneck by using all CPU cores by default.

I'm not spouting conjecture, the DX12 tests have shown 150% boosts to Nvid, 400% boost to AMD, but AMD was still behind, despite that 400% so obviously drivers and hardware matters. The rest of your post is nonsense and ranting. You seem to think SLI/Crossfire is going away. DX12 is going to cause the opposite. With it's shared resourcing of your VRAM's and system RAM, multi GPU solutions have never been more effective.
 
Back
Top