AMD R9 390X, Nvidia GTX 980 Ti and Titan X Benchmarks Leaked

Actually, back when I designed and sold enthusiast gaming PCs: I experienced the exact opposite. When people have the money to spend, they think LESS about the purchase. I had a customer who would whip out his credit card as soon as I told him we had a product that was more expensive than what he owned: he just wanted to say he had the best. People at that level of disposable income are VERY susceptible to brand loyalty, marketing and price gouging: they RARELY do their research. I had 13 year old kids dragging their wealthy parents through the store with credit cards out, ready to buy, demanding Titan Black cards for their new gaming system. When I state that the 780 Ti could do the same damn thing for damn near half the price, they didn't believe me and threatened to go elsewhere.

People with the money to spend do less research: trust me.


That's that true, people that don't work hard for their money do that ;) or second generation money.

There are people that just want the best period.

Also you expect 13 year old rich kids would do their research? I wouldn't think they would, and the parents probably don't even know what the different graphics cards are. But if the parent was an avid gamer and was making money they would do their research.
 
Which they then delayed again until Q3 '15 for mass production.
Nvidia isn't going to launch a new architecture on a new node with their +500mm2 ASIC.

Don't forget this part.

which essentially means another delay, albeit, a slight one.

Even if 16nm was ready now, I expect Pascal in the late Q3 or into Q4 just to maximize the profit of the holiday buying season. Unless they can maintain their market share with Maxwell.

I do agree that the lack of competition has allowed NVIDIA to stretch it's release schedule. Maxwell is already over a year old and they still have not released the full chip yet.

My point is that 16nm will be available not too long after June. The 390X may have been interesting last year up against the 980. Halfway into 2016 is far too late.
 
AMD drivers beg to differ. I had nothing but issues with them.
We can play the anecdote game back and forth but there are plenty of people who do just fine with AMD drivers. Yes I'd like them to update their drivers more frequently but since everything I play works this is more of a wish than a complaint.
Launching in June will give the 390x a very short life span. Considering 16nm will be out soon after.
TSMC is going to be late on 16-14nm though. How late is hard to say but the rumor mill is suggesting mid to late 2016 for volume parts.
If AMD can match Titan X performance then I really have to question why they would market it at $600?
Possibly to take market share away from nV. Also volume at $1K price points for GPU's is absolutely tiny. They wouldn't sell very many.
 
Don't forget this part.
When they push it up two quarters only to push it back 3 quarters, you have to start questioning WTF TSMC is doing.

Q2 '15 -> Q4 '14 -> Q1 '15 -> Q2 '15 -> Q3 '15.

Even if 16nm was ready now, I expect Pascal in the late Q3 or into Q4 just to maximize the profit of the holiday buying season. Unless they can maintain their market share with Maxwell.
Meh, that is when they projected it for one of their contracts, so yeah.
Hopefully they can release the Pascal Tesla on time so they don't have to pay penalties like with Gk110.

I do agree that the lack of competition has allowed NVIDIA to stretch it's release schedule. Maxwell is already over a year old and they still have not released the full chip yet.
Not sure where you are seeing lack of competition or Nvidia sitting on their laurels...
They had the same time in between releases back with Kepler.

My point is that 16nm will be available not too long after June. The 390X may have been interesting last year up against the 980. Halfway into 2016 is far too late.
What? Where do you get halfway into 2016? 390x isn't meant to compete with GTX980.
If 390x isn't interesting than I guess Titan X isn't interesting either...
 
from what I understand about TSMC 16nm FF and FF+, is that it's a hybrid 20/16nm process. So Samsung/GF 14nm should be an improvement, I wonder how long before NV switches to Samsung/GF :p

TSMC’s 16nm FinFET (CLN16FF) and 16nm FinFET+ (CLN16FF+) process technologies rely on the back-end-of-line (BEOL) interconnect flow of the company’s 20nm SOC (CLN20SOC) fabrication process, but use FinFET transistors instead of planar transistors. Such hybrid approach to CLN16FF process technologies provides additional performance and/or power savings, but does not allow to significantly shrink the size of chips compared to chips made using the 20nm SOC technology. The proven BEOL interconnect flow means that it gets easier for TSMC to start mass production of chips using its 16FF and 16FF+ manufacturing technologies.
 
from what I understand about TSMC 16nm FF and FF+, is that it's a hybrid 20/16nm process. So Samsung/GF 14nm should be an improvement, I wonder how long before NV switches to Samsung/GF :p

20SOC is 20nm planar LP.

16FinFet is 20nm FinFet HP.
16FinFet+ is 20nm FinFet HPP (or HPM).

They share much of the same tools and libraries as the 20nm planar process, 20SOC, which is why it was supposed to ramp so quickly.
It was supposed to enter Risk production at the beginning of Q3 '14, pushed to Q4 '14, entered in Nov '14.

Edit- 16nm is just marketing speak. All the big foundries do it. Gate pitch shows it is 20nm based. FinFets offer a good speed/power benefit that they decided to market it as a "half-node."
 
Last edited:
Nope, Crossfire is not smoother right now. I personally went through a 290x crossfire situation and ended up putting back in the SLI 580's. That's how much choppier the 290x's were compared to the smoothness of Adaptive Vsync

AMD DOESN'T HAVE ADAPTIVE VSYNC

Games are less compatible with crossfire than to sli. AMD's drivers have been crap since forever

390x having that much more performance while using the same power draw as 290x isn't the point. It's that a Titan X doesn't use nearly as much power, neither do the 980's. I'm not a fan of AMD's philosophy of making stuff loud, hot, and big power draw just for little more performance.

But it all comes down to the drivers really. AMD doesn't have adaptive vsync which is HUGE deal for giving you tear free, jitter free motion. If they had that, I'd consider them.

I know crossfire is not all bread and butter but I generally don't have many problems. It would be nice for them to release profiles instead of me having to look online to find a solution or just using AFR which seems to work on most titles.

But screen tearing? I haven't seen screen tearing in over 2 years, I don't know if its just my 144hz monitor or what, which wouldn't make sense but I haven't seen screen tearing on my 7950 or my crossfire 290x's.
 
Yes a 144Hz panel definitely reduces screen tearing. I pretty much don't notice it unless I actively look for it.
 
Yes a 144Hz panel definitely reduces screen tearing. I pretty much don't notice it unless I actively look for it.
Its better but tearing was still pretty noticeable to me at 144 hz. For things like flickering lights in game, the tearing seemed basically the same as at 60 hz.

I even made this video of Dead Space 3 with insane tearing from the flickering lights I just mentioned. https://www.youtube.com/watch?v=Y3T6chyW2Vo
 
I've been paying attention before GeForce was ever created. However, as you stated, there's been many times ATi (especially with the Radeon 9800 Pro) demolished NVIDIA but still suffered from the poor driver image. My personal experiences with AMD crossfire (mobile) were pretty poor overall and I think those that say AMD drivers are on par with NVIDIA are just lying to themselves or haven't delved deeply into the issue enough. Perception isn't created simply from marketing, that's my point. At some point the marketing has to be backed up with substance, especially with enthusiast level graphics technology--we aren't buying $4 cereal. Most consumers that are in the Titan X/390X segment have done their homework and concluded that NVIDIA is worth the premium and that isn't out of sheer loyalty or brainwashing like some claim but rather recognizing that NVIDIA does in fact offer a superior experience and are willing to pay for that. People like me that purchase $1000 Titan's aren't stupid, we've been in this game far longer than most and to us NVIDIA is worth a premium and that (justified) perception has obviously been passed on to the market as a whole given NVIDIA's lions share of AIB sales. So you're right, even if AMD bundles a superior AIO with the 390X, they cannot market it at $800+ because their drivers + ecosystem overall are inferior to NVIDIA.

I'd say the opposite, anyone who would buy a $1000 video cards with 12Gb's of ram isn't that smart.

To use anywhere near that 12Gb's your going to need two video cards to even push that many pixels, which is $2000.

and if you don't buy a second card your stuck with a $1000 dollar card with slightly higher performance that that of a 6Gb card that costs half as much.

and the problem with a 1000 dollar video card is when you go to sell it no one wants to buy it because the card what came out a 6 months later has equal or higher performance and is brand new and no one wants to drop 500-600 dollars on a second hand card.
 
Its better but tearing was still pretty noticeable to me at 144 hz. For things like flickering lights in game, the tearing seemed basically the same as at 60 hz.

I even made this video of Dead Space 3 with insane tearing from the flickering lights I just mentioned. https://www.youtube.com/watch?v=Y3T6chyW2Vo

A video from 2013 of a game that has probably been patched and fixed is proof?

What about a video showing the shadow flickering in recent gameworks games from Nvidia? Does that count too?

Gotcha
 
I'd say the opposite, anyone who would buy a $1000 video cards with 12Gb's of ram isn't that smart.

To use anywhere near that 12Gb's your going to need two video cards to even push that many pixels, which is $2000.

and if you don't buy a second card your stuck with a $1000 dollar card with slightly higher performance that that of a 6Gb card that costs half as much.

and the problem with a 1000 dollar video card is when you go to sell it no one wants to buy it because the card what came out a 6 months later has equal or higher performance and is brand new and no one wants to drop 500-600 dollars on a second hand card.

You are 110% correct. Couldn't explain it any better myself.
 
A video from 2013 of a game that has probably been patched and fixed is proof?

Gotcha
Do you have reading comprehension problems? There is nothing to be patched. I made that video in that spot of Dead Space 3 to show that 144 hz does not stop all tearing and seems to have little impact on scene like that. For normal scenes without flickering lights then yes tearing is greatly reduced at 144 hz and no denies that.
 
Do you have reading comprehension problems? There is nothing to be patched. I made that video to show that 144 hz does not stop all tearing and seems to have little impact on scene like that. For normal scenes without flickering lights then yes tearing is greatly reduced and no denies that.


There is a difference between flickering (game issue/driver issue) and Screen Tearing.
 
There is a difference between flickering (game issue/driver issue) and Screen Tearing.
That is screen tearing there that I am showing. Its the flickering lights in the game that were causing the huge tears across the screen. Flickering lights back in the original FEAR game is what made me start using vsync in the first place because it caused lots of annoying tearing.
 
That is screen tearing there that I am showing. Its the flickering lights in the game that were causing the huge tears across the screen.

You said it was flickering. Flickering in game isn't screen tearing.

I think you are the one who needs to read up on the difference between them.

That issue in the video is a game/driver issue
 
You said it was flickering. Flickering in game isn't screen tearing.

I think you are the one who needs to read up on the difference between them.

That issue in the video is a game/driver issue
Please learn to read closer. AGAIN its the flickering lights within the game that was causing the excessive screen tearing in that scene.
 
Please learn to read closer. AGAIN its the flickering lights within the game that was causing the excessive screen tearing in that scene.

No it is not. In that video the screen isn't tearing at all. That is a game/driver bug bro.

You honestly must not see/know what screen tearing is.

Anyway we are off topic. Back on topic.
 
so... what are the chances these new nvidia/amd cards will work on the good old x58 chipset ?

skipped gtx980's for the lack of new games and would like to grab a new gen card to replace my crossfire system
 
No it is not. In that video the screen isn't tearing at all. That is a game/driver bug bro.

You honestly must not see/know what screen tearing is.

Anyway we are off topic. Back on topic.
Get through your thick skull that I specifically made that video to show that 144 hz cant help much when there are flicking lights in the game causing massive tearing. If I turn vsync on then there then all the tearing would stop.
 
Get through your thick skull that I specifically made that video to show that 144 hz cant help much when there are flicking lights in the game causing massive tearing. If I turn vsync on then there is no tearing at all.

So what do you think about those leaked benchmarks? I mean that new Titan X looks beastly eh?

Cannot wait for reviews this week!
 
If we're talking about the difference between $600 and $1300 then I believe the downsides are justified... And if you're only running 1 card, crossfire problems are a non-issue.

Benchmarks could be considered a global average of a card's performance, which would include games where AMD has poor optimization. So that would mean, despite AMD's bad performance, the 390X would still be as fast or faster than the Titan X averaged across all games.

The games it's faster in will suck and the games it's slower in will prove that AMD has shit drivers. Rinse/repeat.
 
so... what are the chances these new nvidia/amd cards will work on the good old x58 chipset ?
Barring PCIe bugs in either the x58 chipset or the video card it should work fine. No way to know for sure yet though.
 
Could you post an argument?

Your post was ridiculous! Nobody needs to post anything to prove that. Your attempt at a derail is worse. We aren't discussing AMD vs. Intel CPU's.
 
Last edited:
....

One day I shall turn him onto the dark side... Or make him get a 6 core Intel so his games can breath again. That i7-3770K really has gotten long in the tooth for the settings that he wants to use while streaming. He's a good guy, but can't see the total picture sometimes.

Uh, if you really want to do him a favor tell him to overclock that 3770k to around 4.5GHz and make sure that Hyperthreading is enabled in BIOS.
 
http://videocardz.com/55124/amd-radeon-r9-390x-wce-could-this-be-real

Looks like Watercooling as been confirmed, and so is the 8GB of memory.

The leaker claims this slide is a part of in-house presentation called ‘2015 Future of Radeon’ that will be shown next week to AIB partners (not to public).
Question.
Does this mean AMD's partners haven't even seen Fiji yet? Wouldn't that mean they haven't even started designing their 390/390X cards? So they have to do all of the design, mass production, and shipment by June?

Or are we going to be stuck on reference cards for 3+ months like what happened with Hawaii... Because seriously, the release schedule is already pushed back far enough.
 
Companies don't show their products to their partners through a PPT slide.
Ever wonder that it was from an internal meeting? Ever wonder its part of a 40 slide presentation? Ever wonder if its a controlled leak meant to show whats actually coming to combat Titan X release? Ever wonder in general?

Watercooling has been in the pipeline and we knew that a long time when AMD partnered with Asetek (295x2 ring any bells?)
8GB edition along with a 4GB is common sense knowing the fact that AMD is pushing 4k where VRAM is needed.

See where I am going? Seems you dont!
 
AMD's actions will have no effect on Titan pricing based on the original Titan / Black / Z.

I know Nvidia is gonna Nvidia but surely seeing the leaked price/performance ratio (if at all accurate) of the new AMD cards NV will make a logical decision to price it somewhat competitively. Right? Right guys?
 
I know Nvidia is gonna Nvidia but surely seeing the leaked price/performance ratio (if at all accurate) of the new AMD cards NV will make a logical decision to price it somewhat competitively. Right? Right guys?

If history repeats itself... Maybe as a last resort... but as always they will try their hands at all types of marketing to create a demand at extremely high prices, for their customers, before bowing down and ending the product line. Only if they are stuck will they price competitively.
 
This train is off the tracks and headed for a wall.....lets try and get it back on track eh?
 
Back
Top