Supposed Picture of 390X watercooled Ed.

Considering the efficiency from HBM it's possible this card will be faster and use less power than the Titan X.
There's about 50W between the Titan X and reference 290X right now. If the 390X stays around 290-300W with the additional savings from HBM it could potentially use less power than the Titan X.

The WCE is a special edition card, I assume it exists because things went well between AMD and Asetek for the 295 X2. :)
It's too bad Nvidia sucks at making friends.
 
There's VRAM on that board so who would think it's a 390X?

No one. They state that it is a R9 285 itx for comparison of the size, notice the different PCB lengths.

Edit: This probably against the rules but just saw this on AnAnd forums. Prob not legit at all, but I'm starving for rumors/gossip.

"Originally Posted by Cloudfire777 View Post
People, remember that 360/370/380 are OEM chips. Keep your eyes open for 370X, 380X and 390/390X.

Here are the latest price ive heard from Korea (rumor per moderators request):

R9 370X - $229
R9 380X - $349
R9 390 - $429
R9 390X 4GB - $549
R9 390X 8GB -$599

You may notice that R9 380X cost $100 more than R9 290X.
Lets just say GTX 980 will get some serious competition.

Its a reason why AMD only announced and listed the non-X chips"

That looks similar... hmmm... He is just going off of the Tahiti launch pricing and throwing an 8GB version in for $50 more, LoL.
Leave his crap off of here. He was also adamant that the new GPU family would be 20nm...
I'm still expecting my numbers to be much closer.
 
Well I hope you're wrong for the sake of my wallet, because an 8GB 390X Lightning would be north of $800 by your numbers. :eek: I dunno if I can justify spending that much on a 28nm card. :(
 
Well I hope you're wrong for the sake of my wallet, because an 8GB 390X Lightning would be north of $800 by your numbers. :eek: I dunno if I can justify spending that much on a 28nm card. :(

It really just depends on the dynamic of the market. What Nvidia has waiting and what AMD has to offer at what price points.
If Nvidia launches a cutdown GM200 as a GTX980Ti and has a full/boosted GM200 as a GTX980Ti Black that could push GTX980 down and then they place the 980Ti and Black at their respective spots.

I would forsee a 390x Lightning costing, roughly, about the same as a WCE.
 
I guess I'll just make due with a 980 ti when it comes out...don't want my comp to be a heater.
 
I guess I'll just make due with a 980 ti when it comes out...don't want my comp to be a heater.

You realize the 980TI will use more power then a Titan X, if its the same GPU with only 6GB of memory but clocked higher?

You do know Titan X's right now throttle on the stock cooling?

So your statement is....well lets just say makes you look like a troll lol
 
Are there going to be non reference AIB cards on launch for this card? I'm talking about the air cooled cards here.
 
Are there going to be non reference AIB cards on launch for this card? I'm talking about the air cooled cards here.

There will from what the rumors say. On Launch day as well!

I would be shocked if AMD did another horrible launch like the 290/290x.
 
You do know Titan X's right now throttle on the stock cooling?
http://www.hardocp.com/article/2015..._gtx_titan_x_video_card_review/3#.VUygQ_lViko

HardOCP said:
What you will notice is that there are a few changes in clock speed while gaming, but none of those changes ever dropped below the base clock of 1000MHz. In fact, most of the time the clock speed was well above the boost clock of 1075MHz. The average clock speed is 1150MHz. At the times when the clock speed did drop, these were still right at the boost clock.

We ran this test in every other game we used here in this evaluation as well. The absolute lowest clock speed we ever experienced was 1050MHz for a few seconds in Dying Light. At 1050MHz the clock speed is still above the base clock of 1000MHz, and still considered a "boost" since it exceeds the base clock. This did not happen in every game either.

Our conclusion is that the GeForce GTX TITAN X is not clock throttling, at least for all the testing we have done today in all the games used.
 
^^^ BOOM! Exactly.

The fact is AMD has to put these AIO water coolers on their cards because they are inefficient designs. I guess since most of us put AIO coolers on our CPUs that its only a matter of time before it becomes standard to do the same on graphics cards. I just don't understand why AMD can't design a card with the same power/efficiency as Nvidia is doing these days.

The 290s were a joke, lets hope these cards actually deliver on what AMD says they will.
 
You realize the 980TI will use more power then a Titan X, if its the same GPU with only 6GB of memory but clocked higher?
But HBM uses 50% less power than GDDR5 and it's going to help the power ceiling on the 390X tremendously, at least according to many in this thread. Literally cutting the amount of GDDR5 in half will also reduce the power used by the 980Ti by 50% over the Titan. So if HBM gives AMD a bunch of headroom for the power ceiling then the 980Ti will get just as much giving it room for the extra clocks.
You do know Titan X's right now throttle on the stock cooling?
You got some sort of proof of this? Because all the reviews I've seen have, like the [H] review shown the titan doesn't throttle (you know, the way the AMD cards actually did). The only place I really see this assertion repeated is by AMD fans in AMD threads
So your statement is....well lets just say makes you look like a troll lol
Irony
 
Considering the efficiency from HBM it's possible this card will be faster and use less power than the Titan X.
There's about 50W between the Titan X and reference 290X right now. If the 390X stays around 290-300W with the additional savings from HBM it could potentially use less power than the Titan X.

The WCE is a special edition card, I assume it exists because things went well between AMD and Asetek for the 295 X2. :)
It's too bad Nvidia sucks at making friends.


each vram chip uses like 5 watts not too much. So 12 chips on the titan X, if HBM gives you 50% improvement in power, that's only 30 watts saved, not much. The GPU is what uses most of the power.

Die size if it comes out with 4000+ shader units is going to be big, its going to be as big possibly even bigger then Titan X's chip which is 601 nm,
 
^^^ BOOM! Exactly.

The fact is AMD has to put these AIO water coolers on their cards because they are inefficient designs. I guess since most of us put AIO coolers on our CPUs that its only a matter of time before it becomes standard to do the same on graphics cards. I just don't understand why AMD can't design a card with the same power/efficiency as Nvidia is doing these days.

The 290s were a joke, lets hope these cards actually deliver on what AMD says they will.

OMFG... How many times does it have to be said that AMD is not putting AIO coolers on the 390X because it has to? Air coolers are quite capable of dissipating 500w by themselves. And the 390X won't be consuming 500w. AIO coolers are simply superior to air coolers. Yes, they can dissipate more heat. But they're also more compact, they're quieter, they exhaust their heat to outside the case and they also function as case ventilation.

Please stop reading so many nvidia propoganda leaflets. If you actually stop and think about it, AIO coolers are simply the next step in GPU cooling. The same as vapor chamber coolers were a step up from solid block coolers and the same as heat pipes were a step up from plain finned heatsinks.
 
Sounds like they made another space heater card based on them immediately jumping to water cooling. I'd be more impressed if like nVidia, they could make cards that perform well while running cool and quiet on air.
The reference cooler on NVIDIA cards is anything but quiet...
 
each vram chip uses like 5 watts not too much. So 12 chips on the titan X, if HBM gives you 50% improvement in power, that's only 30 watts saved, not much. The GPU is what uses most of the power.

Die size if it comes out with 4000+ shader units is going to be big, its going to be as big possibly even bigger then Titan X's chip which is 601 nm,

30W is not much? that's the difference between 170W and 200W, 220W and 250W, it's 10-15% depending on original TDP, so it's a significant reduction, that means 30 more W you can spend elsewhere, or 30 less W the card will use over all
 
30W is not much? that's the difference between 170W and 200W, 220W and 250W, it's 10-15% depending on original TDP, so it's a significant reduction, that means 30 more W you can spend elsewhere, or 30 less W the card will use over all

Yes that is true, but 50% reduction of power usage for HBM, what does that entail? Same frequency, same vram amount etc, if that's the case then HBM gen 1 will have no savings since its frequency is higher and its going to be 4 gb vs 3 gb.

http://www.hotchips.org/wp-content/...Bandwidth-Kim-Hynix-Hot Chips HBM 2014 v7.pdf

from a 1:1 of every aspect its a 42% savings in power.
 
Last edited:
No but it's using literally half of the GDDR5 that the Titan X does so literally half the VRAM will use 50% less power.
Only in situations where the Titan X exceeds 6 GB, which is basically never.
The only time it would be "50%" is when the Titan X is running at 12 GB usage.

Even then it's fuzzy math. Never seen anyone refer to less VRAM usage as a power-saving feature.

You should work for Nvidia's marketing team.
Nvidia isn't giving you less VRAM with the 980 Ti, they're giving you more power savings! The 390X has icky 8 GB HBM, why would you want to blow so many watts on a measley 2 GB extra? :p
 
Only in situations where the Titan X exceeds 6 GB, which is basically never.
The only time it would be "50%" is when the Titan X is running at 12 GB usage.

Even then it's fuzzy math. Never seen anyone refer to less VRAM usage as a power-saving feature.

You should work for Nvidia's marketing team.
Nvidia isn't giving you less VRAM with the 980 Ti, they're giving you more power savings! The 390X has icky 8 GB HBM, why would you want to blow so many watts on a measley 2 GB extra? :p

Umm... pretty sure that RAM are powered regardless if it's holding data or not.
 
Not seeing a difference on the 290X Vapor-X 4 vs 8.
The 4 GB is clocked ~20MHz higher.

Anybody else want to dig up more 3vs6, 4vs8, etc, tests that would be helpful.

pEODc6w.png
 
Well that pretty much proves then that the 50% power savings HBM will have over GDDR5 is completely insignificant, and while technically true, the RAM using 1W vs 2W total won't make a hill of beans of difference for card thermal design.
 
^^^ BOOM! Exactly.

The fact is AMD has to put these AIO water coolers on their cards because they are inefficient designs. I guess since most of us put AIO coolers on our CPUs that its only a matter of time before it becomes standard to do the same on graphics cards. I just don't understand why AMD can't design a card with the same power/efficiency as Nvidia is doing these days.

The 290s were a joke, lets hope these cards actually deliver on what AMD says they will.

You could write for PCPER
 
It doesn't prove anything other than the difference between 4 GB and 8 GB is a few watts at most, while utilizing an uncertain amount of the VRAM.
The results tell you nothing about the total power consumption of GDDR5 in the first place.
 
It doesn't prove anything other than the difference between 4 GB and 8 GB is a few watts at most, while utilizing an uncertain amount of the VRAM.
The results tell you nothing about the total power consumption of GDDR5 in the first place.
Electrically there is no difference between used and unused RAM. From the moment the computer is turned on to the moment the computer is turned off, all RAM is constantly powered on and refreshed numerous times per second. The RAM has no concept for "oh this bank isn't used, power it down", therefore it doesn't matter if you're using the full 12GB of RAM on a Titan X or just whatever minimal amount is required for DOS mode. All that matters is the type of chip and how many you have. Why do you think GPUs have had bare RAM on the back side for ages and not had heat problems? The heat generated by RAM is negligible because they don't use much power. The 50% power savings on HBM will be negligible. A 100% power savings would have been negligible. The point isn't that going from 12GB to 6GB on the 980Ti will save power, the point was all these people that say "Oh HBM will give the 390X more thermal headroom because it saves 50% power!" are full of shit since 50% of a watt or two doesn't matter when you're talking a 250-300W card.
 
Electrically there is no difference between used and unused RAM.
50% of a watt or two
We're not talking about being powered on or off, we're talking about being actively used.
Why are we having this long discussion when you could just link a source that concretely shows GDDR5 using 1 or 2W under load?

According to AMD/Hynix it's about 80W, but look at the specs they're using.

http://i.imgur.com/ob7XkZg.jpg
 
We're not talking about being powered on or off, we're talking about being actively used.
Why are we having this long discussion when you could just link a source that concretely shows GDDR5 using 1 or 2W under load?

According to AMD/Hynix it's about 80W, but look at the specs they're using.

http://i.imgur.com/ob7XkZg.jpg


Only if everything is at 1:1 will you see that power savings, amount of vram, frequency etc.

That graph is only indicative of if GDDR5 is pushed to get the same bandwidth. The problem with that is, the more volts and frequency increased with GDDR5, you run into higher temps, this will intern increase power consumption due to resistance. As heat or temp of the silicon increases resistance increases which causes more leakage, to over come leakage you have to supply more power, increase voltage or have better cooling.

The 290x has around 300 GB/s, that puts it around 50 watts.
 
We're not talking about being powered on or off, we're talking about being actively used.
Why are we having this long discussion when you could just link a source that concretely shows GDDR5 using 1 or 2W under load?

According to AMD/Hynix it's about 80W, but look at the specs they're using.

http://i.imgur.com/ob7XkZg.jpg

So it looks like they are stating a cost in power consumption to the amount of bandwidth. So HBM 1.0 is 1/3 the power of DDR5 and the same BW but HBM 1.0 has like 4.5 times the BW. So it could use up to 135Ws?
 
I believe the leaked 390X spec is 640 GB/s, that chart is showing data for 500 GB/s.
It's also from dec 2013, though.
 
You got some sort of proof of this? Because all the reviews I've seen have, like the [H] review shown the titan doesn't throttle (you know, the way the AMD cards actually did). The only place I really see this assertion repeated is by AMD fans in AMD threads

Have you seen this chart from TPU?

perf_oc.gif


Maxing out both the power and thermal limits even on stock clocks improves performance a tad. Granted it's only 3.7% difference, but clearly there is some slight degree of power or thermal throttling going on.
 
Have you seen this chart from TPU?

perf_oc.gif


Maxing out both the power and thermal limits even on stock clocks improves performance a tad. Granted it's only 3.7% difference, but clearly there is some slight degree of power or thermal throttling going on.

Exactly. Even Anandtech's own review showing it using more power then a 290x, and that it throttles.

Now Temp wise the Titan X destroy's the 290x.

http://www.anandtech.com/show/9059/the-nvidia-geforce-gtx-titan-x-review/16

"Speaking of clockspeeds, taking a look at our average clockspeeds for GTX Titan X and GTX 980 showcases just why the 50% larger GM200 GPU only leads to an average performance advantage of 35% for the GTX Titan X. While the max boost bins are both over 1.2GHz, the GTX Titan has to back off far more often to stay within its power and thermal limits. The final clockspeed difference between the two cards depends on the game in question, but we&#8217;re looking at a real-world clockspeed deficit of 50-100MHz for GTX Titan X." <-----called throttling

More Throttling: http://www.tomshardware.com/reviews/nvidia-geforce-gtx-titan-x-gm200-maxwell,4091-6.html

Saying I am troll because numerous websites states Titan X throttles? I mean the proof is out there.

Now will custom Bios's fix that? O fuck yea, but when people look at reviews on websites to see what is best, 99% of people don't give a fuck about bios updating.

Just my 0.02c
 
Last edited:
That shitty reference cooler is the worst decision AMD has made in years.
It will never stop haunting them. Some time in the future, when AMD is dead and gone, you'll be able to trace day-zero back to October 24, 2013.
 
Exactly. Even Anandtech's own review showing it using more power then a 290x, and that it throttles.

Now Temp wise the Titan X destroy's the 290x.

http://www.anandtech.com/show/9059/the-nvidia-geforce-gtx-titan-x-review/16

"Speaking of clockspeeds, taking a look at our average clockspeeds for GTX Titan X and GTX 980 showcases just why the 50% larger GM200 GPU only leads to an average performance advantage of 35% for the GTX Titan X. While the max boost bins are both over 1.2GHz, the GTX Titan has to back off far more often to stay within its power and thermal limits. The final clockspeed difference between the two cards depends on the game in question, but we’re looking at a real-world clockspeed deficit of 50-100MHz for GTX Titan X." <-----called throttling
So you think that boosting to less than the maximum but still well above the stock clock speed is throttling?
 
So you think that boosting to less than the maximum but still well above the stock clock speed is throttling?

Do you want to call it downclocking then? I mean if AMD's 290x throttles below its 1ghz desired speed on a stock cooler?

Wouldn't it be called the same thing? Or since its NVidia it's not that big of a deal?
 
That shitty reference cooler is the worst decision AMD has made in years.
It will never stop haunting them. Some time in the future, when AMD is dead and gone, you'll be able to trace day-zero back to October 24, 2013.

I agree. That is why they are coming out with a AIO CLC WC card. They want NO ISSUES With throttling.

Best decision ever. I could care less about power a card draws if the performance is there. I DO care about cards throttling. Why I love my Gigabyte GTX :) If i keep it below 62C is never throttles, and the power limit is never close to its limit.

Thats why I stayed as far as way as possible to 290/290x. Sure the AIB heatsinks fixed the issue, but it took awhile. Too long if you ask me. AMD Dropped the ball hard on that release.
 
Do you want to call it downclocking then? I mean if AMD's 290x throttles below its 1ghz desired speed on a stock cooler?

Wouldn't it be called the same thing? Or since its NVidia it's not that big of a deal?
It depends on how you look at it. AMD's marketing claimed the 290X clock speed was 'up to' 1GHz, and when things got too hot for too long the clock speed tumbled. Nvidia's marketing claims the Titan X clock speed as 'at least 1GHz' and a boost clock of 1075MHz. It consistently stays above both listed clock speeds using the reference cooler.
 
So you think that boosting to less than the maximum but still well above the stock clock speed is throttling?

Well, AMD never actually listed 290X's real stock speed, so can we say that the 290X technically is never throttling by that logic? (Answer: Hell no)

AnandTech had a major gripe about it

Anyhow, as we can see, in everything, even the shortest benchmark, the sustained clockspeeds are below 1000MHz. Out of all of our games Rome 2 fares the worst in this regard, dropping to 907MHz, while other games like Metro and Crysis aren’t far behind at 910MHz-930MHz. FurMark does one better yet and drops to 727MHz, which we believe to be 290X’s unlisted base clockspeed, indicating it has to drop out of boost mode entirely to bring performance/heat in check with cooling under quiet mode. 290X simply cannot sustain its peak boost clocks under quiet mode; there’s not enough cooling to handle the estimated 300W of heat 290X produces at those performance levels.

...

Consequently this is why we’re so dissatisfied with how AMD is publishing the specifications for the 290X. The lack of a meaningful TDP specification is bad enough, but given the video card’s out of the box (quiet mode performance) it’s disingenuous at best for the only published clockspeed number to be the boost clock. 290X simply cannot sustain 1000MHz in quiet mode under full load.

...

Given this, we find AMD’s current labeling practices troubling. Although seasoned buyers are going to turn to reviews like ours, where the value of a card will be clearly spelled out with respect to both performance and price, to only list the boost clock is being deceitful at best. AMD needs to list the base clockspeeds, and they’d be strongly advised to further list an average clockspeed similar to NVIIDA’s boost clock. Even those numbers won’t be perfect, but it will at least be a reasonable compromise over listing an “up to” number that is currently for all intents and purposes unreachable.

Ok so they included some shameless self-promotion at the very end, but they do bring up a good point.
 
Last edited:
It depends on how you look at it. AMD's marketing claimed the 290X clock speed was 'up to' 1GHz, and when things got too hot for too long the clock speed tumbled. Nvidia's marketing claims the Titan X clock speed as 'at least 1GHz' and a boost clock of 1075MHz. It consistently stays above both listed clock speeds using the reference cooler.

To me downclocking is downclocking. If a video card says boost to (desired speed) I expect to be able to maintain that speed with fans at 100% and power limit maxed out.

With Titan X, you very rarely see (if at all) it able to maintain its boost speed. Why is that? The stock cooler can't keep up with the GPU, and also the HORRIBLE power limit they set on the Titan X.

Basically Nvidia limited the Titan X, by either temps or Power Limit. Which makes it very rarely sustain its rated boost speed (Which is why if you want your MAX potential flash that bitch with a new bios!!!)
 
To me downclocking is downclocking. If a video card says boost to (desired speed) I expect to be able to maintain that speed with fans at 100% and power limit maxed out.

With Titan X, you very rarely see (if at all) it able to maintain its boost speed. Why is that? The stock cooler can't keep up with the GPU, and also the HORRIBLE power limit they set on the Titan X.

Basically Nvidia limited the Titan X, by either temps or Power Limit. Which makes it very rarely sustain its rated boost speed (Which is why if you want your MAX potential flash that bitch with a new bios!!!)
I don't think you're looking at the numbers correctly. Again, the Titan X's clock speed is set at 1GHz, boost clock speed is 1075MHz. It consistently boosts beyond 1075MHz with no modification to fan speed or power limit.
 
Back
Top