GTX 980 May Lose The Performance Crown To AMD’s R9 380X in February 2015

Joined
Jul 20, 2013
Messages
524
So many nvidian mutants <-- to all of ya, I can't wait to post benchies of the mighty 390x destroying your gtx prius

And here's some more pics of your GTX Prius watt guzzling:


yea, GTX Prius using 240 watts. In Tom's own words, "new graphics card&#8217;s increased efficiency is largely attributable to better load adjustment and matching." Leave it to nvidians to release a tdp of 150w for their reference 970 and then ship non-reference 970s with 240w+ tdps.

#MAXFAIL
 
Last edited:

Kor

2[H]4U
Joined
Mar 31, 2010
Messages
2,175
So many nvidian mutants <-- to all of ya, I can't wait to post benchies of the mighty 390x destroying your gtx prius

And here's some more pics of your GTX Prius watt guzzling:


yea, GTX Prius using 240 watts. In Tom's own words, "new graphics card’s increased efficiency is largely attributable to better load adjustment and matching." Leave it to nvidians to release a tdp of 150w for their reference 970 and then ship non-reference 970s with 240w+ tdps.

#MAXFAIL
Are you a real person?
 

DejaWiz

Fully [H]
Joined
Apr 15, 2005
Messages
19,971
Factory OC 970 for $330-370 that lays waste to any formerly released GPU but does consume more power than stock-clocked and you are complaining about while saying R390X needs water cooling and power consumption doesn't matter because it's going to own everything?
 
Last edited by a moderator:

Unknown-One

[H]F Junkie
Joined
Mar 5, 2005
Messages
8,903
yea, GTX Prius using 240 watts. In Tom's own words, "new graphics card&#8217;s increased efficiency is largely attributable to better load adjustment and matching." Leave it to nvidians to release a tdp of 150w for their reference 970 and then ship non-reference 970s with 240w+ tdps.

#MAXFAIL
TDP = Thermal Design Power = The amount of heat the cooling system is designed to cope with, in watts.

As an extreme example: A card can consume 800w of power, and have a 100w TDP. That just means it's INCREDIBLY efficient, and is using 700w to do actual work and only converting 100w into waste heat.

All your chart shows is that Maxwell is, in fact, very efficient. It's drawing ALL that power, using most of it up doing real work, and only producing 150w of heat. Not bad.

And here's some more pics of your GTX Prius watt guzzling:
The reference model is only pulling 177w in your chart. Not seeing the big deal here.

The overclocked card is pulling significantly more power, but that's generally the case with ANY overclocked card. Your complaint seems to be with MSI's non-reference-specced card, not Maxwell in general.
 
Last edited:

Lord_Exodia

Supreme [H]ardness
Joined
Sep 29, 2005
Messages
7,007
TDP = Thermal Design Power = The amount of heat the cooling system is designed to cope with, in watts.

As an extreme example: A card can consume 800w of power, and have a 100w TDP. That just means it's INCREDIBLY efficient, and is using 900w to do actual work and only converting 100w into waste heat.

All your chart shows is that Maxwell is, in fact, very efficient. It's drawing ALL that power, using most of it up doing real work, and only producing 150w of heat. Not bad.


The reference model is only pulling 177w in your chart. Not seeing the big deal.

The overclocked card is pulling significantly more power, but that's generally the case with ANY overclocked card. Your complaint seems to be with MSI's non-reference-specced card, not Maxwell in general.
Dayum OwnD! Thanks Fixedmy7970 for pointing out how unbelievably awesome maxwell is. You should come to nvidia's side. Water is good. :D
 

Tamlin_WSGF

2[H]4U
Joined
Aug 1, 2006
Messages
2,974
TDP = Thermal Design Power = The amount of heat the cooling system is designed to cope with, in watts.

As an extreme example: A card can consume 800w of power, and have a 100w TDP. That just means it's INCREDIBLY efficient, and is using 700w to do actual work and only converting 100w into waste heat.

All your chart shows is that Maxwell is, in fact, very efficient. It's drawing ALL that power, using most of it up doing real work, and only producing 150w of heat. Not bad.


The reference model is only pulling 177w in your chart. Not seeing the big deal here.

The overclocked card is pulling significantly more power, but that's generally the case with ANY overclocked card. Your complaint seems to be with MSI's non-reference-specced card, not Maxwell in general.
Lol! If a card consume 800W of power, most of that wattage (99.9999 percent of it) will be transformed into heat. As you might remember from school, "energy cannot be destroyed or created, only transformed" (law of conservation of energy).

A card can be more efficient doing more calculations per wattage, but the energy used will be transformed into heat. The energy (watt) doesnt disappear when the GPU uses it, it only gets transformed.

The rated TDP of a card, is simply what is being produced under certain circumstances (maximum heat according to chosen load).
 

n=1

2[H]4U
Joined
Sep 2, 2014
Messages
2,388
I cannot BELIEVE you guys are still taking fixedmy7970 seriously at this point...
 
Joined
Jul 20, 2013
Messages
524
brah, believe it nvidia won this last battle with better compression and load adjusting, something amd will do and fix like with frame stuttering. you nvidians know the 390x is gonna smoke your puny gtx prius thats why your hating
 

DejaWiz

Fully [H]
Joined
Apr 15, 2005
Messages
19,971
brah, believe it nvidia won this last battle with better compression and load adjusting, something amd will do and fix like with frame stuttering. you nvidians know the 390x is gonna smoke your puny gtx prius thats why your hating
We can't hate something that isn't out and won't be reviewed for months. And given AMD's track record for sensationalism prior to a product launch, well, remember Bulldozer?
 

xLokiX

Limp Gawd
Joined
Jun 10, 2007
Messages
263
remember maxwell where nvidia claims 2x performance over kepler when its really 20%
It is 2x the power of a 680gtx, which is the card its technically replacing. And when the 980ti comes out it should be 2x the power of the 780ti, which is the card that it will replace.
 

wonderfield

Supreme [H]ardness
Joined
Dec 11, 2011
Messages
7,396
Um, they could have gimped a firepro, just as nvidia did

they chose not to.

im not sure where the confusion is....
Which one? The W9100 &#8212; the highest-end FirePro &#8212; uses the same chip as the 290X, the Hawaii XT. That chip, as used on both parts, is fully-enabled: all they could have done to push past 290X performance was to unrestrict DP performance, which would have gotten them minimal performance gains in gaming scenarios, and boost clock speeds, which they have little to no headroom to do. As I already said, the 290X is the practical limit of what AMD is capable of in single-GPU configurations. There's no wonderful chip in their technology stack that is currently more capable than the one they sell you on the 290X.

The Titan, on the other hand, had an entirely different chip than what was previously available on consumer parts. In the Titan, NVIDIA brought down a chip they had intended to bring down to consumers earlier were it not for AMD's lack of earlier competitiveness.

What a crock. Titan was a pure rip off and people fell for it.
I never suggested it wasn't. As previously alluded by other posters, the Titan was an opportunity NVIDIA seized due to AMD's lack of competitiveness.

As an extreme example: A card can consume 800w of power, and have a 100w TDP. That just means it's INCREDIBLY efficient, and is using 700w to do actual work and only converting 100w into waste heat.
The only work a GPU does is 'move' electrons around. This work is measured in thousandths of a watt. Computing pixel color values is not work.

The work being done by the fan is multiple orders of magnitude greater than what's being done in a GPU.
 

fanboy

[H]ard|Gawd
Joined
Jul 4, 2009
Messages
1,057
It is 2x the power of a 680gtx, which is the card its technically replacing. And when the 980ti comes out it should be 2x the power of the 780ti, which is the card that it will replace.
Why would it replace the 680GTX if the 770GTX is a 680GTX done better??
 

Lord_Exodia

Supreme [H]ardness
Joined
Sep 29, 2005
Messages
7,007
Why would it replace the 680GTX if the 770GTX is a 680GTX done better??
What he's trying to say is that when nvidia announced maxwell they didn't say it would be 2x as fast as the 780GTX. they said it will be 2x the performance/watt as the kepler architecture they were just launching (GTX 680 at the time) The entire refresh of Kepler was using their GK 110 GPU that was their flagship Quadro card. Nvidia didn't even need to spin off a full on architecture refresh to fight AMD last round. They just utilized their GK110 that launched several months before it as a Quadro GPU.
 

xLokiX

Limp Gawd
Joined
Jun 10, 2007
Messages
263
What he's trying to say is that when nvidia announced maxwell they didn't say it would be 2x as fast as the 780GTX. they said it will be 2x the performance/watt as the kepler architecture they were just launching (GTX 680 at the time) The entire refresh of Kepler was using their GK 110 GPU that was their flagship Quadro card. Nvidia didn't even need to spin off a full on architecture refresh to fight AMD last round. They just utilized their GK110 that launched several months before it as a Quadro GPU.
Pretty much, just didn't want to take any potshots. I know the butthurt is strong enough in this thread already. JK
 

Lord_Exodia

Supreme [H]ardness
Joined
Sep 29, 2005
Messages
7,007
Adding to my last comment I will say that they did improve the GK110 a little and came out with a rev B but that wasn't a full on architecture refresh like we've seen in the past.

anyway I hope nvidia does lose the performance crown. I'm scared they will get complacent like Intel if AMD doesn't stuff a boot up their ass. The last thing we need is incremental gpu upgrades like 10-15% each generation :eek:
 

LordEC911

[H]ard|Gawd
Joined
Jul 7, 2013
Messages
1,461
Adding to my last comment I will say that they did improve the GK110 a little and came out with a rev B but that wasn't a full on architecture refresh like we've seen in the past.
Of course it wasn't an architectural refresh, a metal respin is almost never an architectural refresh unless there is something majorly wrong with the ASIC design, see Fermi.
 

ebduncan

[H]ard|Gawd
Joined
Feb 1, 2008
Messages
2,007
Adding to my last comment I will say that they did improve the GK110 a little and came out with a rev B but that wasn't a full on architecture refresh like we've seen in the past.

anyway I hope nvidia does lose the performance crown. I'm scared they will get complacent like Intel if AMD doesn't stuff a boot up their ass. The last thing we need is incremental gpu upgrades like 10-15% each generation :eek:

You don't have to worry about AMD not being competitive with Nvidia. Nvidia will finish strong in 2014, but the tide will change in 2015. As always if you need a graphics card today its hard to look past the 970 or the 980. However if you can wait Amd will not disappoint with their next release. In all I think everyone is in for a big surprise, and I strongly feel the next generation AMD cards will make the current maxwell cards look low end. Just a hunch though.
 
Joined
Nov 11, 2008
Messages
831
I guess every forum needs its own set of trolls...
Would be nice to see a Tonga based flagship of sorts undercut the 970 in price though I have doubts at this point whether it'd be worth it.

R9 3xx series leaks can't come fast enough.
 

Hakaba

Gawd
Joined
Jul 22, 2013
Messages
784
Funny how "May" and "Speculation" leads to so much green vs red arguments/bashing/hatred/sheep calling.

Ohh well flashed 6950 just hold out a little longer and see what AMD brings to the table next. May just be another buy what offers the best price to performance ratio decision again.
 

Unknown-One

[H]F Junkie
Joined
Mar 5, 2005
Messages
8,903
Lol! If a card consume 800W of power, most of that wattage (99.9999 percent of it) will be transformed into heat. As you might remember from school, "energy cannot be destroyed or created, only transformed" (law of conservation of energy).
As I said, a card with a 100w TDP that consumes 800w of power would imply a VERY efficient card that uses most of the power for actual work (converting very little into waste heat).

Go back and re-read my post. This example scenario was clearly stated.

A card can be more efficient doing more calculations per wattage, but the energy used will be transformed into heat. The energy (watt) doesnt disappear when the GPU uses it, it only gets transformed.
You're assuming 100% of the electricity consumed by the card is converted into heat. This is incorrect. If TDP was calculated accurately, the maximum power consumption of the card will always be higher than the TDP by some amount.

The only work a GPU does is 'move' electrons around. This work is measured in thousandths of a watt. Computing pixel color values is not work.
This does not invalidate what I said. A card that consumes 800w of power and only produces 100w of heat is possible (again, as an EXTREME example).

This was simply meant to illustrate that TDP doesn't equate to power consumption. The two CAN be wildly different.
 
Last edited:

wonderfield

Supreme [H]ardness
Joined
Dec 11, 2011
Messages
7,396
As I said, a card with a 100w TDP that consumes 800w of power would imply a VERY efficient card that uses most of the power for actual work
What work? What do you believe is going on within a GPU? What, in your theoretical example, necessitating the equivalent of one horsepower to perform? That's enough power to propel a typical scooter, and its rider, at speeds up to and excess of 40 feet per second.

You're assuming 100% of the electricity consumed by the card is converted into heat. This is incorrect.
It assumes the overwhelming and vast majority is converted to heat. Graphics cards are not energy black holes, nor do they violate the law of conservation of energy. They consume electrical energy, doing so little work with it so as to be of little interest to anyone, and expend the vast majority of that energy in the form of heat.
 

-Strelok-

[H]F Junkie
Joined
Dec 2, 2010
Messages
10,262
Video cards do tons of work. They make so much graphics energy! What is this heat you speak of?
 

n=1

2[H]4U
Joined
Sep 2, 2014
Messages
2,388
You don't have to worry about AMD not being competitive with Nvidia. Nvidia will finish strong in 2014, but the tide will change in 2015. As always if you need a graphics card today its hard to look past the 970 or the 980. However if you can wait Amd will not disappoint with their next release. In all I think everyone is in for a big surprise, and I strongly feel the next generation AMD cards will make the current maxwell cards look low end. Just a hunch though.
Actually the 980 is pretty forgettable tbh unless you're the type that just has to have the latest and greatest at whatever cost.

Now the 970 is truly shaping up to be the next 8800 GTX (almost caught myself typing GTX 8800, damn nVidia and the confusing names :D), and that price/performance ratio is going to give AMD massive headaches until they release something to compete. For now all AMD can do is drop prices on the R290(X) and indeed that's what's happening.

I've no doubt AMD will come up with an answer, but timing is everything in the GPU market. If they don't get anything out before the end of this year, they're going to lose out massively during the holiday sales. They could keep cutting prices but you can only go so low before you cut into your margins.
 

Unknown-One

[H]F Junkie
Joined
Mar 5, 2005
Messages
8,903
What work? What do you believe is going on within a GPU? What, in your theoretical example, necessitating the equivalent of one horsepower to perform? That's enough power to propel a typical scooter, and its rider, at speeds up to and excess of 40 feet per second.
Already covered this, see again:
This does not invalidate what I said. A card that consumes 800w of power and only produces 100w of heat is possible (again, as an EXTREME example).

This was simply meant to illustrate that TDP doesn't equate to power consumption. The two CAN be wildly different.
It was an extreme example to illustrate the point, nothing more. I'm not saying such a would exist, it's simply a hilariously over-extended example of how TDP is NOT directly linked to power consumption.

Go ahead, replace those numbers with ones from the real-world. A card with a 145w TDP drawing 177w. Doesn't matter, my point stands. TDP still isn't 1:1 with power consumption, and is still lower than power consumption, EXACTLY as expected.

It assumes the overwhelming and vast majority is converted to heat.
I didn't say it wasn't. I clearly pointed out that the GTX 970 in his chart (a card with a 145w TDP) is drawing 177w. That's 82% of the power the card is drawing being converted into heat.

So yeah, I already covered this.

Graphics cards are not energy black holes, nor do they violate the law of conservation of energy.
Never said they were, never said they did.

They do, however, use some power without converting it into heat.

They consume electrical energy, doing so little work with it so as to be of little interest to anyone, and expend the vast majority of that energy in the form of heat.
Yup, as I said, 82% in the case of the GTX 970.

But, here's the thing, NONE of what you said has any real relevance. My point remains, TDP doesn't equate to actual power consumption, and TDP will always be lower than actual power consumption.
 
Last edited:

ebduncan

[H]ard|Gawd
Joined
Feb 1, 2008
Messages
2,007
I'm dying of laughter.

Graphics cards do not do work. They take d/c electric and convert it into heat. Some energy is converted into mechanical energy mainly for the the fans.

I guess some folks never took physics. Sorry I couldn't help myself.
 

n=1

2[H]4U
Joined
Sep 2, 2014
Messages
2,388
Wait you mean pushing electrons around is NOT doing work now? My entire life has been a lie..
 

Lord_Exodia

Supreme [H]ardness
Joined
Sep 29, 2005
Messages
7,007
I'm dying of laughter.

Graphics cards do not do work. They take d/c electric and convert it into heat. Some energy is converted into mechanical energy mainly for the the fans.

I guess some folks never took physics. Sorry I couldn't help myself.
Performing calculations and converting them into 3d and sending them to the monitor is not work?

Oh wait.. LOL are you splitting hairs here and saying the card itself doesn't do the work but the GPU <-- does? Is that the inside joke I'm missing? (edited your quote to emphaisze)
 

Unknown-One

[H]F Junkie
Joined
Mar 5, 2005
Messages
8,903
I'm dying of laughter.

Graphics cards do not do work. They take d/c electric and convert it into heat.
Graphics card DO NOT convert 100% of the energy they consume into heat. This is a fact.

Therefor, TDP (thermal design power) will always be lower than ACTUAL power consumption. Exactly as I've stated, repeatedly.

Some energy is converted into mechanical energy mainly for the the fans.
Dunno the wattage of the fan on the stock cooler on the GTX 970, but I do know the fans on the EVGA AXC cooler pull about 13w when spun all the way up. Even accounting for that, that still leaves the rest of the GTX 970 consuming 164w of power and only producing around 145w of heat.

I guess some folks never took physics. Sorry I couldn't help myself.
Really not sure what you're laughing at, physics agrees with the point I'm making.
 
Last edited:

LordEC911

[H]ard|Gawd
Joined
Jul 7, 2013
Messages
1,461
I'm sorry but I agree with the "dying of laughter" post.

Where do you think that energy goes?
 

wonderfield

Supreme [H]ardness
Joined
Dec 11, 2011
Messages
7,396
I wish I could be laughing my ass off like the rest of you, but the ignorance just upsets me.

It was an extreme example to illustrate the point, nothing more.
I don't care. The question is as applicable regardless of the extremity of the example. What work would this theoretical graphics card be doing with the equivalent of one horsepower of energy? Define the work being done.

I didn't say it wasn't. I clearly pointed out that the GTX 970 in his chart (a card with a 145w TDP) is drawing 177w. That's 82% of the power the card is drawing being converted into heat.
No. The TDP is a constant: it is defined by NVIDIA. Its TDP is 145W regardless of whether the card draws 6W (idling) or 260W (transient load). A graphics card drawing 177W expends approximately 177W of energy as heat. It's not some magical ratio of its power consumption to its rated TDP.

My point remains, TDP doesn't equate to actual power consumption, and TDP will always be lower than actual power consumption.
No. See above. Furthermore, your so-called 'point' was that a graphics card's efficiency is an aspect of how much power it draws relative to its TDP, not that the rated TDP will always be lower than power consumption (which is also comically wrong). You're trying to slowly dilute your argument such that it seems more agreeable, but you're doing quite a poor job of it.

Performing calculations and converting them into 3d and sending them to the monitor is not work?
Not in any physical sense, no.
 

ebduncan

[H]ard|Gawd
Joined
Feb 1, 2008
Messages
2,007
Wait you mean pushing electrons around is NOT doing work now? My entire life has been a lie..
by definition of work it does not

Performing calculations and converting them into 3d and sending them to the monitor is not work?

Oh wait.. LOL are you splitting hairs here and saying the card itself doesn't do the work but the GPU <-- does? Is that the inside joke I'm missing? (edited your quote to emphaisze)
there is no inside joke here, GPUs do not perform work.

Graphics card DO NOT convert 100% of the energy they consume into heat. This is a fact.

Therefor, TDP (thermal design power) will always be lower than ACTUAL power consumption. Exactly as I've stated, repeatedly.


Dunno the wattage of the fan on the stock cooler on the GTX 970, but I do know the fans on the EVGA AXC cooler pull about 13w when spun all the way up. Even accounting for that, that still leaves the rest of the GTX 970 consuming 164w of power and only producing around 145w of heat.


Really not sure what you're laughing at, physics agrees with the point I'm making.
your dead wrong. ALL of the electrical energy consumed by the graphics card is directly converted to heat. Unless the graphics card is breaking the laws of physics this holds true. Basically what your saying is 100% power goes in and gets lost somewhere.

Here is the definition of the law conservation of energy

"The law of conservation of energy states that the total amount of energy in a system remains constant ("is conserved"), although energy within the system can be changed from one form to another or transferred from one object to another. Energy cannot be created or destroyed, but it can be transformed."

There are 3 types of energy possible by a graphics card. Electrical energy , mechanical energy, and heat energy. Electrical energy goes in it can be either transformed into mechanical energy (ie fans or pumps), or heat.

I'm still laughing.
 

KazeoHin

Supreme [H]ardness
Joined
Sep 7, 2011
Messages
8,139
by definition of work it does not



there is no inside joke here, GPUs do not perform work.



your dead wrong. ALL of the electrical energy consumed by the graphics card is directly converted to heat. Unless the graphics card is breaking the laws of physics this holds true. Basically what your saying is 100% power goes in and gets lost somewhere.

Here is the definition of the law conservation of energy

"The law of conservation of energy states that the total amount of energy in a system remains constant ("is conserved"), although energy within the system can be changed from one form to another or transferred from one object to another. Energy cannot be created or destroyed, but it can be transformed."

There are 3 types of energy possible by a graphics card. Electrical energy , mechanical energy, and heat energy. Electrical energy goes in it can be either transformed into mechanical energy (ie fans or pumps), or heat.

I'm still laughing.
Not to derail the thread more, but this discussion is quite interesting.

My take is that if 100% of the electrical energy input is converted to heat, why would we need a ground/earth line?
 

The Mac

Supreme [H]ardness
Joined
Feb 14, 2011
Messages
4,492
Electrical energy requires a return path by way of a closed circuit from the energy source (potential). all electrons flow towards ground.

a ground is a convenient way of indicating that path versus the input.

A common ground allows all circuits to share the same return path greatly simplifying design.

so for example, you take a battery, ground the anode (-), then the cathode (+) becomes your signal, you send it through a lightbulb then ground the other side. (electrons flow from + to -, which seems backwards, but the signal becomes the positive holes where the electrons have flowed from)

now you have a complete circuit.

Oversimplification, but good enough for basic theory.
 
Last edited:

Relayer

[H]ard|Gawd
Joined
Jun 5, 2011
Messages
1,527
Graphics card DO NOT convert 100% of the energy they consume into heat. This is a fact.

Therefor, TDP (thermal design power) will always be lower than ACTUAL power consumption. Exactly as I've stated, repeatedly.


Dunno the wattage of the fan on the stock cooler on the GTX 970, but I do know the fans on the EVGA AXC cooler pull about 13w when spun all the way up. Even accounting for that, that still leaves the rest of the GTX 970 consuming 164w of power and only producing around 145w of heat.


Really not sure what you're laughing at, physics agrees with the point I'm making.
If a card truly had a TDP limit of 177W and drew 800W it would go up like a Roman candle. Nice try trying to say that these O/C cards aren't converting the extra power they are using into heat. We know, virtually all of it is just that, heat.
 

Dayaks

Supreme [H]ardness
Joined
Feb 22, 2012
Messages
8,023
Unless it's moving something (air, vehicle, vibrations), the power turns into heat or light. Something on the electromagnetic spectrum.

Edit: I guess you could say a 100% efficient card would consume 5W, dissipate 0 vibrations or heat, and pass on all the electrons through the HDMI cable (or what have you) to the monitor.

Considering the display cable can handle what, 5 watts? We can pretty much say the vast majority of energy used is turned into heat or vibrations...
 
Top