GTX 980 May Lose The Performance Crown To AMD’s R9 380X in February 2015

Because, if you can't fight for a company, whose sole purpose is to separate you from as much of your money as they can, then what else is there to fight for? Peace? Love? Freedom? No, for these are trivial things, graphics cards however... They make the world a better place. Long live <insert company name here>!!!
 
So if I buy a 95w CPU and overclock it from 3.4GHz to 5.0GHz then it is only dumping 95w of heat into my case. Awesome! Thx!
Incorrect. You've blown past the reference settings, ergo the reference TDP no-longer applies.

This should be obvious.

No, he is saying it is easier to cool because it is more efficient! LoL
I'm saying it's easier to cool because it's designed to output only 165w of heat rather than 250w of heat.

Only if you buy the model with the attached leaf blower!
Are you seriously still missing the point of that entire example? Wow, sad... the lack of reading comprehension is astounding.
 
Coming from you, that means less than nothing.
What is that supposed to mean, exactly? Vauge attempt at a personal attack without actually addressing any of the topics or issues at hand? Pathetic.

Anyway, I'm not the one failing spectacularly at understanding the point and purpose of a simple illustrative example, you are. Get back to me when you finally understand the simple concept of a device NOT converting all the power it consumes into heat :rolleyes:
 
Get back to me when you finally understand the simple concept of a device NOT converting all the power it consumes into heat :rolleyes:
Graphics cards are not energy black holes, nor do they violate the law of conservation of energy. They consume electrical energy, doing so little work with it so as to be of little interest to anyone, and expend the vast majority of that energy in the form of heat.
You request; I generously and selflessly give. If only everyone had the capacity to do the same.
 
You request; I generously and selflessly give. If only everyone had the capacity to do the same.
I already responded to that quote ages ago. Saying that the energy consumed by a graphics card that isn't converted into heat is doing "so little work so as to be of little interest to anyone" is far from the truth. Fans alone, before considering any other forms of energy conversion, can account for more than 10w of power consumption that DOES NOT get converted into heat. That's a 10w difference between power consumption and TDP, right off the bat.

I repeat: I never said graphics cards are energy black holes. I never said they "violate the law of conservation of energy" in any way. A graphics card can consume can consume 177w of power and output 145w of heat without violating any laws of physics.

I repeat: Graphics cards do not convert 100% of the power the consume into heat. That's why there's a deficit between power consumption and heat output (TDP), with TDP always being lower than actual power consumption.
 
Last edited:
Every time I think this thread will finally die it gets more umph.

So where's the misunderstanding? I think it's that some people think TDP = total power draw from the PSU. Others think TDP = a design specification of a system to build a cooling solution to. I think that's the root cause of the bantering? :)
 
If we all really want to have some fun, I can make a thread in the CPU section called...

"Intel Core i7 Haswell-E May Lose The Performance Crown To AMD's upcoming CPUs in 2015"

:)
 
If we all really want to have some fun, I can make a thread in the CPU section called...

"Intel Core i7 Haswell-E May Lose The Performance Crown To AMD's upcoming CPUs in 2015"

:)

Whoa hold on there... Careful with that one! So hard... To resist troll urge...
 

WTF are you doing?

I already responded to that quote ages ago. Saying that the energy consumed by a graphics card that isn't converted into heat is doing "so little work so as to be of little interest to anyone" is far from the truth. Fans alone, before considering any other forms of energy conversion, can account for more than 10w of power consumption that DOES NOT get converted into heat. That's a 10w difference between power consumption and TDP, right off the bat.

I repeat: I never said graphics cards are energy black holes. I never said they "violate the law of conservation of energy" in any way. A graphics card can consume can consume 177w of power and output 145w of heat without violating any laws of physics.

I repeat: Graphics cards do not convert 100% of the power the consume into heat. That's why there's a deficit between power consumption and heat output (TDP), with TDP always being lower than actual power consumption.

So that 10W the fan is consuming (also, do your research, 10W of fan is a ridiculous amount of fan power)...where do you think that power is going? What form of energy do you think it eventually ends up as? (Here's a tip. That air that fan is moving...it's accelerating the molecules in the air, and once that air stops moving, ends up as heat.)
 
If the GTX 980 doesn't lose the performance crown to (at least) the R9-390x, then AMD may be in trouble.
 
If the GTX 980 doesn't lose the performance crown to (at least) the R9-390x, then AMD may be in trouble.

I take it you mean in the stand alone grafix card arena? Because on that I will agree.

But business wise they still bring in billions of dollars a year. In CPU and other small chip manufacturing and designing Purpose built CPU's and other chips are AMD's bread and butter. See the fact that they are the chips used for just about everything in both of the main consoles in the market today.
 
Ah, so that's why these next-gen consoles are utter shit.

i keed i keed :D
 
I take it you mean in the stand alone grafix card arena? Because on that I will agree.

But business wise they still bring in billions of dollars a year. In CPU and other small chip manufacturing and designing Purpose built CPU's and other chips are AMD's bread and butter. See the fact that they are the chips used for just about everything in both of the main consoles in the market today.


AMD is still in huge trouble in the CPU department and making their APU strategy pay off. The graphics department is literally the only part of their business keeping them afloat right now. While they are starting to find success in the non-discrete area, it isn't by far anything to brag about. They've stemmed the bleeding, not stopped it. I still see them being bought out in the next 10 years. They just don't have the influence, R&D, or money they once had.

Nvidia is going to be in some serious shit in the future themselves which is why they went ARM years ago when they couldn't secure an x86 license.
 
=So that 10W the fan is consuming (also, do your research, 10W of fan is a ridiculous amount of fan power)
Did do my research. EVGA specs the fans on the ACX cooler to draw up to 13w combined.

The stock fan on the GTX 780 / Titan reference cooler also draws a ridiculous amount of power (don't know the exact number on that one) when running full speed, to the point that you can actually watch the "Power %" graph in EVGA Precision jump-up a few percent if you force the fan up to 100% speed while the card is idle. Unpluging the GPU fan from the graphics card and powering it from an alternative source can actually make the card slightly less-likely to drop out of boost clocks because it doesn't bump into the power target as easily.

...where do you think that power is going?
Turning the fan blades, obviously... How is this even a question?

That air that fan is moving...it's accelerating the molecules in the air, and once that air stops moving, ends up as heat.
The tiny amount of heat generated by the fan jostling air molecules about is not counted as part of TDP, and is therefor irrelevant to the discussion at hand. Most of the power consumed by the fan is spent overcoming inertia and overcoming air-resistance, not creating heat.
 
Last edited:
THIS THREAD TITLE IS AWESOME!!!! A card launching 6 months later is gonna steal the performance crown from a 6 month old card, U DONT SSAY!!!!

The only benefit from this is for the consumer with Price wars between AMD and Nvidia. the way AMD is pricing (or dropping their prices) out their current lineup expect that for the 980s 6 months from now. No more crazy $800 pricing schemes.
 
I take it you mean in the stand alone grafix card arena? Because on that I will agree.

But business wise they still bring in billions of dollars a year. In CPU and other small chip manufacturing and designing Purpose built CPU's and other chips are AMD's bread and butter. See the fact that they are the chips used for just about everything in both of the main consoles in the market today.

Yeah, I wasn't damming the company as a whole. :D Just saying if they put out a GPU 6 months later that's just matching what's already out there, they are in trouble. I wouldn't want Nvidia vs AMD to be like Intel vs AMD. Intel owns the high end CPU market, and can price their products as high as they want, because of the lack of competition up there. I think we've witnessed a little bit of that in the past already though, since AMD really didn't have an answer for the Titan/Titan Black at the time. I know the 290x is a great card and wasn't horribly far away in performance, but that didn't change the 1k price on those Titans.
 
Last edited:
THIS THREAD TITLE IS AWESOME!!!! A card launching 6 months later is gonna steal the performance crown from a 6 month old card, U DONT SSAY!!!!

The only benefit from this is for the consumer with Price wars between AMD and Nvidia. the way AMD is pricing (or dropping their prices) out their current lineup expect that for the 980s 6 months from now. No more crazy $800 pricing schemes.

Since there are leaked pictures of shrounds with watercooling holes cut in them.... if true crazy pricing might just be starting! :)
 
If we all really want to have some fun, I can make a thread in the CPU section called...

"Intel Core i7 Haswell-E May Lose The Performance Crown To AMD's upcoming CPUs in 2015"

:)

This one comes with a LN2 tube pre-installed.
 
My next thread.

AMD 390x may lose the performance crown NVIDIA's next generation card in 12 months. I wonder if it will troll along to 6 pages of dogshit.

I literally just read this at work while I was drinking water and just spat it all out laughing at this. (But you may want to extend that to 7 pages now)...

That being said, I hope AMD still puts up a good fight against Nvidia. Competition is a good thing for all of us. I'd like to have a 8GB Vram card and some sweet performance for 4k gaming and beyond!
 
This thread serves as a good soap opera.. Kudos to the OP who either played as a devil's advocate or was an idiot fan boy vying for breathing space.
 
I will likely skip the 380 series because I have 290 and Crossfire. I just dont see the benefit of constantly jumping from one generation to the next. Its a bad financial choice as well because you will never recoup the money for the previous card, might as well just wait until it cant perform a task at hand.
 
I will likely skip the 380 series because I have 290 and Crossfire. I just dont see the benefit of constantly jumping from one generation to the next. Its a bad financial choice as well because you will never recoup the money for the previous card, might as well just wait until it cant perform a task at hand.

I sell when the card is fairly new most of the time. If you sell your 290Xs before the new cards come out, you'll get back much more cash than if you wait for the 490Xs to release.

I dunno if that's correct for everbody but that's what I usually do. ;)
 
I sell when the card is fairly new most of the time. If you sell your 290Xs before the new cards come out, you'll get back much more cash than if you wait for the 490Xs to release.

I dunno if that's correct for everbody but that's what I usually do. ;)

^agree.. I do the same, specially when I have a spare GPU to wait 1 month before release.. :D..
 
^agree.. I do the same, specially when I have a spare GPU to wait 1 month before release.. :D..

So keep a generation old gpu around to power the displays and sell the current one a month before release so you don't see the value of your card drop in half?
 
I have had many AMD cards in the past. I hope they can make a quiet, cool running, powerful card in the near future. I will buy whatever, from whoever. Nvidia has just been better for the last few generations. I don't mind buying, selling every generation if there is something interesting to experience. It is a hobby as far as I am concerned. There sure was a lot of fanboism and passion on this topic...
 
I have had many AMD cards in the past. I hope they can make a quiet, cool running, powerful card in the near future. I will buy whatever, from whoever. Nvidia has just been better for the last few generations. I don't mind buying, selling every generation if there is something interesting to experience. It is a hobby as far as I am concerned. There sure was a lot of fanboism and passion on this topic...

You do realize that people make non reference AMD cards, don't you? They are both cool running and quiet.
 
im gonna hold out for the next die shrink...then im jumping in for the first good sale price...my 7970 will hold me out till then np
 
The title of this thread is absolutely hilarious.

*May* lose performance crown? Every card ever made will always lose it's performance crown. This is the nature of incremental and substantial improvements in the industry.

And the only reason i picked up a 980 gtx is because i upgraded from a 580 gtx. So it was a MASSIVE performance upgrade for me and i didn't feel like buying the part that's just a compromise of price this time around (970) even if it was an exceptional part.
 
Back
Top