GTX 980 May Lose The Performance Crown To AMD’s R9 380X in February 2015

Discussion in 'AMD Flavor' started by fanboy, Sep 29, 2014.

  1. xLokiX

    xLokiX Limp Gawd

    Messages:
    262
    Joined:
    Jun 10, 2007
    Because, if you can't fight for a company, whose sole purpose is to separate you from as much of your money as they can, then what else is there to fight for? Peace? Love? Freedom? No, for these are trivial things, graphics cards however... They make the world a better place. Long live <insert company name here>!!!
     
  2. Unknown-One

    Unknown-One [H]ardForum Junkie

    Messages:
    8,886
    Joined:
    Mar 5, 2005
    Incorrect. You've blown past the reference settings, ergo the reference TDP no-longer applies.

    This should be obvious.

    I'm saying it's easier to cool because it's designed to output only 165w of heat rather than 250w of heat.

    Are you seriously still missing the point of that entire example? Wow, sad... the lack of reading comprehension is astounding.
     
  3. wonderfield

    wonderfield [H]ardness Supreme

    Messages:
    7,396
    Joined:
    Dec 11, 2011
    Coming from you, that means less than nothing.
     
  4. Unknown-One

    Unknown-One [H]ardForum Junkie

    Messages:
    8,886
    Joined:
    Mar 5, 2005
    What is that supposed to mean, exactly? Vauge attempt at a personal attack without actually addressing any of the topics or issues at hand? Pathetic.

    Anyway, I'm not the one failing spectacularly at understanding the point and purpose of a simple illustrative example, you are. Get back to me when you finally understand the simple concept of a device NOT converting all the power it consumes into heat :rolleyes:
     
  5. wonderfield

    wonderfield [H]ardness Supreme

    Messages:
    7,396
    Joined:
    Dec 11, 2011
    You request; I generously and selflessly give. If only everyone had the capacity to do the same.
     
  6. Unknown-One

    Unknown-One [H]ardForum Junkie

    Messages:
    8,886
    Joined:
    Mar 5, 2005
    I already responded to that quote ages ago. Saying that the energy consumed by a graphics card that isn't converted into heat is doing "so little work so as to be of little interest to anyone" is far from the truth. Fans alone, before considering any other forms of energy conversion, can account for more than 10w of power consumption that DOES NOT get converted into heat. That's a 10w difference between power consumption and TDP, right off the bat.

    I repeat: I never said graphics cards are energy black holes. I never said they "violate the law of conservation of energy" in any way. A graphics card can consume can consume 177w of power and output 145w of heat without violating any laws of physics.

    I repeat: Graphics cards do not convert 100% of the power the consume into heat. That's why there's a deficit between power consumption and heat output (TDP), with TDP always being lower than actual power consumption.
     
    Last edited: Oct 8, 2014
  7. The Mac

    The Mac [H]ardness Supreme

    Messages:
    4,492
    Joined:
    Feb 14, 2011
    Obviously thats being measure with an APC, minus the system usage.
     
  8. Dayaks

    Dayaks [H]ardness Supreme

    Messages:
    7,192
    Joined:
    Feb 22, 2012
    Every time I think this thread will finally die it gets more umph.

    So where's the misunderstanding? I think it's that some people think TDP = total power draw from the PSU. Others think TDP = a design specification of a system to build a cooling solution to. I think that's the root cause of the bantering? :)
     
  9. Lord_Exodia

    Lord_Exodia [H]ardness Supreme

    Messages:
    6,997
    Joined:
    Sep 29, 2005
    If we all really want to have some fun, I can make a thread in the CPU section called...

    "Intel Core i7 Haswell-E May Lose The Performance Crown To AMD's upcoming CPUs in 2015"

    :)
     
  10. Dayaks

    Dayaks [H]ardness Supreme

    Messages:
    7,192
    Joined:
    Feb 22, 2012
    Whoa hold on there... Careful with that one! So hard... To resist troll urge...
     
  11. Parja

    Parja [H]ardForum Junkie

    Messages:
    11,682
    Joined:
    Oct 4, 2002
    WTF are you doing?

    So that 10W the fan is consuming (also, do your research, 10W of fan is a ridiculous amount of fan power)...where do you think that power is going? What form of energy do you think it eventually ends up as? (Here's a tip. That air that fan is moving...it's accelerating the molecules in the air, and once that air stops moving, ends up as heat.)
     
  12. DraginDime

    DraginDime [H]ard|Gawd

    Messages:
    1,426
    Joined:
    Jan 12, 2012
    If the GTX 980 doesn't lose the performance crown to (at least) the R9-390x, then AMD may be in trouble.
     
  13. Grimlaking

    Grimlaking 2[H]4U

    Messages:
    2,912
    Joined:
    May 9, 2006
    I take it you mean in the stand alone grafix card arena? Because on that I will agree.

    But business wise they still bring in billions of dollars a year. In CPU and other small chip manufacturing and designing Purpose built CPU's and other chips are AMD's bread and butter. See the fact that they are the chips used for just about everything in both of the main consoles in the market today.
     
  14. n=1

    n=1 2[H]4U

    Messages:
    2,388
    Joined:
    Sep 2, 2014
    Ah, so that's why these next-gen consoles are utter shit.

    i keed i keed :D
     
  15. Liger88

    Liger88 2[H]4U

    Messages:
    2,657
    Joined:
    Feb 14, 2012

    AMD is still in huge trouble in the CPU department and making their APU strategy pay off. The graphics department is literally the only part of their business keeping them afloat right now. While they are starting to find success in the non-discrete area, it isn't by far anything to brag about. They've stemmed the bleeding, not stopped it. I still see them being bought out in the next 10 years. They just don't have the influence, R&D, or money they once had.

    Nvidia is going to be in some serious shit in the future themselves which is why they went ARM years ago when they couldn't secure an x86 license.
     
  16. Unknown-One

    Unknown-One [H]ardForum Junkie

    Messages:
    8,886
    Joined:
    Mar 5, 2005
    Did do my research. EVGA specs the fans on the ACX cooler to draw up to 13w combined.

    The stock fan on the GTX 780 / Titan reference cooler also draws a ridiculous amount of power (don't know the exact number on that one) when running full speed, to the point that you can actually watch the "Power %" graph in EVGA Precision jump-up a few percent if you force the fan up to 100% speed while the card is idle. Unpluging the GPU fan from the graphics card and powering it from an alternative source can actually make the card slightly less-likely to drop out of boost clocks because it doesn't bump into the power target as easily.

    Turning the fan blades, obviously... How is this even a question?

    The tiny amount of heat generated by the fan jostling air molecules about is not counted as part of TDP, and is therefor irrelevant to the discussion at hand. Most of the power consumed by the fan is spent overcoming inertia and overcoming air-resistance, not creating heat.
     
    Last edited: Oct 8, 2014
  17. samm

    samm [H]ard|Gawd

    Messages:
    1,757
    Joined:
    Jul 17, 2013
    THIS THREAD TITLE IS AWESOME!!!! A card launching 6 months later is gonna steal the performance crown from a 6 month old card, U DONT SSAY!!!!

    The only benefit from this is for the consumer with Price wars between AMD and Nvidia. the way AMD is pricing (or dropping their prices) out their current lineup expect that for the 980s 6 months from now. No more crazy $800 pricing schemes.
     
  18. DraginDime

    DraginDime [H]ard|Gawd

    Messages:
    1,426
    Joined:
    Jan 12, 2012
    Yeah, I wasn't damming the company as a whole. :D Just saying if they put out a GPU 6 months later that's just matching what's already out there, they are in trouble. I wouldn't want Nvidia vs AMD to be like Intel vs AMD. Intel owns the high end CPU market, and can price their products as high as they want, because of the lack of competition up there. I think we've witnessed a little bit of that in the past already though, since AMD really didn't have an answer for the Titan/Titan Black at the time. I know the 290x is a great card and wasn't horribly far away in performance, but that didn't change the 1k price on those Titans.
     
    Last edited: Oct 8, 2014
  19. Dayaks

    Dayaks [H]ardness Supreme

    Messages:
    7,192
    Joined:
    Feb 22, 2012
    Since there are leaked pictures of shrounds with watercooling holes cut in them.... if true crazy pricing might just be starting! :)
     
  20. Trimlock

    Trimlock [H]ardForum Junkie

    Messages:
    15,157
    Joined:
    Sep 23, 2005
    This one comes with a LN2 tube pre-installed.
     
  21. clayton006

    clayton006 Gawd

    Messages:
    980
    Joined:
    Jan 4, 2005
    I literally just read this at work while I was drinking water and just spat it all out laughing at this. (But you may want to extend that to 7 pages now)...

    That being said, I hope AMD still puts up a good fight against Nvidia. Competition is a good thing for all of us. I'd like to have a 8GB Vram card and some sweet performance for 4k gaming and beyond!
     
  22. spintroniX

    spintroniX Gawd

    Messages:
    957
    Joined:
    Apr 7, 2009
    goddamn this thread is so full of win. Can a mod please sticky this shit?
     
  23. madgun

    madgun [H]ard|Gawd

    Messages:
    1,767
    Joined:
    May 11, 2006
    This thread serves as a good soap opera.. Kudos to the OP who either played as a devil's advocate or was an idiot fan boy vying for breathing space.
     
  24. doz

    doz [H]ardness Supreme

    Messages:
    4,853
    Joined:
    Jun 16, 2009
    :eek:
     
  25. fanboy

    fanboy [H]ard|Gawd

    Messages:
    1,057
    Joined:
    Jul 4, 2009
    It brings the best out of us..:p
     
  26. TekRok

    TekRok 2[H]4U

    Messages:
    2,217
    Joined:
    Dec 9, 2005
    I will likely skip the 380 series because I have 290 and Crossfire. I just dont see the benefit of constantly jumping from one generation to the next. Its a bad financial choice as well because you will never recoup the money for the previous card, might as well just wait until it cant perform a task at hand.
     
  27. Lord_Exodia

    Lord_Exodia [H]ardness Supreme

    Messages:
    6,997
    Joined:
    Sep 29, 2005
    I sell when the card is fairly new most of the time. If you sell your 290Xs before the new cards come out, you'll get back much more cash than if you wait for the 490Xs to release.

    I dunno if that's correct for everbody but that's what I usually do. ;)
     
  28. Araxie

    Araxie [H]ardness Supreme

    Messages:
    6,378
    Joined:
    Feb 11, 2013
    ^agree.. I do the same, specially when I have a spare GPU to wait 1 month before release.. :D..
     
  29. outers

    outers Limp Gawd

    Messages:
    317
    Joined:
    Dec 10, 2013
    So keep a generation old gpu around to power the displays and sell the current one a month before release so you don't see the value of your card drop in half?
     
  30. Sodapopjones

    Sodapopjones [H]ard|Gawd

    Messages:
    1,833
    Joined:
    Aug 22, 2012
  31. Comixbooks

    Comixbooks Ignore Me

    Messages:
    13,650
    Joined:
    Jun 7, 2008
    Power Efficiency will be though the roof.
     
  32. dccmadams

    dccmadams [H]ard|Gawd

    Messages:
    2,010
    Joined:
    Nov 25, 2007
    I have had many AMD cards in the past. I hope they can make a quiet, cool running, powerful card in the near future. I will buy whatever, from whoever. Nvidia has just been better for the last few generations. I don't mind buying, selling every generation if there is something interesting to experience. It is a hobby as far as I am concerned. There sure was a lot of fanboism and passion on this topic...
     
  33. Relayer

    Relayer [H]ard|Gawd

    Messages:
    1,527
    Joined:
    Jun 5, 2011
    You do realize that people make non reference AMD cards, don't you? They are both cool running and quiet.
     
  34. primetime

    primetime [H]ardness Supreme

    Messages:
    6,012
    Joined:
    Aug 17, 2005
    im gonna hold out for the next die shrink...then im jumping in for the first good sale price...my 7970 will hold me out till then np
     
  35. w00t69

    w00t69 Limp Gawd

    Messages:
    337
    Joined:
    Aug 9, 2004
    The title of this thread is absolutely hilarious.

    *May* lose performance crown? Every card ever made will always lose it's performance crown. This is the nature of incremental and substantial improvements in the industry.

    And the only reason i picked up a 980 gtx is because i upgraded from a 580 gtx. So it was a MASSIVE performance upgrade for me and i didn't feel like buying the part that's just a compromise of price this time around (970) even if it was an exceptional part.