GTX 980 May Lose The Performance Crown To AMD’s R9 380X in February 2015

Discussion in 'AMD Flavor' started by fanboy, Sep 29, 2014.

  1. fixedmy7970

    fixedmy7970 Gawd

    Messages:
    524
    Joined:
    Jul 20, 2013
    So many nvidian mutants <-- to all of ya, I can't wait to post benchies of the mighty 390x destroying your gtx prius

    And here's some more pics of your GTX Prius watt guzzling:
    [​IMG]

    yea, GTX Prius using 240 watts. In Tom's own words, "new graphics card&#8217;s increased efficiency is largely attributable to better load adjustment and matching." Leave it to nvidians to release a tdp of 150w for their reference 970 and then ship non-reference 970s with 240w+ tdps.

    #MAXFAIL
     
    Last edited: Oct 5, 2014
  2. Kor

    Kor 2[H]4U

    Messages:
    2,175
    Joined:
    Mar 31, 2010
    Are you a real person?
     
  3. DejaWiz

    DejaWiz Oracle of Unfortunate Truths

    Messages:
    19,059
    Joined:
    Apr 15, 2005
    Factory OC 970 for $330-370 that lays waste to any formerly released GPU but does consume more power than stock-clocked and you are complaining about while saying R390X needs water cooling and power consumption doesn't matter because it's going to own everything?
     
    Last edited by a moderator: Oct 5, 2014
  4. Unknown-One

    Unknown-One [H]ardForum Junkie

    Messages:
    8,886
    Joined:
    Mar 5, 2005
    TDP = Thermal Design Power = The amount of heat the cooling system is designed to cope with, in watts.

    As an extreme example: A card can consume 800w of power, and have a 100w TDP. That just means it's INCREDIBLY efficient, and is using 700w to do actual work and only converting 100w into waste heat.

    All your chart shows is that Maxwell is, in fact, very efficient. It's drawing ALL that power, using most of it up doing real work, and only producing 150w of heat. Not bad.

    The reference model is only pulling 177w in your chart. Not seeing the big deal here.

    The overclocked card is pulling significantly more power, but that's generally the case with ANY overclocked card. Your complaint seems to be with MSI's non-reference-specced card, not Maxwell in general.
     
    Last edited: Oct 5, 2014
  5. Lord_Exodia

    Lord_Exodia [H]ardness Supreme

    Messages:
    6,997
    Joined:
    Sep 29, 2005
    Dayum OwnD! Thanks Fixedmy7970 for pointing out how unbelievably awesome maxwell is. You should come to nvidia's side. Water is good. :D
     
  6. Tamlin_WSGF

    Tamlin_WSGF 2[H]4U

    Messages:
    2,974
    Joined:
    Aug 1, 2006
    Lol! If a card consume 800W of power, most of that wattage (99.9999 percent of it) will be transformed into heat. As you might remember from school, "energy cannot be destroyed or created, only transformed" (law of conservation of energy).

    A card can be more efficient doing more calculations per wattage, but the energy used will be transformed into heat. The energy (watt) doesnt disappear when the GPU uses it, it only gets transformed.

    The rated TDP of a card, is simply what is being produced under certain circumstances (maximum heat according to chosen load).
     
  7. n=1

    n=1 2[H]4U

    Messages:
    2,388
    Joined:
    Sep 2, 2014
    I cannot BELIEVE you guys are still taking fixedmy7970 seriously at this point...
     
  8. fixedmy7970

    fixedmy7970 Gawd

    Messages:
    524
    Joined:
    Jul 20, 2013
    brah, believe it nvidia won this last battle with better compression and load adjusting, something amd will do and fix like with frame stuttering. you nvidians know the 390x is gonna smoke your puny gtx prius thats why your hating
     
  9. DejaWiz

    DejaWiz Oracle of Unfortunate Truths

    Messages:
    19,059
    Joined:
    Apr 15, 2005
    We can't hate something that isn't out and won't be reviewed for months. And given AMD's track record for sensationalism prior to a product launch, well, remember Bulldozer?
     
  10. fixedmy7970

    fixedmy7970 Gawd

    Messages:
    524
    Joined:
    Jul 20, 2013
    remember maxwell where nvidia claims 2x performance over kepler when its really 20%
     
  11. xLokiX

    xLokiX Limp Gawd

    Messages:
    262
    Joined:
    Jun 10, 2007
    It is 2x the power of a 680gtx, which is the card its technically replacing. And when the 980ti comes out it should be 2x the power of the 780ti, which is the card that it will replace.
     
  12. Syphon Filter

    Syphon Filter 2[H]4U

    Messages:
    2,595
    Joined:
    Dec 19, 2003
    Quite. It's not exactly rocket science is it...
     
  13. wonderfield

    wonderfield [H]ardness Supreme

    Messages:
    7,396
    Joined:
    Dec 11, 2011
    Which one? The W9100 &#8212; the highest-end FirePro &#8212; uses the same chip as the 290X, the Hawaii XT. That chip, as used on both parts, is fully-enabled: all they could have done to push past 290X performance was to unrestrict DP performance, which would have gotten them minimal performance gains in gaming scenarios, and boost clock speeds, which they have little to no headroom to do. As I already said, the 290X is the practical limit of what AMD is capable of in single-GPU configurations. There's no wonderful chip in their technology stack that is currently more capable than the one they sell you on the 290X.

    The Titan, on the other hand, had an entirely different chip than what was previously available on consumer parts. In the Titan, NVIDIA brought down a chip they had intended to bring down to consumers earlier were it not for AMD's lack of earlier competitiveness.

    I never suggested it wasn't. As previously alluded by other posters, the Titan was an opportunity NVIDIA seized due to AMD's lack of competitiveness.

    The only work a GPU does is 'move' electrons around. This work is measured in thousandths of a watt. Computing pixel color values is not work.

    The work being done by the fan is multiple orders of magnitude greater than what's being done in a GPU.
     
  14. fanboy

    fanboy [H]ard|Gawd

    Messages:
    1,057
    Joined:
    Jul 4, 2009
    Why would it replace the 680GTX if the 770GTX is a 680GTX done better??
     
  15. xLokiX

    xLokiX Limp Gawd

    Messages:
    262
    Joined:
    Jun 10, 2007
    The 770 is just a 680 with a little faster VRAm. The 980 is still 2x as fast as the 770.
     
  16. Lord_Exodia

    Lord_Exodia [H]ardness Supreme

    Messages:
    6,997
    Joined:
    Sep 29, 2005
    What he's trying to say is that when nvidia announced maxwell they didn't say it would be 2x as fast as the 780GTX. they said it will be 2x the performance/watt as the kepler architecture they were just launching (GTX 680 at the time) The entire refresh of Kepler was using their GK 110 GPU that was their flagship Quadro card. Nvidia didn't even need to spin off a full on architecture refresh to fight AMD last round. They just utilized their GK110 that launched several months before it as a Quadro GPU.
     
  17. xLokiX

    xLokiX Limp Gawd

    Messages:
    262
    Joined:
    Jun 10, 2007
    Pretty much, just didn't want to take any potshots. I know the butthurt is strong enough in this thread already. JK
     
  18. Lord_Exodia

    Lord_Exodia [H]ardness Supreme

    Messages:
    6,997
    Joined:
    Sep 29, 2005
    Adding to my last comment I will say that they did improve the GK110 a little and came out with a rev B but that wasn't a full on architecture refresh like we've seen in the past.

    anyway I hope nvidia does lose the performance crown. I'm scared they will get complacent like Intel if AMD doesn't stuff a boot up their ass. The last thing we need is incremental gpu upgrades like 10-15% each generation :eek:
     
  19. LordEC911

    LordEC911 [H]ard|Gawd

    Messages:
    1,461
    Joined:
    Jul 7, 2013
    Of course it wasn't an architectural refresh, a metal respin is almost never an architectural refresh unless there is something majorly wrong with the ASIC design, see Fermi.
     
  20. ebduncan

    ebduncan [H]ard|Gawd

    Messages:
    2,005
    Joined:
    Feb 1, 2008

    You don't have to worry about AMD not being competitive with Nvidia. Nvidia will finish strong in 2014, but the tide will change in 2015. As always if you need a graphics card today its hard to look past the 970 or the 980. However if you can wait Amd will not disappoint with their next release. In all I think everyone is in for a big surprise, and I strongly feel the next generation AMD cards will make the current maxwell cards look low end. Just a hunch though.
     
  21. runudownquick

    runudownquick Gawd

    Messages:
    831
    Joined:
    Nov 11, 2008
    I guess every forum needs its own set of trolls...
    Would be nice to see a Tonga based flagship of sorts undercut the 970 in price though I have doubts at this point whether it'd be worth it.

    R9 3xx series leaks can't come fast enough.
     
  22. Hakaba

    Hakaba Gawd

    Messages:
    639
    Joined:
    Jul 22, 2013
    Funny how "May" and "Speculation" leads to so much green vs red arguments/bashing/hatred/sheep calling.

    Ohh well flashed 6950 just hold out a little longer and see what AMD brings to the table next. May just be another buy what offers the best price to performance ratio decision again.
     
  23. Unknown-One

    Unknown-One [H]ardForum Junkie

    Messages:
    8,886
    Joined:
    Mar 5, 2005
    As I said, a card with a 100w TDP that consumes 800w of power would imply a VERY efficient card that uses most of the power for actual work (converting very little into waste heat).

    Go back and re-read my post. This example scenario was clearly stated.

    You're assuming 100% of the electricity consumed by the card is converted into heat. This is incorrect. If TDP was calculated accurately, the maximum power consumption of the card will always be higher than the TDP by some amount.

    This does not invalidate what I said. A card that consumes 800w of power and only produces 100w of heat is possible (again, as an EXTREME example).

    This was simply meant to illustrate that TDP doesn't equate to power consumption. The two CAN be wildly different.
     
    Last edited: Oct 5, 2014
  24. wonderfield

    wonderfield [H]ardness Supreme

    Messages:
    7,396
    Joined:
    Dec 11, 2011
    What work? What do you believe is going on within a GPU? What, in your theoretical example, necessitating the equivalent of one horsepower to perform? That's enough power to propel a typical scooter, and its rider, at speeds up to and excess of 40 feet per second.

    It assumes the overwhelming and vast majority is converted to heat. Graphics cards are not energy black holes, nor do they violate the law of conservation of energy. They consume electrical energy, doing so little work with it so as to be of little interest to anyone, and expend the vast majority of that energy in the form of heat.
     
  25. -Strelok-

    -Strelok- [H]ardForum Junkie

    Messages:
    10,002
    Joined:
    Dec 2, 2010
    Video cards do tons of work. They make so much graphics energy! What is this heat you speak of?
     
  26. n=1

    n=1 2[H]4U

    Messages:
    2,388
    Joined:
    Sep 2, 2014
    Actually the 980 is pretty forgettable tbh unless you're the type that just has to have the latest and greatest at whatever cost.

    Now the 970 is truly shaping up to be the next 8800 GTX (almost caught myself typing GTX 8800, damn nVidia and the confusing names :D), and that price/performance ratio is going to give AMD massive headaches until they release something to compete. For now all AMD can do is drop prices on the R290(X) and indeed that's what's happening.

    I've no doubt AMD will come up with an answer, but timing is everything in the GPU market. If they don't get anything out before the end of this year, they're going to lose out massively during the holiday sales. They could keep cutting prices but you can only go so low before you cut into your margins.
     
  27. Unknown-One

    Unknown-One [H]ardForum Junkie

    Messages:
    8,886
    Joined:
    Mar 5, 2005
    Already covered this, see again:
    It was an extreme example to illustrate the point, nothing more. I'm not saying such a would exist, it's simply a hilariously over-extended example of how TDP is NOT directly linked to power consumption.

    Go ahead, replace those numbers with ones from the real-world. A card with a 145w TDP drawing 177w. Doesn't matter, my point stands. TDP still isn't 1:1 with power consumption, and is still lower than power consumption, EXACTLY as expected.

    I didn't say it wasn't. I clearly pointed out that the GTX 970 in his chart (a card with a 145w TDP) is drawing 177w. That's 82% of the power the card is drawing being converted into heat.

    So yeah, I already covered this.

    Never said they were, never said they did.

    They do, however, use some power without converting it into heat.

    Yup, as I said, 82% in the case of the GTX 970.

    But, here's the thing, NONE of what you said has any real relevance. My point remains, TDP doesn't equate to actual power consumption, and TDP will always be lower than actual power consumption.
     
    Last edited: Oct 6, 2014
  28. ebduncan

    ebduncan [H]ard|Gawd

    Messages:
    2,005
    Joined:
    Feb 1, 2008
    I'm dying of laughter.

    Graphics cards do not do work. They take d/c electric and convert it into heat. Some energy is converted into mechanical energy mainly for the the fans.

    I guess some folks never took physics. Sorry I couldn't help myself.
     
  29. n=1

    n=1 2[H]4U

    Messages:
    2,388
    Joined:
    Sep 2, 2014
    Wait you mean pushing electrons around is NOT doing work now? My entire life has been a lie..
     
  30. Lord_Exodia

    Lord_Exodia [H]ardness Supreme

    Messages:
    6,997
    Joined:
    Sep 29, 2005
    Performing calculations and converting them into 3d and sending them to the monitor is not work?

    Oh wait.. LOL are you splitting hairs here and saying the card itself doesn't do the work but the GPU <-- does? Is that the inside joke I'm missing? (edited your quote to emphaisze)
     
  31. Unknown-One

    Unknown-One [H]ardForum Junkie

    Messages:
    8,886
    Joined:
    Mar 5, 2005
    Graphics card DO NOT convert 100% of the energy they consume into heat. This is a fact.

    Therefor, TDP (thermal design power) will always be lower than ACTUAL power consumption. Exactly as I've stated, repeatedly.

    Dunno the wattage of the fan on the stock cooler on the GTX 970, but I do know the fans on the EVGA AXC cooler pull about 13w when spun all the way up. Even accounting for that, that still leaves the rest of the GTX 970 consuming 164w of power and only producing around 145w of heat.

    Really not sure what you're laughing at, physics agrees with the point I'm making.
     
    Last edited: Oct 6, 2014
  32. LordEC911

    LordEC911 [H]ard|Gawd

    Messages:
    1,461
    Joined:
    Jul 7, 2013
    I'm sorry but I agree with the "dying of laughter" post.

    Where do you think that energy goes?
     
  33. wonderfield

    wonderfield [H]ardness Supreme

    Messages:
    7,396
    Joined:
    Dec 11, 2011
    I wish I could be laughing my ass off like the rest of you, but the ignorance just upsets me.

    I don't care. The question is as applicable regardless of the extremity of the example. What work would this theoretical graphics card be doing with the equivalent of one horsepower of energy? Define the work being done.

    No. The TDP is a constant: it is defined by NVIDIA. Its TDP is 145W regardless of whether the card draws 6W (idling) or 260W (transient load). A graphics card drawing 177W expends approximately 177W of energy as heat. It's not some magical ratio of its power consumption to its rated TDP.

    No. See above. Furthermore, your so-called 'point' was that a graphics card's efficiency is an aspect of how much power it draws relative to its TDP, not that the rated TDP will always be lower than power consumption (which is also comically wrong). You're trying to slowly dilute your argument such that it seems more agreeable, but you're doing quite a poor job of it.

    Not in any physical sense, no.
     
  34. ebduncan

    ebduncan [H]ard|Gawd

    Messages:
    2,005
    Joined:
    Feb 1, 2008
    by definition of work it does not

    there is no inside joke here, GPUs do not perform work.

    your dead wrong. ALL of the electrical energy consumed by the graphics card is directly converted to heat. Unless the graphics card is breaking the laws of physics this holds true. Basically what your saying is 100% power goes in and gets lost somewhere.

    Here is the definition of the law conservation of energy

    "The law of conservation of energy states that the total amount of energy in a system remains constant ("is conserved"), although energy within the system can be changed from one form to another or transferred from one object to another. Energy cannot be created or destroyed, but it can be transformed."

    There are 3 types of energy possible by a graphics card. Electrical energy , mechanical energy, and heat energy. Electrical energy goes in it can be either transformed into mechanical energy (ie fans or pumps), or heat.

    I'm still laughing.
     
  35. KazeoHin

    KazeoHin [H]ardness Supreme

    Messages:
    7,860
    Joined:
    Sep 7, 2011
    Not to derail the thread more, but this discussion is quite interesting.

    My take is that if 100% of the electrical energy input is converted to heat, why would we need a ground/earth line?
     
  36. The Mac

    The Mac [H]ardness Supreme

    Messages:
    4,492
    Joined:
    Feb 14, 2011
    Electrical energy requires a return path by way of a closed circuit from the energy source (potential). all electrons flow towards ground.

    a ground is a convenient way of indicating that path versus the input.

    A common ground allows all circuits to share the same return path greatly simplifying design.

    so for example, you take a battery, ground the anode (-), then the cathode (+) becomes your signal, you send it through a lightbulb then ground the other side. (electrons flow from + to -, which seems backwards, but the signal becomes the positive holes where the electrons have flowed from)

    now you have a complete circuit.

    Oversimplification, but good enough for basic theory.
     
    Last edited: Oct 6, 2014
  37. Relayer

    Relayer [H]ard|Gawd

    Messages:
    1,527
    Joined:
    Jun 5, 2011
    If a card truly had a TDP limit of 177W and drew 800W it would go up like a Roman candle. Nice try trying to say that these O/C cards aren't converting the extra power they are using into heat. We know, virtually all of it is just that, heat.
     
  38. The Mac

    The Mac [H]ardness Supreme

    Messages:
    4,492
    Joined:
    Feb 14, 2011
    Pretty basic thermal dynamics really..
     
  39. Dayaks

    Dayaks [H]ardness Supreme

    Messages:
    7,199
    Joined:
    Feb 22, 2012
    Unless it's moving something (air, vehicle, vibrations), the power turns into heat or light. Something on the electromagnetic spectrum.

    Edit: I guess you could say a 100% efficient card would consume 5W, dissipate 0 vibrations or heat, and pass on all the electrons through the HDMI cable (or what have you) to the monitor.

    Considering the display cable can handle what, 5 watts? We can pretty much say the vast majority of energy used is turned into heat or vibrations...
     
  40. -Strelok-

    -Strelok- [H]ardForum Junkie

    Messages:
    10,002
    Joined:
    Dec 2, 2010
    So what exactly is TDP?