Maxwell reviews starting to appear

Discussion in 'nVidia Flavor' started by Stoly, Sep 18, 2014.

  1. jojo69

    jojo69 [H]ardForum Junkie

    Messages:
    10,368
    Joined:
    Sep 13, 2009
    agreed, crossfire has been working well for a while as well
     
  2. Armenius

    Armenius I Drive Myself to the [H]ospital

    Messages:
    17,187
    Joined:
    Jan 28, 2014
    Indeed. I could go quad-SLI on my 850W PSU with the 980 if I had the motherboard for it. It would be just a little more power draw with 4 980s compared to running 2 780s right now.
     
  3. wonderfield

    wonderfield [H]ardness Supreme

    Messages:
    7,396
    Joined:
    Dec 11, 2011
    Average frame rates only and no noise testing in this article make it nearly worthless.

    I really don't understand why you guys bother.
     
  4. Lord_Exodia

    Lord_Exodia [H]ardness Supreme

    Messages:
    6,997
    Joined:
    Sep 29, 2005
    It's the grand order of things. Look at GPU's and CPU's of today. They pull less power, are on a smaller architecture, run cooler and still do more and drive higher resoltuions than those of years past. My 5nm gpus will drive 3x 4k 120hz monitors better than my 28nm Maxwell cards. Believe it. :)
     
  5. Blackstone

    Blackstone 2[H]4U

    Messages:
    3,062
    Joined:
    Mar 8, 2007
    I am going to buy TWO of these 980 badboys and SLI them for my 60inch 1080p plasma. Then I am going to log onto BF4 crank up the supersampling AA. I still won't be able to aim because I suck.

    As long as 4GB is enough video memory for the next two years at 1080p this card is awesome.

    One question, they mentioned a feature regarding upsampling at 1080p resolution. I wonder how much video memory that will use. I hope 4GB is enough. That is the ONLY issue here for me.

    By the time I get into 4k there will be a whole new generation of cards available. Once you go Plasma you can't go back.
     
  6. tonyftw

    tonyftw [H]ard|Gawd

    Messages:
    1,817
    Joined:
    Mar 21, 2013
    LOL
     
  7. n=1

    n=1 2[H]4U

    Messages:
    2,322
    Joined:
    Sep 2, 2014
    Possibly, we'll see.
     
  8. jojo69

    jojo69 [H]ardForum Junkie

    Messages:
    10,368
    Joined:
    Sep 13, 2009
    Of course, my point is that we will ALWAYS be asking more of them.

    Also, we are soon going to run up against atomic scale limitations on process shrinkage.
     
  9. Lord_Exodia

    Lord_Exodia [H]ardness Supreme

    Messages:
    6,997
    Joined:
    Sep 29, 2005
    That is true, it looks like silicone is coming to a demise soon and we are moving to better processes.

    I guess my point is things will always move forward and the more they are able to refine processes and control leakage less and less power will be required to do the same or more graphically. Maybe (and hopefully) the days we need a 250 Watt TDP GPU to do what we need are gone.
     
  10. Unknown-One

    Unknown-One [H]ardForum Junkie

    Messages:
    8,854
    Joined:
    Mar 5, 2005
    Looks like it uses the standard Titan cooler, so we already know how it sounds...
     
  11. wonderfield

    wonderfield [H]ardness Supreme

    Messages:
    7,396
    Joined:
    Dec 11, 2011
    I wouldn't worry about that too much right now. Intel will hit that wall two to three years before the GPU vendors do, and GPU vendors have several tricks still up their sleeves that aren't dependent on node size. One of the key ones being DRAM stacking. Past that, NVIDIA and AMD still have the opportunity to move toward a tiled or [hybrid] ray-traced architecture, where multi-GPU becomes much more realistic a solution. A typical 2020 video card might have an array of discrete GPUs.

    At the end of the day, NVIDIA and AMD will do what they need to do to keep moving performance forward and selling products. If that means the rendering paradigm needs to change, that's what will happen.
     
  12. wonderfield

    wonderfield [H]ardness Supreme

    Messages:
    7,396
    Joined:
    Dec 11, 2011
    Not without information on the fan profile.
     
  13. jojo69

    jojo69 [H]ardForum Junkie

    Messages:
    10,368
    Joined:
    Sep 13, 2009
    and all that will take power

    that is all I'm saying
     
  14. naticus

    naticus [H]ard|Gawd

    Messages:
    1,962
    Joined:
    Feb 6, 2011
    This.
     
  15. Unknown-One

    Unknown-One [H]ardForum Junkie

    Messages:
    8,854
    Joined:
    Mar 5, 2005
    Which we have, because the RPM's are listed in the tables in the OP's post...

    We know what cooler it is and we know what RPM it's running at in each test. What more do you want? lol
     
  16. wonderfield

    wonderfield [H]ardness Supreme

    Messages:
    7,396
    Joined:
    Dec 11, 2011
    Where are fan speeds listed?

    EDIT: They're in the power consumption charts. Don't they realize images aren't text-searchable?
     
  17. SilverSliver

    SilverSliver Beat It To Deformation

    Messages:
    10,950
    Joined:
    Feb 23, 2004
    Zero fucks given about power draw unless that means you can OC the tar out of it.
     
  18. Unknown-One

    Unknown-One [H]ardForum Junkie

    Messages:
    8,854
    Joined:
    Mar 5, 2005
    I just told you, click the link in the OP's post and scroll down...

    Here, I copy/pasted it for you. RPM figures are given:

    [​IMG]

    Not bad at all for Nvidia's tried-and-true reference cooler. 780 Ti running its fan at 3697 RPM vs. the GTX 980 running its fan at 2787 RPM when being stressed with Furmark (and note that the 780 Ti is throttling badly and STILL overheating, while the 980 appears to be doing just fine)
     
  19. wonderfield

    wonderfield [H]ardness Supreme

    Messages:
    7,396
    Joined:
    Dec 11, 2011
    It will. But supplying power to many ICs and to multiple cards is not overly difficult.
     
  20. wonderfield

    wonderfield [H]ardness Supreme

    Messages:
    7,396
    Joined:
    Dec 11, 2011
    You quoted the edit. Near as I'm aware, today is not Feign Ignorance Day.
     
  21. Unknown-One

    Unknown-One [H]ardForum Junkie

    Messages:
    8,854
    Joined:
    Mar 5, 2005
    Well, again, Nvidia's current Boost technology monitors temps and power consumption to determine when to IGNORE your overclock and throttle the card.

    A more efficient card = less throttling = your overclock actually means something. You should be overjoyed at these power consumption figures if you're overclocking :D

    I actually quoted you BEFORE your edit. I updated the quote when you updated your post.

    Figured I might as well post the image anyway (and what I said wasn't wrong, I DID tell you where to get the information :p ), since there seems to be a general lack of reading comprehension on this forum. If it didn't help you, it might help someone else.
     
  22. Bobalias_LeShay

    Bobalias_LeShay Limp Gawd

    Messages:
    287
    Joined:
    Sep 27, 2013
    How do you SLI on a mini ITX board with only one PCIe slot? I suppose some people are using SFX PSUs with uATX, but almost always those cases use full size ATX PSUs. SFX is generally used for mini-ITX, no?
    Personally? Not since my GTX 295. I swore never again. Although it looks like it's gotten substantially better, the same main problems remain. Lower performance per dollar, more heat, higher power requirements, mediocre scaling in the best case and no scaling at the worst, doesn't work with miniITX, microstutter, can't use video outputs on secondary card, reduced PCIe bandwidth per card, takes up more physical PCIe slots (meaning SLI on uATX deprives you of any free slots for RAID/sound/wireless/PCIe SSD/etc cards), et al.

    Don't mistake my meaning. SLI is the best (and only) solution for those that need the absolute maximum graphics performance possible. If that's you, do it. But for the vast majority of people, even on [H], it just doesn't outweigh the drawbacks in my mind. It's also disingenuous to claim multiple cheap cards being much faster than last gen's single card at a similar price as a dramatic improvement. Compare top end single card speeds/prices, that's much more fair.
     
  23. Unknown-One

    Unknown-One [H]ardForum Junkie

    Messages:
    8,854
    Joined:
    Mar 5, 2005
    There are plenty of MicroATX cases using SFX power supplies, actually.
     
  24. jojo69

    jojo69 [H]ardForum Junkie

    Messages:
    10,368
    Joined:
    Sep 13, 2009
    I'm not saying that it is, all I have been saying is that the 250ish Watt TDP envelope in the PCIe form factor is not going away any time soon. We will push resolution, frame rate and eye candy such that we will be building fire breathing, power hungry setups for a long time.

    I have been reading about the "demise of the discrete GPU" for years ffs...ummmm, no.

    Until I am surrounded all four walls, ceiling and floor with display at resolutions and rates my brain can not discern from real life we will be bolting up systems to strive for that.
     
  25. GibsonEX

    GibsonEX Gawd

    Messages:
    680
    Joined:
    Feb 21, 2008
     
  26. wonderfield

    wonderfield [H]ardness Supreme

    Messages:
    7,396
    Joined:
    Dec 11, 2011
    And I'm suggesting you not think too much about what PCIe enables or doesn't enable. Like traditional rasterization, PCIe will eventually and invariably be replaced with something else, as needs demand. NVIDIA, in fact, already has their own solutions.

    I'm not talking about the demise of discrete GPUs, I'm talking about what the inevitability is of many GPUs. When we eventually hit process walls, we'll probably not care. It won't impede performance evolution.
     
  27. Bobalias_LeShay

    Bobalias_LeShay Limp Gawd

    Messages:
    287
    Joined:
    Sep 27, 2013
    I think he was referring more to the physical size of a graphics card. Much more than 250W in that volume starts to get pretty tricky to cool.
     
  28. wonderfield

    wonderfield [H]ardness Supreme

    Messages:
    7,396
    Joined:
    Dec 11, 2011
    Yes, there is that. That's a harder problem to solve.
     
  29. xorbe

    xorbe [H]ardness Supreme

    Messages:
    5,979
    Joined:
    Sep 26, 2008
    What he said up there, the 970 @ $325 if true, is going to be the home-run card. I'm waiting for Titan 900 or whatever it is
     
  30. SilverSliver

    SilverSliver Beat It To Deformation

    Messages:
    10,950
    Joined:
    Feb 23, 2004
    Don't really care about the boost technology. Just flash the bios to force what you want.
     
  31. Teenyman45

    Teenyman45 2[H]4U

    Messages:
    2,267
    Joined:
    Nov 29, 2010
    It looks like some of those were run in a quiet mode or worse for the 290x card. The 290x was listed as being clocked at 894 or 742Mhz for both power draw and temperature. That performance chart needs to indicate whether the 290x was throttling during game play or if it was given enough airflow to stick at the full 1,000Mhz.

    If the 970 and 980 really do price out as leaked, I think the biggest factor will be whether the 980 has unlocked voltage control. If it can be heavily, and reliably, OC'ed like the 580's or AMD 280x cards then the 980 might be worth the price.
     
  32. tsuehpsyde

    tsuehpsyde [H]ardness Supreme

    Messages:
    6,595
    Joined:
    Oct 22, 2004
  33. Liger88

    Liger88 2[H]4U

    Messages:
    2,657
    Joined:
    Feb 14, 2012

    lol I about shit myself at the 7Gbps memory 9.3Gbps "effective". You know somethings up when they don't bother comparing against their last generation and go all the way back to the GTX 680.

    Not giving them the excuse of 2 year upgrade cycles more norm than yearly blah blah, lets just point out Nvidia bullshits on their slides just as much as AMD.

    Gave me a hearty laugh.
     
  34. Parja

    Parja [H]ardForum Junkie

    Messages:
    11,536
    Joined:
    Oct 4, 2002
    All links to reviews must be cleared through wonderfield before they're allowed to be posted!
     
  35. Digital Viper-X-

    Digital Viper-X- [H]ardForum Junkie

    Messages:
    13,619
    Joined:
    Dec 9, 2000
    if it's true about perf / watt, they will have a lot of headroom for a TI + Titan model, I Can see it now, a TI with 10-15% more Cuda cores, a titan with 30% + more memory / bandwidth
     
  36. wonderfield

    wonderfield [H]ardness Supreme

    Messages:
    7,396
    Joined:
    Dec 11, 2011
    It's a great idea.
     
  37. Unknown-One

    Unknown-One [H]ardForum Junkie

    Messages:
    8,854
    Joined:
    Mar 5, 2005
    Even after flashing a custom BIOS, my GTX 780 still throttles when running Furmark (and that's with the Power Target at 300% and the temps hovering at only 65c). There's simply no getting around it.

    Like I said: "A more efficient card = less throttling = your overclock actually means something. You should be overjoyed at these power consumption figures if you're overclocking"
     
  38. sblantipodi

    sblantipodi 2[H]4U

    Messages:
    3,425
    Joined:
    Aug 29, 2010
    Can't wait for.reviews
     
  39. madgun

    madgun [H]ard|Gawd

    Messages:
    1,749
    Joined:
    May 11, 2006
    Should have a great overclocking potential with those power figures.

    Impressive, most impressive. Should be a delicious prospect for Sli. Less heat and only 350W out of the wall!
     
  40. Parja

    Parja [H]ardForum Junkie

    Messages:
    11,536
    Joined:
    Oct 4, 2002
    Meh, Furmark is a terrible stress test anyway. It maxes out temps, but there's a ton of things more stressful.