NVIDIA Could Tease Next-Gen 7nm Ampere at GTC 2019

Discussion in 'HardForum Tech News' started by Megalith, Mar 16, 2019.

  1. Megalith

    Megalith 24-bit/48kHz Staff Member

    Messages:
    13,004
    Joined:
    Aug 20, 2006
    It isn’t clear whether NVIDIA will have any surprises to share at next week’s GPU Technology Conference (GTC), but some speculate the company could reveal aspects of its next-generation architecture, “Ampere,” which will purportedly be built on the 7nm node. TweakTown and TechSpot suggest it could be the right time to do so, as the luster of Volta and Turing continues to wear thin. The former predicts it won’t be a gaming part, however, suggesting “a new GPU architecture tease that will succeed Volta in the HPC/DL/AI market.”

    For now, NVIDIA has used the Ampere name for their future 7nm GPUs. If that's the case, the Ampere GPUs would bring power efficiency improvements, higher clock rates, and perhaps higher memory bandwidth. Now would be a good time for NVIDIA to make a big announcement, considering the company just had one of the worst fiscal quarters its ever had. Consumer and investor faith in the company is slipping, especially since the adoption of RTX technology has been much slower than expected.
     
  2. DF-1

    DF-1 2[H]4U

    Messages:
    2,567
    Joined:
    Jun 17, 2011
    nvidia could tease 0.002nm too.
     
    Madmeerkat55 likes this.
  3. Snishsnoosh

    Snishsnoosh n00b

    Messages:
    20
    Joined:
    Oct 3, 2018
    Courage to breach the $2000 barrier?
     
  4. Zam15

    Zam15 [H]Lite

    Messages:
    65
    Joined:
    Jul 27, 2015
    Waiting on a 7nm part to upgrade, feels like this will be one of the last decent jumps in performance as node shrinks get harder and more expensive. Hope to get another 4years our of my rig. Running 980s in SLI circa 2014.

    Needs to run all current and near future titles and next gen console ports @ 4K60 or better.
     
  5. TourGuide

    TourGuide n00b

    Messages:
    28
    Joined:
    Nov 10, 2009
    Nah - They'll come in a bit under 2k with this one - $1900 or $1950. Then short supplies will push past that magic number.

     
  6. oldmanbal

    oldmanbal [H]ard|Gawd

    Messages:
    2,043
    Joined:
    Aug 27, 2010
    Since they are FINALLY separating GPUS into true gaming and workstation divisions, maybe this will be the generation that goes all in on performance, power be damned. It would be beautiful if they dropped the silly tensor/rtx cores and went flush with shaders. Picture a 3ghz ti with over 2x the cores of current gen. A gamer can dream~
     
    FrgMstr likes this.
  7. Brian_B

    Brian_B 2[H]4U

    Messages:
    3,166
    Joined:
    Mar 23, 2012
    Doesn’t really make sense unless they want to write off Turing as failed.

    As soon as they tease 7nm everyone is just going to camp and wait for it rather than buy turing.

    Heck, a lot of people are doing that anyway, so why the hell not... it would just being a subtle way for nV to admit Turing isn’t doing what they want without spooking Wall Street too bad
     
  8. Hakaba

    Hakaba Gawd

    Messages:
    641
    Joined:
    Jul 22, 2013
    Ohh maybe an upgrade is incoming next year.
     
  9. STrooperTK421

    STrooperTK421 Limp Gawd

    Messages:
    450
    Joined:
    Oct 8, 2009
    Heh, yeah, and from what we saw with the 20XX cards and their pricing, you can expect to have to mortgage your house in order to get one unless the fucking miners snap 'em up first. :rolleyes:

    I'm just about done with PC gaming completely, I'm sick of paying to be a beta tester for these companies, and "escaped samples" that could burn my damn house down if I leave my PC on. I'm old and I remember when PC gaming really meant something, now it is just another corporate cocksucker cash grab. Just my opinion.
     
  10. odditory

    odditory [H]ardness Supreme

    Messages:
    5,527
    Joined:
    Dec 23, 2007
    Except it's not a secret that 7nm was coming this year.
     
  11. Brian_B

    Brian_B 2[H]4U

    Messages:
    3,166
    Joined:
    Mar 23, 2012
    Really?

    I'd say it's no secret that nVidia would move to 7nm sooner than later. But no one has said 2019, especially since they still don't have the full Turing lineup out. Most rumor sites were pegging Ampere at 2020. I still don't see anything indicating that it will be out this year in this rumor... just a hint that it could be announced.

    Regardless, just an announcement will cause people to sit and wait. Means a poor sales cycle gets even worse, but if it can move the stock price needle, that seems to be the game nVidia is playing.
     
  12. Snowdog

    Snowdog [H]ardForum Junkie

    Messages:
    9,229
    Joined:
    Apr 22, 2006
    NVidia is in no rush to 7nm, if they were, they wouldn't be releasing a full stack of 16nm+ Turing parts from 2080 Ti all the way to 1660 so far and 1650 likely soon. They are more power efficient at 16nm+ than AMD is at 7nm, and it is probably pretty cost effective and very high yield at this point.

    They will definitely want to get all the 16nm+ sales exhausted before moving on to 7nm.
     
    andrewaggb likes this.
  13. Zam15

    Zam15 [H]Lite

    Messages:
    65
    Joined:
    Jul 27, 2015
    NVIDIA is on 12nm....
     
  14. Elf_Boy

    Elf_Boy 2[H]4U

    Messages:
    2,337
    Joined:
    Nov 16, 2007
    I wont pay $2399.99 for a GPU.

    Might be stuck on the 1080ti for a while.
     
  15. JargonGR

    JargonGR Limp Gawd

    Messages:
    493
    Joined:
    Dec 16, 2006
    Hey nVidia, start with a 20% discount on your GPUs and work your way to higher sales from there...otherwise good luck....I am enjoying the show and eagerly await your next Q results.
     
  16. Lakados

    Lakados [H]ard|Gawd

    Messages:
    1,620
    Joined:
    Feb 3, 2014
    I really hope these are for the Quadro and Tesla series, I need some upgrades and the GP100’s are still 9K a pop in Canada.
     
  17. Snowdog

    Snowdog [H]ardForum Junkie

    Messages:
    9,229
    Joined:
    Apr 22, 2006
    In reality 16nm+ is more apt description since it didn't actually shrink, just like the GF "12nm" that AMD uses that is really 14nm+ and the exact same size it was in the previous year when they called it 14nm.
     
    N4CR likes this.
  18. Azphira

    Azphira [H]ard|Gawd

    Messages:
    1,822
    Joined:
    Aug 18, 2003
    The price for the 2080ti in 2019 is not all that ridiculous when compared to the Quantum3D Obsidian X-24 in 1998.
     
  19. Patton187

    Patton187 Gawd

    Messages:
    670
    Joined:
    Feb 12, 2012
    Same exact situation.
     
  20. Mode13

    Mode13 Gawd

    Messages:
    645
    Joined:
    Jun 11, 2018
    Speaking of price.. Are there any hard numbers on Turing sales yet? I have a feeling the 1660 exists due to poor sales.

    It's pretty bad when the most budget card on the market is over $200, at least it performs very well, but I bet when the next round of consoles drop we'll be seeing yet another round of decline in PC gaming and thus sales. This is becoming cyclical.
     
  21. oldmanbal

    oldmanbal [H]ard|Gawd

    Messages:
    2,043
    Joined:
    Aug 27, 2010
    I agree and disagree with some of your points. I'm firmly in the camp that the high end offering should come in around $500-$600 not $1200. The fact that games aren't really pushing graphics over the last 3 years doesn't give most gamers any reason to upgrade over a 970 class card. After changing to 4k, I know that it looked crisper, but it was still the same games with a little better resolution, nothing game changing.

    Regarding you quips about burning your house down, I try not to sensationalize media reports as typical scenarios. Anyone that bought a defective card either exchanged it or refunded it. I've had multiple AMD flagship cards arrive DOA on launch and had to return them for another roll. I think we can all attest to the fact that Nvidia wanted to maintain their profits following the crypto bubble and clearly took a route to the dismay of many gamers that made them really consider weather an upgrade offered any better experience for the massive increase in price over old refreshes. As I've done many times before, I got 2 ti's for SLI just to sell one off a few weeks afterwards due to lack of compatibility in games I play most. To be really honest, even 1 2080ti is more than enough for everything I need right now, and may be for years to come.
     
    GoldenTiger likes this.
  22. Mode13

    Mode13 Gawd

    Messages:
    645
    Joined:
    Jun 11, 2018
    It gets sensationalized but it does happen. I had two 8800 gtx cards burn on me under warranty. Reality is the power supply kicked off immediately and the cards were just left with charred VRMs and some smoke, but I'm sure freak accidents and corner cases exist. I can name quite a few cards going back in time that were pretty well known for failure all the way back to the v3 voodoo which was when people started modding their own fans in.

    So I don't think some cards with higher failure rates are new, what is new is paying $1200 on a graphics card.. So I'm with you all there, it's too expensive, especially these days when the average Joe has way less spending cash then he did in 1999..
     
  23. LuxTerra

    LuxTerra Limp Gawd

    Messages:
    211
    Joined:
    Jul 3, 2017
    Why? Why must the price be sticky and why must the market only have products at prices you desire to pay?


    To be clear, I'm not thrilled with the $1200 price tag either and thus, I don't have one. I'll stick with my 1080Ti for a while longer.

    That said, should the high end sports car only cost $50k? Should a high end home top out at $500k? I understand that in general we all want to buy the best and are happy when the price we want to pay for being top dog is aligned with the market. However, why should there be no higher end products that cost more than I'm willing to pay? It's really just that you grew accustom to paying that amount for the ego of having the best. Any $500-$600 card you purchase today blows any 5, 10, 15 year old card out of the water. You're still getting more for the same price point (ignoring inflation), just not the ego of being able to say you have the best.

    Also, this has been beat to death, but inflation really matters. That doesn't make a 2080Ti relatively inexpensive, but rather that your fixed of $500-$600 won't buy a loaf of bread at some point in the future. It's silly to have fixed price constructs in our minds.

    The real issue is that the exponential scaling of semiconductors has seriously spoiled us. Sort of like the US manufacturing situation after WW2, it was never a long-term sustainable situation...things were always going to change.
     
    Last edited: Mar 17, 2019
    GoldenTiger and DooKey like this.
  24. DooKey

    DooKey [H]ardness Supreme

    Messages:
    8,067
    Joined:
    Apr 25, 2001
    Average Joe should be playing consoles and maybe should just stick to his phone for games if that's the case.
     
    GoldenTiger likes this.
  25. DooKey

    DooKey [H]ardness Supreme

    Messages:
    8,067
    Joined:
    Apr 25, 2001
    Says the guy running a $1200 cpu with other ultra expensive products to go along with it. LOL.
     
    GoldenTiger likes this.
  26. oldmanbal

    oldmanbal [H]ard|Gawd

    Messages:
    2,043
    Joined:
    Aug 27, 2010
    Nvidia reacts to crypto MSRP over pricing by setting msrp to new heights to adjust for what they were selling for during the peak of the crypto boom. Now that gamers are buying the cards again the prices don't make any sense. You can complicate it as much as you want but I just spelled it out for you.
     
  27. LuxTerra

    LuxTerra Limp Gawd

    Messages:
    211
    Joined:
    Jul 3, 2017
    You completely ignored the question. The GPU market is now far more complex than just gamers and crypto.

    The GPU you buy today at any given inflation adjusted price is faster than any GPU you've EVER been able to buy at that price point. What's changed is your money doesn't buy you the status it previously did. What's changed is instead of having the best at a price point you prefer, you must balance other competing requirements to subjectively determine value.
     
    Last edited: Mar 18, 2019
  28. oldmanbal

    oldmanbal [H]ard|Gawd

    Messages:
    2,043
    Joined:
    Aug 27, 2010
    When pricing is subjective inflation doesn't = market price.
     
  29. smarenwolf

    smarenwolf [H]Lite

    Messages:
    95
    Joined:
    May 7, 2018
    Please explain that when taking a look at a 1080Ti vs a 2080.
    They have basically the same performance within margin of error.
    The 1080Ti cost around 700.
    The 2080 costs around 700.
    How exactly is this "the fastest GPU ever"?

    It's a stagnation, and the next better model costs _twice_ the amount.
     
  30. NickJames

    NickJames [H]ardness Supreme

    Messages:
    6,611
    Joined:
    Apr 28, 2009
    Is consumer card mining still a thing? I can't imagine with the huge drop in BTC that it's even profitable at this point.
     
  31. Hashiriya415

    Hashiriya415 [H]Lite

    Messages:
    105
    Joined:
    Mar 17, 2019
    Does this mean a new card for fall?
     
  32. Ranulfo

    Ranulfo [H]ard|Gawd

    Messages:
    1,561
    Joined:
    Feb 9, 2006
    When the now middle of the road "performance" GPU costs more than a mid level cpu and mobo combined (and in some cases ram included) you might have a problem with runaway prices. Especially when that cpu will likely last you 2 times longer than the GPU will.
     
  33. LuxTerra

    LuxTerra Limp Gawd

    Messages:
    211
    Joined:
    Jul 3, 2017
    You’re only looking at a subset of capabilities and metrics. The CUDA core/shader performance is about the same for the same money (not surprising based on specs and Nvidias announcement focus). However, there’s additional units on the 2080 which make it significantly faster than a 1080Ti in other tests. How much value that provides you is individually dependent on your use case. Turing cards also have dual INT/FP execution that a 1080Ti does not. There are other changes too.

    It’s like Intel and AVX512 support, single or dual issue. It’s much faster than AVX2 if you have the right problem. It’s about the same if you don’t. RTX shouldn’t tempt you in this case any more than AVX512.

    Any particular subsystem may/may not be faster and you can cherrypick tests to make virtually any point you want. However, across all tests, the 2080 is at least as fast and much faster in some (where RT, tensor, or simultaneous INT/FP provides a benefit).

    A 2080 is faster than a 1080Ti, just not in everything. Nothing new about that in tech nor any other market. E.g. for cars, sometimes 0-60 drops, or skid pad Gs increases, or lap times around some circuits decreases or whatever. Rarely is every metric significantly improved in a single generation.

    Edit: Here's a similar example. More CPU cores doesn't always make a faster CPU. It can in some tests, but in others it can actually cause a regression. Threadripper is amazing, but more expensive than Ryzen and not faster in every test. However, in general, we consider Threadripper to be the "more powerful" CPU even if it's slower in some cases. The option to buy Threadripper, even if it's an expensive CPU, is a great option to have. Despite the fact that it's not the right CPU for many and priced well above average.
     
    Last edited: Mar 20, 2019
  34. LuxTerra

    LuxTerra Limp Gawd

    Messages:
    211
    Joined:
    Jul 3, 2017
    And the point?

    Pricing is always subjective in a free market. Inflation just gives a way to normalize, not for subjectivity, but time since the value of currency is not constant.