AMD reveal next-gen ‘Nvidia killer’ graphics card at CES?

Discussion in 'AMD Flavor' started by Mega6, Nov 18, 2019.

  1. Darth Ender

    Darth Ender Limp Gawd

    Messages:
    486
    Joined:
    Oct 11, 2018
    Unlike processors where going faster has meaningful results ....there is objectively less use out of going faster than needed for graphics... So the market for such cards is extremely small. AMD doesn't have money to make massive investments in things that wont sell volume, especially when they can't deliver all of the things they want at the same time due to money and manufacturing constraints, so strategy and priority will almost always trump aiming for fastest pc title.

    I just dont see amd seeking the title of fastest graphics card as high of a priority as getting their gpu's in laptops by improving the power efficiency and maintaining their push into the larger markets of mid-range and low range price points.

    They are looking at intel not nvidia.

    Nvidia is a temporary player ....as soon as intel gets into the discrete graphics game, it's game over for nvidia since their literal only market will be top end pc graphics and compute cards ...which is unlikely to be able to sustain them. Then they'll get cannibalized by either intel or amd (probably intel unless they file for bankruptcy)
     
  2. 5150Joker

    5150Joker 2[H]4U

    Messages:
    3,264
    Joined:
    Aug 1, 2005
    Yes it's a sign that either all is not well with Ampere or they think big Navi isn't a threat. I would think it's more likely the former.
     
  3. auntjemima

    auntjemima [H]ardness Supreme

    Messages:
    5,299
    Joined:
    Mar 1, 2014
    AMD guys always bets on the former and they always fall on their faces. Keep dreaming.
     
    GoldenTiger likes this.
  4. IdiotInCharge

    IdiotInCharge [H]ardForum Junkie

    Messages:
    12,408
    Joined:
    Jun 13, 2003
    No. Graphics are still orders of magnitude too slow.
     
    tungt88, GoldenTiger and Maddness like this.
  5. IdiotInCharge

    IdiotInCharge [H]ardForum Junkie

    Messages:
    12,408
    Joined:
    Jun 13, 2003
    *fixed in bold
     
    GoldenTiger likes this.
  6. PhaseNoise

    PhaseNoise 2[H]4U

    Messages:
    2,097
    Joined:
    May 11, 2005
    I really have no idea how you would reach these conclusions.
     
  7. Darth Ender

    Darth Ender Limp Gawd

    Messages:
    486
    Joined:
    Oct 11, 2018
    because intel owns the laptop and oem market right now. When they push out their discrete graphics cards ..they'll have an easy way to undercut costs to system builders on both the PC and laptop markets (both of which are basically nvidia only market and they primarily live in intel machines).

    If nvidia is pushed out of the laptop and oem pc market to a significant degree, they'll quickly see their value drop ...their cash dwindle.. which plays into their ability to invest in R&D and if they fall to 2nd place they immediately become irrelevant. They wont be able to compete on price (which intel and amd will volley over) and they wont be able to fund performance without volume sales ...

    Do not underestimate the position intel will be in when it's discrete cards are ready to goto market in the near future. They will slaughter not only nvidia's but to some extent, amd's low end and laptop market share and probably eat into the cash cow mid-end market. and it'll happen quickly. Nvidia has no where to pivot, no alternative markets to grow into. It's living on borrowed time until intel drops their hammer.
     
    rgMekanic and 5150Joker like this.
  8. UnknownSouljer

    UnknownSouljer [H]ardness Supreme

    Messages:
    6,046
    Joined:
    Sep 24, 2001
    That... actually makes a lot of sense. nVidia also hasn't made a lot of friends and has spent the last decade burning bridges and more or less selling because they're dominant and for no other reason. Just as another reason why it might be likely they'll have trouble.
    My one note would be that they might be able to continue selling to the machine learning space. As has been noted, Google has bought tons of nVidia off the rack cards for AI and other R&D. If they can grow that space, the space with the deep pockets, there might be some room for them as a super computer supplier. But there are a ton of ifs.

    Intel has to show competency in the GPU space though. And it turns out (as AMD has clearly shown) developing excellent GPUs that aren't hamstrung architecturally is incredibly difficult. Which from the very little I've studied is actually really easy to accidentally do. Obviously no one intentionally does this.

    But anyway, Intel can't even get out of 14nm and have more or less be stalled in terms of major improvements since launching i7. It's all be incremental. After 10 years it adds up to a lot, but they have had to do so at high power usage, high cost, and a bunch of hacks (leading to dozens of new vulnerabilities).

    Anyway, perhaps I say all that to say that nothing is decided yet, by a long shot. Who knows, maybe nVidia will develop ARM chips for third parties and make a run at being a system integrator. There are still things which they have competency in or that they can pivot to even if in slightly different spaces.
     
    Last edited: Nov 22, 2019
  9. IdiotInCharge

    IdiotInCharge [H]ardForum Junkie

    Messages:
    12,408
    Joined:
    Jun 13, 2003
    Laptops are really where Intel is already... starting. AMD hasn't shipped a competitive mobile product in a decade, and you really want Intel CPUs for burst performance and battery life, and Nvidia GPUs simply because AMD cannot get performance per watt down and in laptops power draw and heat output are much harder limits.

    Ice Lake has shown enough real improvements that I'd have one now if I could get an XPS13 or similar with adaptive sync. Maybe that's for round two, but it's a 'must have' for me, even on integrated graphics.

    As for Nvidia, they'll have to keep pushing the envelope to stay relevant. Intel hasn't targeted the MX250 hard yet, but we can expect them to really start there, and to likely pursue a solution similar to their collaboration with AMD where they build a smaller HBM part and stick it on the same substrate as the CPU.

    Intel won't get the whole mobile 'discrete' market from Nvidia quickly but I do expect them to make significant inroads at least due to providing a solution that's more space efficient if nothing else.

    As much of a monster as Intel is at volume silicon fabrication, GPUs are going to take time. And one of the trends that Intel is up against in the mobile space is ARM on Chromebooks and the like: this is a space that Nvidia can compete with Intel in, or potentially out-compete them. Especially if they're say willing to work with Samsung on higher-performance ARM SoCs.

    Beyond that, it's really anyone's guess at the discrete high-end. No one can really predict where the market is going to expand the most.
     
  10. Darth Ender

    Darth Ender Limp Gawd

    Messages:
    486
    Joined:
    Oct 11, 2018
    intel is not going to be starting from scratch, they've been slowly developing their stuff for well over a decade now already and they make sure their stuff is standards compliant with graphics api's ... technically they have _the_best_ open source drivers. And if you trust intel's marketing benchmarks (why would you though? ) their current graphics solution is already beating amd's apu's watt for watt in performance. That's extremely significant since that could mean that the same exact arch put into a desktop power envelope with ddr6 / hbm2 would have parity to amd's current top end graphics solutions. That's right now tech that intel has. They're just not implementing that way.
     
  11. IdiotInCharge

    IdiotInCharge [H]ardForum Junkie

    Messages:
    12,408
    Joined:
    Jun 13, 2003
    Quite right, the IP is fairly well advanced. The challenge is of scale. They absolutely can do it, however the issue is that it will take them time to scale it up. The question is where the market will be then.
     
  12. PhaseNoise

    PhaseNoise 2[H]4U

    Messages:
    2,097
    Joined:
    May 11, 2005
    And what NV can and will do. It would be very foolish to think they have been or will be sitting on laurels, and simply say "GG" while markets go by.

    Even the most staunch opponents of NVidia recognize how competent and ruthless they are.
     
  13. Darth Ender

    Darth Ender Limp Gawd

    Messages:
    486
    Joined:
    Oct 11, 2018
    nvidia may be evil. but intel was evil when nvidia was just a wee baby ... it's not that i think they're not aware of things and doing nothing. I just dont see what they can do as they've made no movements outside of super computers to divorce themselves on dependence of their own competitors to exist.

    edit: in fact, their best option may be throwing the monopoly card out there to attack both intel and maybe even amd for leveraging their cpu market dominance to unfairly push out competition in the discrete graphics market.
     
  14. IdiotInCharge

    IdiotInCharge [H]ardForum Junkie

    Messages:
    12,408
    Joined:
    Jun 13, 2003
    If you're basing your opinions of businesses on morals then you're going to be really, really disappointed ;)
     
  15. Darth Ender

    Darth Ender Limp Gawd

    Messages:
    486
    Joined:
    Oct 11, 2018
    if you're not basing your opinion of companies on some form of morals then your world is full of just grey players and that's basically a sociopath.

    good and evil are relative.... history has tallied more objectively evil things for some than for others.
     
  16. IdiotInCharge

    IdiotInCharge [H]ardForum Junkie

    Messages:
    12,408
    Joined:
    Jun 13, 2003
    The world is gray.
     
  17. Dayaks

    Dayaks [H]ardness Supreme

    Messages:
    7,377
    Joined:
    Feb 22, 2012
    I am more interested in how AMD convinced all these gullible people that they are somehow “the good guys”.

    A few months ago I tried to find Intel’s efficiency per mm^2 for iGPUs but I couldn’t find anything. Was interested in extrapolating it out..
     
    GoldenTiger, 5150Joker and auntjemima like this.
  18. Mega6

    Mega6 2[H]4U

    Messages:
    2,208
    Joined:
    Aug 13, 2017
    How this thread turned from an Intel / Nvidia Apologist thread to the wonderment how can Nvidia and Intel be bad by by gouging consumers to how can AMD possibly be good. Yet AMD exhibits none of the bad characteristics of Nvidia or Intel - ever.

    GP program, price gouging, monopolistic strong arm tactics proven in court.. Where does that come from? AMD is no "Angel" but Intel and Nvidia are definitely MORE Tainted and Toxic in their behavior. Corporate behavior comes from the Top -> Down. The top executives determine the overt behavior. Draw your own conclusions from the data or just stick your head back in the sand and deny reality - But don't try and tell me of innocence where there is proven guilt and don't try and shovel shit where there is none.
     
    Zuul likes this.
  19. 5150Joker

    5150Joker 2[H]4U

    Messages:
    3,264
    Joined:
    Aug 1, 2005
    The fact that you call me an "AMD guy" is pretty funny.
     
    N4CR, TurboGLH and Gideon like this.
  20. auntjemima

    auntjemima [H]ardness Supreme

    Messages:
    5,299
    Joined:
    Mar 1, 2014
    If the shoe fits.

    I run all kinda of GPUs in my rigs, so whatever company offers me the best bang for my buck in the situation, gets it. But even I can see that unless these new cards are good for compute, they are dead in the water like most of their current lineup.
     
    GoldenTiger and IdiotInCharge like this.
  21. MangoSeed

    MangoSeed Gawd

    Messages:
    579
    Joined:
    Oct 15, 2014
    I think you have that backwards. We need both faster CPUs (AI, animation) and GPUs (lighting, physics) but there is objectively a lot more horsepower needed for the latter.
     
    PhaseNoise and IdiotInCharge like this.
  22. Darth Ender

    Darth Ender Limp Gawd

    Messages:
    486
    Joined:
    Oct 11, 2018
    producing frames you will never see because it is faster than the refresh rate of the viewing medium is not important to most people except benchmarkers. Sure some games need to fastest to do that, but for the vast majority of monitors and even games, they do not. Which is why the top tier of graphics cards is far less interesting than the top tier of cpu's. A CPU over performing leaves you room to do more things at once. Most people won't run multiple games at once..

    That's the general gist I was making
     
  23. IdiotInCharge

    IdiotInCharge [H]ardForum Junkie

    Messages:
    12,408
    Joined:
    Jun 13, 2003
    Yeah, so maximum framerates -- more specifically, the shortest frametimes -- are not really that interesting.

    What needs to be considered are the minimum framerates as expressed in terms of maximum frametimes. Essentially, we don't care about average framerates. We care about the frames that can cause interruptions in gameplay, and this is where faster CPUs actually do wind up helping.

    ...they're also not here on the [H].

    They're both rather interesting.

    Over-performing in which way?

    Let me know when you're getting < 8.3ms frametimes in literally everything.
     
  24. 5150Joker

    5150Joker 2[H]4U

    Messages:
    3,264
    Joined:
    Aug 1, 2005
    Dead in the water? Navi is cheaper than NVIDIA cards and in new top DX 12/Vulkan titles they're outperforming Super cards that cost $80-100+ more. At the midrange to lower end, they've got NVIDIA covered. How is that dead in the water? I like NVIDIA cards too (and it's all I've owned for years) but even I can say that despite Navi not really being revolutionary and bringing anything new to the table, it at least put AMD back on the map vs NVIDIA and is doing well. Big Navi might be a surprise wake up call for NVIDIA.
     
  25. Dayaks

    Dayaks [H]ardness Supreme

    Messages:
    7,377
    Joined:
    Feb 22, 2012
    I always saw pushing RTX as a possible misstep, and seemed to pan out very much in AMDs favor so far, if AMD would push out a cheaper, faster card than a 2080ti even if RT is mediocre I think a lot of people would be interested in it. RT is fun to play with but definitely not a “must have”.
     
    N4CR, 5150Joker and Maddness like this.
  26. MangoSeed

    MangoSeed Gawd

    Messages:
    579
    Joined:
    Oct 15, 2014
    Oh I get what you're saying. Not sure it's a valid comparison though. The only way you have spare GPU performance is if you intentionally reduce the workload. By no means is the 2080 Ti overkill for the latest games at today's highest resolutions.

    Most people don't do enough on their PCs to warrant today's top CPUs so I would argue the opposite. A 64 thread desktop CPU is cool but not very useful for most folks. With GPUs you can crank resolution etc. There isn't much you can do as an end user to take advantage of all that CPU horsepower.
     
    Maddness and Dayaks like this.
  27. Fremunaln

    Fremunaln Limp Gawd

    Messages:
    175
    Joined:
    May 3, 2019
    rDNA is interesting. I truly want it to be the Zen of Gpus, but seeing the leaked 5500 benchmarks seem more of the same from AMD. They seem to have issues with power consumption. Unless this flagship is a hidden ryzen waiting to be unveiled, I`m thinking "finally a flagship card but with caveats that make it less attractive then a ti)
     
  28. KazeoHin

    KazeoHin [H]ardness Supreme

    Messages:
    7,922
    Joined:
    Sep 7, 2011

    Yet.
     
  29. MangoSeed

    MangoSeed Gawd

    Messages:
    579
    Joined:
    Oct 15, 2014
    Sure but the point is we can max out the best GPU hardware today. Don't need to wait for "Yet".
     
  30. Dayaks

    Dayaks [H]ardness Supreme

    Messages:
    7,377
    Joined:
    Feb 22, 2012
    Lets be honest, it’s AMD, there will be caveats. As long as it’s not priced like there are none it’ll do fine.
     
    GoldenTiger, Auer, 5150Joker and 2 others like this.
  31. Phelptwan

    Phelptwan [H]ardness Supreme

    Messages:
    6,498
    Joined:
    Jul 20, 2002
    Even if they have the capability, no way they have the capacity.
     
  32. oldmanbal

    oldmanbal 2[H]4U

    Messages:
    2,071
    Joined:
    Aug 27, 2010
    The problem now is that ray tracing is going to end up being a standard and require a substantial area of each gpu die to be devoted solely to. The tipping point is going to end up being what is more important? Ray tracing or shader/polygon crunching. A game with a fully ray traced real time engine and a gpu that has more horsepower dedicated to that, or something of a hybrid like we are seeing now.
     
  33. IdiotInCharge

    IdiotInCharge [H]ardForum Junkie

    Messages:
    12,408
    Joined:
    Jun 13, 2003
    Given what is available for 'realistic' 3D rendering, there's got to be a 'limit' there somewhere. Obviously we're nowhere close, but essentially that's 'how much' ray tracing is needed.

    For rasterization, we're likely to hit a tipping point of sorts where raster performance is tied more to resolution than anything else, and once we hit a resolution that exceeds most human eyesight, further gains will start tipping toward ray tracing instead.

    Consider that something like 8k240 would probably be enough for the sake of argument. That's about 16 times the raw pixels per second of 4k60, easily achievable in a few decades.

    And that's in terms of raw performance; stuff like variable shading and the various tricks used by consoles and VR headsets to 'lighten the load' brings the real performance needed back to earth pretty quickly.


    Last, we're not likely to see a move away from a hybrid solution. There are so many things that rasterizers are significantly faster at that incur no or very little visual penalty. It's a more complicated route to take, but perhaps machine learning may be employed to help get us there.
     
  34. noko

    noko [H]ardness Supreme

    Messages:
    4,417
    Joined:
    Apr 14, 2010
    A game with a fully raytraced real time engine will have to wait many years for the hardware to do that. Plus we will be talking about some rather high resolutions as well. Current RTX cards do limited raytracing calculations that are used to for better lighting effects in games. Virtually all the current RTX enabled games are virtually mostly traditional methods of game rendering with some help from the RT cores.
     
    Mega6 likes this.
  35. pippenainteasy

    pippenainteasy Gawd

    Messages:
    629
    Joined:
    May 20, 2016
    I read "Nvidia killer" as a 400W RTX 2080 Super competitor at $50 less.
     
    tungt88, KazeoHin, Dayaks and 2 others like this.
  36. tungt88

    tungt88 [H]ard|Gawd

    Messages:
    2,006
    Joined:
    Jan 14, 2008
    Pretty much -- as usual, I'm sure that AMD has a bit of "wriggle room" should NVIDIA pull out something else, like a RTX 2070 Super Elite Pro (very unlikely, but it has happened in the past, like the GTX 560 Ti 448).
    So, it's possible (under such conditions) that the price goes down to $79 (or so) less -- we saw that with AMD's RX 5700 XT vs RTX 2060 Super.
     
  37. Grimlaking

    Grimlaking 2[H]4U

    Messages:
    2,941
    Joined:
    May 9, 2006
    Sadly I don't see this happening. it would be a huge feather in their cap if AMD did pull this off but I don't see it right now.