AMD reveal next-gen ‘Nvidia killer’ graphics card at CES?

Unlike processors where going faster has meaningful results ....there is objectively less use out of going faster than needed for graphics... So the market for such cards is extremely small. AMD doesn't have money to make massive investments in things that wont sell volume, especially when they can't deliver all of the things they want at the same time due to money and manufacturing constraints, so strategy and priority will almost always trump aiming for fastest pc title.

I just dont see amd seeking the title of fastest graphics card as high of a priority as getting their gpu's in laptops by improving the power efficiency and maintaining their push into the larger markets of mid-range and low range price points.

They are looking at intel not nvidia.

Nvidia is a temporary player ....as soon as intel gets into the discrete graphics game, it's game over for nvidia since their literal only market will be top end pc graphics and compute cards ...which is unlikely to be able to sustain them. Then they'll get cannibalized by either intel or amd (probably intel unless they file for bankruptcy)
 
Then releasing ANOTHER card above anything AMD has is some sign? What about the 2070 super.. 2080.. 2080 Super.. 2080ti.. RTX Titan?

How is a 2080ti Super some kind of "sign"?

Yes it's a sign that either all is not well with Ampere or they think big Navi isn't a threat. I would think it's more likely the former.
 
Nvidia is a temporary player ....as soon as intel gets into the discrete graphics game, it's game over for nvidia AMD discrete graphics since their literal only market will be top end pc graphics and compute cards ...which is unlikely to be able to sustain them No, lol. Then they'll get cannibalized by either intel or amd (probably intel unless they file for bankruptcy) AMD is more likely to fail.

*fixed in bold
 
Nvidia is a temporary player ....as soon as intel gets into the discrete graphics game, it's game over for nvidia since their literal only market will be top end pc graphics and compute cards ...which is unlikely to be able to sustain them. Then they'll get cannibalized by either intel or amd (probably intel unless they file for bankruptcy)

I really have no idea how you would reach these conclusions.
 
because intel owns the laptop and oem market right now. When they push out their discrete graphics cards ..they'll have an easy way to undercut costs to system builders on both the PC and laptop markets (both of which are basically nvidia only market and they primarily live in intel machines).

If nvidia is pushed out of the laptop and oem pc market to a significant degree, they'll quickly see their value drop ...their cash dwindle.. which plays into their ability to invest in R&D and if they fall to 2nd place they immediately become irrelevant. They wont be able to compete on price (which intel and amd will volley over) and they wont be able to fund performance without volume sales ...

Do not underestimate the position intel will be in when it's discrete cards are ready to goto market in the near future. They will slaughter not only nvidia's but to some extent, amd's low end and laptop market share and probably eat into the cash cow mid-end market. and it'll happen quickly. Nvidia has no where to pivot, no alternative markets to grow into. It's living on borrowed time until intel drops their hammer.
 
because intel owns the laptop and oem market right now. When they push out their discrete graphics cards ..they'll have an easy way to undercut costs to system builders on both the PC and laptop markets (both of which are basically nvidia only market and they primarily live in intel machines).

If nvidia is pushed out of the laptop and oem pc market to a significant degree, they'll quickly see their value drop ...their cash dwindle.. which plays into their ability to invest in R&D and if they fall to 2nd place they immediately become irrelevant. They wont be able to compete on price (which intel and amd will volley over) and they wont be able to fund performance without volume sales ...

Do not underestimate the position intel will be in when it's discrete cards are ready to goto market in the near future. They will slaughter not only nvidia's but to some extent, amd's low end and laptop market share and probably eat into the cash cow mid-end market. and it'll happen quickly. Nvidia has no where to pivot, no alternative markets to grow into. It's living on borrowed time until intel drops their hammer.

That... actually makes a lot of sense. nVidia also hasn't made a lot of friends and has spent the last decade burning bridges and more or less selling because they're dominant and for no other reason. Just as another reason why it might be likely they'll have trouble.
My one note would be that they might be able to continue selling to the machine learning space. As has been noted, Google has bought tons of nVidia off the rack cards for AI and other R&D. If they can grow that space, the space with the deep pockets, there might be some room for them as a super computer supplier. But there are a ton of ifs.

Intel has to show competency in the GPU space though. And it turns out (as AMD has clearly shown) developing excellent GPUs that aren't hamstrung architecturally is incredibly difficult. Which from the very little I've studied is actually really easy to accidentally do. Obviously no one intentionally does this.

But anyway, Intel can't even get out of 14nm and have more or less be stalled in terms of major improvements since launching i7. It's all be incremental. After 10 years it adds up to a lot, but they have had to do so at high power usage, high cost, and a bunch of hacks (leading to dozens of new vulnerabilities).

Anyway, perhaps I say all that to say that nothing is decided yet, by a long shot. Who knows, maybe nVidia will develop ARM chips for third parties and make a run at being a system integrator. There are still things which they have competency in or that they can pivot to even if in slightly different spaces.
 
Last edited:
because intel owns the laptop and oem market right now. When they push out their discrete graphics cards ..they'll have an easy way to undercut costs to system builders on both the PC and laptop markets (both of which are basically nvidia only market and they primarily live in intel machines).

If nvidia is pushed out of the laptop and oem pc market to a significant degree, they'll quickly see their value drop ...their cash dwindle.. which plays into their ability to invest in R&D and if they fall to 2nd place they immediately become irrelevant. They wont be able to compete on price (which intel and amd will volley over) and they wont be able to fund performance without volume sales ...

Laptops are really where Intel is already... starting. AMD hasn't shipped a competitive mobile product in a decade, and you really want Intel CPUs for burst performance and battery life, and Nvidia GPUs simply because AMD cannot get performance per watt down and in laptops power draw and heat output are much harder limits.

Ice Lake has shown enough real improvements that I'd have one now if I could get an XPS13 or similar with adaptive sync. Maybe that's for round two, but it's a 'must have' for me, even on integrated graphics.

As for Nvidia, they'll have to keep pushing the envelope to stay relevant. Intel hasn't targeted the MX250 hard yet, but we can expect them to really start there, and to likely pursue a solution similar to their collaboration with AMD where they build a smaller HBM part and stick it on the same substrate as the CPU.

Intel won't get the whole mobile 'discrete' market from Nvidia quickly but I do expect them to make significant inroads at least due to providing a solution that's more space efficient if nothing else.

Nvidia has no where to pivot, no alternative markets to grow into. It's living on borrowed time until intel drops their hammer.

As much of a monster as Intel is at volume silicon fabrication, GPUs are going to take time. And one of the trends that Intel is up against in the mobile space is ARM on Chromebooks and the like: this is a space that Nvidia can compete with Intel in, or potentially out-compete them. Especially if they're say willing to work with Samsung on higher-performance ARM SoCs.

Beyond that, it's really anyone's guess at the discrete high-end. No one can really predict where the market is going to expand the most.
 
intel is not going to be starting from scratch, they've been slowly developing their stuff for well over a decade now already and they make sure their stuff is standards compliant with graphics api's ... technically they have _the_best_ open source drivers. And if you trust intel's marketing benchmarks (why would you though? ) their current graphics solution is already beating amd's apu's watt for watt in performance. That's extremely significant since that could mean that the same exact arch put into a desktop power envelope with ddr6 / hbm2 would have parity to amd's current top end graphics solutions. That's right now tech that intel has. They're just not implementing that way.
 
intel is not going to be starting from scratch, they've been slowly developing their stuff for well over a decade now already and they make sure their stuff is standards compliant with graphics api's ... technically they have _the_best_ open source drivers. And if you trust intel's marketing benchmarks (why would you though? ) their current graphics solution is already beating amd's apu's watt for watt in performance. That's extremely significant since that could mean that the same exact arch put into a desktop power envelope with ddr6 / hbm2 would have parity to amd's current top end graphics solutions. That's right now tech that intel has. They're just not implementing that way.

Quite right, the IP is fairly well advanced. The challenge is of scale. They absolutely can do it, however the issue is that it will take them time to scale it up. The question is where the market will be then.
 
Quite right, the IP is fairly well advanced. The challenge is of scale. They absolutely can do it, however the issue is that it will take them time to scale it up. The question is where the market will be then.

And what NV can and will do. It would be very foolish to think they have been or will be sitting on laurels, and simply say "GG" while markets go by.

Even the most staunch opponents of NVidia recognize how competent and ruthless they are.
 
nvidia may be evil. but intel was evil when nvidia was just a wee baby ... it's not that i think they're not aware of things and doing nothing. I just dont see what they can do as they've made no movements outside of super computers to divorce themselves on dependence of their own competitors to exist.

edit: in fact, their best option may be throwing the monopoly card out there to attack both intel and maybe even amd for leveraging their cpu market dominance to unfairly push out competition in the discrete graphics market.
 
nvidia may be evil. but intel was evil when nvidia was just a wee baby ... it's not that i think they're not aware of things and doing nothing. I just dont see what they can do as they've made no movements outside of super computers to divorce themselves on dependence of their own competitors to exist.

edit: in fact, their best option may be throwing the monopoly card out there to attack both intel and maybe even amd for leveraging their cpu market dominance to unfairly push out competition in the discrete graphics market.

If you're basing your opinions of businesses on morals then you're going to be really, really disappointed ;)
 
if you're not basing your opinion of companies on some form of morals then your world is full of just grey players and that's basically a sociopath.

good and evil are relative.... history has tallied more objectively evil things for some than for others.
 
I am more interested in how AMD convinced all these gullible people that they are somehow “the good guys”.

A few months ago I tried to find Intel’s efficiency per mm^2 for iGPUs but I couldn’t find anything. Was interested in extrapolating it out..
 
How this thread turned from an Intel / Nvidia Apologist thread to the wonderment how can Nvidia and Intel be bad by by gouging consumers to how can AMD possibly be good. Yet AMD exhibits none of the bad characteristics of Nvidia or Intel - ever.

GP program, price gouging, monopolistic strong arm tactics proven in court.. Where does that come from? AMD is no "Angel" but Intel and Nvidia are definitely MORE Tainted and Toxic in their behavior. Corporate behavior comes from the Top -> Down. The top executives determine the overt behavior. Draw your own conclusions from the data or just stick your head back in the sand and deny reality - But don't try and tell me of innocence where there is proven guilt and don't try and shovel shit where there is none.
 
  • Like
Reactions: Zuul
like this
The fact that you call me an "AMD guy" is pretty funny.

If the shoe fits.

I run all kinda of GPUs in my rigs, so whatever company offers me the best bang for my buck in the situation, gets it. But even I can see that unless these new cards are good for compute, they are dead in the water like most of their current lineup.
 
Unlike processors where going faster has meaningful results ....there is objectively less use out of going faster than needed for graphics...

I think you have that backwards. We need both faster CPUs (AI, animation) and GPUs (lighting, physics) but there is objectively a lot more horsepower needed for the latter.
 
producing frames you will never see because it is faster than the refresh rate of the viewing medium is not important to most people except benchmarkers. Sure some games need to fastest to do that, but for the vast majority of monitors and even games, they do not. Which is why the top tier of graphics cards is far less interesting than the top tier of cpu's. A CPU over performing leaves you room to do more things at once. Most people won't run multiple games at once..

That's the general gist I was making
 
producing frames you will never see because it is faster than the refresh rate of the viewing medium is not important to most people except benchmarkers.

Yeah, so maximum framerates -- more specifically, the shortest frametimes -- are not really that interesting.

What needs to be considered are the minimum framerates as expressed in terms of maximum frametimes. Essentially, we don't care about average framerates. We care about the frames that can cause interruptions in gameplay, and this is where faster CPUs actually do wind up helping.

but for the vast majority of monitors and even games, they do not.

...they're also not here on the [H].

Which is why the top tier of graphics cards is far less interesting than the top tier of cpu's

They're both rather interesting.

A CPU over performing leaves you room to do more things at once. Most people won't run multiple games at once..

Over-performing in which way?

Let me know when you're getting < 8.3ms frametimes in literally everything.
 
If the shoe fits.

I run all kinda of GPUs in my rigs, so whatever company offers me the best bang for my buck in the situation, gets it. But even I can see that unless these new cards are good for compute, they are dead in the water like most of their current lineup.

Dead in the water? Navi is cheaper than NVIDIA cards and in new top DX 12/Vulkan titles they're outperforming Super cards that cost $80-100+ more. At the midrange to lower end, they've got NVIDIA covered. How is that dead in the water? I like NVIDIA cards too (and it's all I've owned for years) but even I can say that despite Navi not really being revolutionary and bringing anything new to the table, it at least put AMD back on the map vs NVIDIA and is doing well. Big Navi might be a surprise wake up call for NVIDIA.
 
Dead in the water? Navi is cheaper than NVIDIA cards and in new top DX 12/Vulkan titles they're outperforming Super cards that cost $80-100+ more. At the midrange to lower end, they've got NVIDIA covered. How is that dead in the water? I like NVIDIA cards too (and it's all I've owned for years) but even I can say that despite Navi not really being revolutionary and bringing anything new to the table, it at least put AMD back on the map vs NVIDIA and is doing well. Big Navi might be a surprise wake up call for NVIDIA.

I always saw pushing RTX as a possible misstep, and seemed to pan out very much in AMDs favor so far, if AMD would push out a cheaper, faster card than a 2080ti even if RT is mediocre I think a lot of people would be interested in it. RT is fun to play with but definitely not a “must have”.
 
A CPU over performing leaves you room to do more things at once. Most people won't run multiple games at once..

That's the general gist I was making

Oh I get what you're saying. Not sure it's a valid comparison though. The only way you have spare GPU performance is if you intentionally reduce the workload. By no means is the 2080 Ti overkill for the latest games at today's highest resolutions.

Most people don't do enough on their PCs to warrant today's top CPUs so I would argue the opposite. A 64 thread desktop CPU is cool but not very useful for most folks. With GPUs you can crank resolution etc. There isn't much you can do as an end user to take advantage of all that CPU horsepower.
 
rDNA is interesting. I truly want it to be the Zen of Gpus, but seeing the leaked 5500 benchmarks seem more of the same from AMD. They seem to have issues with power consumption. Unless this flagship is a hidden ryzen waiting to be unveiled, I`m thinking "finally a flagship card but with caveats that make it less attractive then a ti)
 
rDNA is interesting. I truly want it to be the Zen of Gpus, but seeing the leaked 5500 benchmarks seem more of the same from AMD. They seem to have issues with power consumption. Unless this flagship is a hidden ryzen waiting to be unveiled, I`m thinking "finally a flagship card but with caveats that make it less attractive then a ti)

Lets be honest, it’s AMD, there will be caveats. As long as it’s not priced like there are none it’ll do fine.
 
I'm pretty sure AMD has the ability right now, so it's frustrating to see them not release a RTX 2080 equiv. Forget Nav 2, bring that out whenever, but in the meantime give us a Navi 2080 Ti. They use to bring out the entire lineup, 6000 series, 7000 series, x4xx they all had low, mid and high end range cards. Now AMD's lineup is a mess... and it seems they are filling the low / mid / high(er) end cards with completely different generations. You've got a 570 4gb, 580, 590, Vega 56 cards, and 57xx cards. They have the tech, why not just bring out the entire Navi lineup and scrap everything else.

We get an announcement that Navi 2 will be announced at CES, and yet currently we only have 2 navi mid end cards.

Even if they have the capability, no way they have the capacity.
 
I always saw pushing RTX as a possible misstep, and seemed to pan out very much in AMDs favor so far, if AMD would push out a cheaper, faster card than a 2080ti even if RT is mediocre I think a lot of people would be interested in it. RT is fun to play with but definitely not a “must have”.

The problem now is that ray tracing is going to end up being a standard and require a substantial area of each gpu die to be devoted solely to. The tipping point is going to end up being what is more important? Ray tracing or shader/polygon crunching. A game with a fully ray traced real time engine and a gpu that has more horsepower dedicated to that, or something of a hybrid like we are seeing now.
 
The tipping point is going to end up being what is more important? Ray tracing or shader/polygon crunching. A game with a fully ray traced real time engine and a gpu that has more horsepower dedicated to that, or something of a hybrid like we are seeing now.

Given what is available for 'realistic' 3D rendering, there's got to be a 'limit' there somewhere. Obviously we're nowhere close, but essentially that's 'how much' ray tracing is needed.

For rasterization, we're likely to hit a tipping point of sorts where raster performance is tied more to resolution than anything else, and once we hit a resolution that exceeds most human eyesight, further gains will start tipping toward ray tracing instead.

Consider that something like 8k240 would probably be enough for the sake of argument. That's about 16 times the raw pixels per second of 4k60, easily achievable in a few decades.

And that's in terms of raw performance; stuff like variable shading and the various tricks used by consoles and VR headsets to 'lighten the load' brings the real performance needed back to earth pretty quickly.


Last, we're not likely to see a move away from a hybrid solution. There are so many things that rasterizers are significantly faster at that incur no or very little visual penalty. It's a more complicated route to take, but perhaps machine learning may be employed to help get us there.
 
The problem now is that ray tracing is going to end up being a standard and require a substantial area of each gpu die to be devoted solely to. The tipping point is going to end up being what is more important? Ray tracing or shader/polygon crunching. A game with a fully ray traced real time engine and a gpu that has more horsepower dedicated to that, or something of a hybrid like we are seeing now.
A game with a fully raytraced real time engine will have to wait many years for the hardware to do that. Plus we will be talking about some rather high resolutions as well. Current RTX cards do limited raytracing calculations that are used to for better lighting effects in games. Virtually all the current RTX enabled games are virtually mostly traditional methods of game rendering with some help from the RT cores.
 
  • Like
Reactions: Mega6
like this
I read "Nvidia killer" as a 400W RTX 2080 Super competitor at $50 less.

Pretty much -- as usual, I'm sure that AMD has a bit of "wriggle room" should NVIDIA pull out something else, like a RTX 2070 Super Elite Pro (very unlikely, but it has happened in the past, like the GTX 560 Ti 448).
So, it's possible (under such conditions) that the price goes down to $79 (or so) less -- we saw that with AMD's RX 5700 XT vs RTX 2060 Super.
 
Sadly I don't see this happening. it would be a huge feather in their cap if AMD did pull this off but I don't see it right now.
 
Sadly I don't see this happening. it would be a huge feather in their cap if AMD did pull this off but I don't see it right now.
TSMC 7nm+ is in full production, yield is as good as their 7nm. Now who do you think is using that production? It may well be AMD building up a stock pile of GPU's for a real hard launch. If AMD jumped on this very early, which appears they did, there will not be as much contention for using this process initially. Meaning they maybe able to produce as many chips as they can buy. We may well be very surprised early next year.
 
AMD/ATI has had generational strides and design/technology strides well ahead of Nvidia before, but they still couldn't pill out the performance win.

Because when it comes to actually price per performance they just don't have ENOUGH of an edge.

They have won hands down on TCO and that is great for server farms and Corporate use... but for our market.. meaning the gamer consumer market we care about performance and stability. (hence why early RTX cards were so risky.) Cost of ownership we by and large ignore. Sure we bitch but the cards still sell and we still buy them even when they are expensive.

so what AMD really needs to do is have a cost competitive but generational and overall performance win out of the box. If they can do that and effectively 'maxwell' nvidia then I will be pleased... I just don't see that happening as of yet.

Also keep in mind Nvidia has more market cloud and pure $$ resources to win this race. If they get stagnant like Intel did they will experience the pain. I just haven't seen them getting stagnant.
 
AMD/ATI has had generational strides and design/technology strides well ahead of Nvidia before, but they still couldn't pill out the performance win.

Because when it comes to actually price per performance they just don't have ENOUGH of an edge.

They have won hands down on TCO and that is great for server farms and Corporate use... but for our market.. meaning the gamer consumer market we care about performance and stability. (hence why early RTX cards were so risky.) Cost of ownership we by and large ignore. Sure we bitch but the cards still sell and we still buy them even when they are expensive.

so what AMD really needs to do is have a cost competitive but generational and overall performance win out of the box. If they can do that and effectively 'maxwell' nvidia then I will be pleased... I just don't see that happening as of yet.

Also keep in mind Nvidia has more market cloud and pure $$ resources to win this race. If they get stagnant like Intel did they will experience the pain. I just haven't seen them getting stagnant.
AMD can beat Nvidia now if they had more access to 7nm process tech, next year will be very interesting, especially if AMD releases earlier than expected their 7nm+ cards which will be an improvement to RNDA (how much is anyone's guess) plus a process improvement. This can be a rather big combine result. Not sure how ready Samsung 7nm EUV, seems to have been in developement forever. Nvidia 12nm huge chips does not equate to same performance with smaller chips, even shrinking down Nvidia Turing Arch would be very huge chips on a 7nm process. Amper will need some more magic I would say to get the size down for good enough yields - now for their Tesla line that would probably not matter. For big Ampere next year - restricted mostly to Tesla, small Amper gamers. I see AMD winning in performance for gamers.
 
Back
Top