AMD CEO Confirms 7nm Navi High-End Radeon RX Graphics Cards and 4thGen Ryzen CPUs For Notebooks

They could but it would burn a lot of their resources to do so. They simply cannot outspend either Intel or Nvidia and have to prioritize where their limited D&D budget goes.

Hmm a D&D budget, I am assuming they are researching the Paladin class :) But yes AMD has to be careful how they spend their money, Lisa Su has done well so far.
 
They could but it would burn a lot of their resources to do so. They simply cannot outspend either Intel or Nvidia and have to prioritize where their limited D&D budget goes.

I think their biggest challenge is being able to get such products made. Toss 2x whatever midrange GPU they're shipping at the time onto a die (monolithically, not CFX-on-a-package) with double the memory bandwidth and they'd be right there at the top.

However, they have to convince a fab to build them, and part of that negotiation is the ability for AMD to commit to future business to get access to that capacity and that's where the lack of certainty as to whether such a part would actually be successful comes in to play. Essentially, AMD lacks the command of the market necessary to have leverage with fabs to build risky parts.
 
I think their biggest challenge is being able to get such products made. Toss 2x whatever midrange GPU they're shipping at the time onto a die (monolithically, not CFX-on-a-package) with double the memory bandwidth and they'd be right there at the top.

However, they have to convince a fab to build them, and part of that negotiation is the ability for AMD to commit to future business to get access to that capacity and that's where the lack of certainty as to whether such a part would actually be successful comes in to play. Essentially, AMD lacks the command of the market necessary to have leverage with fabs to build risky parts.

Fabs will build just about anything if you pay them. Contracts are contracts... and AMD doesn't not pay its bills or something. They will build anything AMD wants to contract. The risk is all theirs. You think TMSC really gives a toss if AMD pays for setup and production of 100,000 parts that don't move ? Why would they. Yes AMD has to make those decisions... and they did which is why we have Polaris still in the market. They choose to focus on fixing their x86 line.

I have worked in industrial supply and if I ordered 100,000 parts from a MFG and paid for them on delivery WTF do they care if they sit in my warehouse or I move them in a week. (other then I would likely have to reorder.) :) Contract producers produce when contracted if you pay your bills... sales are your problem.

AMD and NV both have a massive advantage over Intel in the fab dept. Owning your fab CAN save you money in theory as long as everything goes according to plan. When it doesn't your the only company on the hook for the costs. When things don't go as planned without side fabs... costs are shared according to the initial contract. AMD isn't getting any worse a deal in that regard as NV is... they both order, they both pay their bills, they both have 2-3 options for high end fabs. If TMSC isn't giving either the timetables or cost/expense sharing details they like they can call the competition. Just look at NV moving to Samsung for their 7nm. Not owning your fab means you are free to contract which ever fab can get the job done and deliver.

Unless you know of a case where AMD has stiffed a fab when the bill came. This is just a FUD argument. AMD NV Intel Samsung Qcom everyone makes design decisions based on how well they expect things to fab. But if they choose to go risky the fabs don't really much care... worse case they may negotiate a lower % of the expense if they don't expect the design to tap well.

AMD has a major advantage over NV on the fab side of things... as they also order tons of x86 chips. One of the main reasons NV choose Samsung for Ampere is cause Samsung could deliver on time. Part of the reason TMSC lost it was because they are busy fabbing parts for AMD and Apple I guess.
 
Last edited:
Because Nvidia top end cards (2080TI and RTX Titan) are probably priced about 50-75% higher than they would be if we had real competition in the marketplace.

That isn’t how business works though. The new price point has been set by market demand, like it or not has no bearing, AMD would have a stakeholder riot if they had a 2080ti and where not charging close to nVidia’s price.

I 100000% agree with you. But I don't think competition alone will stop it.

Stop buying $1000++ video cards... that WILL stop it.

Agreed, but the reality is the market is huge compared to the market of the 90’s and 00’s. there are enough people globally buying GPUs that these prices are head to stay (barring recessionary action reducing the market).
 
Fabs will build just about anything if you pay them. Contracts are contracts...

The issue that I'm getting at is that it's not that simple- fab space is allocated to highest bidders and to bidders with a good track record of repeat business. AMD is competing with their CPUs and with other companies, including Nvidia, to get that fab space. And if Apple wants it too? Think AMD is outbidding Apple for fab capacity, for GPUs?

This is the downside to farming out fabrication.
 
The issue that I'm getting at is that it's not that simple- fab space is allocated to highest bidders and to bidders with a good track record of repeat business. AMD is competing with their CPUs and with other companies, including Nvidia, to get that fab space. And if Apple wants it too? Think AMD is outbidding Apple for fab capacity, for GPUs?

This is the downside to farming out fabrication.

Fair there is competition for fab space no doubt. AMD is in a good position there though. If they where in a bad position 7nm navi wouldn't be on the market and TMSC would be bumping big navi for ampere. ;)

I wouldn't doubt they are outbidding Apple... Apple isn't paying through the nose for mobile parts. They are also the type of company that will pound their chest and threaten to just open a competing fab if you don't give them the shit margins they want. If your TMSC and Apple is going to order half a million parts at 20 points... and AMD is going to order 100k at 40 points. Ya they find room for AMD. And as NV has already proven.. if TMSC answer is wait a few months for space and this is our cost... there is always Samsung.

No doubt global foundries not going 7nm hurts AMD though... makes it hard to negotiate if you only really have one other Fab option that doesn't have the same track record.
 
That isn’t how business works though. The new price point has been set by market demand, like it or not has no bearing, AMD would have a stakeholder riot if they had a 2080ti and where not charging close to nVidia’s price.

Demand in isolation means nothing. It is one part of the equation. The other is supply. The ONLY reason Nvidia can get away with their high prices is because they are the only player in the high end market, so they are really only competing with themselves.

As soon as there is a second player in the market, they will try to undercut each other and pricing should come down for both of them.

Research suggests that you don't get a truly price competitive market until you have three or more competitors though.
 
Last edited:
Fair there is competition for fab space no doubt. AMD is in a good position there though. If they where in a bad position 7nm navi wouldn't be on the market and TMSC would be bumping big navi for ampere. ;)

It's not 'none' or 'as much as they want', but 'more' or 'less'. The more they try to order, the more it's going to cost, and fabs won't build out capacity without knowing that they have a diverse demand for that additional capacity- meaning that AMD can't just say 'make us something three times the size with half the quantity' and then expect to pay the same $/mm^2 they are now.

No doubt global foundries not going 7nm hurts AMD though... makes it hard to negotiate if you only really have one other Fab option that doesn't have the same track record.

GloFo is mostly just hurting themselves, or just watching over the business till it dies. Not advancing their technology means that they're not likely to survive.

It also gives AMD an out- if they have a design that GloFo simply cannot produce, then they can start wiggling around contract requirements.

They can also start courting Samsung more, or other smaller fabs if need be.
 
I am so stupid, and nerd, that i would probably get a high end AMD card ( more expensive than 5700XT ) though a 5700XT will suit me just fine.
And my excuse will be " if i upgrade from my 1080p screen" which i bloody well know i will not do unless it break or i win a substantial lotto win, but they will have to sell such a card a lot cheaper than what the current high end Nvidia cards cost, cuz i am never going there.

5-600 USD i could probably do, but it will have to be one helluva deal, even if i am stupid and nerdy as hell.
 
  • Like
Reactions: N4CR
like this
5-600 USD i could probably do, but it will have to be one helluva deal, even if i am stupid and nerdy as hell.

I paid MSRP for my 1080Ti. Never spent that much on any part except for a monitor, and that only once- because it was the height of the crypto craze, and I got a 1080Ti with built-in AIO for the price that the OEM cards were going for.

Never thought I'd feel like a US$700 GPU would be a good deal- mostly don't plan on repeating that.
 
AMD can't charge $1,200 for a comparable 2080Ti card for two reasons;
1. RTX, at the same price and performance you will get the card with added features (just in case)
2. Brand, NVidia is the bigger player and at the same price you don't go with the smaller company.

To beat a larger and more powerful competitor you don't come out with the same performance at the same price you need to offer better performance or better price to performance to convince people to switch.

It's immaterial for me as I'm not going to pay over $500 for a GPU anytime in the next 5 years.
 
1. RTX, at the same price and performance you will get the card with added features (just in case)

If AMD decides to get around to implementing DXR in hardware, they most likely can- and supposing they don't roll out their typical assed-up reference GPUs, they could easily get away with charging equivalent Nvidia MSRP minus say US$50.
 
Well for the reasons a lot of AMD fans boast about: No tensor cores, no RTX die space used and on 7nm. So they should be able to sell them for a substantially lower price right?
I think its a case of perceived value Vs actual cost to make. I doubt making silicon with tensor cores and hardware raytracing, costs meaningfully more, than excluding those features. Sure, AMD can undercut, especially if they don't have the features, it makes some sense from a marketing standpoint and on the business end, with investors. However, they still gotta make a profit of some amount. Right now, they are likely able to play a little more of a price game, due to the generally higher yield of 7nm. Not due to lack of ray tracing, etc. This stuff still costs money to make.
 
Last edited:
And AMD is doing just fine on their relationships with fabs. It behooves anyone to do business with AMD. Because, whatever AMD wants to make, will be spun into a version for Playstation and Xbox. Playstation just yet again, hit 100 million units sold. That's a relationship which a fab would want to foster.

Nvidia is lucky AMD doesn't have a mobile presence. Otherwise, AMD would likely have Switch, too. Switch was a really great deal for Nvidia to get on.
 
I think its a case of perceived value Vs actual cost to make. I doubt making silicon with tensor cores and hardware raytracing, costs meaningfully more, than excluding those features. Sure, AMD can undercut, especially if they don't have the features, it makes some sense from a marketing standpoint and on the business end, with investors. However, they still gotta make a profit of some amount. Right now, they are likely able to play a little more of a price game, due to the generally higher yield of 7nm. Not due to lack of ray tracing, etc. This stuff still costs money to make.
Umm... Right now there isn't much cost benefit to put rtx (dxr) in, probably in the future. The downside is larger dies, reducing yields and increasing the chances of a defective chip. At the performance range these GPUs are in 2060 super, 2070) it wouldn't make much sense to put in rtx as has been pointed out that the only real viable rtx GPU right now is the 2080ti. This *may* change, but more than likely it will be another generation or two before it's useful on more than just the top end.
 
Umm... Right now there isn't much cost benefit to put rtx (dxr) in, probably in the future. The downside is larger dies, reducing yields and increasing the chances of a defective chip

This is really only relative to an Nvidia product without RTX. We'd need the same featureset seen on the 1660 line but at a higher performance level to begin to make a comparison, and we'd still be guessing because we don't know what volume Nvidia is ordering.

At the performance range these GPUs are in 2060 super, 2070) it wouldn't make much sense to put in rtx as has been pointed out that the only real viable rtx GPU right now is the 2080ti. This *may* change, but more than likely it will be another generation or two before it's useful on more than just the top end.

Consider that one of the largest issues with DXR we've seen is that it's been shoehorned into games that were already 'finished', from an engine development perspective. So the lower-end RTX cards- and potential future AMD cards with hardware DXR support- should become more viable for DXR going forward as developers build in support at lower levels and tune for the mix of resources that are expected to be available in the upcoming consoles.

And AMD is doing just fine on their relationships with fabs. It behooves anyone to do business with AMD. Because, whatever AMD wants to make, will be spun into a version for Playstation and Xbox. Playstation just yet again, hit 100 million units sold. That's a relationship which a fab would want to foster.

It's not 'AMD'- remember that the demand for the console SoCs comes from Microsoft and Sony and Nintendo. Further, each SKU needs its own consideration because each is going to require a unique run, and that's what drives fab costs. So for every separate die AMD wants to produce they must bargain with their fabs over wafer costs and so on to make all of the sunk costs of the associated production run worth it, keeping in mind that AMD is both competing with their own wafer orders for other products and with the orders from other, far larger companies.

Nvidia is lucky AMD doesn't have a mobile presence. Otherwise, AMD would likely have Switch, too. Switch was a really great deal for Nvidia to get on.

AMD has a mobile presence- they're just too poorly performing to gain any marketshare. Remember that the SoC in the Xbox One and PS4 is a 'mobile' SoC, but in no way would it be of the appropriate performance / watt for an actual mobile device. Works well enough for a console running off mains, but that's it.

Same for notebooks really. Laptop dGPUs are hard limited by TDP, and well, as above, AMD falls short here. You see their APUs in a few budget chassis and that's about it, as the battery drain is just too high to consider them for more premium smaller laptops and their dGPUs are simply going to be significantly slower than an Nvidia dGPU for any specific chassis.

[note that there are documented cases in this space of lower-end parts outperforming higher-end parts from the same vendors due to how the performance / watt equation shakes out- this is a challenging market segment]
 
This is really only relative to an Nvidia product without RTX. We'd need the same featureset seen on the 1660 line but at a higher performance level to begin to make a comparison, and we'd still be guessing because we don't know what volume Nvidia is ordering.
I'm not really sure what you meant by this, but I was responding to chameleoneel, who said "I doubt making silicon with tensor cores and hardware raytracing, costs meaningfully more, than excluding those features". I was saying, there IS a real cost to it, and the benefit is still to be seen. I was just trying to point that out. I'm not saying it was the right or wrong move, just why it was considered and *most likely* why they chose to leave it out on their mid-range cards.

Consider that one of the largest issues with DXR we've seen is that it's been shoehorned into games that were already 'finished', from an engine development perspective. So the lower-end RTX cards- and potential future AMD cards with hardware DXR support- should become more viable for DXR going forward as developers build in support at lower levels and tune for the mix of resources that are expected to be available in the upcoming consoles.

This is all speculation. NVidia spent some time helping 'shoehorn' it in with/for developers, so it's not as if they had no clue and were just winging it. I do know a couple of future games have announced support, so hopefully they start rolling out and we can get some real performance numbers. I'm wondering if we might start seeing RTX (more likely DXR since it's multi platform) level adjustments. Something like they did with shadows and reflections, etc as they were all new technologies at a time. Minimal would only use it for small things or more focused things, while extreme would have it enabled on all surfaces that could be reflective or something (and some settings in between).

It's not 'AMD'- remember that the demand for the console SoCs comes from Microsoft and Sony and Nintendo. Further, each SKU needs its own consideration because each is going to require a unique run, and that's what drives fab costs. So for every separate die AMD wants to produce they must bargain with their fabs over wafer costs and so on to make all of the sunk costs of the associated production run worth it, keeping in mind that AMD is both competing with their own wafer orders for other products and with the orders from other, far larger companies.

I'm not exactly sure what you're arguing here. AMD is the one placing orders with the foundries as far as I know and it does significantly increase AMD's presence even if they are different SKU's. Repeat business is a big motivator. I mean, you are correct that there is limited availability so they are in essence competing with themselves and others. But the same could be said with Intel... they are competing for space on their own foundries and don't have an easy option to use a second foundry if needed.
"For example, in the previous calendar year AMD ordered over ten million of chips for Microsoft Xbox One and Sony PlayStation 4 to Taiwan Semiconductor Manufacturing Co. Given that sales of the two consoles are only going to increase, AMD’s chip orders will raise as well."
https://www.kitguru.net/components/...s-to-make-semi-custom-apus-cpus-gpus-for-amd/

AMD has a mobile presence- they're just too poorly performing to gain any marketshare. Remember that the SoC in the Xbox One and PS4 is a 'mobile' SoC, but in no way would it be of the appropriate performance / watt for an actual mobile device. Works well enough for a console running off mains, but that's it.
Same for notebooks really. Laptop dGPUs are hard limited by TDP, and well, as above, AMD falls short here. You see their APUs in a few budget chassis and that's about it, as the battery drain is just too high to consider them for more premium smaller laptops and their dGPUs are simply going to be significantly slower than an Nvidia dGPU for any specific chassis.

[note that there are documented cases in this space of lower-end parts outperforming higher-end parts from the same vendors due to how the performance / watt equation shakes out- this is a challenging market segment]

*This* Here's hoping that's the next place they start putting focus. They have the best APU's (performance wise) but couldn't seem to get it together to put good products out.
 
That isn’t how business works though. The new price point has been set by market demand, like it or not has no bearing, AMD would have a stakeholder riot if they had a 2080ti and where not charging close to nVidia’s price.

Well that is not quite true is it? NVIDIA has the only cards at the top three tiers. If there was legit competition prices would be forced to drop unless there was fixing. That is how the market is supposed to work. Now if 700 dollars to 1300 dollars in two years is normal inflation then please educate me.
 
Well that is not quite true is it? NVIDIA has the only cards at the top three tiers. If there was legit competition prices would be forced to drop unless there was fixing. That is how the market is supposed to work. Now if 700 dollars to 1300 dollars in two years is normal inflation then please educate me.

It may be hard to find them, but don't these start $1000? I see a couple of new ones for $1100 on Newegg.

There was no process transistory density change for this generation, and no significant performance/transistor increase either, so all the performance gains had to come from making a much larger die, which is particularly painful at the top end part.

The 2080 Ti die is 754mm2. Which is the biggest die of any consumer GPU ever, and each generation of process also gets more expensive, so the die costs dwarf the "big" dies of prior generations.

The business reality in this situation is that the price must go up a lot to cover the increased production cost, or the product would never be built.

It's the same reason AMD probably skipped a lot of high end competition with NVidia. If you are going to build something that would have to sell for an unsustainable profit margin, then there is no point building it.
 
Well that is not quite true is it? NVIDIA has the only cards at the top three tiers. If there was legit competition prices would be forced to drop unless there was fixing. That is how the market is supposed to work. Now if 700 dollars to 1300 dollars in two years is normal inflation then please educate me.

You pretty much missed my point. In no way would AMD deliver a $700 2080ti, you can whine about how we got here all you want, but a big part of that reality is what the consumer was and is willing to spend. If no one buys $1300 cards, we will cease seeing $1300 cards.
 
You pretty much missed my point. In no way would AMD deliver a $700 2080ti, you can whine about how we got here all you want, but a big part of that reality is what the consumer was and is willing to spend. If no one buys $1300 cards, we will cease seeing $1300 cards.

If your point is that AMD would only release 2080ti level cards at $1200, I would have to say that you are not correct. I would honestly see their card as going for $700 or so dollars, at least for a reference version, if nothing else.
 
You pretty much missed my point. In no way would AMD deliver a $700 2080ti, you can whine about how we got here all you want, but a big part of that reality is what the consumer was and is willing to spend. If no one buys $1300 cards, we will cease seeing $1300 cards.
In my opinion, while it is fun to guess what price AMD will release, there is really no point arguing price point as the only ones really knows what price point is AMD.
 
It may be hard to find them, but don't these start $1000? I see a couple of new ones for $1100 on Newegg.
You can find the "black editions" for 1095 usd. Most cards are well above 1200, still, the majority at or near 1300. Also I was referring to the price at launch. 1080 ti launched at 699, and you could buy them for that. I was wrong btw, it was less than two years before the 2080 ti.

You pretty much missed my point. In no way would AMD deliver a $700 2080ti, you can whine about how we got here all you want, but a big part of that reality is what the consumer was and is willing to spend. If no one buys $1300 cards, we will cease seeing $1300 cards.

Maybe, but I am not a "whiner" and did not call you names. If two companies have a close to identical product but one has market share and name recognition, the only way to claim some of that market share is to undercut. Then the other company has to follow suit or lose market share. You see it's not just about what people are willing to pay, its also about what is available. Have you never heard of companies slowing production to inflate price? Not legal, but done all the time. I am an astute consumer, in my own mind. You see, I would not pay 1300 dollars for a video card, but someone or many more naive consumers might and do. That is where competition and market correction come in.
 
Have you never heard of companies slowing production to inflate price? Not legal, but done all the time. I am an astute consumer, in my own mind. You see, I would not pay 1300 dollars for a video card, but someone or many more naive consumers might and do. That is where competition and market correction come in.

I am pretty sure companies can set there production to whatever they want, and price however they want. Perfectly legal.

It's only an issue if there is collusion between companies to manipulate pricing.
 
Well that is not quite true is it? NVIDIA has the only cards at the top three tiers. If there was legit competition prices would be forced to drop unless there was fixing. That is how the market is supposed to work. Now if 700 dollars to 1300 dollars in two years is normal inflation then please educate me.

Every Nvidia product is competing at least with its next higher and next lower cards in Nvidia's lineup. In those few spaces where AMD competes, there's that, and, the big one is that most don't need a dGPU at all as most consumer GPUs from Intel and a few from AMD come with quite competent iGPUs.
 
Nobody said they weren't, but saying they are coming doesn't imply any time frame... Better Nvidia cards are coming too... see, it gives you absolutely no information yet is factually true.

?
Yes, many are sayin that AMD won't release another GPU until 2020. This tidbit, further hints at AMD releasing a new bigger GPU in just a few months time, for the 2019 holiday season.
Dr Su has been making subtle hints about additional GPUs coming out this year (2019) & she always undersells Herself.

You can play all the word games you want, but next month is September and then fall starts. So (if true), we will see a Radeon RX 5800 series with the power of 2080 Super in games, for about $599.



And NEXT YEAR... about 13 months from now, when Nvidia releasing their 1st 7nm GPU & AMD releases 4th 7nm GPU is when we will have the Ray Tracing wars and bigly powerhouses.
 
AMD releases 4th 7nm GPU

Which is AMD's 2nd 7nm architecture, and 1st new architecture on 7nm. Ampere will be Nvidia's 1st and 1st new architecture on 7nm, so they'll be competing on equal ground. Well, sort of. AMD can't match Nvidia's 12nm architecture with their 2nd 7nm architecture, so it looks like they're probably going to be trolling the low-end again.
 
If your point is that AMD would only release 2080ti level cards at $1200, I would have to say that you are not correct. I would honestly see their card as going for $700 or so dollars, at least for a reference version, if nothing else.

Given the Fury X, Vega 64, ect where AMD was +$100 over where they should have been I am highly skeptical they will undercut nVidia that much.
 
Given the Fury X, Vega 64, ect where AMD was +$100 over where they should have been I am highly skeptical they will undercut nVidia that much.

Oh no, if they feel that they have a superior product, they are absolutely going to wring every cent out of The Faithful that they can!
 
Given the Fury X, Vega 64, ect where AMD was +$100 over where they should have been I am highly skeptical they will undercut nVidia that much.

The Vega 64 was $499 and therefore, no where near overpriced. However, the Fury X and Fury Nano? We will just agree not to speak of that....... ;)

Oh, and my Openbox Sapphire 5700 for $284 is faster than the Vega 64.
 
5600 Series
5700 Series (navi10)
5800 Series
5900 Series

Those are the only 4 AMD needs for utter dominance on the Gaming World.
 
However, the Fury X and Fury Nano? We will just agree not to speak of that....... ;)

I was a bit envious of whichever version came with an AIO stock. I'm also still surprised that that wasn't a launch option straight from AMD for all HBM GPUs; those are basically tailor-made for AIO cooling.
 
I was a bit envious of whichever version came with an AIO stock. I'm also still surprised that that wasn't a launch option straight from AMD for all HBM GPUs; those are basically tailor-made for AIO cooling.

Fury X, and they actually had wide spread pump problems they refused to acknowledge. Deleting community posts and everything. I had one...
 
Given the Fury X, Vega 64, ect where AMD was +$100 over where they should have been I am highly skeptical they will undercut nVidia that much.

They also used HBM which made them more expensive.
 
Back
Top