confirmed: AMD's big Navi launch to disrupt 4K gaming

I'd be surprised if AMD tops $800. Maybe $1000 for the high end factory OC models.

No, they are not a charity, but they have gone hard on price for CPUs recently and with GPUs in the past.

But I think we are talking at least 2080 Ti territory, then $799 would be enticing potentially.
 
Your business "understanding" is a bizarre fantasy.

In another thread you are argued vehemently that the right move because of the pandemic, was to simply not launch this year and let your competitors have the market, because somehow you think no sales at all, are better than slower than normal sales. Business genius you are. :rolleyes:




Because you keep repeating nonsense that begs correction.




See, this kind of nonsense.

Other than when running a "non-profit", Profit is the point. Companies exist to make profits, and they will definitely attempt to maximize margins/profits on their products. They would be incompetent if they didn't.

Fudging numbers, and shifting cash around is fraud. Why are you equating profit with fraud?

It really seems that you are offended by profits(equating them with fraud), and the thought that Saintly AMD would seek to maximize the profits on their GPUs is unthinkable.

And yet AMD is smart enough not to do something that will damage the very same market they are helping to maintain and create. The fact that you keep preaching profit means you feel that Intel and Nvidia price gouging was justified and that heck, everyone does it. (Hint: No, not everyone does it.)
 
Last edited:
I'd be surprised if AMD tops $800. Maybe $1000 for the high end factory OC models.

No, they are not a charity, but they have gone hard on price for CPUs recently and with GPUs in the past.

But I think we are talking at least 2080 Ti territory, then $799 would be enticing potentially.

If it is true that AMD are releasing 3 RDNA 2 chips, then I think, the lower 2 will target 2080 ti from below & above. Likely $500 for 6700 XT for slightly lesser performance than the 2080 ti & $700-$800 for 6800 XT with more performance than the 2080 ti

The topmost card 6900 XT will be pushed to the limits to compete with Ampere RTX 3080 ti. Price of 6900 XT likely to be in same ballpark as the 3080 ti
 
And yet AMD is smart enough not to do something that will damage the very same market they are help to maintain and create. The fact that you keep preaching profit means you feel that Intel and Nvidia price gouging was justified and that heck, everyone does it. (Hint: No, not everyone does it.)

You aren't making sense. Damage the market how?

I am not "preaching profit". I am pointing out how the world works, to some people that live in an alternate fairy land (Where it seems AMD plays Oberon, king of the non profit Fairies).

Every product manager making any kind of manufactured product anywhere, is going to be doing everything their power to maximize margins/profit, for that product. To do otherwise is incompetence. If they don't do it well enough, they will be replaced by someone that does it better.

The silly refrain of NVidia is nefarious for pricing GPUs like this, while AMD is saintly while pricing them the same, because saintly AMD really has no choice but to price them like evil NVidia that sets the prices, is utterly absurd.

They are just both profit driven companies, trying to maximize product margins. Attributing fan based good/evil designations is childish.
 
You aren't making sense. Damage the market how?

I am not "preaching profit". I am pointing out how the world works, to some people that live in an alternate fairy land (Where it seems AMD plays Oberon, king of the non profit Fairies).

Every product manager making any kind of manufactured product anywhere, is going to be doing everything their power to maximize margins/profit, for that product. To do otherwise is incompetence. If they don't do it well enough, they will be replaced by someone that does it better.

The silly refrain of NVidia is nefarious for pricing GPUs like this, while AMD is saintly while pricing them the same, because saintly AMD really has no choice but to price them like evil NVidia that sets the prices, is utterly absurd.

They are just both profit driven companies, trying to maximize product margins. Attributing fan based good/evil designations is childish.

Keep it up, there is nothing childish about being observant and noting the facts as they are. I would not need to point out all the nefarious business practices that both Nvidia and Intel have actually pushed and accomplished over the last 15 years. Those practices did nothing but damage customer choice and slow down innovation, among other things. (Severely slow things down, in the case of Intel.)

Modern AMD is saintly in comparison, especially for what they are trying and succeeding to do in the market. We are now far ahead of where we were just 3 years ago and have considerably more to look forward to, because of them. The idea of profit at all costs is what got Intel in the trouble there in now, after all.
 
It's this kind of zealous nonsense, that makes debate with ardent fans pointless.

Company worship is absurd behavior. It goes way beyond bias.

Just quit now dude. I tried and failed. The circular logic is impenetrable. God speed good sir.

Because neither one of you are interested in good, honest debate. Your minds were made up, which is fine but, all you have to do is think about what has occurred in the last 3 years and what is still occurring to understand the point.
 
Because neither one of you are interested in good, honest debate. Your minds were made up, which is fine but, all you have to do is think about what has occurred in the last 3 years and what is still occurring to understand the point.

I think you need to look in the mirror on that comment. I treat companies as the for profit ventures that they are.

You are the one so biased you worship at the alter of "Saintly AMD".
 
To put this stupid argument to bed: If AMD manages to come out with a GPU that’s top of the line, Lisa Su and her team would have to be fucking idiots not to charge as much as they can get away with. Ryzen processors aren’t priced where they are because AMD is a “saint”. They are priced where they are because that is what AMD feels they can get away with. That is why they raised prices for Ryzen 3000 and Threadripper 3 CPUs compared to the previous generation.
 
To put this stupid argument to bed: If AMD manages to come out with a GPU that’s top of the line, Lisa Su and her team would have to be fucking idiots not to charge as much as they can get away with. Ryzen processors aren’t priced where they are because AMD is a “saint”. They are priced where they are because that is what AMD feels they can get away with. That is why they raised prices for Ryzen 3000 and Threadripper 3 CPUs compared to the previous generation.

Dude, they did not raise prices for the Ryzen 3000 and Threadripper 3 CPU compared to the previous generation, at all. The 3700X and 3800X are faster than previous generations and yet, cost equal or less than before. The 3900X not only is faster than the 1800X but also has more cores and threads, for the same cost. The 3950X is a next level up CPU so of course, it is going to cost more.

As for the Threadripper CPU's, the cost is equivalent at least for what the customer is getting. A faster CPU with considerably more cores. It is not like they are throwing the customer a bone with a slightly faster quad core for more money. If AMD comes out with a GPU that is top of the line, they are not going to be idiotic enough to charge $1200 for it, which would be stupid. (Consumer GPU, not professional level, of course.)
 
Dude, they did not raise prices for the Ryzen 3000 and Threadripper 3 CPU compared to the previous generation, at all. The 3700X and 3800X are faster than previous generations and yet, cost equal or less than before. The 3900X not only is faster than the 1800X but also has more cores and threads, for the same cost. The 3950X is a next level up CPU so of course, it is going to cost more.

As for the Threadripper CPU's, the cost is equivalent at least for what the customer is getting. A faster CPU with considerably more cores. It is not like they are throwing the customer a bone with a slightly faster quad core for more money. If AMD comes out with a GPU that is top of the line, they are not going to be idiotic enough to charge $1200 for it, which would be stupid. (Consumer GPU, not professional level, of course.)
I didn’t say the prices weren’t justified but they are what they are.

Why would it be stupid? Nvidia proved that there is a market for it. If AMD believes they can get away with that price then that is what they will charge. They would be fools to price their products below what they believe the market will handle.
 
Whoever releases their cards first has a financial obligation to their shareholders to charge the most the market will bear for that product. Who ever releases second / third then gets the decision to to price accordingly based on the numbers set by those before them.
Nvidia, AMD, and Intel are all using the same plants on the same node their costs will all be similar (not identical) and their pricing needs to reflect not only the physical costs but the R&D costs as well and leave a bare minimum of 30% profit on top of that. If the company who launches first comes out the gate at 30% and somebody is able to challenge them at a slightly cheaper point they have nowhere to go with out risking taking a loss. A much safer point is between 45-55% profit margins it allows them to respond to changes in the market and not risk falling below that 30% point unless they really have too.
Why 30% some may ask, because it is a solid baseline that keeps investors happy not meeting that baseline tends to devalue stock and anger shareholders and lenders alike. Given supply for a given chip is not infinite, and their production runs are limited strategy must be used to price them accordingly otherwise you are just going to loose money to secondary markets (aka price gougers).
 
Whoever releases their cards first has a financial obligation to their shareholders to charge the most the market will bear for that product. Who ever releases second / third then gets the decision to to price accordingly based on the numbers set by those before them.
Nvidia, AMD, and Intel are all using the same plants on the same node their costs will all be similar (not identical) and their pricing needs to reflect not only the physical costs but the R&D costs as well and leave a bare minimum of 30% profit on top of that. If the company who launches first comes out the gate at 30% and somebody is able to challenge them at a slightly cheaper point they have nowhere to go with out risking taking a loss. A much safer point is between 45-55% profit margins it allows them to respond to changes in the market and not risk falling below that 30% point unless they really have too.
Why 30% some may ask, because it is a solid baseline that keeps investors happy not meeting that baseline tends to devalue stock and anger shareholders and lenders alike. Given supply for a given chip is not infinite, and their production runs are limited strategy must be used to price them accordingly otherwise you are just going to loose money to secondary markets (aka price gougers).

I agree, with stipulation that margins on individual products are gross margins, and 30% is likely not enough to maintain a overall profitable high technology, heavy R&D business. Here is AMD gross and net margin over time. Gross margins of 30% usually result in negative net margins:
https://149367133.v2.pressablecdn.com/wp-content/uploads/2018/12/Lenovo-Yoga-C930-Review-01.jpg

Even NVidia went negative when gross margins fell near 30%:
https://www.macrotrends.net/stocks/charts/NVDA/nvidia/profit-margins

50%-60% is probably the healthy target in this kind of business.
 
I agree, with stipulation that margins on individual products are gross margins, and 30% is likely not enough to maintain a overall profitable high technology, heavy R&D business. Here is AMD gross and net margin over time. Gross margins of 30% usually result in negative net margins:
https://149367133.v2.pressablecdn.com/wp-content/uploads/2018/12/Lenovo-Yoga-C930-Review-01.jpg

Even NVidia went negative when gross margins fell near 30%:
https://www.macrotrends.net/stocks/charts/NVDA/nvidia/profit-margins

50%-60% is probably the healthy target in this kind of business.

Think your AMD link is copy pasta'd incorrectly (thankfully you didn't paste weird porn)
 
I agree, with stipulation that margins on individual products are gross margins, and 30% is likely not enough to maintain a overall profitable high technology, heavy R&D business. Here is AMD gross and net margin over time. Gross margins of 30% usually result in negative net margins:
https://149367133.v2.pressablecdn.com/wp-content/uploads/2018/12/Lenovo-Yoga-C930-Review-01.jpg

Even NVidia went negative when gross margins fell near 30%:
https://www.macrotrends.net/stocks/charts/NVDA/nvidia/profit-margins

50%-60% is probably the healthy target in this kind of business.
Yeah 30% is an arbitrary number but still works as a baseline point for "abandon all hope ye who enter here" territory, that nVidia profit margin chart does need to be taken with a slight gain of salt though as that huge income dip does occur in early 2009 right in the hardest part of the financial crash so while their profits were probably dropped to as low as they could go their sales were probably still terrible so the volume wasn't there to cope. Looking at ATI's numbers (AMD rebrand happened in 2010) from 2008 and 2009 look similar in shape only with smaller profits and larger losses again sited to lack of sales. Probably didn't help that nVidia had a 30% + performance lead over ATI at that time in the same price points
 
Yeah 30% is an arbitrary number but still works as a baseline point for "abandon all hope ye who enter here" territory, that nVidia profit margin chart does need to be taken with a slight gain of salt though as that huge income dip does occur in early 2009 right in the hardest part of the financial crash so while their profits were probably dropped to as low as they could go their sales were probably still terrible so the volume wasn't there to cope. Looking at ATI's numbers (AMD rebrand happened in 2010) from 2008 and 2009 look similar in shape only with smaller profits and larger losses again sited to lack of sales. Probably didn't help that nVidia had a 30% + performance lead over ATI at that time in the same price points

There is always a bigger story then just what the numbers say. But 30% profit margin is not where you want to be from a manufacturers standpoint. General rule of thumb is if it costs a dollar to make then sell it for 2 dollars and R&D and other expenses reduce your margin from there, gets ugly tho when you cant move product.
 
There is always a bigger story then just what the numbers say. But 30% profit margin is not where you want to be from a manufacturers standpoint. General rule of thumb is if it costs a dollar to make then sell it for 2 dollars and R&D and other expenses reduce your margin from there, gets ugly tho when you cant move product.
Yeah, I remember AMD saying that 45% is their target zone for the stuff coming up but I don't know what they have included in their costs list for calculating that 45% so they would be very silly if they weren't including some degree of their R&D budget in there.
 
I am a little gun-shy of the 4K disruptor claim, didn't AMD R9 Fury X claim the same, unless it is around $400 not interested, we make 6 figures so it is not an issue of affordability for me to spend more. I refuse to spend $1200 to play a $60 game at max settings.

I made 3200 last week! Try my program to see how you too can make 6400$$ a minute. And my friend also makes 87,000$$^^ wow every 16 seconds selling this amazing product! Visit www.stupidmoneyadsonforums.cc.net.com.biz for information on how to be a money maniac.

Ok ok im surprised our forums havent been invaded like this hah your 6 figures comment reminded me of it for some reason lol
 
Last edited:
I've not seen this... on a base 2000 mhz card you're saying the average Nvidia card at that base clock can OC to 2500 mhz? From articles I've read that simply isn't the case.

First, 2000Mhz base card is grossly innacurate.

2080Ti Base clock is 1350Mhz, boost clock is 1545Mhz or 1635Mhz on Founders edition, this is likely the binned GPU's.
My 2080Ti is factory oc'd to 1770 boost clock. My own overlocking with EVGAPrecision I get 2100Mhz.
2100Mhz is 55.6% > base clock
2100Mhz is 35.9% > boost clock
2100Mhz is 28.4% > FE boost clock, or the binned GPU's boost clock
2100Mhz is 18.6% > My cards' factory OC

AMD cards cannot OC this well. They run 110C + junction temps... My card temp reading in EVGAPrecision hasn't ever hit 80C, mid 70's under load is what I've seen.
AMD GPU's are already balls to the walls from the factory because if they were not, they simply would not be competitive.

I believe from the above chart, that the 28% OC number is the one to compare to AMD. They also likely come in factory OC versions, but are those 35% over the base card boost clock? Even the binned GPU's can't do that.
 
First, 2000Mhz base card is grossly innacurate.

2080Ti Base clock is 1350Mhz, boost clock is 1545Mhz or 1635Mhz on Founders edition, this is likely the binned GPU's.
My 2080Ti is factory oc'd to 1770 boost clock. My own overlocking with EVGAPrecision I get 2100Mhz.
2100Mhz is 55.6% > base clock
2100Mhz is 35.9% > boost clock
2100Mhz is 28.4% > FE boost clock, or the binned GPU's boost clock
2100Mhz is 18.6% > My cards' factory OC

AMD cards cannot OC this well. They run 110C + junction temps... My card temp reading in EVGAPrecision hasn't ever hit 80C, mid 70's under load is what I've seen.
AMD GPU's are already balls to the walls from the factory because if they were not, they simply would not be competitive.

I believe from the above chart, that the 28% OC number is the one to compare to AMD. They also likely come in factory OC versions, but are those 35% over the base card boost clock? Even the binned GPU's can't do that.

5700XT can overclock up to 2100 Mhz, just not all of them. Also, the temps you are referring to are the reference models with the default fan curve. And no, newer AMD cards are not balls to the wall anymore.
 
I've actually had better results overclocking on AMD.

The Nvidia cards I've got either don't OC at all (even small bumps crash), or I can OC (and get a higher frequency) but performance is worse.

But a lot of it is luck of the draw, so I won't try to make any conclusions there.
 
First, 2000Mhz base card is grossly innacurate.

2080Ti Base clock is 1350Mhz, boost clock is 1545Mhz or 1635Mhz on Founders edition, this is likely the binned GPU's.
My 2080Ti is factory oc'd to 1770 boost clock. My own overlocking with EVGAPrecision I get 2100Mhz.
2100Mhz is 55.6% > base clock
2100Mhz is 35.9% > boost clock
2100Mhz is 28.4% > FE boost clock, or the binned GPU's boost clock
2100Mhz is 18.6% > My cards' factory OC

AMD cards cannot OC this well. They run 110C + junction temps... My card temp reading in EVGAPrecision hasn't ever hit 80C, mid 70's under load is what I've seen.
AMD GPU's are already balls to the walls from the factory because if they were not, they simply would not be competitive.

I believe from the above chart, that the 28% OC number is the one to compare to AMD. They also likely come in factory OC versions, but are those 35% over the base card boost clock? Even the binned GPU's can't do that.

If you're talking 5700xt vs 2070 super, as we were, yes, they have the same OC overhead. As they both come stock at ~1800MHz and both hit ~2100MHz and both gain ~7 to 10% performance from overclocking.

The 1080ti, 2080ti etc have much lower base clocks than the potential of the architecture, I'm assuming due to not wanting to blow TDPs out of the water.

Your 80c and 70c isn't a junction temp, Nvidia doesn't report the junction temp.

Also my 5700xt runs at ~42c with a 60c junction.

Good post "GoodBoy"
 
Last edited:
Right, the Junction temp tends to be about 20C over the other type of measurements. That would put my Junction temps around 95C under load.

The differences at the lower end of the barrel: 2070, obviously a poorer performing version of the GPU (or might be completely different chip). But for the AMD chip, it is their "top of the barrell".

So you are saying, Nvidia's lower tier chip is equal in performance and overclocking capabilty as AMD's top-tier chip? Completely agree.

Good post "Sabrewolf732"
 
Right, the Junction temp tends to be about 20C over the other type of measurements. That would put my Junction temps around 95C under load.

The differences at the lower end of the barrel: 2070, obviously a poorer performing version of the GPU (or might be completely different chip). But for the AMD chip, it is their "top of the barrell".

So you are saying, Nvidia's lower tier chip is equal in performance and overclocking capabilty as AMD's top-tier chip? Completely agree.

Good post "Sabrewolf732"
except he said 2070 super which is not "nvidia's low end"
 
except he said 2070 super which is not "nvidia's low end"

Right, the Junction temp tends to be about 20C over the other type of measurements. That would put my Junction temps around 95C under load.

The differences at the lower end of the barrel: 2070, obviously a poorer performing version of the GPU (or might be completely different chip). But for the AMD chip, it is their "top of the barrell".

So you are saying, Nvidia's lower tier chip is equal in performance and overclocking capabilty as AMD's top-tier chip? Completely agree.

Good post "Sabrewolf732"

And we are talking about AMD's top end, as it being a reference model with the default fan curve. Increasing the fan curve on my reference 5700 drops temperatures significantly without any appreciable noise increase.
 
This is unquestionably subjective -- any increase in fan speed is going to increase noise. You may not hear it, someone else might.

It is unquestionably reality, despite your argument to the contrary. Otherwise, most Nvidia cards would be noisy according to your standards.
 
It is unquestionably reality, despite your argument to the contrary. Otherwise, most Nvidia cards would be noisy according to your standards.
Wrong. Have you been deafened by your Radeon cards? That's the only way you could say they're quiet...
 
Wrong. Have you been deafened by your Radeon cards? That's the only way you could say they're quiet...

Did you forget about the FX 5800 Ultra. The Ultra stood for how dang loud it was, the current cards out on the market are not that loud unless you run them at 100%. Usually 75% and down on fan speed is quiet enough while cooling well enough for most people. The rest of us slap water cooling on it if it still bothers us and reference designs are best for that. Oh and just for reference the 5800 was measured at 77 dB and 45 dB for the 5700 XT.
 
Right, the Junction temp tends to be about 20C over the other type of measurements. That would put my Junction temps around 95C under load.

The differences at the lower end of the barrel: 2070, obviously a poorer performing version of the GPU (or might be completely different chip). But for the AMD chip, it is their "top of the barrell".

So you are saying, Nvidia's lower tier chip is equal in performance and overclocking capabilty as AMD's top-tier chip? Completely agree.

Good post "Sabrewolf732"

So my junction still beats your temp, what is your argument? Radeon's run hot so they don't OC well? As you can see, temperature is completely dependent on the cooling method being used. The 5700xt and 2070s both use a similar amount of power.

The 2070 super and 5700xt have the same overclocking headroom.

Don't understand the point of your post tbh.

If you're trying to say AMD's recent high end cards (290, fury, vega 64) don't have good headroom, sure, I'll buy that.
 
Wrong. Have you been deafened by your Radeon cards? That's the only way you could say they're quiet...

Wrong, have you been blinded....... oh yeah, never mind, you will never agree but hey, that is not a surprise, enjoy. :D
 
Did you forget about the FX 5800 Ultra. The Ultra stood for how dang loud it was, the current cards out on the market are not that loud unless you run them at 100%. Usually 75% and down on fan speed is quiet enough while cooling well enough for most people. The rest of us slap water cooling on it if it still bothers us and reference designs are best for that. Oh and just for reference the 5800 was measured at 77 dB and 45 dB for the 5700 XT.

Yeah, he tends to be the AMD + Reference blower = always bad camp of a person.
 
I had a Titan X Pascal, now THAT was a loud card. And I typically don't care about noise, it was really bad.

Granted, I had it stuffed in a mini-itx cube, I actually cut myself and bled just to get it in there. So there was pretty bad airflow, but it worked.

I think that was the loudest card I've used recently. The 5700 XT was loud at launch, but the updated drivers seemed to fix that, it's not too bad at all.

And depending on your thermals and/or noise comfort, you can definitely tweak any card to your desired levels, no need to live with stock fan settings.
 
Dude, they did not raise prices for the Ryzen 3000 and Threadripper 3 CPU compared to the previous generation, at all. The 3700X and 3800X are faster than previous generations and yet, cost equal or less than before. The 3900X not only is faster than the 1800X but also has more cores and threads, for the same cost. The 3950X is a next level up CPU so of course, it is going to cost more.

As for the Threadripper CPU's, the cost is equivalent at least for what the customer is getting. A faster CPU with considerably more cores. It is not like they are throwing the customer a bone with a slightly faster quad core for more money. If AMD comes out with a GPU that is top of the line, they are not going to be idiotic enough to charge $1200 for it, which would be stupid. (Consumer GPU, not professional level, of course.)

Of course they are going to charge $1200 for a card if their offering is faster than Nvidia's offering at $1200. Or they'll charge $1000-$1100 and make Nvidia charge less.

Do you think if their Ryzen chips are faster than Intel's at gaming that they would still be cheaper with more cores than Intel's offering? They would never charge over 1K for a consumer grade cpu.. cough... fx-60 cough.
 
5700XT can overclock up to 2100 Mhz, just not all of them. Also, the temps you are referring to are the reference models with the default fan curve. And no, newer AMD cards are not balls to the wall anymore.
Benchmark Red Dead Redemption 2, 1440p, OC 5700 XT AE. Slight undervolt, custom fan curve -> at end of video the OC settings are shown. Game is set to max preset slider settings. GPU clocks do vary. Recording the video takes off about 4FPS. Card toasts my 1080 Ti's in this game. No problem keeping the card cool with +50 power and custom fan curve, it is noisy but I could care less. This was done back in December, have not tested it lately to see if AMD did any further improvements. Got bored with game, will probably go back to it.

 
Back
Top