AMD's best Big Navi GPU may only be a match for Nvidia Ampere's second tier

NVidia does not give out those numbers, so conjecture from the Reddit addicts is about the best anybody can offer but they shouldn't be too far off. and to date have been pretty accurate for both NV's and AMD's cards.

That's not what you claimed earlier, here let me quote you:

according to NVidia's financials they make about $45 profit on. AMD is not selling the $500 GPU, the AIB is.

So now you're backpedaling and saying NVIDIA doesn't disclose that information and instead pointed me to reddit nonsense.
 
XSX equal to a 2080Ti.

It’s in between. But since Navi their flops better reflect the gaming performance vs GCN which was more compute reflective. I expect big Navi to just maintain that. So if Xbox RDNA 2 performance is anything to go by I think big Navi is going to at minimum should be 30% faster than 2080ti.
 
Since when are TFLOPS an indication of gaming performance across different vendors?
Its not but since Navi they are better reflective of it compared to nvidia vs how gcn was. Gcn was more compute oriented.
 
That's not what you claimed earlier, here let me quote you:



So now you're backpedaling and saying NVIDIA doesn't disclose that information and instead pointed me to reddit nonsense.

I also stated I made a math mistake, NVidia doesn’t disclose specifics but they do give profit margins. So again I will respond to you stating $45 was incorrect but closer to 45% which on a $300 would be $135. For a GPU core itself, not the physical card which given NVidia’s financial reports is accurate, unless NVidia is lying on them.
 
I also stated I made a math mistake, NVidia doesn’t disclose specifics but they do give profit margins. So again I will respond to you stating $45 was incorrect but closer to 45% which on a $300 would be $135. For a GPU core itself, not the physical card which given NVidia’s financial reports is accurate, unless NVidia is lying on them.

Actually 45% is what AMD makes, NVIDIAs gross margin is higher and closer to 55%. And you claimed NVIDIA made $45 on a 2080 Ti and then said Reddit backed up those bullshit numbers and now are saying something completely different. Thanks for playing.
 
Actually 45% is what AMD makes, NVIDIAs gross margin is higher and closer to 55%. And you claimed NVIDIA made $45 on a 2080 Ti and then said Reddit backed up those bullshit numbers and now are saying something completely different. Thanks for playing.
Sure man my loss, hats off to you good sir. I should learn to double check my math before I post things and not try to recall things from memory before the coffee has kicked in. Next time I will take the time to pull up the source first so I can pull numbers from it, but for shits and giggles I will leave you with this excerpt from the now archived Reddit post working out the specific dye costs from the 1080TI and 2080TI just because it's there, not because it is relevant any longer, but these would be strict manufacturing costs not including any R&D overheads or other costs they would have to factor in, and this is also from back at launch and I am sure the yield rates have improved over time but it at least serves as an example of the kind of research these Reddit guys put into trying to crunch the numbers.

Anywhoo, again I've said my piece and I am dead wrong so good day to you sirs.

"http://www.semi.org/en/node/50856
The minimum silicon cost reached with 300mm diameter wafers is about $3 per square inch, resulting in a maximum cost per wafer to of $400.
Quoting from : https://www.fool.com/investing/gene...omics-nvidia-corps-geforce-gtx-1080-chip.aspx
By using Silicon Edge's dies per wafer estimator tool, a wafer of GM200 chips should pack a total of 91 dies. A wafer of GP104 chips, on the other hand, should be able to cram in approximately 180 dies. This passes a basic sanity check because the GP104 is about half the size of the GM200.
Now, if we assume that the 28-nanometer process, by virtue of its maturity, has a very low defect density of 0.01 defects per square centimeter, then -- using iSine's die yield calculator tool -- NVIDIA should get around 76 good chips per 28-nanometer wafer.
Dividing the wafer cost by the number of good dies yields a cost of $58.68 per chip.
Doing the same calculation for the 16-nanometer GP104, but assuming a defect density of 0.015 defects per square centimeter (since it is a less mature process), yields around 164 good dies per wafer. Dividing the estimated 16-nanometer wafer costs by this figure leads to a die cost estimate of $47.43.
So on a GTX 1080, Nvidia spends an estimated $47.43 per GPU die including covering costs of bad yields.
Now I don't have an RTX 2080 ti, but we have some measurements;
https://www.techpowerup.com/247005/nvidia-geforce-rtx-2080-ti-tu102-die-size-revealed
The rectangular die measures 31 mm x 25 mm, or 775 mm². The package has no IHS, but a metal brace along the periphery of the fiberglass substrate distributes mounting pressure from the cooler. NVIDIA is building the "Turing" family of GPUs on TSMC 12 nm FinFET node.
I'm going to presume this is on 300mm wafer, Correct me if I'm wrong. This works out at 61 dies per wafer.
Now going with a defect density of 0.1 for a new process. Which gives a pretty shocking 48.4% success rate, that pumps out 30 good dies per wafer.
-=-=-=-=-=-=-=-=-=-
Let's throw a guess out there that 300mm of 12nm wafer costs a massive $8000 for a new process (as immature 16nm wafer was estimated to cost around $7500). So 8000 divided by 30 = $266.67 (rounded up) (If you know the correct cost please say)
This should be pretty much an absolute worst case scenario.. And that puts the RTX 2080 ti at a wafer cost $219.24 more expensive than a GTX 1080 (at launch). And the price difference between the two?
  • GTX 1080 Launch MSRP: $549 - FE: $699
  • RTX 2080 Ti Launch MSRP: $999 - FE: $1,199
Realistically speaking a GPU die shouldn't have yields that bad. The Motley Fool reckoned that the immature 16nm process would have a defect density of 0.015 defects per square centimeter.
Now let's double that, a defect density of 0.03 defects per square centimeter would be pretty rough, but that gives 49 good dies, which lowers that down to $163.27 per die for the RTX 2080 ti.
And the difference between $47.43 and 163.27? $115.84
So by my calculations, the RTX 2080ti GPU Die alone could cost $115.84 more than a GTX 1080 to produce."
 
For some odd reason it seems like people have been saying this is coming for over the past 6 yrs. I just want it to come junk or not lets get this thing out.
 
You think MSI makes more than AMD on a graphics card sale? Nope, no, impossible.
Yeah I was off on my numbers, and explained it poorly. AMD makes like 45% on their GPU's the AIB's make closer to 20% But with the new cards the bulk of the costs have been in the actual boards themselves and less on the chips. So no MSI or any other AIB is not making a larger margin by any means. But if an AIB sold a card for $500 maybe only $150 of that was the actual GPU and the rest was the board and its components something like that. Either way I phrased things poorly so it is 100% on me for any misunderstandings from that or subsequent posts.
 
Sure man my loss, hats off to you good sir. I should learn to double check my math before I post things and not try to recall things from memory before the coffee has kicked in. Next time I will take the time to pull up the source first so I can pull numbers from it, but for shits and giggles I will leave you with this excerpt from the now archived Reddit post working out the specific dye costs from the 1080TI and 2080TI just because it's there, not because it is relevant any longer, but these would be strict manufacturing costs not including any R&D overheads or other costs they would have to factor in, and this is also from back at launch and I am sure the yield rates have improved over time but it at least serves as an example of the kind of research these Reddit guys put into trying to crunch the numbers.

Anywhoo, again I've said my piece and I am dead wrong so good day to you sirs.

"http://www.semi.org/en/node/50856

Quoting from : https://www.fool.com/investing/gene...omics-nvidia-corps-geforce-gtx-1080-chip.aspx




So on a GTX 1080, Nvidia spends an estimated $47.43 per GPU die including covering costs of bad yields.
Now I don't have an RTX 2080 ti, but we have some measurements;
https://www.techpowerup.com/247005/nvidia-geforce-rtx-2080-ti-tu102-die-size-revealed

I'm going to presume this is on 300mm wafer, Correct me if I'm wrong. This works out at 61 dies per wafer.
Now going with a defect density of 0.1 for a new process. Which gives a pretty shocking 48.4% success rate, that pumps out 30 good dies per wafer.
-=-=-=-=-=-=-=-=-=-
Let's throw a guess out there that 300mm of 12nm wafer costs a massive $8000 for a new process (as immature 16nm wafer was estimated to cost around $7500). So 8000 divided by 30 = $266.67 (rounded up) (If you know the correct cost please say)
This should be pretty much an absolute worst case scenario.. And that puts the RTX 2080 ti at a wafer cost $219.24 more expensive than a GTX 1080 (at launch). And the price difference between the two?
  • GTX 1080 Launch MSRP: $549 - FE: $699
  • RTX 2080 Ti Launch MSRP: $999 - FE: $1,199
Realistically speaking a GPU die shouldn't have yields that bad. The Motley Fool reckoned that the immature 16nm process would have a defect density of 0.015 defects per square centimeter.
Now let's double that, a defect density of 0.03 defects per square centimeter would be pretty rough, but that gives 49 good dies, which lowers that down to $163.27 per die for the RTX 2080 ti.
And the difference between $47.43 and 163.27? $115.84
So by my calculations, the RTX 2080ti GPU Die alone could cost $115.84 more than a GTX 1080 to produce."

I think the problem with the above is the assumption of defect rates which unfortunately nobody outside TSMC and NVIDIA know. However, even if costs went up per die, so did the price of the GPU so their margins stayed high and record profits were obtained. Also it would be more realistic to compare 1080 --> 2080 not the Ti. Things are different now since 2080S is in the equation.
 
Ampere 3080 is considered 2nd tier?...I thought 3070/3060 would be considered 2nd tier...if Big Navi is competitive with the 2080Ti or 3080 with competitive pricing then it's a big win
 
Ampere 3080 is considered 2nd tier?...I thought 3070/3060 would be considered 2nd tier...if Big Navi is competitive with the 2080Ti or 3080 with competitive pricing then it's a big win
There is still the 3080TI and 3090 cards.
 
In short: any argument saying that AMD doesn't care about what happens on desktop is coming from a place of ignorance (which was my response to DukeNukemX). My overall point in both this post and the previous post is that console sales matter far less to AMD than anything happening on server/workstation or hell even desktop.
Either way it's a conflict of interest. To me it seems AMD isn't even trying. When the RX 5700's were priced higher than anyone expected and then Nvidia dropped the ball by lowering prices of their GPU's, thus forcing AMD to lower their's again, tells me that AMD doesn't want to compete that much. Those prices should have been much lower from the start, especially without Ray-Tracing support.

The issue is that AMD maybe too reliant on semi-custom and custom offerings to stay afloat. Remember AMD has their hardware in Stadia, PS5, Xbox, Apple, and etc. Maybe not Apple for much longer. AMD has less reason to appeal to the PC gaming market.
 
They should just rename this place Rumor and Speculation Forum.

Can't remember the last time we got information that wasn't second-hand from some anonymous tipster.
 
Either way it's a conflict of interest. To me it seems AMD isn't even trying. When the RX 5700's were priced higher than anyone expected and then Nvidia dropped the ball by lowering prices of their GPU's, thus forcing AMD to lower their's again, tells me that AMD doesn't want to compete that much. Those prices should have been much lower from the start, especially without Ray-Tracing support.

The issue is that AMD maybe too reliant on semi-custom and custom offerings to stay afloat. Remember AMD has their hardware in Stadia, PS5, Xbox, Apple, and etc. Maybe not Apple for much longer. AMD has less reason to appeal to the PC gaming market.
Whoever gets to the party first gets to set the pace, NVidia launches first so they get to set the price AMD gets to choose how they respond and NVidia can counter accordingly. But I agree AMD has little interest in the consumer GPU market especially on the higher ends. Their TSMC time is precious, they have signed agreements with Microsoft, Sony, and Lenovo for large quantities of chips that AMD can’t afford to not meet. Then they have all their CPU’s as well which are selling like hot cakes and are already struggling to keep up with. So for AMD they have to keep innovating their GPU’s to keep up with the server and workstation stuff not to mention the consoles but they make major bank on the budget end of the PC market which isn’t too high volume but still significant, I think expect them to really come hard at the sub $500 market and I’m excited to see what they bring there but they can’t afford to play in the higher end. They can’t spare the time at TSMC.
 
Either way it's a conflict of interest.
It can't be. As I explained in the most simple of terms, their chips provided are on contract. They make the money whether a console sells or it doesn't.

But lets say that it did matter for any reason. The simplest illustration is that AMD/nVidia are weapons manufacturers and their goal is to sell them to anyone that will buy them. They don't really give a rats ass who that is (they don't care if they're selling guns to both sides, in case that isn't clear. Sell guns to both the US and China under contract? Sure who cares?). And it doesn't really affect them negatively either way. Their goal is more money > less money. There are people on this forum who both console game and game on desktop. Are you saying that AMD doesn't want them to buy 2 GPU's or even 3 if they buy multiple consoles? Seriously, try and describe how that makes sense.

In order for your position to make sense what you say about AMD would also have to hold true for nVidia. It's also not like nVidia didn't try to get those contracts. They do sell (essentially Shield) to Nintendo. Is nVidia concerned that if you buy a video card you won't buy a Switch? Really? The only reason why they didn't land those contracts and AMD did is because AMD can design CPU's and GPU's in the form of APU's. Which, nVidia cannot, meaning that console manufactuers would've had to have an extra supplier, complicating their supply chain (and also increasing cost). Otherwise both contracts could've just as easily gone nVidia's way.

To me it seems AMD isn't even trying.
If it isn't apparent, making a GPU is incredibly difficult. It takes 100s of people and if you count the money invested in fabs, billions of dollars.

Intel has tried multiple times to make a GPU. And they have well over 10x the operating income that AMD does and hasn't been able to compete in any meaningful way other than in mobile and that's strictly because they're building GPU's into processors for low power solutions.

So sorry you feel they aren't working hard enough to meet your expectations. Considering their operating budget and who they can afford to pay to design and the many other restrictions they have, being #2 in a two horse race is pretty damn good. For there to be even a third option, to reiterate, would require billions of dollars in investment. Not to mention incredibly talented people with deep GPU knowledge and experience.

The issue is that AMD maybe too reliant on semi-custom and custom offerings to stay afloat. Remember AMD has their hardware in Stadia, PS5, Xbox, Apple, and etc. Maybe not Apple for much longer. AMD has less reason to appeal to the PC gaming market.
Semi-custom keeps them afloat. They're a business. Their goal isn't to cater to you in particular but what the market demands at large. AMD is also incredibly diversified. Way more than nVidia. So if you're going to point the finger at AMD at being to reliant on the market segments they're involved in, than nVidia is involved in far fewer.

Considering top end GPU's account for <1% of total sales (in the consumer market workstation and server is different) it doesn't exactly make sense to even have the best performing halo product. It makes far more sense to create products that will offer bang for the buck. For themselves (R&D dollars, fab space etc) and also their customers (which most people will always be buying somewhere in the mid tier and below). This is just facts. You and everyone else that's an arm chair critic doesn't actually have to deal with the financial realities. But whether you like them or not, AMD has to deal with them. Same as everyone else. Considering AMD was just on the verge of bankruptcy not even 3 years ago, I'd say they're doing something right.

Another way of saying this, back to analogies, is that the Toyota Corolla is far more valuable than a Toyota Supra. Or even more to the point, a Toyota Corolla is more valuable than a Lamborgini Gallardo. You could argue that "Toyota isn't even trying" but that would be an absurd statement about car manufacturers just as it is about GPU manufacturers. AMD's interest is always about selling cards. AMD just may not be as interested in selling top end cards as you want them to be. But to say they aren't interested at all in one of their top two segments is an absurd distortion of the facts.

And this is only after being swayed by rumormongering. Truth is no one knows how next gen will stack up. We have at this moment zero hardware in hand and zero benchmarks from anyone remotely relevant. It's possible AMD will be on top this go around. I'll say that it isn't likely. But I'm thinking it will be a much closer fight unless accounting for ultra-premium options that we have no way of knowing about one way or another.
 
Last edited:
Are you saying that AMD doesn't want them to buy 2 GPU's or even 3 if they buy multiple consoles? Seriously, try and describe how that makes sense.
Most people don't buy a console and a desktop gaming PC. Most people buy one and usually stick to it. Sony and Microsoft know this and they don't want AMD to make PC GPU's that are compelling to PC gamers. You have to admit that the specs of these consoles are impressive and as a PC gamer you may not want to spend money to upgrade your hardware if you plan to buy a console.
In order for your position to make sense what you say about AMD would also have to hold true for nVidia. It's also not like nVidia didn't try to get those contracts.
We saw what happened during the PS3/360 era of PC gaming where the market nearly died. The price of graphic cards were sky high and nobody could run Crysis. The PC gaming market was a joke back then.

They do sell (essentially Shield) to Nintendo. Is nVidia concerned that if you buy a video card you won't buy a Switch? Really?
Well no because the Switch isn't exactly in the same market as PC/Playstation/Xbox. The Switch is just too weak of a game console to be competative. A single core of my Ryzen 2700 can emulate the entire CPU of the Switch. It's a very slow machine that probably won't see many 3rd party games from this point forward.
The only reason why they didn't land those contracts and AMD did is because AMD can design CPU's and GPU's in the form of APU's. Which, nVidia cannot, meaning that console manufactuers would've had to have an extra supplier, complicating their supply chain (and also increasing cost). Otherwise both contracts could've just as easily gone nVidia's way.
From what I've heard Sony and Microsoft were thinking of ARM and x86, but x86 won because at the time ARM's 64-bit wasn't ready. Also Nvidia does have CPU's through ARM.

Intel has tried multiple times to make a GPU. And they have well over 10x the operating income that AMD does and hasn't been able to compete in any meaningful way other than in mobile and that's strictly because they're building GPU's into processors for low power solutions.
From what I remember about Intel's Larrabee is that they had a competition between the Larrabee team and the people who make the integrated graphics for Intel and the latter won. Also Intel wanted to integrated Pentium x86 cores into the GPU, which wasn't very efficient or effective.

Considering top end GPU's account for <1% of total sales (in the consumer market workstation and server is different) it doesn't exactly make sense to even have the best performing halo product. It makes far more sense to create products that will offer bang for the buck.
I'm not interested in the top end GPU market and nobody else should be either. The problem is that AMD doesn't focus very much in the mid range or low end product too well. The RX 5700 to me seemed like a mid range product that should be sold for $250, but that didn't happen. Strangely enough AMD released the 5600 that isn't all that good either and still costs nearly $300. And yet AMD doesn't have any $200-$250 product in that range. They would clearly penetrate the market more if AMD just sold at lower prices to get a larger market presence. AMD's problem is that they're always $20 cheaper but also slightly slower than Nvidia. That doesn't look to change with RDNA2.
 
Most people don't buy a console and a desktop gaming PC. Most people buy one and usually stick to it. Sony and Microsoft know this and they don't want AMD to make PC GPU's that are compelling to PC gamers. You have to admit that the specs of these consoles are impressive and as a PC gamer you may not want to spend money to upgrade your hardware if you plan to buy a console.
Right. So to reiterate what you're saying: console gamers aren't PC gamers. Extrapolating from that, but also from knowing the marketplace: You will (generally speaking) never convert a console gamer to being a PC gamer. With those two basic premises' in place: your argument argues against itself. As what you're saying is there really is no conflict. Selling at the top of both those stacks makes AMD more money than if they only did one or the other, because each individual will only choose one or the other. In which case: I agree. Although that is literally the direct opposite of your point and position.

What Sony and MS want is irrelevant. Unless Sony and MS are going to pay AMD to give up everything else they sell in order to have exclusivity (which would cost billions) they don't have the power or inclination to stop AMD from "selling to both sides", which is the same analogy as I stated before. Using your logic Sony should want AMD to stop selling APU's to Microsoft. And Microsoft should want AMD to stop selling APU's to Sony. And all the PC part manufactures should want AMD to stop selling to Sony and Microsoft. And all of them should care and all of them should be preventing them from being in each others markets. And not only should they just want that, they should be enforcing that. It doesn't work that way. At all. If anything Sony and Microsoft care far more about their console competition versus what is happening in the desktop space.

To reiterate, Sony and Microsoft have just as little power to prevent AMD from selling onto the desktop space as they do to their much more direct console competitors (namely, literally each other). Which is to say: none.

We saw what happened during the PS3/360 era of PC gaming where the market nearly died. The price of graphic cards were sky high and nobody could run Crysis. The PC gaming market was a joke back then.
Citation needed. There has never been a time in which PC gaming has "almost died". PCMR aside there has been a lot of hyperbole and generally a lot of over generalizations of things.
If there ever was a time that gaming almost crashed completely, that would've been the early 1980's with the Atari crash and Nintendo essentially bringing back the entire market from the dead.

Well no because the Switch isn't exactly in the same market as PC/Playstation/Xbox. The Switch is just too weak of a game console to be competative. A single core of my Ryzen 2700 can emulate the entire CPU of the Switch. It's a very slow machine that probably won't see many 3rd party games from this point forward.
Using your own logic, if you're buying a console you're not buying a PC. And if you're buying a console as a console gamer you're also not likely to buy a second console. So functionally whether you think the graphics are "relevant" or not, the way people buy the Switch and use it exclusively to game on is no different than the XBox, PS4, or PC.
The Switch's power is irrelevant. As is emulation, which legalities aside, is a practice that is incredibly niche. And also far beyond the scope of what we're talking about.
More to the point, you're directly ignoring the problems with your own arguments. As you've noted yourself, it would've been possible for nVidia to have gotten either or both of the other two console contracts. Because if they did, using your logic: you're saying that nVidia would intentionally hamper their desktop products in order to appease Sony and/or Microsoft? Because that's basically your entire argument up to now. And it's absurd.

From what I've heard Sony and Microsoft were thinking of ARM and x86, but x86 won because at the time ARM's 64-bit wasn't ready. Also Nvidia does have CPU's through ARM.
Sure. But nVidia ARM is incredibly immature at best. There are, as an example, no phones I know of that are using nVidia ARM. It's all Qualcomm and Samsung (when excluding Apple). nVidia was trying to build something with Shield, but ironically they have stopped bothering with after the Switch.
More to the point though, you're contining to argue against your own points. As you note yourself nVidia is at the top of GPU sales and would have gladly taken both the PS and XB contracts if they could have. And there wouldn't have been any "conflict" like you've been claiming for the last several posts.

From what I remember about Intel's Larrabee is that they had a competition between the Larrabee team and the people who make the integrated graphics for Intel and the latter won. Also Intel wanted to integrated Pentium x86 cores into the GPU, which wasn't very efficient or effective.
Great. Still not competing.

I'm not interested in the top end GPU market and nobody else should be either. The problem is that AMD doesn't focus very much in the mid range or low end product too well. The RX 5700 to me seemed like a mid range product that should be sold for $250, but that didn't happen. Strangely enough AMD released the 5600 that isn't all that good either and still costs nearly $300. And yet AMD doesn't have any $200-$250 product in that range. They would clearly penetrate the market more if AMD just sold at lower prices to get a larger market presence. AMD's problem is that they're always $20 cheaper but also slightly slower than Nvidia.
AMD has a lot of problems in the GPU side. Not least or limited to just having fab space or drivers that work properly out of the box. However the 5700XT has been a boon for them. Even if you're not interested in it, the market has been.
Also, I'm more than sure their accountants have done a "very basic" break even analysis. Profit is what matters far more than "market presence". Especially considering they can only produce so many chips. With that in mind maximizing profit per chip matters far more than selling more chips for less, since they are physically incapable of selling more.

If you don't like their pricing, their strategy, or their product lineup you have nVidia. Vote with your wallet.

That doesn't look to change with RDNA2.
I'd love to have your crystal ball. You can predict all of AMD's product stack when no one else has any clue what it is they're doing. With that in mind I'd like you to let us know exactly every part they will come out with over the course of 6 months starting in November to June based off of RDNA2.
 
Last edited:
If the rumors are true about RDNA2 then the PS5 and Xbox Series X's Tflops maybe a terrible metric to determine their performance, and may actually be a lot slower than we anticipated. Look at Halo Infinite graphics on the Xbox Series X. Not looking too different than Halo 3.

View attachment 265859


You are actually watching 343i ineptitude. Nothing there was running on XSX, nothing shown so far from Microsoft has been running on the console after 2 game shows and Phil Spencer bragging about his XSX being working at home since December 2019, unlike Sony whose only game show only showed games running in the console.(*)

Halo Infinite gameplay has been the best Sony ps5 commercial for launch day, to the point that the latest poll in the UK got lopsided 85/15 PS5/XSX. My point is that you shouldn't judge the capabilities based on inept developers, specially inept judging at glassdoor reviews.

Edit:(*) forgot the GTA V thing at the very beginning. The XSX should be able to do better but the rumors have always been that Microsoft is very late on the devkits and SDK, to the point that RAD Game Tools has even published multiple tests with their Oodle Texture compression helper on actual game data and the results are amazing for PS4/5 and PC(for those willing to license it), Microsoft talks a lot but is lacking actual output, they still have 4 months tho but all points to them being a bit too raw still.
 
Last edited:
One day: “AMD’s Big Navi can’t compete with Nvidia’s top tier cards.”
Random AMD fan: “Huh, where are the facts in this biased article?”

Next day: “This just in, from the mouth of a completely unreliable industry source. Big Navi will be 400% faster than the 3080 Titan Ti and cost $150!”
Same AMD fan: “Hell yeah! I love you Dr. Su! Fuck Nvidia and Intel!”
 
WOW Guys, anyone saying that AMD aren't interested in the GPU market anymore because they have the consoles is way off the mark.

Consoles have been a nice little earner, but even with the small percentage of market share they have in the GPU and CPU markets those divisions still bring in more revenue than the consoles. And that difference is going to grow according to estimates.

What is it? something like 18% CPU, 30% GPU and 5% server CPU, 12% laptop. And people think that AMD aren't interested in getting more of that market share?

what are you guys smoking?
 
Most people don't buy a console and a desktop gaming PC. Most people buy one and usually stick to it.

Even if that’s true their choice has little to do with the speed of PC graphics cards.

You have to admit that the specs of these consoles are impressive and as a PC gamer you may not want to spend money to upgrade your hardware if you plan to buy a console.

An extremely short lived comparison. How long do you expect these consoles to seem “impressive”?

We saw what happened during the PS3/360 era of PC gaming where the market nearly died. The price of graphic cards were sky high and nobody could run Crysis. The PC gaming market was a joke back then.

Well this is obviously false to anyone who was alive back then
 
I don't give much credence to any of the nearly daily rumors.

I just look at facts, and do reasonable extrapolation.

We know for a Fact that "Big Navi" is coming this year. AMD has repeatedly said this.
Big Navi will Be RDNA 2, RDNA 2 will have RT-DXR support.
AMD has access to the same world class process technology as NVidia, and is no longer saddled with the Global Foundries albatross limiting its GPUs.
RDNA 1 a big gain in GPU gaming transistor efficiency. Vega was hopeless. AMD need massively more transistors than NVidia to offer the same performance. There was no chance AMD could compete for the top end, even if they wanted to. The deficit was just too great. But with RDNA, they are now close to equal. I won't quibble about who is in front. They are now close enough, that AMD can compete for the top end if they want to. For the first time in a long time.

So this is where we are, AMD has fixed the two things (process and architecture) that practically prevented them for competing for the high end. AMD has access to the same process tech, a GPU architecture with essentially near transistor parity, so for the first time in a long time we have real potential for competition at the top end.

So what can we extrapolate from that?

IMO If they were to each build a 700mm2 chip, they would perform similarly on the same process.

The biggest difference will end up being who will make the biggest chip. Given recent history, NVidia is usually usually pushes chip boundary farther.

If I were to place odds, the would be NVidia having a bigger chip and eeking out the top end win again, but AMD will be much closer behind. Both cards will offer significant gains over 2080ti.

I expect AMDs top part will be significantly ahead of 3080 (not Ti), but will likely fall short of what ever crazy train, giant die 3090/titan NVidia release for $1500+...

This is definitely shaping up to be the best competitive launch in ages. I just hope we get some good perf/$ gains. While this be the best aligned launch in ages, I don't expect anyone wants to really wants to a price war to give us a big perf/$ boost.
 
n when the released the R9 390 along with the R9 Fury. It wasn't until the RX series where AMD actually released new mid and low range products that wasn't a rebranding. AMD being dominant in the console market means Sony and Microsoft won't be happy seeing AMD making better priced GPU's for the PC market.
I don't give much credence to any of the nearly daily rumors.

I just look at facts, and do reasonable extrapolation.

We know for a Fact that "Big Navi" is coming this year. AMD has repeatedly said this.
Big Navi will Be RDNA 2, RDNA 2 will have RT-DXR support.
AMD has access to the same world class process technology as NVidia, and is no longer saddled with the Global Foundries albatross limiting its GPUs.
RDNA 1 a big gain in GPU gaming transistor efficiency. Vega was hopeless. AMD need massively more transistors than NVidia to offer the same performance. There was no chance AMD could compete for the top end, even if they wanted to. The deficit was just too great. But with RDNA, they are now close to equal. I won't quibble about who is in front. They are now close enough, that AMD can compete for the top end if they want to. For the first time in a long time.

So this is where we are, AMD has fixed the two things (process and architecture) that practically prevented them for competing for the high end. AMD has access to the same process tech, a GPU architecture with essentially near transistor parity, so for the first time in a long time we have real potential for competition at the top end.

So what can we extrapolate from that?

IMO If they were to each build a 700mm2 chip, they would perform similarly on the same process.

The biggest difference will end up being who will make the biggest chip. Given recent history, NVidia is usually usually pushes chip boundary farther.

If I were to place odds, the would be NVidia having a bigger chip and eeking out the top end win again, but AMD will be much closer behind. Both cards will offer significant gains over 2080ti.

I expect AMDs top part will be significantly ahead of 3080 (not Ti), but will likely fall short of what ever crazy train, giant die 3090/titan NVidia release for $1500+...

This is definitely shaping up to be the best competitive launch in ages. I just hope we get some good perf/$ gains. While this be the best aligned launch in ages, I don't expect anyone wants to really wants to a price war to give us a big perf/$ boost.

Get out of here with your sound logic and reason.
 
Most people don't buy a console and a desktop gaming PC.

Eh, but most PC gamers also buy consoles. I can't think of a single one of my PC gaming friends who doesn't own and use at least one console.

We saw what happened during the PS3/360 era of PC gaming where the market nearly died.

That was like, a golden age for PC gaming and hardware.

I'm not interested in the top end GPU market and nobody else should be either.

Everyone wants the best card, even if they'll settle for the cut-down version. The halo part is a massive market indicator.
 
This is definitely shaping up to be the best competitive launch in ages. I just hope we get some good perf/$ gains. While this be the best aligned launch in ages, I don't expect anyone wants to really wants to a price war to give us a big perf/$ boost.

Pretty much the only two things that seem clear at this point: a) we'll finally see competition at the top end after years of disparity, and b) neither AMD nor nV will be doing the consumer any favors on price/performance. Come November, I'm expecting 2080 Ti-level performance will cost nearly what it does today.
 
dukenukemx said:
I'm not interested in the top end GPU market and nobody else should be either. The problem is that AMD doesn't focus very much in the mid range or low end product too well. The RX 5700 to me seemed like a mid range product that should be sold for $250, but that didn't happen. Strangely enough AMD released the 5600 that isn't all that good either and still costs nearly $300. And yet AMD doesn't have any $200-$250 product in that range. They would clearly penetrate the market more if AMD just sold at lower prices to get a larger market presence. AMD's problem is that they're always $20 cheaper but also slightly slower than Nvidia. That doesn't look to change with RDNA2.

???

The 5700xt is often available for ~350 to ~375. It's ~5-7% slower than the 2070 super which I've only seen as low as $499. The 5700xt is closer to 10% faster than a 2060 super. The 2060 super I've only seen as low as $400.

????

Your argument is sound for the 5600. Hard to recommend the 5600 over the 2060 KO when the 2060 has DLSS 2.0.
 
But with RDNA, they are now close to equal. I won't quibble about who is in front. They are now close enough, that AMD can compete for the top end if they want to. For the first time in a long time.
AMD is 'about equal' while being a node ahead.
 
Right. So to reiterate what you're saying: console gamers aren't PC gamers.
You're trying to put words into my mouth. What I'm saying is that people typically can't afford to buy every machine. That's why you have fanboys. That's why you have Genesis vs Super Nintendo fanboys. Obviously the best thing to do is buy all the gaming machines but obviously people usually don't have that kind of disposable income. Some do, most don't.
Using your logic Sony should want AMD to stop selling APU's to Microsoft. And Microsoft should want AMD to stop selling APU's to Sony.
You don't think that wasn't a problem at some point for AMD?
And all of them should care and all of them should be preventing them from being in each others markets.
Businesses have done it in the past. Ask Intel, they're experts at it.
And not only should they just want that, they should be enforcing that.
Putting words in my mouth again.
It doesn't work that way.
It shouldn't work that way but history has shown that it has.

Citation needed. There has never been a time in which PC gaming has "almost died". PCMR aside there has been a lot of hyperbole and generally a lot of over generalizations of things.
I don't have any evidence other than personal experience. Not all 3rd party games were ported to PC and what got ported was terribly done. I believe Batman Arkham Knight has turned things around for PC ports. Games like Dark Souls had terrible ports and made it seem like PC gamers were second class citizens. This is why Dark Souls Remastered left a bad taste in my mouth since From Software didn't want to fix the mess that was Dark Souls back in 2012.
Using your own logic, if you're buying a console you're not buying a PC.
Again, you're putting words in my mouth. The Switch is too weak to get some 3rd party titles that you see on PS4/XB1/PC. Hence, it isn't directly competing in that market.
The Switch's power is irrelevant.
I feel that it is very relavent, and Switch owners will know what I mean once the PS5 and Xbox Series X are released.

Because if they did, using your logic: you're saying that nVidia would intentionally hamper their desktop products in order to appease Sony and/or Microsoft? Because that's basically your entire argument up to now. And it's absurd.
If the money was greater on console they would. But Nvidia owns the majority of the discrete GPU market. Nvidia probably wouldn't see console gaming a lucrative enough business to let it effect their other sources of income. AMD on the other hand is barely scratching the discrete GPU market.

There are, as an example, no phones I know of that are using nVidia ARM. It's all Qualcomm and Samsung (when excluding Apple). nVidia was trying to build something with Shield, but ironically they have stopped bothering with after the Switch.
Because Nvidia wasn't all that much better than Qualcomm and Samsung's SoC's and costed more. It isn't exactly hard to build an ARM SoC. Just ask China and all the Allwinner and etc chips they just fart out.
More to the point though, you're contining to argue against your own points. As you note yourself nVidia is at the top of GPU sales and would have gladly taken both the PS and XB contracts if they could have. And there wouldn't have been any "conflict" like you've been claiming for the last several posts.
Nvidia hasn't had the best history when it comes to working with others.


I'd love to have your crystal ball. You can predict all of AMD's product stack when no one else has any clue what it is they're doing. With that in mind I'd like you to let us know exactly every part they will come out with over the course of 6 months starting in November to June based off of RDNA2.
AMD is just going to make a high end GPU that'll be around $700 with maybe a slightly cheaper and slower version. We've seen AMD do this before with the R9 290, Fury, and Radeon VII. But if Nvidia's RTX 3070 costs $400 and performs like a RTX 2080 Ti then AMD is in big trouble. That would be intentional by Nvidia as I believe Nvidia is aware that they don't want PC gamers to buy these consoles, but instead buy their GPU's.
 
They should just rename this place Rumor and Speculation Forum.

Can't remember the last time we got information that wasn't second-hand from some anonymous tipster.


https://hardforum.com/threads/confi...-to-disrupt-4k-gaming.1992290/post-1044671950

Looks like rogame of hardwareleaks.com is doubling down on his claim of Big Navi == 2x Navi 10 (5600 XT, 5700, 5700 XT)



https://www.thefpsreview.com/2020/0...have-80-compute-units-5120-stream-processors/

cc @erek @Schro

Big Navi : Navi21
  • Up to 80 compute units/5120 GPU cores
  • A die size of around 505mm²
  • 50% better performance per watt
There are 4 Navi21 variants for the gaming market :

View attachment 266173

https://hardwareleaks.com/2020/05/23/exclusive-future-amd-gpu-stack-big-navi-navi10-refresh/


https://videocardz.com/newz/amd-sienna-cichlid-navi-21-big-navi-to-feature-up-to-80-compute-units

Also DISCLAIMER:
 
Back
Top