What happened to AMD's 'Big Supply'?

AMD continues to shoot themselves in the foot.
The launch of the 6800 and 6900 series cards drew interest given that despite using slower GDDR6 versus Nvidia's GDDR6X, there was more of it.
With Nvidia upping the ante in terms of VRAM capacity even on cards that use GDDR6, AMD may find themselves between a rock and a hard place in terms of sheer sales figures.
The fact that Nvidia has decided to increase VRAM coupled with their additional features, may influence user decisions moving forward.
I suspect Nvidia will start bringing significantly more cards to market now, even if from their lower tier parts...RTX3060, 3070, etc...
Every sale of an Nvidia GPU represents one less for AMD.
AMD made a great product this time out and they knew it.
The irony is... AMD may fall victim of their own success and greed this time out...much as Intel has and Nvidia will at some point.

The time between launch and the launch of Nvidia's Ti/Super/Ampere refresh cards was really AMD's time to shine. Don't know about you but I've seen 0 AMD cards here since they supposedly "launched". While Nvidia is bad as well, I at least periodically see 3090's, 3060 Ti's hit over here. Not fast enough for me to pick one up as a working person, but still.
 
Consoles are targeting 4k now, so that's not really true. Consoles have had good graphics just often shitty frame rates to maintain them.
They are targeting upscaled 4k with lots of tricks. Granted nvidia can use DLSS now, but still.

I also think that 10 won't be a limiting factor, heck not even 8gb in the foreseable future. Specially when (if) DLSS and DirectML become more commonplace.
 
Consoles are targeting 4k now, so that's not really true. Consoles have had good graphics just often shitty frame rates to maintain them.

Display resolution != texture quality

Every game I have has a separate setting for the two. If it truly made no difference as you’re suggesting, any PC game running at 1080p and maxed out settings should only look as good as it’s PS4 counterpart, but with higher FPS. And we know that isn’t the case.

Thats before we even get into the myriad of other settings unrelated to display resolution that also take up vram
 
Display resolution != texture quality

Every game I have has a separate setting for the two. If it truly made no difference as you’re suggesting, any PC game running at 1080p and maxed out settings should only look as good as it’s PS4 counterpart, but with higher FPS. And we know that isn’t the case.

Thats before we even get into the myriad of other settings unrelated to display resolution that also take up vram
I'm more talking PS5/XBX than previous gen. The newer consoles have the capability to do 4k and all I'm suggesting is that most games will stay within those confines. Will the occasional PC version go further? Of course but you will be on the margins which most won't care about.
 
I'm more talking PS5/XBX than previous gen. The newer consoles have the capability to do 4k and all I'm suggesting is that most games will stay within those confines. Will the occasional PC version go further? Of course but you will be on the margins which most won't care about.

Current consoles have the capability to do 4K. Capability does not mean they should.

A huge part of the 4k problem is everyone has a different opinion of what 4k capability is and performance acceptable. Personally I highly doubt we will see true 4k (encompassing all textures and IQ settings at a minimum of 4k pixels etc without compromise) for a long time yet.

Its a stupid marketing gimick to sell things, because buzzy words work and technical details put people to sleep.
 
I'm more talking PS5/XBX than previous gen. The newer consoles have the capability to do 4k and all I'm suggesting is that most games will stay within those confines. Will the occasional PC version go further? Of course but you will be on the margins which most won't care about.
I know what you're talking about, i'm giving you an example of 1080p and the previous gen which we have data for, to show that just because you have a target resolution, doesn't mean it's going to look the same. Go ahead and launch a few games and you'll see that texture quality and display resolution are two completely different settings. PS4 was targeting 1080p, but a maxed out 1080p game on PC looks better than it does on PS4, especially games released in the last few years.

Same will be true with PS5. Initially they will be very close, but as newer more powerful hardware becomes available, PC's get them and consoles dont and a 4K PS5 game won't look as good as a 4K PC game.
 
Something else to consider is that AMD is competing with Apple and other manufacturers for TSMC wafer space. Nvidia is currently on Samsung 8nm, which is probably more plentiful in wafer space and cheaper.

I know RTX 3000 cards are power hogs, but from a manufacturing/inventory perspective, Nvidia made a very smart decision. They are outpacing AMD in the PC GPU space, while AMD is focusing on console chips.
 
I know what you're talking about, i'm giving you an example of 1080p and the previous gen which we have data for, to show that just because you have a target resolution, doesn't mean it's going to look the same. Go ahead and launch a few games and you'll see that texture quality and display resolution are two completely different settings. PS4 was targeting 1080p, but a maxed out 1080p game on PC looks better than it does on PS4, especially games released in the last few years.

Same will be true with PS5. Initially they will be very close, but as newer more powerful hardware becomes available, PC's get them and consoles dont and a 4K PS5 game won't look as good as a 4K PC game.
Again I'm not disputing PC will look better especially as time goes on, that's a given that everyone knows, with the exception of HDR implementation, perhaps.
 
A few things to keep in mind.

1) VRAM usage in games is only going to go up, once you saturate the 10GB on the 3080 performance will tank drastically. If you upgrade with each new GPU cycle, probably don’t need to worry much about this. But if you’re an “every other generation” buyer like myself, it’s an important factor.

2) AMD historically is terrible at extracting full potential from their cards at launch and needs several driver updates before that happens.

Great example of both these points is the 7970 vs GTX 680 era. At launch the 680 was the card to get. It was faster, far more efficient and iirc it was even a bit cheaper for a very short period of time before AMD reduced It’s price. A few months later AMDs driver team got their ish together and it was equal or better than the 680. Not long after that, the 2gb on the 680 became a severe limiting factor and the 7970 was a very clear favorite.

If you're on a budget, you don't need 3080+ or 6800XT+ nor should be gaming at 4k.
By the time 3080's vram becomes "obsolete" at 4k, I will already have a new card.
Not that I would buy 3080 (or 6800XT for that matter).

By the time I buy a new card, the time I spent with those 16gb from 6800XT will be useless.
At its current price, 6800XT is already obsolete for me.

And wait 1 year so 6800XT *might be* on par with 3080? No thanks.
 
If you're on a budget, you don't need 3080+ or 6800XT+ nor should be gaming at 4k.
By the time 3080's vram becomes "obsolete" at 4k, I will already have a new card.
Not that I would buy 3080 (or 6800XT for that matter).

By the time I buy a new card, the time I spent with those 16gb from 6800XT will be useless.
At its current price, 6800XT is already obsolete for me.

And wait 1 year so 6800XT *might be* on par with 3080? No thanks.

Theres a lot of things you shouldn’t do if you’re on a conservative budget.
 
Current consoles have the capability to do 4K. Capability does not mean they should.

A huge part of the 4k problem is everyone has a different opinion of what 4k capability is and performance acceptable. Personally I highly doubt we will see true 4k (encompassing all textures and IQ settings at a minimum of 4k pixels etc without compromise) for a long time yet.

Its a stupid marketing gimick to sell things, because buzzy words work and technical details put people to sleep.

I agree.
In my case, at current speed it's going to take 10+ years to actually be able to game at 8k.
I have a 4k/120hz screen and that's what I want to be able to game at, on max details.
I absolutely refuse to lower details. It's why I game at 4k. I want the best image quality I can get. 60 fps is good enough, but I'm not going to lower the details... Never, even if the game is running at 35-40 fps.
And still, many moder games with the absolutely best video cards are not able to reach 120fps at 4k.
So, yeah. High-end GPUs are still not able to fully utilize moder monitors with modern games.
 
I agree.
In my case, at current speed it's going to take 10+ years to actually be able to game at 8k.
I have a 4k/120hz screen and that's what I want to be able to game at, on max details.
I absolutely refuse to lower details. It's why I game at 4k. I want the best image quality I can get. 60 fps is good enough, but I'm not going to lower the details... Never, even if the game is running at 35-40 fps.
And still, many moder games with the absolutely best video cards are not able to reach 120fps at 4k.
So, yeah. High-end GPUs are still not able to fully utilize moder monitors with modern games.
I think we'll get there much sooner, but with a bag of tricks like adaptive shaders, lower precision rendering, dlss AI scaled textures and other AI techniques.

The days of native rendering are coming to an end. You just can't push that many pixels by brute force.
 
I think we'll get there much sooner, but with a bag of tricks like adaptive shaders, lower precision rendering, dlss AI scaled textures and other AI techniques.

The days of native rendering are coming to an end. You just can't push that many pixels by brute force.

Native rendering will always be better than trickery, why would you be excited to game at upscaled 8k? just like throwing money at tech that can't really be used?

The 'push' to 8k is a bad joke, most of us are off that train an going with high refresh rate over ultimate pixal density.
 
Native rendering will always be better than trickery, why would you be excited to game at upscaled 8k? just like throwing money at tech that can't really be used?

The 'push' to 8k is a bad joke, most of us are off that train an going with high refresh rate over ultimate pixal density.
I can see 8K for VR of the future, as for a monitor, desktop usage, I don't see it that much of an improvement over 4K. I can see Ultra Wide and Super Ultra Wide as in like 5K being useful. Just my opinion.

I've game in 4K and really not that impressed of the improvement, HDR at 1440p on a good HDR game blows it away (Doom Eternal, Shadow Of The Tomb Raider my top picks, others are good too). Still when 120hz+ 4K/5K HDR monitors are a little bit better than todays, sign me up. Now for upscaling technologies, I am not as impressed with DLSS 2.x as I thought I would be and very disappointed on poor reviews or paid off reviews of this technology. Also people can be see things very much differently and what is an issue for one is nothing for another. DLSS is good tech with growing pains, it can indeed beat native resolution on some things while having issues on others which probably deals with now the developer implements it.
 
Your first wave of buyers that need to be removed from the market are those willing to pay the 50-100% mark-up that scalpers demand.
I will never understand why anyone would pay that much over msrp other than absolute idiocy. Gpus are not collectors items whose value will multiply with time.
The newer consoles have the capability to do 4k and all I'm suggesting is that most games will stay within those confines. Will the occasional PC version go further? Of course
Occasional? %90 of PC games will go well over what consoles can do. Consoles are typically at what PCs consider medium settings.
at current speed it's going to take 10+ years to actually be able to game at 8k.
I find 8k completely irrelevant for the mass market. No one with a regular display (monitor 3ft away or tv 6ft away) should care or will benefit perceptually from it.
I've game in 4K and really not that impressed of the improvement, HDR at 1440p on a good HDR game blows it away
That’s because resolution is only one part of human visual acuity, with factors such as contrast being far, far more important. Our eyes are primarily designed to notice changes in brightness, not extreme detail. That’s why the push for crazy resolutions is just marketing: it’s way easier to cram more pixels into a panel than to make an actual better panel. A 1080p OLED 27” monitor would look far better than any regular QHD LED equivalent - the improvement in contrast alone would make it perceptually much more pleasing than extra pixels. If we had a QHD OLED 27”, nobody would be talking about wanting 4K LED, it’d be irrelevant.
 
I will never understand why anyone would pay that much over msrp other than absolute idiocy. Gpus are not collectors items whose value will multiply with time.
There are people whose self-worth is defined by having a thing that other people want, before other people have it.

Stupid, but true.
 
Native rendering will always be better than trickery, why would you be excited to game at upscaled 8k? just like throwing money at tech that can't really be used?

The 'push' to 8k is a bad joke, most of us are off that train an going with high refresh rate over ultimate pixal density.
Normally I would agree, but DLSS can look better than native rendering and adaptive shading is doesn't seem to alter IQ at all, specially at higher resolutions. Besides, DLSS antialising is so much better than TAA.

I do like native res better, but I'm open to other options as long as IQ is not degraded.
 
Normally I would agree, but DLSS can look better than native rendering and adaptive shading is doesn't seem to alter IQ at all, specially at higher resolutions. Besides, DLSS antialising is so much better than TAA.

I do like native res better, but I'm open to other options as long as IQ is not degraded.
DLSS does not look better than native in any of the titles I've used it in, closest is Control but Native still beats dlss, maybe in young blood (I haven't played).

i'll take TAA over DLSS anyday, I cannot stand the graphical glitching DLSS introduces, typically around narrow objects, distant objects in motion, light sources, small details (rain) and especially in combination of narrow objects and light sources. Its distracting as hell.
 
DLSS does not look better than native in any of the titles I've used it in, closest is Control but Native still beats dlss, maybe in young blood (I haven't played).

i'll take TAA over DLSS anyday, I cannot stand the graphical glitching DLSS introduces, typically around narrow objects, distant objects in motion, light sources, small details (rain) and especially in combination of narrow objects and light sources. Its distracting as hell.
To each its own, but for me TAA is far worse with apparent blurring and ghosting in motion. It depends on the game tough.
 
I will never understand why anyone would pay that much over msrp other than absolute idiocy. Gpus are not collectors items whose value will multiply with time.

Occasional? %90 of PC games will go well over what consoles can do. Consoles are typically at what PCs consider medium settings.

I find 8k completely irrelevant for the mass market. No one with a regular display (monitor 3ft away or tv 6ft away) should care or will benefit perceptually from it.

That’s because resolution is only one part of human visual acuity, with factors such as contrast being far, far more important. Our eyes are primarily designed to notice changes in brightness, not extreme detail. That’s why the push for crazy resolutions is just marketing: it’s way easier to cram more pixels into a panel than to make an actual better panel. A 1080p OLED 27” monitor would look far better than any regular QHD LED equivalent - the improvement in contrast alone would make it perceptually much more pleasing than extra pixels. If we had a QHD OLED 27”, nobody would be talking about wanting 4K LED, it’d be irrelevant.
There is a lot of anecdotal evidence that most of the fed money printing is going to 'cards' (i.e. electronic purchases) and TSLA stock (don't even get me started) , so at this point the least Nvidia can do is officially raise the prices on 3060 ti to 599, 3070 to 799 and 3080 to 1099. The market will easily bear that and the OOS messages will mostly stop.
 
There is a lot of anecdotal evidence that most of the fed money printing is going to 'cards' (i.e. electronic purchases) and TSLA stock (don't even get me started) , so at this point the least Nvidia can do is officially raise the prices on 3060 ti to 599, 3070 to 799 and 3080 to 1099. The market will easily bear that and the OOS messages will mostly stop.
Nvidia pricing themselves out of their own virtually uncontested market - I like that idea. Can you replace Leather Jacket Guy?
 
Nvidia pricing themselves out of their own virtually uncontested market - I like that idea. Can you replace Leather Jacket Guy?
The hungry masses clearly want card :) and will pay for it. The artificial low pricing only helps scalpers.
 
DLSS does not look better than native in any of the titles I've used it in, closest is Control but Native still beats dlss, maybe in young blood (I haven't played).

i'll take TAA over DLSS anyday, I cannot stand the graphical glitching DLSS introduces, typically around narrow objects, distant objects in motion, light sources, small details (rain) and especially in combination of narrow objects and light sources. Its distracting as hell.
That is exactly what I am finding as well, maybe not to the degree. COD is fairing well but have not played enough of it so far to give a good conclusion. Control DLSS is somewhat of a letdown, RT is not and is a nice enhancement. Wolfenstein Youngblood has those motion artifacts you describe and to me Native is better. Shadow Of The Tomb Raider, too blurry, destroys textures. Comparing SMAA, FXAA etc. to DLSS, DLSS wins out in general with a big performance boost to go along with it.

As for narrow objects, one that DLSS does very good is hair for whatever reason, while other thin objects fall apart.
 
To each its own, but for me TAA is far worse with apparent blurring and ghosting in motion. It depends on the game tough.

Seizure inducting lights waving and flickering from behind what would turn out to be a fence would beg to differ.
 
Seizure inducting lights waving and flickering from behind what would turn out to be a fence would beg to differ.
You don't necessarily need TAA to address that problem, though. Remember TRMSAA, or transparency multisample antialiasing, which is what games like Half-Life 2 were doing back in 2004. That and the other various forms of transparency AA used back in the day address the issues with transparent billboards, but for some reason those methods have fallen out of favor. It is probably far easier these days to use some form of post-process AA with how reliant games are on shaders now.
 
You don't necessarily need TAA to address that problem, though. Remember TRMSAA, or transparency multisample antialiasing, which is what games like Half-Life 2 were doing back in 2004. That and the other various forms of transparency AA used back in the day address the issues with transparent billboards, but for some reason those methods have fallen out of favor. It is probably far easier these days to use some form of post-process AA with how reliant games are on shaders now.
I was referring to what DLSS does, I did turn off TAA via a file edit for Cyberpunk and the light noise was very amusing.
 
Estimate by HardwareTimes for Q4-2020

Overall 7nm chips — 9-10 million
PS5 + XSX/XSS — 7 million
Ryzen 5000 CPUs — 1 million
Big Navi chips — 100k-300k

https://www.hardwaretimes.com/1-mil...f-amds-7nm-capacity-at-tsmc-3-4-for-big-navi/
Explains why their stock, even after a glowing year and 4th quarter is 11% down from their last financial call. They are looking good but what a bad business decision in my opinion. They could have owned the GPU and CPU market right now but Consoles came first, could have had a much higher profit margin vice the dirt level profit margin with consoles, AMD giving up on the much more lucrative PC CPU and even GPU market share. In order to get their profit margin over 45% which was their previous goal, CPUs and GPUs prices had to be raised. I look at it as PC part buyers are supplementing Consoles. While Microsoft and Sony I am sure are enjoying the most successful launch ever, all the other companies that AMD provide are not.
 
Explains why their stock, even after a glowing year and 4th quarter is 11% down from their last financial call. They are looking good but what a bad business decision in my opinion. They could have owned the GPU and CPU market right now but Consoles came first, could have had a much higher profit margin vice the dirt level profit margin with consoles, AMD giving up on the much more lucrative PC CPU and even GPU market share. In order to get their profit margin over 45% which was their previous goal, CPUs and GPUs prices had to be raised. I look at it as PC part buyers are supplementing Consoles. While Microsoft and Sony I am sure are enjoying the most successful launch ever, all the other companies that AMD provide are not.
I suspect their deal to supply Sony and Microsoft probably necessitated such numbers. But agree, it does shed some light on the price hikes in the CPU department.
 
I suspect their deal to supply Sony and Microsoft probably necessitated such numbers. But agree, it does shed some light on the price hikes in the CPU department.
Also explains the AMD delay or lack of launch of the lower end GPU lineup which most buy into, pushing many I suppose to buying Consoles making that market even bigger than it should be. How many will be totally turned off, converted over to Consoles is another thought, while PC gaming machine sells where on a steady climb it is probably the opposite now. This has a potential longer effect for developers, game makers, if this continues, will be more incline to just develop for Consoles.
 
i mean i bought AMD back in march at 37. Id say the stock is doing pretty well. Paying scalpers for ampere is also out of laziness just trying to check off boxes on to do lists. For me I've been tempted because ever since buying a 65CX OLED i've moved my office into the living room working off my HTPC, and I really want a new HTPC with ampere for the HDMI 2.1 out. fortunately able to snag a PS5 yesterday to tide me over
 
Back
Top