Big Navi is coming

the most important part of open source is not about security since yes, that still depends on competent people reviewing every change all the time just like any software.

the most important part is that once the author has grown tired of maintaining it, or goes off in a direction they want but maybe you dont or dies/goes bankrupt etc... The software doesn't have to die with it or change with them or abandon your needs. You can (or like minded indviduals) fork it and adapt it to your needs. Or keep updating it for new hardware. Bring it to new archs that maybe the original authors dont care about.

That's the most important part. The freedom to take the software to places you care about. AMD (and intel) give you that opportunity. Nvidia does not. So as far as I'm concerned, nvidia can release the best hardware ever to grace humanity ...they'll still get a huge middle finger from my wallet.
 
the most important part is that once the author has grown tired of maintaining it, or goes off in a direction they want but maybe you dont or dies/goes bankrupt etc... The software doesn't have to die with it or change with them or abandon your needs. You can (or like minded indviduals) fork it and adapt it to your needs. Or keep updating it for new hardware. Bring it to new archs that maybe the original authors dont care about.

That's the most important part. The freedom to take the software to places you care about. AMD (and intel) give you that opportunity. Nvidia does not. So as far as I'm concerned, nvidia can release the best hardware ever to grace humanity ...they'll still get a huge middle finger from my wallet.

And thus, a FOSS'er.

How well the hardware and software stack execute the task at hand is secondary to being able to use the hardware once long obsoleted.

Yes, to keep using ancient hardware with new software is nice, but it's not a primary focus of those trying to get work done today.


[and I want to say that I'm not unsympathetic, but rather, that this perspective while absolutely worth presenting, does not rate importance for most of the buyers of these products]
 
A significant amount of amd fanboy-ism stems not from some kind of blind love of AMD or the radeon brand name or their color choice. It stems from the company going out of their way to being accessible and functional to a fairly large community of non-windows users. It's reinforced by the lack of support from their main competitor, nvidia. And while closed source drivers exist for OS's that are more common and may be good enough for someone coming in new. Anyone who's been in that game for a decade or more knows the pain of the limitations to closed source drivers. I dont even consider them an option. It's bad enough we still have to deal with closed firmware blobs.

but true enough, none of this rates as important to "buyers of these products". Then again, almost nothing a computer enthusiast cares about rates as important to the masses that buy these products. Our discussions may as well just revolve around which ones has the most RGB lights.
 
It stems from the company going out of their way to being accessible and functional to a fairly large community of non-windows users.

Not going to disagree for FOSS'ers, but remember that it's only been the last few years where AMD has had drivers that were even on the same continent as 'functional'.

Nvidia has supported operating systems other than Windows far better. You may turn your nose up at closed source drivers, but by the gods they worked years ago and they work today.

And despite all the Linux evangelism- and I include myself, just installed Fedora on a newly resurrected build today!- this is still a nearly unmeasurably small fraction of the market.
 
the pc enthusiast market is extremely small.

How often do you hear about everyone here upgrading to current stuff ... yet the market for pc is contracting and slowing every year. the diy build by components market while being fairly visible, is basically non-existent.

if it doesn't come from a big oem with it in it, it's niche to the level of not really mattering.

we all live in the niche.
 
https://www.statista.com/statistics/263393/global-pc-shipments-since-1st-quarter-2009-by-vendor/

that curve is not going up.

now if you want to subset into "gaming" ...not sure where you'd get accurate numbers on that since any computer can be used to game on to varying degrees. Maybe their fraction of this shrinking market is growing year over year. Maybe the same amount is just getting more visible. The bottom line is, the total possible number is not growing, since new pc gaming sales can't grow beyond new pc sales.

Since we know oem sales make the bulk of money for manufacturers like amd / nvidia / intel, I can only assume any growth in the diy sector is small compared to the oem ...and so is either also shrinking or is too small to counteract.
 
https://www.statista.com/statistics/263393/global-pc-shipments-since-1st-quarter-2009-by-vendor/

that curve is not going up.

now if you want to subset into "gaming" ...not sure where you'd get accurate numbers on that since any computer can be used to game on to varying degrees. Maybe their fraction of this shrinking market is growing year over year. Maybe the same amount is just getting more visible. The bottom line is, the total possible number is not growing, since new pc gaming sales can't grow beyond new pc sales.

Since we know oem sales make the bulk of money for manufacturers like amd / nvidia / intel, I can only assume any growth in the diy sector is small compared to the oem ...and so is either also shrinking or is too small to counteract.
Q
467B90F4-D589-4C5F-BA58-E8755CD4066A.png


Around 50 million discrete gpus a year? Seems fairly healthy. It went down slightly overtime but I also imagine “good enough” lasts a lot longer now. Relatively flat 2014-2018 if you did a moving average.
 
regarding the graph:

also, not growing. Stagnant at best for the last 5-6 years.

But sales went up.

Also nVidia’s best selling card was the 1080 for Pascal. I don’t know how 50 million discrete GPUs a year and a $600+ GPU being the best selling card equates to a “niche” market. It’s basically main stream.
 
Sure you can. Because you can't meaningfully talk about them together.

Gaming software can grow and fall independent of hardware sales. Since this is a hardware thread, it makes more sense for us to be looking at things from that perspective.

if the argument is that gaming software is growing year over year, it seems pretty obvious that's not driving an increase in sales of hardware.

One main conclusion to that is that people are gaming on older and older hardware before buying new ....the other conclusion is that an ever decreasing customer base is buying more games per capita.

Either way, manufacturing companies dont make money off already sold hardware. If the market isn't growing (and it's not judging by both pc sales data and gpu sales data) then all the enthusiasm and excitement and sold out-ness of the latest and greatest that you see on sites like this (well, used to be on sites like this) and the like account for an insignificant representation of the overall market. We're a niche. A halo niche that manages to get more attention than we otherwise deserve, likely because the enthusiast niche dictates the only/primary brand recognition these companies have. Otherwise they'd be a faceless name associated with your pc appliance, easily swappable with another faceless supplier.
 
But sales went up.

Also nVidia’s best selling card was the 1080 for Pascal. I don’t know how 50 million discrete GPUs a year and a $600+ GPU being the best selling card equates to a “niche” market. It’s basically main stream.

Sales went up for the Mining bubble.
 
But sales went up.

Also nVidia’s best selling card was the 1080 for Pascal. I don’t know how 50 million discrete GPUs a year and a $600+ GPU being the best selling card equates to a “niche” market. It’s basically main stream.

3 quarters out of the last 11 showed this "sales went up" idea. That is not a trend in of itself. Definitely not when you see that it drops like a rock afterward back to following the trend it was following previously.

Most "discrete" sales are in oem machines. I doubt anyone would attribute sales of big box pc's to enthusiasts in general. You might as well count the xbox and ps4 as enthusiast pc sales too then. There's some overlap ...not every pc enthusiast builds their own computers. But i'd say the vast majority aren't. They're just buying a computer.
 
3 quarters out of the last 11 showed this "sales went up" idea. That is not a trend in of itself. Definitely not when you see that it drops like a rock afterward back to following the trend it was following previously.

Most "discrete" sales are in oem machines. I doubt anyone would attribute sales of big box pc's to enthusiasts in general. You might as well count the xbox and ps4 as enthusiast pc sales too then. There's some overlap ...not every pc enthusiast builds their own computers. But i'd say the vast majority aren't. They're just buying a computer.

Going off the lowest point on the graph there’s 40 million discrete GPUs sold a year. Approximately 80% are nVidia. Based on Pascal, the majority of the GPUs sold are 1080s in the $600 enthusiast range. Your arguement is weak.
 
Going off the lowest point on the graph there’s 40 million discrete GPUs sold a year. Approximately 80% are nVidia. Based on Pascal, the majority of the GPUs sold are 1080s in the $600 enthusiast range. Your arguement is weak.

Based on what evidence?
 
nVidia quarterly financial reports. I should have worded it, the best selling card.

Majority means something else, good catch. :)

One quarter before the 1060 arrived? Because looking at Steam, they sold a LOT more of those. More 1050, 1050ti, and 1070's on Steam as well.
 
One quarter before the 1060 arrived? Because looking at Steam, they sold a LOT more of those. More 1050, 1050ti, and 1070's on Steam as well.

Maybe it was the best selling based on revenue. They can’t lie in quarterly reports. Regardless it’s no niche market.
 
To put things in perspective, the PS4, arguably one of the most successful consoles of all time, has sold around 100 million units lifetime.

While we are talking about in the 50 million range for GPUs sold in a year. Big difference.
 
Last edited:
Q
View attachment 180906

Around 50 million discrete gpus a year? Seems fairly healthy. It went down slightly overtime but I also imagine “good enough” lasts a lot longer now. Relatively flat 2014-2018 if you did a moving average.
OK, maybe I no longer know how to read a graph but to me it looks like the y axis tops out at 25 million, how do you see 50?
 
OK, maybe I no longer know how to read a graph but to me it looks like the y axis tops out at 25 million, how do you see 50?

Oh crap - I thought it was broken up by quarter looking on my phone. It’s by half years. Just ignore my posts... it was tiny and I saw Q.

Same point stands regardless. They sell these high dollar cards in the millions.

Remember the 970, it’s a bit cheaper starting at $330 iirc but they sold 3 million in the first quarter. (and then got sued for the 3.5GB issue haha)
 
Oh crap - I thought it was broken up by quarter looking on my phone. It’s by half years. Just ignore my posts... it was tiny and I saw Q.

Same point stands regardless. They sell these high dollar cards in the millions.

Remember the 970, it’s a bit cheaper starting at $330 iirc but they sold 3 million in the first quarter. (and then got sued for the 3.5GB issue haha)
Wait, it is broken up by quarters. You said sales per year, not quarter, I forgot to add up 4 quarters of sales. Still dropping by a half in 9 years is not good.
 
Last edited:
Wait, it is broken up by quarters. You said sales per year, not quarter, I forgot to add up 4 quarters of sales. Still dropping by a half in 9 years is not good.

Yeah ok, so I wasn’t wrong just freaked myself out. Still had quarterly dividers. Here’s a yearly one for nVidia alone.

8AA536B0-EB9A-42AD-A04F-97D4FA69DB4C.png
 
Remember the 970, it’s a bit cheaper starting at $330 iirc but they sold 3 million in the first quarter. (and then got sued for the 3.5GB issue haha)

Bought two, ran in SLI (back when that worked), sold one, just put one in my revived ITX build, playing BF4 (have 1 and V, prefer 4), Apex Legends, Ring of Elysium, X-COM 2, Civ VI all fine at 1440p at >100FPS. Just have to make sure that I keep the settings in check. And this system makes me wish that someone made a good blower!
 
RTX 3000, XSX/XSS, PS5, TGL - Certainly alot of hardware news! Possibly the first info on Big Navi now as well:
https://www.notebookcheck.net/AMD-R...-PlayStation-5-GPU-clock-speeds.494558.0.html

If those specs hold and we really get a 2.2 ghz boost clock then this thing should reach about 5-10% higher performance than 3080 and be within spitting distance of the 3090. I think that’s when NVIDIA will be forced to release 3080 20GB via their AIB partners similar to that 3060 KO which might feature more CUDA cores and higher factory OC.

If the above comes to pass, AMDs 16GB 80 CU monster will cost close to the same as a 3080, maybe even more if it exceeds 3080 performance while consuming less power—it will come down to whether AMD has decent RT performance and a DLSS equivalent. The 3080 20G will probably end up 5-10% slower than 3090 and cost $900-$1000+ so it will still be a slightly worse value vs AMD but will command the NVIDIA name and ecosystem as a reason for the premium pricing.

The big losers in all this will be the people who rushed out to buy a 3080 10GB, it’s just not a great card, especially relative to what’s potentially coming in the next few months. Definitely some crazy releases ahead of us to look forward to.
 
Last edited:
If those specs hold and we really get a 2.2 ghz boost clock then this thing should reach about 5-10% higher performance than 3080 and be within spitting distance of the 3090. I think that’s when NVIDIA will be forced to release 3080 20GB via their AIB partners similar to that 3060 KO which might feature more CUDA cores and higher factory OC.

If the above comes to pass, AMDs 16GB 80 CU monster will cost close to the same as a 3080, maybe even more if it exceeds 3080 performance while consuming less power—it will come down to whether AMD has decent RT performance and a DLSS equivalent. The 3080 20G will probably end up 5-10% slower than 3090 and cost $900-$1000+ so it will still be a slightly worse value vs AMD but will command the NVIDIA name and ecosystem as a reason for the premium pricing.

The big losers in all this will be the people who rushed out to buy a 3080 10GB, it’s just not a great card, especially relative to what’s potentially coming in the next few months. Definitely some crazy releases ahead of us to look forward to.

You have any benches that indicates 20GB will offer better performance than 10GB?

I see a lot of noise form people but no hard data...and none of the tests I have seen has shown 10 GB as being a limiting factor?
 
You have any benches that indicates 20GB will offer better performance than 10GB?

I see a lot of noise form people but no hard data...and none of the tests I have seen has shown 10 GB as being a limiting factor?

Considering what I wrote is hypothetical and the 20GB isn’t available, my statement is more forward looking. 10GB isn’t limited yet but for a 4K card, I can see it hitting a wall in a year or two once next generation games come along. It’s a step back from even 2080 Ti and reviewers like HWUB agrees:
https://www.techspot.com/review/2099-geforce-rtx-3080/

The fact that the RTX 3080 simply excels at 4K gaming could be seen as the cherry on top. Doesn't necessarily future proof the GPU however, as Nvidia paired it with 10GB of VRAM which might prove insufficient in a year or two

This card should’ve had 12GB not 10GB.
 
Considering what I wrote is hypothetical and the 20GB isn’t available, my statement is more forward looking. 10GB isn’t limited yet but for a 4K card, I can see it hitting a wall in a year or two once next generation games come along. It’s a step back from even 2080 Ti and reviewers like HWUB agrees:
https://www.techspot.com/review/2099-geforce-rtx-3080/



This card should’ve had 12GB not 10GB.

So no data, gotcha.

Until we can map vRAM usage like we can map system RAM usage, any talk about vRAM usage is pointless (unless benches shows perfomance dropping of a cliff due to vRAM limitations.

We have NO clue on what is actually in use and what is simply cached.
(Most games will cache all they can to and eat up any vRAM availble...but are no where near being vRAM limitied)
 
Don’t need it to see it, guess you know better than Techspot and many other veterans. You should start FactumTech where you defend NVIDIAs decisions.

Their OPNION is not backed by data...I deal with data, no fuzzy-warm-feelings, done on a ignorant foundation (used vRAM vs cahced vRAM eg.)

Wake me up when you have data...
 
Their OPNION is not backed by data...I deal with data, no fuzzy-warm-feelings, done on a ignorant foundation (used vRAM vs cahced vRAM eg.)

Wake me up when you have data...

https://www.techspot.com/review/2099-geforce-rtx-3080/

Now moving to 4K we see some interesting stuff using the ‘Ultra Nightmare’ preset, which in our test sees 9GB of VRAM used.

4K_Doom.png

Here the 3080 was 35% faster than the 2080 Ti, but quite incredibly 115% faster than the 2080. Also it was 93% faster than the 1080 Ti which does well here relative to the 2080 thanks to the extra VRAM.

This particular test is why many are expecting/claiming up to 100% performance gains over the 2080 after seeing Digital Foundry’s Nvidia promo video. The data is accurate, but without more information can be somewhat misleading.

And here is more information. By leaving all other options on ‘Ultra Nightmare’ with the exception of texture pool size, which we dropped to ‘Ultra’ reducing VRAM usage to 7GB, this is what we find...

4K_Doom_Ultra.png

There is no change in performance with GPUs packing 10GB or more VRAM, but for those with 8GB like the RTX 2080, we’re seeing a 26% performance uplift and that means the 3080 is now 70% faster, not 115% faster.

9GB used by this game, already uncomfortably near its limit in 2020. 2080 performance delta was much wider vs 2080 Ti relative to 3080 because of the vram wall. Keep defending NVIDIAs poor decisions.
 
Used or cached usage?
Those numbers does not tell what you think.

Keep acting like you have valid data....when you have none.

The numbers are right there in front of you. I don’t care if you choose to bury your head in the sand. The 3080 10GB will be quietly swept aside once Big Navi and NVIDIAs answer to it releases. I feel bad for anyone who pays more than $750 for the 3080 10GB.
 
The numbers are right there in front of you. I don’t care if you choose to bury your head in the sand. The 3080 10GB will be quietly swept aside once Big Navi and NVIDIAs answer to it releases. I feel bad for anyone who pays more than $750 for the 3080 10GB.

So much much of those 9 GB are used and how much is cached assets?
Because if you cannot answer that...your posts can be safely ignored as useless static...
 
So much much of those 9 GB are used and how much is cached assets?
Because if you cannot answer that...your posts can be safely ignored as useless static...

You obviously work at id software and have inside information about how the engine allocates vram so enlighten me. If you can’t then your post can be safely ignored as useless.
 
You obviously work at id software and have inside information about how the engine allocates vram so enlighten me. If you can’t then your post can be safely ignored as useless.

Until we can this for vRAM usage:
1600423231880.png


Then vRAM usage is useless...just like your whine.

Bad data is just that...bad data.

You can then try and whine and fallacy around, but doesn't change a thing...you are trying to make a conclusion, but lack the data to support your conclusion.

It's is not rocket science, so you are you acting so "ignorant"?
 
Back
Top