RTX 3060 12GB

Nothing about this is typical, I'd say. It seems AMD have given up on value, no longer trying to get market share, so they're going to maximize how much they can get per card sold. Nvidia is not a better value, although no one provides good value these days. We need Intel to enter this market and hopefully manage to disrupt it.
I think it's a lot of compounding factors, but I don't see things normalizing as long as wafer prices continue to increase no matter who gets involved in the market. We need cheaper, more widely available fabrication.
 
Didn't you see the 3070/80/90 announcement? The cards were in the oven! He's still high from the fumes 😂
He was just ghetto reballing them so they wouldn't fail during the launch presentation. ;)

This 3060 is such a shit show though, wtf nvidia? This reminds me of AMD doing the weird China only gimped 480 or whatever it was. Neither of them should do this to consumers. 970 comes to mind too..
 
I'll admit, this is really, really bothering me. Take a random 3060 review from a reputable site, say this one for PC perspective. Let's look at some of their 1080p performance numbers:

View attachment 334053

Let's do this math exercise with Metro Exodus, which I feel is a good representative of a high quality, 2020 3rd person game performance. According to that chart, for someone on a 1060 like myself, this is what I have to look forward to performance wise:

1060 - 34.41fps, %100 performanc. Price: $199 = %100
2060s - 63.68fps, %185 performan. Price: $399 = %200
3060 - 60.97, %177.1 performance. Price: $329 = %165.3

So, judging from my current base performance of a 1060, these are the conclusions I arrive at:

1) 2060s would give me %185 performance at %200 the price. That is a BAD deal, so I never bought it.
2) 3060 would give me %177 performance at %165 the price. That is a BETTER deal than 2060, and not necessarily a bad value proposition. However,
3) The value proposition has barely changed. If I wanted more performance in 2016 I could have paid more for it. If I want more performance in 2021, instead of paying the same $199 for it (that's what you expect after 4 years and 2 generations of GPU architecture, more for the same cost), I have to pay more for it, just as I would have in 2016.
4) The logical next 30 series evolution would be a 3050, which at $199 or so, would give me similar performance to my current 1060. Same value I got in 2016, so why would anyone want to buy in 2021 what they could have had for the past 5 years?

Overarching conclusion: 2021 = 2016. There has been barely any low-midrange performance/price uplift (~%12) in the past 5 years from Pascal, to Turing, to Ampere. On this topic, I recommend you watch this video. It makes excellent points.

Giving Nvidia (or AMD for that matter) money to reward this behavior seems like a mistake. It's saying I'm OK with them offering me absolutely no improvement in 5 years and 2 architecture changes. And reality is, I'm not OK with that. And so, I think I'm legitimately not going to buy a card this generation, and keep my 1060 yet another year, hoping that next year they'll finally offer some improvement. If they don't, I'll keep NOT buying until they offer a compelling upgrade path, not more performance for more money as years keep passing by. And I'll repeat this strategy until even setting games at 720p no longer gives me playable framerates. Otherwise, I can't shake the feeling that I'm getting royally ripped off.

I mean you could have gotten a 5600xt for around $250 last year or a 5700 for around $279 which would have been a solid value proposition over your 1060. 5700xt seems DECENT. 3060ti at MSRP also seems DECENT.

We're fucked with anything below $400. There is no GPU market under $400. It's insane.
 
Here's your dose of speculation for the day - Asus seems to have spilled the beans on a 3050 Ti:
https://www.theverge.com/2021/3/5/22315175/nvidia-rtx-3050-ti-gpu-asus-leak-rumor

If that pans out, considering the 3060 laptop card has half the VRAM, a 3050 Ti for desktop would likely have 8GB. You'd think this will have a baseline 2060 performance, doesn't make sense to go any lower... and if that's the case, then the 3060 12GB officially becomes terrible value - or needs a price adjustment ASAP. Say a 3050 Ti goes for $250, but is ~%10 lower performance. You could pay $80 more for a 3060 12GB for %10 increase, but the next $70 upgrade gets you ~%35 in the 3060 Ti? Yeah, the 3060 12GB is increasingly looking like bad value. At $250, and not much less performance than a 3060 12GB, I'd go with the 3050 (unless I find a $399 FE 3060 Ti, which is likely not going to happen).
 
The 3070 is 25% more expensive than the 3060 Ti but only 10% faster.
$250 -> $330 is an increase of 32% but you also get 4 GB more VRAM. Alternatively, you're paying an extra $17 for +4 GB, relative to the 3060 Ti & 3070... Meh.

Value proposition is supposed to increase on cheaper cards, which is where the 3060 really goes wrong. 3050 Ti doesn't improve it.
 
Last edited:
Value proposition is supposed to increase on cheaper cards, which is where the 3060 really goes wrong. 3050 Ti doesn't improve it.
Agreed. But keep in mind that post was about laptop cards. If the 3050 follows the 3060 trend, it'd double the memory to 8GB on desktop.
 
Last edited:
Steve's re-evaluation of the 1060 in 2021 is further nudging me to be on 0 rush to get a new GPU. If and when the EVGA queue materializes for me, I can get that 3060, if not, I can keep waiting.

For all of you who are still on a 1060 like me, this is certainly worth a watch.

 
Ok, I'll fess up and admit sometimes I check Yahoo news and start reading about random crap after I get through the serious news, but this time it looks like Yahoo got the scoop ahead of most of the non-tech press and, well, you might not like it but 3060 Crypto mining performance has been unlocked. Not surprised at all.

https://www.yahoo.com/finance/news/ethereum-miners-found-way-bypass-205149434.html
https://hardforum.com/threads/debun...er-fake-news-read-op.2008497/#post-1044953297

I am not sure it is yahoo, but : https://www.benzinga.com/, buying space on the yahoo platform, i.e. that exactly this article:
https://www.benzinga.com/markets/cr...ers-found-a-way-to-bypass-nvidias-hash-limits

That do not seem to be signed.
 
Yahoo doesn't write their own news, they just aggregate stuff. The do mark ads though. They're tagged with "Ad" above the headline. The benzinga article isn't tagged with "Ad". Also, I find it quite believable since it's very difficult to block or nerf a specific use of a vid card. They just fiddle with the software a bit until the card/driver doesn't recognize it at Ethereum mining.
 
Steve's re-evaluation of the 1060 in 2021 is further nudging me to be on 0 rush to get a new GPU. If and when the EVGA queue materializes for me, I can get that 3060, if not, I can keep waiting.

For all of you who are still on a 1060 like me, this is certainly worth a watch.


Yeah at this point you're better off waiting until things normalize... unless they don't for years. I don't know. It was hard for me to get my card but I got it at MSRP.
 
After looking at some results from today's restock and also comparing reviews on BB's website, it seems like the 3070 and 3090 are really the two cards worth going for.
They both have more reviews than the 3060 Ti/3080, which is especially odd for the 3090 since it costs more than double the 3080 and launched later. This suggests Nvidia is manufacturing more of them (yield quality notwithstanding) -- which isn't surprising since they cost more. The 3060 Ti and 3080 are also way more popular among scalpers since they have higher profit margins and are easier to re-sell.

If you're trying to get a 3060 Ti at MSRP you might be better off going for the 3070 and just eating the $100 loss, it seems like your chances of getting a card are way higher.

tbh I guess smart people are just queueing for all 4 FE models every restock, and grabbing whatever pops up.
 
Last edited:
Well, this has pushed a GIANT pause button on my upgrade plans:



There's 0 point on hunting for a FE 3060 TI, or even a 3060, if my Ryzen 2600 is going to drag performance down because of how Nvidia does scheduling in software instead of hardware like AMD does.

I have actually recognized the exact same thing HU has discovered in my own gaming: whenever I turn on DX12 for "better performance", my 1060 3GB performs like absolute garbage, so I've avoided DX12 like the plague. Funny thing is, when I had the RX480 in this rig, the opposite was true: it always performed way better on DX12GB. It's a curious scenario where Nvidia's driver basically multithreads any DX11 engine on the driver even if a game developer doesn't - and removed the hardware scheduler from the GPUs themselves - while AMD still does hardware scheduling. See this excellent video suggested by HU:




So, if devs overload a primary thread - as used to be the case in DX11 - Nvidia's driver will make that issue invisible by threading the whole process, but that doesn't happen in AMD, so performance in DX11 has historically been worse when devs overload the main thread. Funny how for years we've blamed AMD for driver bloat, and now it's coming to light that it's devs poor coding for the DX11 multithreading standard (and Nvidia giving up on them, actually bloating their driver, and just taking over scheduling). If anything, it confirms much of how these companies operate: AMD pushes industry standards, Nvidia puts money down to do whatever they want. I'd be terribly interested in also seeing if this is why DX11 has refused to disappear, because Nvidia can to some extent push DX11 usage with their engineers on developing teams, while AMD says just stick to standards and it'll all work fine.

Bottom line - considering my current CPU, which I'm in 0 rush to upgrade, it seems like an Nvidia card might not make any sense at all for users of older Ryzen generations. Bye-bye 3060, let's see what the 6700 and 6600 offer.
 
Last edited:
New driver unlocks Eth mining if you have a display connected.
person-facepalming_1f926.png
 
Well, this has pushed a GIANT pause button on my upgrade plans:



There's 0 point on hunting for a FE 3060 TI, or even a 3060, if my Ryzen 2600 is going to drag performance down because of how Nvidia does scheduling in software instead of hardware like AMD does.

I have actually recognized the exact same thing HU has discovered in my own gaming: whenever I turn on DX12 for "better performance", my 1060 3GB performs like absolute garbage, so I've avoided DX12 like the plague. Funny thing is, when I had the RX480 in this rig, the opposite was true: it always performed way better on DX12GB. It's a curious scenario where Nvidia's driver basically multithreads any DX11 engine on the driver even if a game developer doesn't - and removed the hardware scheduler from the GPUs themselves - while AMD still does hardware scheduling. See this excellent video suggested by HU:




So, if devs overload a primary thread - as used to be the case in DX11 - Nvidia's driver will make that issue invisible by threading the whole process, but that doesn't happen in AMD, so performance in DX11 has historically been worse when devs overload the main thread. Funny how for years we've blamed AMD for driver bloat, and now it's coming to light that it's devs poor coding for the DX11 multithreading standard (and Nvidia giving up on them, actually bloating their driver, and just taking over scheduling). If anything, it confirms much of how these companies operate: AMD pushes industry standards, Nvidia puts money down to do whatever they want. I'd be terribly interested in also seeing if this is why DX11 has refused to disappear, because Nvidia can to some extent push DX11 usage with their engineers on developing teams, while AMD says just stick to standards and it'll all work fine.

Bottom line - considering my current CPU, which I'm in 0 rush to upgrade, it seems like an Nvidia card might not make any sense at all for users of older Ryzen generations. Bye-bye 3060, let's see what the 6700 and 6600 offer.

You can turn off "threading optimization" in Nvidia's Driver Control Panel. You can make it universal or in a game specific profile.
 
You can turn off "threading optimization" in Nvidia's Driver Control Panel. You can make it universal or in a game specific profile.
It's not as simple as that, you misunderstand. You cannot disable the multithreaded scheduling Nvidia does, because their cards no longer have a hardware scheduler. The driver is always doing this scheduling because the cards are no longer able to. You really think these analysis would happen if you could simply solve it by turning off 1 feature? Seriously. Check nerdtech's video, it's very well explained. Even Igor's Lab is now confirming this. It's clear we're about to hear more and more of this in the next few weeks.

It still stands to reason that lower-grade cards will be less affected by this, as their performance is less limited than higher end cards. Still, it's putting in doubt my getting a 3060 12GB, because I've seen similar behavior on my 1060 already... Although, with nothing else available for purchase, there might just be no other option.
 
Last edited:
Well, this has pushed a GIANT pause button on my upgrade plans:



There's 0 point on hunting for a FE 3060 TI, or even a 3060, if my Ryzen 2600 is going to drag performance down because of how Nvidia does scheduling in software instead of hardware like AMD does.

I have actually recognized the exact same thing HU has discovered in my own gaming: whenever I turn on DX12 for "better performance", my 1060 3GB performs like absolute garbage, so I've avoided DX12 like the plague. Funny thing is, when I had the RX480 in this rig, the opposite was true: it always performed way better on DX12GB. It's a curious scenario where Nvidia's driver basically multithreads any DX11 engine on the driver even if a game developer doesn't - and removed the hardware scheduler from the GPUs themselves - while AMD still does hardware scheduling. See this excellent video suggested by HU:




So, if devs overload a primary thread - as used to be the case in DX11 - Nvidia's driver will make that issue invisible by threading the whole process, but that doesn't happen in AMD, so performance in DX11 has historically been worse when devs overload the main thread. Funny how for years we've blamed AMD for driver bloat, and now it's coming to light that it's devs poor coding for the DX11 multithreading standard (and Nvidia giving up on them, actually bloating their driver, and just taking over scheduling). If anything, it confirms much of how these companies operate: AMD pushes industry standards, Nvidia puts money down to do whatever they want. I'd be terribly interested in also seeing if this is why DX11 has refused to disappear, because Nvidia can to some extent push DX11 usage with their engineers on developing teams, while AMD says just stick to standards and it'll all work fine.

Bottom line - considering my current CPU, which I'm in 0 rush to upgrade, it seems like an Nvidia card might not make any sense at all for users of older Ryzen generations. Bye-bye 3060, let's see what the 6700 and 6600 offer.


So, you're worried that at resolutions below your monitor's native res, you'll be cpu-limited to framerates which are still higher than your monitor can display?

I mean, you do you...
 
So, you're worried that at resolutions below your monitor's native res, you'll be cpu-limited to framerates which are still higher than your monitor can display?
Who said my monitor can’t display those frame rates?
 
Are you kidding me...

“A developer driver inadvertently included code used for internal development which removes the hash rate limiter on RTX 3060 in some configurations,” says an Nvidia spokesperson in a statement to The Verge. “The driver has been removed.”
 
Are you kidding me...

“A developer driver inadvertently included code used for internal development which removes the hash rate limiter on RTX 3060 in some configurations,” says an Nvidia spokesperson in a statement to The Verge. “The driver has been removed.”

Too late. It's out in the wild now.
 
Well, this has pushed a GIANT pause button on my upgrade plans:



There's 0 point on hunting for a FE 3060 TI, or even a 3060, if my Ryzen 2600 is going to drag performance down because of how Nvidia does scheduling in software instead of hardware like AMD does.

I have actually recognized the exact same thing HU has discovered in my own gaming: whenever I turn on DX12 for "better performance", my 1060 3GB performs like absolute garbage, so I've avoided DX12 like the plague. Funny thing is, when I had the RX480 in this rig, the opposite was true: it always performed way better on DX12GB. It's a curious scenario where Nvidia's driver basically multithreads any DX11 engine on the driver even if a game developer doesn't - and removed the hardware scheduler from the GPUs themselves - while AMD still does hardware scheduling. See this excellent video suggested by HU:




So, if devs overload a primary thread - as used to be the case in DX11 - Nvidia's driver will make that issue invisible by threading the whole process, but that doesn't happen in AMD, so performance in DX11 has historically been worse when devs overload the main thread. Funny how for years we've blamed AMD for driver bloat, and now it's coming to light that it's devs poor coding for the DX11 multithreading standard (and Nvidia giving up on them, actually bloating their driver, and just taking over scheduling). If anything, it confirms much of how these companies operate: AMD pushes industry standards, Nvidia puts money down to do whatever they want. I'd be terribly interested in also seeing if this is why DX11 has refused to disappear, because Nvidia can to some extent push DX11 usage with their engineers on developing teams, while AMD says just stick to standards and it'll all work fine.

Bottom line - considering my current CPU, which I'm in 0 rush to upgrade, it seems like an Nvidia card might not make any sense at all for users of older Ryzen generations. Bye-bye 3060, let's see what the 6700 and 6600 offer.

The RTX 3090 when in my TR 3960x, with the multiple CCXs/CDOs with inherit latency was not a good match in some games. On the Ryzen 5800x, it woke the card up! For example Flight Simulator 2020 is over 20% faster with the 5800x, it is faster at 4K then the TR 3960x at 1440p with the RTX 3090. While a DX 11 game, this now explains why I had poorer than expected performance, Nvidia forcing MT for draw calls, software scheduler, in DX 11 ran across the TR 3960x latency between the CCXs/CDOs. The 6900XT works great as a note with the TR 3960x and trounces the RTX 3090 in the same machine.
 
Too late. It's out in the wild now.
Yeah, this is insane. I won't speculate on whether this really was accidental or not, but they should just remove the limiter from every 3060 card. Now, any miner can get a 3060, use this driver (of which by now there will be millions of copies) and mine normally. Meanwhile, if you're a gamer, you get punished with not being able to mine properly if you want to (not that I ever would, but the point is equality). This is now an ultra f'ed up situation: screws over gamers, does nothing to discourage miners from scooping all the available product (not that it did that much before at just %50 hash rate, perhaps if it had been %10).
 
I think it does more, to give someone the blueprints to try and reverse engineer the driver, to figure out how to bypass the mining lock, since now we know nivida was full of it, when they said it was a bios and driver lock :) The driver doesn't really do anything much for miners, since its limited to the primary PCIe slot, so you cant use that driver for a Mining rig with multiple cards in it. However it does open up the stronger possibility to unlock the potential of the 3060 in mining.
 
Considering how these x60 Nvidia cards are built:



vs AMD's newest 6700 cards:




Well, even with all of Nvidia's issues, I think I'd go with a 3060 12GB at this point. I'm no longer considering the Ti - even though it's better value - because with my Ryzen 2600 I won't get the advertised performance. The 3060 shouldn't be too bottlenecked though. So I'm back to attempting Newegg Shuffles and just waiting on my EVGA queue. The equation would only change for me if there's some gain to be had from my 1060 3GB with a potential 3050 Ti for $250 or so. We'll see, I guess.
 
Steve's re-evaluation of the 1060 in 2021 is further nudging me to be on 0 rush to get a new GPU. If and when the EVGA queue materializes for me, I can get that 3060, if not, I can keep waiting.

For all of you who are still on a 1060 like me, this is certainly worth a watch.



I was pissed last year to find out Doom Eternal wouldn't let me max-out my 6gb 1060 at 1080p, but not because its short on performance...it's run out of VRAM :rolleyes:

I was able to get 80fps with everything else maxed(ultra nightmare effects + ultra textures), so I could certainly use more VRAM. Too bad I will have to wait another year for the 2GB chips to get sanely priced!

Yeah, you could go 3060 Ti, but that's just sidestepping the problem with a small increase (limit will show-up way sooner than 12)
 
Last edited:
Well, even with all of Nvidia's issues, I think I'd go with a 3060 12GB at this point. I'm no longer considering the Ti - even though it's better value - because with my Ryzen 2600 I won't get the advertised performance. The 3060 shouldn't be too bottlenecked though. So I'm back to attempting Newegg Shuffles and just waiting on my EVGA queue. The equation would only change for me if there's some gain to be had from my 1060 3GB with a potential 3050 Ti for $250 or so. We'll see, I guess.
To my surprise the XC Black queue is actually moving, it's up to 9:07 as of 9/19.
The XC Gaming (backplate) is at 9:29, the $390 model.

Bottom line - considering my current CPU, which I'm in 0 rush to upgrade, it seems like an Nvidia card might not make any sense at all for users of older Ryzen generations. Bye-bye 3060, let's see what the 6700 and 6600 offer.
This is Nvidia we're talking about, I'd still buy their cards just under the assumption they'll fix it ASAP.
 
This is Nvidia we're talking about, I'd still buy their cards just under the assumption they'll fix it ASAP.

If it's an issue with the hardware scheduler in AMD vs the software one in Nvidia, I'm not sure it's "fixable." If it's just an issue of CPU overhead while the driver is running, then maybe they can.
 
To my surprise the XC Black queue is actually moving, it's up to 9:07 as of 9/19.
Well that’s great news. Are you getting that info from the evga megathread excel in their forums? Because it only said 9:02 a couple days ago. I got in at 9:29 so I still got some ways to wait. ~10 min per month ain’t bad, I could get that notify by end of May.

If it's an issue with the hardware scheduler in AMD vs the software one in Nvidia, I'm not sure it's "fixable." If it's just an issue of CPU overhead while the driver is running, then maybe they can.
Yeah it seems like it’s not a badly behaved driver, just the software doing what it needs to do in order to take over the functionality from what’s missing in hardware. That is not something you can fix, you can only avoid the issue by adding a hardware scheduler back into the card design, so Nvidia can only do that with a new gpu architecture, existing cards can’t be altered (even if you could, neither the driver or firmware would be aware of their hardware scheduling capabilities). It’s an architectural design decision/consequence.
 
Indeed Nvidia can take their 8 GB and shove it up their ass. Should've been 16 GB on those cards, frankly I would have preferred 192-bit 12 GB on the 3060 Ti and 3070 even if it means a small loss in performance.
 
Indeed Nvidia can take their 8 GB and shove it up their ass. Should've been 16 GB on those cards, frankly I would have preferred 192-bit 12 GB on the 3060 Ti and 3070 even if it means a small loss in performance.
Um no. Neither card has enough power for 12 gb anyway especially the 3060ti. And having that slimmed down bus would've shown up anyway like it does on the rx 6800 series. They made a good call given the circumstances.
 
Um no. Neither card has enough power for 12 gb anyway especially the 3060ti. And having that slimmed down bus would've shown up anyway like it does on the rx 6800 series. They made a good call given the circumstances.
The 1080 Ti has 11 GB and the 3060 Ti is about 25% more powerful? The 2080 Ti also has 11 GB and the 3070 is as powerful?
The 3070 has the same VRAM as the 2070 Super, 2070, and 1070... 4.5 years worth of x70's... with no increase. And the 3070 is literally twice as fast as the 1070.

With RTX enabled VRAM usage goes through the roof. These cards are absolutely going to suffer within the next 1-2 years. Nvidia intentionally gimped these cards to prevent people skipping the 40 series (like what happened with Pascal), they will launch new cards with 16+ GB across the board and people will be forced to upgrade.
 
Last edited:
The 1080 Ti has 11 GB and the 3060 Ti is about 25% more powerful? The 2080 Ti also has 11 GB and the 3070 is as powerful?
The 3070 has the same VRAM as the 2070 Super, 2070, and 1070... 4.5 years worth of x70's... with no increase. And the 3070 is literally twice as fast as the 1070.

With RTX enabled VRAM usage goes through the roof. These cards are absolutely going to suffer within the next 1-2 years. Nvidia intentionally gimped these cards to prevent people skipping the 40 series (like what happened with Pascal), they will launch new cards with 16+ GB across the board and people will be forced to upgrade.
I understand what you're saying about the 80 ti's vs 3070. But considering everything that happened in 20 and the Gddr 6 and 6x shortage. I mean it seems less rx 6000 series are out there than ampere. Mostly 16 gb there which is nice but like i said even less of them and they're not as popular with the miners as ampere. Well considering raytracing is only viable at 1440 on the 10gb gddr6x 3080 i don't think that is a worry. granted 16 gb for 3080 at least would have been nice(and preferred) but at least there's a ti or super coming. Considering how hard it is to buy an 3000 series card i say most will be looking to upgrade from a pre ampere card. So it'll work out somehow.
 
Back
Top