RTX 3060 12GB

I understand what you're saying about the 80 ti's vs 3070. But considering everything that happened in 20 and the Gddr 6 and 6x shortage. I mean it seems less rx 6000 series are out there than ampere. Mostly 16 gb there which is nice but like i said even less of them and they're not as popular with the miners as ampere. Well considering raytracing is only viable at 1440 on the 10gb gddr6x 3080 i don't think that is a worry. granted 16 gb for 3080 at least would have been nice(and preferred) but at least there's a ti or super coming. Considering how hard it is to buy an 3000 series card i say most will be looking to upgrade from a pre ampere card. So it'll work out somehow.

Right, you want to understand how frustrating it's been for users waiting for double-density GDDR6 ram chips, call me back when you go over the months of threads waiting on the 20GB 3080 Ti (now a 12GB 384-bit.)

The fact that here in April Nvidia still doesn't have GDDR6x 16Gb chips from Micron tells you all you need to know about it's availability in the rest of the video card world. Nvidia is doing it's best to reduce demand for these chips by only putting six of them on a card; and because it's clocked at 15 Gbps, you can source those chips a lot more easily than AMD's 16Gb x 16Gbps liquid gold.

AMD's RX 6700 XT also uses the exact same liquid gold 16x16 chips, so I wouldn't expect to see too many of them in retail.
 
Last edited:
Hardware Unboxed is expanding their findings about Nvidia drivers' overhead and software based scheduling:




Oof. Seeing an RX 580 give 20 frames more per second than an RTX 3080 (see 13.40 mark) on a few years old CPU hurts, a lot. Thankfully I can't find any RTX 3060 on the market, because it's starting to seem more and more like I should just wait for AMD's $300-range offerings instead, despite their lack of AI upscaling - for now - and worse DXR performance. I can't say for sure, but when my Ryzen 2600 based PC had an RX 480, I can't shake the feeling that I had better performance than on the 1060 3GB I have now. I thought I was crazy, because all the YouTube reviews said the opposite, but this is certainly giving credence to my suspicions.
 
This isn't the first time we had CPU scaling issues with new arch.

https://www.nvidia.com/en-us/geforc...-drivers-cpu-overhead-something-to-chew-upon/

The DX12 problem has been a lot easier problem to ignore than a Fermi new arch issue, but it doesn't make it any less-important today.

It took them 3-ish months to fix my Bad-Company 2 issues running GTX 460 on the problem Core 2 Duo, so I would expect the same time-frame for fixes. Lucky me that I can't afford to pay scalpers prices!
 
Last edited:
The DX12 problem has been a lot easier problem to ignore than a Fermi new arch issue, but it doesn't make it any less-important today.

It took them 3-ish months to fix my Bad-Company 2 issues running GTX 460 on the problem Core 2 Duo, so I would expect the same time-frame for fixes. Lucky me that I can't afford to pay scalpers prices!
The issue is coming up a lot more now that DX12 and Vulkan seem to finally be getting some traction, thanks to new consoles. As for fixes... I'm not sure if you've been following this issue, but it is not really something fixable... it's a hardware design choice. If Nvidia implements a hardware scheduler in the next architecture then it will no longer be a problem, but it's not something you can fix in current card designs. Either you have a hardware scheduler, or you don't, so you do it in software.
 
The issue is coming up a lot more now that DX12 and Vulkan seem to finally be getting some traction, thanks to new consoles. As for fixes... I'm not sure if you've been following this issue, but it is not really something fixable... it's a hardware design choice. If Nvidia implements a hardware scheduler in the next architecture then it will no longer be a problem, but it's not something you can fix in current card designs. Either you have a hardware scheduler, or you don't, so you do it in software.


What proof do you have that there's no hardware scheduler in NVIDIA's RTX? A software scheduler implementation optimized for older software does not necessarily mean the hardware is nonexistent.

I mean, just because the RTX cards have strange performance with Windows Hardware Scheduling on doesn't mean anything when the rX 5700 XT has the exact same performance inconsistencies.:

https://adoredtv.com/windows-10-gpu-scheduling-on-amd-radeon-rx-5700-xt-8-games-tested/

summary-xt-gpu-scheduling-on-vs-off-1080p-adoredtv.jpg
 
Last edited:
What proof do you have that there's no hardware scheduler in NVIDIA's RTX? A software scheduler implementation optimized for older software does not necessarily mean the hardware is nonexistent.
It's been pointed out in this thread. Check page 7, HardwareUnboxed's original video. Then you can watch this pretty good explainer video. When you're done with that, you can go read Anandtech's Kepler review. Kepler is when the scheduler got massively simplified (technically there is still one in hardware, just less capable than it used to be) and it's remained so since then (hence why you find instances with 1st or 2nd gen Ryzen CPUs where, astonishingly, an RX 580 can top a 3080). The most important bit:

1616808150621.png


The whole analysis/history on schedulers is a pretty good read though. It's not that Nvidia can't do it, it's that they made a design decision to move much of that work into software, leaving the hardware for basic functionality so they could gain die area and energy consumption in return. This in itself is atypical, going against the technological current, but it worked well for them for many years. AMD moved to a more complex scheduler with GCN. It's likely Nvidia will go back to a hardware scheduler in the future, or it'll just become known that with a lesser CPU, you're leaving performance on the table with an Nvidia card, because your CPU can't keep up with what Nvidia is doing via software. It's not that much of an issue on AMD because they do it on hardware, thereby liberating your CPU. I don't see how Nvidia could, then, "fix" this issue, because it's not an issue at all: it's a design decision. In order to bring back performance for everyone, not just for those on the newest CPUs, they'll have to add a more complex hardware scheduler to their GPUs so the CPU doesn't have to do the work.

In the end, hardware or software scheduler, it's not a good or a bad thing, it's just a different approach. It's made plenty of sense in the DX11 era, because DX11 was a) bad at organizing threads in anything other than 1 CPU core and b) once it kinda sorta got multithreaded, devs just didn't do the work, so Nvidia took over by multithreading the tasks via their driver scheduler. Now that we're finally starting to really move to DX12, no software can beat dedicated hardware scheduling, so when paired with a weaker CPU, the design decision that was a clear win in DX11 is starting to reveal itself as a weakness in the DX12 era. We're going forward, not backward, so I'd predict Nvidia will add a more complex hardware scheduler if not in the 4000 series, for sure by the 5000.
 
Last edited:
Um no. Neither card has enough power for 12 gb anyway especially the 3060ti. And having that slimmed down bus would've shown up anyway like it does on the rx 6800 series. They made a good call given the circumstances.

Here is an edge case

You may not want to play 4k ray tracing on a 3070, but if you do there are some memory limit issues waiting in store

https://www.extremetech.com/gaming/321267-amd-radeon-6700-xt-review

There’s some evidence to suggest that Nvidia’s decision to equip the RTX 3070 with just 8GB of VRAM really could be a limiting factor in games going forward. There’s evidence of this in both Godfall and Watch Dogs Legion, particularly WDL.

1080p and 1440p show similar patterns of performance between the three cards. Ultra detail is extraordinarily hard on both the RTX 2080 and the 6700 XT, with or without ray tracing enabled. Once we hit 4K, however, things change. Both the RTX 2080 and 3080 fall off a cliff in Godfall, where the RX 6700 XT outperforms them by over 3x.

In Watch Dogs Legion, the RTX 3070 is no less than 2.91x faster than the 6700 XT in 1440p, but loses to it in 4K. While none of the GPUs turns in a playable frame rate, the wholesale collapse of the Nvidia cards at high resolution is indicative of one thing: an insufficient VRAM buffer.

Screenshot_20210329-012319_Opera.jpg
 
Spammed every card on Best Buy today and ended up with another 3060 XC. I am cursed with this card.

I actually got to checkout with a 2nd one but I got the Limit 1 error.

3060_bb.png
 
Last edited:
Doesn't seem to be moving too fast. I was at 9:29 and got an e-mail around March 16.
 
Some progress at least albeit painfully slow. I’m only queued for the 3060ti/3070/3080 and have basically written them off.
 
The black queue is up to 9:09 AM. So, it's going to rain quite a bit by the time I get my turn on that list.
 
39 days since February 25th. They are 33 minutes after the hour which means less than 1 minute per day, yikes.
Even slower for the Black model... they’re still on minute 8. By the time I get my notify (minute 29) there might very well be a new generation of cards available.
 
The step up queue is even worse than the notify queue. I don't anticipate getting step ups until they release next generation at this point.
 
The step up queue is even worse than the notify queue. I don't anticipate getting step ups until they release next generation at this point.
hmmmm. . .
Mine 24/7, make some money, after a year or more and step up becomes available -> new GPU in box :D
 
Any good ETH for dummies tutorial to fire it up on a 3060, specifically?
1) install the card directly in a pcie x16 slot
2) locate and install the dev driver without the limiter
3) either plug in a monitor or a dummy hdmi adapter
4) configure your OC undervolt settings
5) mine away at about 47mh/s
 
1) install the card directly in a pcie x16 slot
2) locate and install the dev driver without the limiter
3) either plug in a monitor or a dummy hdmi adapter
4) configure your OC undervolt settings
5) mine away at about 47mh/s
Great starting info... I'll need to dig deeper into step 5 since this would be my first foray into mining.
 
Great starting info... I'll need to dig deeper into step 5 since this would be my first foray into mining.
Ah then find a guide on manually setting up mining wallet etc, or if you wanna be lazy you can use nicehash
 
Ah then find a guide on manually setting up mining wallet etc, or if you wanna be lazy you can use nicehash
I tried nice hash but it behaves a bit like spammy software... even after uninstalling it persisted, I had to reinstall windows in the end, so I’d recommend caution. Not that my 1060 would get me much done anyway, and I’ve miserably failed at getting a 3060 so far.
 
Any good ETH for dummies tutorial to fire it up on a 3060, specifically?
Good question for the Mining Cryptocurrency section of forums. I can walk you through it there in how I do it. Many different ways but I can get you hopefully in the right direction to the Dark Side of GPUs ;).
 
Good question for the Mining Cryptocurrency section of forums. I can walk you through it there in how I do it. Many different ways but I can get you hopefully in the right direction to the Dark Side of GPUs ;).
Thanks, I'll jump over there and post up what I have figured out so far...
 
  • Like
Reactions: noko
like this
1) install the card directly in a pcie x16 slot
2) locate and install the dev driver without the limiter
3) either plug in a monitor or a dummy hdmi adapter
4) configure your OC undervolt settings
5) mine away at about 47mh/s

This is enough to get you started. Takes some trial and error to find a balance. I hit about 49-50 at 120W on the EVGA XC.

I tried nice hash but it behaves a bit like spammy software... even after uninstalling it persisted, I had to reinstall windows in the end, so I’d recommend caution. Not that my 1060 would get me much done anyway, and I’ve miserably failed at getting a 3060 so far.

I never installed it. I just use the zip file version. Pretty easy to just delete it if you don't like it. I never had any issues with persisting installs or spammy nonsense since I didn't install it.
 
I smell a "bait and switch" class action unless

Do not know anything about law, but I doubt plaintiff could find much if any Nvidia crypto performance marketing (did not check so could be wrong here), the video card box on the tablet will be marketed very clear way toward playing game with them, streaming and so on.
 
I just looked at today's Newegg Shuffle, and it's just a "bait and overstock clearance." I can't buy any of the video cards I might want without buying something I absolutely don't need. And Newegg doesn't even give me a choice of the unneeded motherboard or power supply or whatever.

And if I did go for (or fall for) the Shuffle combination offer, then it would be "good luck" trying to get rid of the unwanted motherboard or power supply on Ebay. All the other guys who bought that combination will be dumping the unwanted component on Ebay, so final sale prices will be low.

Way to go Newegg. NOT. :mad: :blackeye:
 
Who has gainward (rtx 3060 12gb or another), has a new version of their tool ExperTool 64bits, I'm trying it right now.

It has an OC Scan option where it automatically records how much your card can be overclocked.
I can also regulate the lighting on it, it can be OFF, so that it lights up according to the gpu temperature or changes (rainbow), this is not the case with msi afterburner.
I mostly like it, download it here:
https://www.gainward.com/main/vgapro.php?id=1106&tab=dl&lang=en
 
You think the 3060 is bad. Wait till Nvidia starts price gouging and misnaming their 4060 cards!! :LOL:
 
Back
Top