Interesting, but the lower the frames per second, the lower the gain. 20% of 50FPS is 60FPS. 30FPS is 36FPS. That's what I mean by does it really matter? If you are getting 100FPS in a game and you increase it to 120, is that helpful? For me, yeah I'll take it, but it's more useful getting 30FPS up to at least 50 - 60 do it's playable. Is it worth a 20% increase in power consumption? I mean, yeah, 20% is technically a LOT of headroom, but does it really matter to game play? I think it is nice if you can get 3070 performance from a 3060ti when overclocked, for instance, because then you just buy the 3060ti OC version and save some serious bucks.ti4600 had good gains, 9700pro gave fairly large increases with overclocking, so did the 5900xt, x1900AIW gave very large gains in terms of performance, hd4870 had a fair amount IIRC, 8800gt had monstrous head room, gtx460 had huge head room, hd7870 had massive headroom along with the hd7970. gtx1070 had a fair bit of headroom. Vega64 gave fairly large gains if you were willing to tinker A LOT, vega56 had huge gains. Current 6900xt offers a lot of headroom too. This is just from what I recall and I also googled quickly to corroborate (at least a 10% gain from overclocking, often closer to 20%).
Also, the magenta and white is a really nice color combo. I dig it.
But, yeah, I'm not saying 20% isn't a great OC, but just question the practical usefulness.
I'm interested in how much better the 3060ti I have runs OCed now, and since there are no reviews, it looks like I'll need t do it myself. The thing is, to do it right, I need to run many tests in game. How would I go about doing that? Do most of games out now have performance demos? Or should I simpy run a synthetic?