Overclocking the 2080 series

I was just reading a post by Unwinder ( maker of afterburner) he has been running an RTX 2080 through its paces. He has had it stable up to 2100mhz. Looking good so far.

forum visitors,
I shared my impressions about NVIDIA Scanner technology from software developers’s point of view. Now I’d like to post the impressions from end user and overclocker POV.

I was never a real fan of automated overclocking because the reliability was always the weakest spot of overclocking process automation. NVIDIA Scanner is not a revolution like many newsmakers are calling it simply because different forms of automatic overclocking already existed in both NVIDIA and AMD drivers for couple decades, if not more (for example NVIDIA had it inside since CoolBits era, AMD had it in Overdrive). However, it was more like a toy and marketing thing, ignored by serious overclockers because everybody used to the fact that traditionally it crashed much more than it actually worked. Different third party tools also tried to implement their own solutions for automating the process of overclocking (the best of them is excellent ATITool by my old good friend w1zzard), but reliability of result was also the key problem.

So I was skeptical about new NVIDIA Scanner too and had serious doubts on including it into MSI Afterburner. However, I changed my mind after trying it in action on my own system with MSI RTX 2080 Ventus card. Yes, it is not a revolution but it is an evolution of this technology for sure. During approximately 2 weeks of development, I run a few hundreds of automatic overclocking detection sessions. None of them resulted in a system crash during overclocking detection. None of them resulted in wrongly detecting abnormally high clocks as stable ones. The worst thing I could observe during automatic overclocking detection was GPU hang recovered during scanning process, and the scanner was always able to continue scanning after recovering GPU at software level and lower the clocks until finding stable result. In all cases it detected repeatable approximately +170MHz GPU overclocking of my system, resulting in GPU clock floating in 2050-2100MHz range during 3D applications runtime after applying such overclocking. Even for the worst case (i.e. potential system crash during overclocking detection) Scanner API contains the recovery mechanisms, meaning that you may simply click “Scan” one more time after rebooting the system and it will continue scanning from the point before crash. But I simply couldn’t even make it crash to test such case and emulated it by killing OC scanner process during automatic overclocking detection.

So embedded NVIDIA workload and test algorithms used inside the Scanner API look really promising for me. And it will be interesting to read impressions of the rest overclockers and RTX 2080 card owners who try NVIDIA Scanner in action in nearest days/weeks.
 
I was just reading a post by Unwinder ( maker of afterburner) he has been running an RTX 2080 through its paces. He has had it stable up to 2100mhz. Looking good so far.

At less than 1000mv IIRC too. So it seems to do more with less voltage than Pascal.
 
At less than 1000mv IIRC too. So it seems to do more with less voltage than Pascal.

It's even more impressive when you take into consideration the size of the chip itself. If the price was reasonable, Nvidia would have another home run on there hands.
 
It's even more impressive when you take into consideration the size of the chip itself. If the price was reasonable, Nvidia would have another home run on there hands.

I wouldn’t worry about nVidia too much. I think they will do just fine.

If supply catches up I am sure prices will fall. I don’t think that will be for a long, long time though.
 
I wouldn’t worry about nVidia too much. I think they will do just fine.

If supply catches up I am sure prices will fall. I don’t think that will be for a long, long time though.

They Pretty much always do well. I would really love to buy a Ti. But I'm just not in a position to get one. I have been waiting for Ray Tracing to see the light of day for years. Just a bit upset it's our of my reach at this stage.
 
Back
Top