Those of you with 3000 series cards...was the only way to get 120 hz was to select the "custom" PC resolution and not under the Ultra HD listings? RBG and 10 bpc are the max via HDMI 2.1, correct? What are the ideal settings otherwise (both TV and nvidia control panel)?
Just picked up a ROG Strix 3080 at my local Microcenter in Columbus, OH. They had a "couple" of these as well as a 3080 Tuf, Zotac 3090 and also a Strix 3090.
Almost went for the 3090 but thought better of it. Good luck to all.
I had a hard on initially for the 3090 when it was announced, then came to my senses when the performance numbers were revealed. Not sure what I was expecting, I guess.
Honestly I just want a 3080 for now to be able to use HDMI 2.1.
For a mix of SDR (mostly) and HDR gaming at 4k60hz until HDMI 2.1 comes out....what's the general consensus on RGB 8bit, full vs 422 10/12bit and limited? Lots of conflicting descriptions out there but these two settings seem to be the most often recommended.
First time using a TV as a monitor..but is it "normal" to have a TINY bit of overscan in certain games intermittently? And I mean tiny...like 1 mm shaved off the right side of the screen that goes away after turning the TV off and on, and even then it doesn't always show up. I have the latest...
Performance seems fine to me, 1440p on a 2080ti with max settings, volumetric fog turned down a notch. Getting about 100-110 fps in most instances. It's just that claptrap dancing loading screen for 2-3 minutes on startup that's annoying...is that related to selecting dx12 as I've read in some...
Just getting started, but what's the point of equipping the grenade mode when by default you can still throw unequipped grenades (as long as you have no 2nd action skill)? Is it for later when you get different types of grenades?