RTX 2080 Super vs. 3060Ti (OEM)

Cannibal Corpse

[H]ard|Gawd
Joined
Sep 22, 2002
Messages
1,277
I have the option to obtain a FREE OEM (coming from an XPS DELL) 3060Ti, to replace my existing eVGA 2080 Super.

I know these cards might be neck to neck as far as PERFORMANCE is concerned, but feature-wise, the HDMI 2.1 (4K120 12-bit HDR) on 3060Ti seem very appealing to me.

One thing that concers me is the number of the raytracing cores on 3060Ti:

Ray Tracing Cores48 (1st Gen)38 (2nd Gen)

Texture Rate348.5 GTexel/s253.1 GTexel/s

Tensor Cores384 (2nd Gen)152 (3rd Gen)

Should You Go for the 3060 Ti Over the 2080 Super?​


If you're picking between the two, the obvious choice is the 3060 Ti. Not only is it slightly more powerful than the latter, but it's also more power efficient and significantly more affordable at SRP. Even at inflated market prices, the former is still more affordable and offers more FPS per dollar than the old GPU.

Don't be fooled by the higher number of 2080 Super. Even if its equivalent 30-series card is the 3080 Ti, Nvidia's Ampere architecture can now deliver similar performance for half the price. So, if you're in the market for a mid-range GPU, you should go for NVIDIA's latest to get the best balance between price, performance, and efficiency.

Source:

https://www.makeuseof.com/gpu-comparison-nvidia-3060-ti-vs-2080-super/


Overall, it’s hard to argue in favor for the GeForce RTX 2080 Super against the RTX 3060 Ti, no matter what the intended use for them will be. Even for half its original price, the GeForce RTX 2080 Super would still not be worth the investment – not so much for the difference in performance, which is not that significant, but due to the newer architecture and technologies of the RTX 3060 Ti that will be further optimized as the new generation of games progresses.

Source:

https://premiumbuilds.com/comparisons/rtx-3060-ti-vs-rtx-2080-super/#Verdict

Also:


And:


Should I got for it? (remember it's FREE, and it is a 3X series)
Thanks!
 
Last edited:
This card will be going to be connected to my newly purchased Sony XBR A80K, which is capable of VRR, 120Hz refresh rate, 4K120 12-bit HDR, and all the new bells and whistles. Planning to play recent games (Elden Ring,COD MWII, etc.)
 
Last edited:
This card will be going to be connected to my newly purchased Sony XBR A80K, which is capable of VRR, 120Hz refresh rate, 4K120 12-bit HDR, and all the new bells and whistles. Planning to play recent games (Elden Ring,COD MWII, etc.)

The 3060Ti is coming off of this XPS system (sold at COSTCO Wholesale, here in the US):
https://www.costco.com/dell-xps-895...x-3060-ti---windows-11.product.100853541.html
Kind of a bummer to go from an evga card to a generic one, I'd probably still prefer the 3060 ti. I had an MSI one for a while, solid card.
 
I know, but if I get 4K120 12-bit HDR, it would be enough reason(s) to replace my 2080 Super.
 
I know, but if I get 4K120 12-bit HDR, it would be enough reason(s) to replace my 2080 Super.
3060ti will still be complete garbage for that resolution and desired refresh rate. But yes, it would be better than a 2080 super.
 
Garbage for 4K resolution? Then what is it good for, 1080? LOL!
So are you unaware that there is a resolution between 1080P and 4k? I mean have you not paid attention to any of the reviews? No one in their right mind would think a 3060 TI is appropriate for modern demanding games on a 4K high refresh rate screen. That is just simple common sense for anyone that has even the least bit of hardware knowledge. You could still play games at 4K but obviously you'll be greatly reducing the settings if you want to get 60 FPS even. A 3060 TI is clearly more appropriate for 1440p. At 1080p it will tear through games with ease.
 
Garbage for 4K resolution? Then what is it good for, 1080? LOL!
For me, it would be bare minimum for 1440 these days. However, I can tell you it's absolutely not enough for 4k unless you're fine with DLSS performance , etc.
 
It is fine for the 2 games you mentioned specifically (Elden Ring, Modern Warfare 2).
Unless he's going to greatly reduce the settings elden ring cannot run at 60 FPS at 4K with a 3060 TI.
 
So are you unaware that there is a resolution between 1080P and 4k? I mean have you not paid attention to any of the reviews? No one in their right mind would think a 3060 TI is appropriate for modern demanding games on a 4K high refresh rate screen. That is just simple common sense for anyone that has even the least bit of hardware knowledge. You could still play games at 4K but obviously you'll be greatly reducing the settings if you want to get 60 FPS even. A 3060 TI is clearly more appropriate for 1440p. At 1080p it will tear through games with ease.
I have been able to play the new MWII at 4K with all settings set to MAX on my 2080 Super, and get 60+ FPS both on campaign and multiplayer. Elden Ring is also playing nicely hovering around 45~58FPS on Super, so I am not sure what are you talking about.
 
I have been able to play the new MWII at 4K with all settings set to MAX on my 2080 Super, and get 60+ FPS both on campaign and multiplayer. Elden Ring is also playing nicely hovering around 45~58FPS on Super, so I am not sure what are you talking about.
I said Elden Ring cannot get 60 FPS at 4K without reducing the settings and then you confirm that by telling me you got 45 to 58 FPS and then ask what I'm talking about? And I could name many more popular games that cannot average 60 never mind maintain 60 so no a 3060 TI is not an appropriate card for a 4K high refresh monitor and you just confirmed that yourself...
 
Last edited:
I said Elden Ring cannot get 60 FPS at 4K without reducing the settings and then you confirm that by telling me you got 45 to 58 FPS and then ask what I'm talking about? And I could name many more popular games that cannot average 60 never mind maintain 60 so no a 3060 TI is not an appropriate card for a 4K high refresh monitor and you just confirmed that yourself...
why are you so upset and angry all the time? this is beyond ridiculous, you need to relax.
 
RTX 3060 Ti should provide a small but definite improvement over the 2080 Super. Based on this chart you should see a 10% performance improvement and maybe in ray-tracing applications it will be even better(?)

https://www.tomshardware.com/reviews/gpu-hierarchy,4388.html

I certainly wouldn't recommend paying for such a small upgrade but given that you're getting this gpu for free, it's pretty hard to recommend against it. You will also save 50W on your TDP (2080 Super 250W, 3060Ti 200W) and so this could free up some PSU capacity for you to do a CPU upgrade or something down the road if you want.

Also, as much as I love eVGA, the general consensus here now is that the advantages of owning an eVGA card are basically moot. Customer service was always eVGA's big advantage over the other OEM's, but that is likely no longer the case now that eVGA has exited the market. As a result, I wouldn't necessarily be partial to eVGA at this point. I don't know what OEM your 3060Ti is but unless it's Gigabyte (I have heard nightmarish things about Gigabyte here on this board and I will never buy from them) I probably wouldn't sweat leaving eVGA too much at this point.
 
One additional dark horse in the room, if offered both cards at the same price:

If you use need to maximize frame rate amplification technologies (DLSS) especially for brute-force framerate-based motion blur reduction and don’t care about latency, the 3060 Ti with DLSS 3.0 seems yields significant performance advantage above the RTX 2080 Super, which only supports DLSS 2.0.

People who hate strobe/flicker based motion blur reduction need to use strobeless motion blur reduction via brute framerate technique.

DLSS 3.0 is one of the better strobeless motion blur reduction technologies at the moment if you don’t care about latency (solo gaming like Cyberpunk 2077, The Witcher 3, etc) and configured in certain ways, the artifacts from DLSS is less than the aeffects from strobing (strobe backlights like ULMB, ELMB, VRB, LightBoost, etc), from all the flickering, the stroboscopics, the light loss, and the amplified jittering. Zero light loss of strobe backlights. Brute Hz permitting brute framerate-based motion blur reduction FTW!

Currently, DLSS 3.0 can increase framerates 3-4x when configured at certain performance settings, if not CPU limited, leading to 3-4x reduction of display motion blur. Of which DLSS 3.0 can reduce display motion blur by about 3x-4x in tests, especially on high-Hz OLED, since frametime=persistence=motion blur on sample and hold.

Even though there’s some softening artifacts, it’s not enough to undo the 3x+ reduction of display motion blur without using a strobe backlight (Just make sure you have the hertzroom to capture the framerate-based motion blur reduction).

Even when you dial back DLSS 3.0 settings to match DLSS 2.0 frame rates, the image quality ends up being better at the milder DLSS 3.0 settings versus maximized DLSS 2.0 settings.

Someday, flashy flickery motion-blur-reducing backlights or black-frame-inserters, as good as they are, will be obsolete for future PC gaming.

Real life does not flicker, we merely only need to get as close to real lifes’ infinite frame rate by ever higher framerate and refreshrate.

Brute framerate-based motion blur reduction is the future of PC gaming (long-term, especially if latency is solved by adding mouse coordinates to the frame rate amplifier logic in future). It works well especially at current 3x+ framerate-increase ratios, and soon 5x-10x ratios when reprojection is added.
 
Last edited:
RTX 3060 Ti should provide a small but definite improvement over the 2080 Super. Based on this chart you should see a 10% performance improvement and maybe in ray-tracing applications it will be even better(?)

https://www.tomshardware.com/reviews/gpu-hierarchy,4388.html

I certainly wouldn't recommend paying for such a small upgrade but given that you're getting this gpu for free, it's pretty hard to recommend against it. You will also save 50W on your TDP (2080 Super 250W, 3060Ti 200W) and so this could free up some PSU capacity for you to do a CPU upgrade or something down the road if you want.

Also, as much as I love eVGA, the general consensus here now is that the advantages of owning an eVGA card are basically moot. Customer service was always eVGA's big advantage over the other OEM's, but that is likely no longer the case now that eVGA has exited the market. As a result, I wouldn't necessarily be partial to eVGA at this point. I don't know what OEM your 3060Ti is but unless it's Gigabyte (I have heard nightmarish things about Gigabyte here on this board and I will never buy from them) I probably wouldn't sweat leaving eVGA too much at this point.
EVGA is a no go for video cards is correct. They can't even RMA my 3080Ti FTW3 Ultra Hydrocopper so I am forced to settle for a 3080Ti XC3 Ultra Hydrocopper which is a downgrade. I am lucky though because of old new stock that I am getting a brand new unused copy. Otherwise I'd be out of luck big-time. Never getting another EVGA card ever again albeit I didn't know they were gonna wake up in the morning one day and decide to leave the GPU market after 15 years or whatever so there's that.
 
  • Like
Reactions: gvx64
like this
One additional dark horse in the room, if offered both cards at the same price:

If you use need to maximize frame rate amplification technologies (DLSS) especially for brute-force framerate-based motion blur reduction and don’t care about latency, the 3060 Ti with DLSS 3.0 seems yields significant performance advantage above the RTX 2080 Super, which only supports DLSS 2.0.

People who hate strobe/flicker based motion blur reduction need to use strobeless motion blur reduction via brute framerate technique.

DLSS 3.0 is one of the better strobeless motion blur reduction technologies at the moment if you don’t care about latency (solo gaming like Cyberpunk 2077, The Witcher 3, etc) and configured in certain ways, the artifacts from DLSS is less than the aeffects from strobing (strobe backlights like ULMB, ELMB, VRB, LightBoost, etc), from all the flickering, the stroboscopics, the light loss, and the amplified jittering. Zero light loss of strobe backlights. Brute Hz permitting brute framerate-based motion blur reduction FTW!

Currently, DLSS 3.0 can increase framerates 3-4x when configured at certain performance settings, if not CPU limited, leading to 3-4x reduction of display motion blur. Of which DLSS 3.0 can reduce display motion blur by about 3x-4x in tests, especially on high-Hz OLED, since frametime=persistence=motion blur on sample and hold.

Even though there’s some softening artifacts, it’s not enough to undo the 3x+ reduction of display motion blur without using a strobe backlight (Just make sure you have the hertzroom to capture the framerate-based motion blur reduction).

Even when you dial back DLSS 3.0 settings to match DLSS 2.0 frame rates, the image quality ends up being better at the milder DLSS 3.0 settings versus maximized DLSS 2.0 settings.

Someday, flashy flickery motion-blur-reducing backlights or black-frame-inserters, as good as they are, will be obsolete for future PC gaming.

Real life does not flicker, we merely only need to get as close to real lifes’ infinite frame rate by ever higher framerate and refreshrate.

Brute framerate-based motion blur reduction is the future of PC gaming (long-term, especially if latency is solved by adding mouse coordinates to the frame rate amplifier logic in future). It works well especially at current 3x+ framerate-increase ratios, and soon 5x-10x ratios when reprojection is added.
30 series doesn’t support DLSS 3.0 feature add.
 
Oh, my bad. Yes. That's only a 4000-series feature! Both still use DLSS 2.0

I hadn't had my coffee yet.
 
I'd go with 3060ti over 2080 super. If your target is 4k 120fps, it's not going to cut it for a lot of modern games, 4k 60fps is so so. My concern about the "60fps" avg of a 3060ti, is the 1% dips into the 30fps range. Take a game like elden ring where a well timed roll, parry requires frame consistency and your timing is messed up due to a frame dip. Average frame rate isn't the only thing you should look at, but rather how bad the 0.1% and 1% lows are and what situations trigger them. (CPU bound dips/GPU bound dips)
 
UPDATE
OK, I got the 3060 Ti card and installed it, and I get a garbled image in Windows 10 (BIOS graphics show fine). I tried the latest drivers, and did a CLEAN install, I even completly removed the entire drivers (from device manager) and in all instances, upon boot to Windows 10, I get a garbled image.

If I replace it with my 2080 Super, Windows 10 displays everything just fine.

Any suggestions?

p.s. I am downloading older version of drivers to see if they help.

Update 2
older drivers didn't help either, still get the garbled image.

I am using 120Hz refresh rate on a HDMI 2.1 capable cable on my Sony A80K display. I am going to lower the refresh rate on 2080 Super, the install the 3060Ti and see if it help.

Any suggestions would be highly appreciated!
 
Last edited:
I needed to Re-Re-install my 3070 driver twice clean for the image to show clear and not out of wack. Was weird. If that doesn't work card could be faulty if you have tried other drivers as well. If the same HDMi cord that is known to be good then we're running out of options. R u on win 11?
 
No I am on Windows 10.

UPDATE 3: SUCCES!!!

OK, per your recommendation, I re-re-installed clean the latest drivers on 2080 Super, lowered the refresh from 120Hz to 60Hz, and shut off. Swapped the cards, and I indeed get a sharp image!
Now I bumped the refresh to 120Hz, and so far so good.

The Advanced Display in Windows reported 8bit HDR, so I toggled the Enable HDR on Windows dissplay setting, and I now read 10bit HDR.

I thought 30XX series could do 12bit HDR.

What settings do you recommend that I should do at this point?

UPDATE 4
I can change some settings in NVIDIA Control Panel as well, and indeed I changed it to 12bit HDR.

What about Output Color format:

RGB

or

YCbCr444


or

YCbCr420
 
Last edited:
What about Output Color format:

RGB

or

YCbCr444

or

YCbCr420
RGB > YCbCr 4:4:4 > YCbCr 4:2:0


YCbCr adds colorspace conversions. It's mostly harmless, but why add extra steps?

4:2:0 is chroma subsampling. Not great for computers. Or for anything else.
 
RGB gives you the full fat color. Games look vibrant and look best in RGB HDR imo.
I've tweaked the other settings in the Nvidia control panel and some of them cause unforseen issues so now I just leave everything default. RGB & 144hz and that's it.
 
Thanks for the info:
The video card was working fine, but as soon as I restarted my computer, I get the same garbled image. This 3060Ti came from a brand new DELL XPS machine, I am sure it is not faulty.
This is driving me nuts!
 
Damn. Just exhaust all options. It might be a pain in the ass but I would.

Try another HDMi wire.
Reseat the card.
Replug in the power adapters.
Reinstall the driver clean again.
Reinstall windows 10 clean and update the GeForce experience automatic driver first.

These are the troubleshooting steps I would take. When all else fails I always do a fresh clean wipe and formatted install on windows 10 or 11 64bit.

If that doesn't work we need more technical help from some of the other wizards here.

Make sure your Motherboard bios is updated along with all the motherboard drivers.
 
Thanks for the info:
The video card was working fine, but as soon as I restarted my computer, I get the same garbled image. This 3060Ti came from a brand new DELL XPS machine, I am sure it is not faulty.
This is driving me nuts!
My EVGA 3090 gave garbage screen when at 120hz from a cable that worked with the 6900XT and 3080Ti. Bought a Zestkit 8K HDMI 2.1 cable and perfect, RGB 8/10/12 bit all work flawless. The old cable was loose in the EVGA 3090 while the Zeskit cable was tight, almost could not put it in the 42" C2. A little hindsight, I did have a little problem with the 3080Ti with the old cable but it mostly worked, not in the 3090.

https://www.amazon.com/dp/B07S1CGQ9Z?psc=1&ref=ppx_yo2ov_dt_b_product_details
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
I have the same zeskit cable. Still needed to reinstall the driver twice interestingly enough lol
 
  • Like
Reactions: noko
like this
Great! I am going to change the cable and see if it makes any difference. It is an $80 Rockfish cable from Best Buy.

Which cable do you recommend? And I will order it ASAP!
 
Last edited:
I got this one personally in 10ft, so I have extra slack to move the rig or display around as I please.

https://www.amazon.com/48Gbps-Compa...prefix=zeskit+may,electronics,110&sr=1-3&th=1

Although be aware that it may not be the cable & may be a software or hardware issue. Just saying It's always good to have an additional cable around for troubleshooting purposes to do the process of elimination you know what I mean.
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
My EVGA 3090 gave garbage screen when at 120hz from a cable that worked with the 6900XT and 3080Ti. Bought a Zestkit 8K HDMI 2.1 cable and perfect, RGB 8/10/12 bit all work flawless. The old cable was loose in the EVGA 3090 while the Zeskit cable was tight, almost could not put it in the 42" C2. A little hindsight, I did have a little problem with the 3080Ti with the old cable but it mostly worked, not in the 3090.

https://www.amazon.com/dp/B07S1CGQ9Z?psc=1&ref=ppx_yo2ov_dt_b_product_details
Ya the new cable feels good when it has resistance going in you know It's making good contact lol
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
  • Like
Reactions: noko
like this
It has GOT to be the cable, as I when I unplug and re-plug it (from the TV side), the picture becomes normal!
 
  • Like
Reactions: noko
like this
Ya the new cable feels good when it has resistance going in you know It's making good contact lol
Frankly it was unreal how loose the plug on the old supposedly 8K cable was. Maybe EVGA used some bigger size HDMI receptacle (whatever it is called).
 
OK, ordered the cable, and now even after couple of re-boots or shut-downs, the image seem to be stable and normal.
 
Last edited:
Back
Top