RTX 2080 Super vs. RTX 3080 (on a Kaby Lake)

Cannibal Corpse

[H]ard|Gawd
Joined
Sep 22, 2002
Messages
1,277
Hello all,
If I upgrade to an RTX 3080, would I notice a huge difference from my current setup? My display is a single Sony 4K (XBR-55X950G) display.

What about the Raytracing aspect? Does 3080 have better Ray tracing?

My specs:

Intel® i7-7700K 4.2GHz (Kaby Lake)
GIGABYTE Aorus GA-Z270X-Gaming K7 (rev. 1.0)
CORSAIR Vengeance LED (Red) 16GB DDR4 3200MHz
EVGA RTX 2080 Super
SEASONIC 850W Prime Titanium PSU


Thank you in advance!
 
Depends on the game, but it should be about a 55% stock performance increase, and comparing OC to OC its probably closer 45% since the 3080 at stock is run much closer to its performance limits. Since you are limited to 60hz on that display, you have to decide if its worth paying $800 ish to hit your framerate target.
 
Depends on the game, but it should be about a 50% performance increase. Since you are limited to 60hz on that display, you have to decide if its worth paying $800 ish to hit your framerate target.

Wow, 50% is huge. But yes, unfortunately I am limited to 60Hz.

p.s. I just looked at my joined date on these forums: 2002! I am an old fart!
 
2080 Super is a good card, I'm still riding a 2080 Ti. I'm gonna wait until Big Navi/20GB 3080s come up before deciding. For me its about a 30% performance increase to upgrade (20% when comparing OC to OC), so I'm not in a huge rush yet considering I too am stuck at 60hz.
 
2080 Super is a good card, I'm still riding a 2080 Ti. I'm gonna wait until Big Navi/20GB 3080s come up before deciding. For me its about a 20-25% performance increase to upgrade, so I'm not in a huge rush yet considering I too am stuck at 60hz.

I know, I mean I have seen just a few games utilizing the Raytracing.

Another question:
Is my current CPU/MoBo/RAM *adequate* for the upcoming games in 4K on single display? In my experience, the CPU/MoBo are no longer *that big of deal* (unlike the Win98/2000 era), as long as you have fast GPU.
 
I know, I mean I have seen just a few games utilizing the Raytracing.

Another question:
Is my current CPU/MoBo/RAM *adequate* for the upcoming games in 4K on single display? In my experience, the CPU/MoBo are no longer *that big of deal* (unlike the Win98/2000 era), as long as you have fast GPU.

At this time, only if you plan on upgrading to a higher refresh rate display. 60hz is not much of a challenge to hit at 4K for a CPU, and it should be fine for indie games since they tend to be mostly single thread limited. But AAA games are becoming more and more multithreaded. If you upgrade in the future for 4K120 you should get at least a 8 CPU. The only game off the top of my head where a 4 core is a major problem right now is Detroit Become Human, you really need a 6 core minimum to run that smoothly even at 60fps.

RAM should be fine for most games but you need 32GB for next gen large open world games, right now really only Microsoft Flight Sim shows a major gain to 32GB AFAIK. But RAM is cheap, 32GB kits are like $100 on sale.
 
At this time, only if you plan on upgrading to a higher refresh rate display. 60hz is not much of a challenge to hit at 4K for a CPU, and it should be fine for indie games since they tend to be mostly single thread limited. But AAA games are becoming more and more multithreaded. If you upgrade in the future for 4K120 you should get at least a 8 CPU. The only game off the top of my head where a 4 core is a major problem right now is Detroit Become Human, you really need a 6 core minimum to run that smoothly even at 60fps.

RAM should be fine for most games but you need 32GB for next gen large open world games, right now really only Microsoft Flight Sim shows a major gain to 32GB AFAIK. But RAM is cheap, 32GB kits are like $100 on sale.

Thanks for the input. Yes, gaming at 120Hz is becoming the norm: I remember upgrading to a 144Hz display (still at 1080p), and smoothness/responsiveness in FPS games were really noticeable.

Even the upcoming consoles from Sony and Microsoft are slated to be 4K120Hz systems.
 
Thanks for the input. Yes, gaming at 120Hz is becoming the norm: I remember upgrading to a 144Hz display (still at 1080p), and smoothness/responsiveness in FPS games were really noticeable.

Even the upcoming consoles from Sony and Microsoft are slated to be 4K120Hz systems.

Its mainly because they have HDMI 2.1 that they can output 4K120. But the hardware is basically 2080/2080 Super level, that's why alot of devs like Ubisoft say they are targeting 4K30 on next gen.
 
I know, I mean I have seen just a few games utilizing the Raytracing.

Another question:
Is my current CPU/MoBo/RAM *adequate* for the upcoming games in 4K on single display? In my experience, the CPU/MoBo are no longer *that big of deal* (unlike the Win98/2000 era), as long as you have fast GPU.


Check HardwareCanucks recent cpu scaling tests on the RTX 3080,it showed a cpu isn't as important as many think.
 
Hello all,
If I upgrade to an RTX 3080, would I notice a huge difference from my current setup? My display is a single Sony 4K (XBR-55X950G) display.

What about the Raytracing aspect? Does 3080 have better Ray tracing?

My specs:

Intel® i7-7700K 4.2GHz (Kaby Lake)
GIGABYTE Aorus GA-Z270X-Gaming K7 (rev. 1.0)
CORSAIR Vengeance LED (Red) 16GB DDR4 3200MHz
EVGA RTX 2080 Super
SEASONIC 850W Prime Titanium PSU


Thank you in advance!

This thread should prove useful = )
 
Back
Top