Updgrading - RTX 4080 or OLED display?

OpenSource Ghost

Limp Gawd
Joined
Feb 14, 2022
Messages
220
Currently running 8700K @ 5.0Ghz, RTX 3060 Ti, and 1440p G-Sync VA monitor. I have enough to either get RTX 4080 or OLED (4K or ultrawide resolution), but can't decide what's going to affect my gaming experienc...
 
They both will? But I think I won't be the only one saying: the 4080 isn't worth the money. The 7900XTX will likely eat its lunch for at least $300 less. If you're not going to buy a 4090 then I probably wouldn't buy an nVidia card period. Basically AMD is already poised to beat nVidia at every performance tier for less money. If you're not planning on running anything else that uses CUDA (and perhaps even if you are), I wouldn't bother.

I'd also note that if you're buying any of these top end cards, if you're not going to play in 4k it's a bit of a waste. And to my knowledge there are zero desktop sized OLED monitors that are 4k. There are a few 1440P options from LG and Alienware, and that's kind of the end of list. We're all waiting for the 4k, OLED, 120+Hz display.

However, the LG C2 42" can be had right now on Black Friday sales for <$900. Otherwise, we're still waiting for monitor perfection.
 
4080 is a waste with that CPU, new monitor at 4k will be starved to hit 60fps with that PC. Upgrade to better CPU/mobo, pick up a cheap used 3080ti/3090 (or wait on 7900xt card)
 
I second that the 8700k is unfortunately outdated.... Its good paired with the 3060ti but if you go up higher the CPU will bottleneck the expensive GPU you purchased..
 
Looking at your hardware it is all balanced somewhat alright. Your cpu is a little older It would be my target if I were you if you want to start upgrading. . Which means a new mb ram and cpu. I was in your position and decided to go for a 12700kf 64gb 6000mhz ram and Z690 to have enough power for the next 5 years. 3060 would struggle at 4k and you're already at 1440p sweet spot. See where the 4070 falls in January or 3080Ti used can be found for a great value right now. I had an oled and returned it because it was dim and the whites looked grey the LG C2 was a hard pass for me didn't live up to the hype. Consider rocking what you have and just buy a new game that gets great reviews and enjoy it and don't even worry about anything your setup is balanced tbh.
 
Good advice here - getting either will utterly kill the balance you now enjoy. You won't be happy with the resulting gaming performance on a new 4K monitor and the new video card would pretty much be wasted on a 1440p monitor. And yes, the 4080 is way overpriced for what it delivers.
 
A 3060 Ti is more than capable of hitting 60+ FPS in modern games without ray tracing or DLSS at UHD 4K resolution. G-SYNC compatibility on the LG OLED make this largely a non-issue. You probably won't be happy with performance if you try to add ray tracing, though, as it sits around a standard 2080 in terms of performance. DLSS is always an option if you want to try.

I say grab an LG OLED (I'd go with the 48" since the 42" uses a cut version of the old panel). It will give you the biggest improvement in your gaming experience right now.
 
First things first, upgrade the CPU and Motherboard. The new i5 13600k is getting rave reviews for a good afordable gaming processor, and find a deal on a Z690 motherboard.

Then later buy a video card

Then the 4k display. Your 3060 Ti won't be able to handle 4k in gaming. Even a 3080 just barely handles it, you really need a 3090 minimum for good fps @ 4k, and the new 4080 is the sweet spot for 4k 120hz.
 
Last edited:
A 3060 Ti is more than capable of hitting 60+ FPS in modern games without ray tracing or DLSS at UHD 4K resolution. G-SYNC compatibility on the LG OLED make this largely a non-issue. You probably won't be happy with performance if you try to add ray tracing, though, as it sits around a standard 2080 in terms of performance. DLSS is always an option if you want to try.

I say grab an LG OLED (I'd go with the 48" since the 42" uses a cut version of the old panel). It will give you the biggest improvement in your gaming experience right now.
You seem to be quite delusional about what a 3060 TI can do at 4K. There are plenty of games that cannot even average never mind maintain 60 FPS on max settings at 4K even without ray tracing. Assassin's Creed valhalla, Control, Cyberpunk 2077, Dying light 2, Microsoft flight simulator, Elden Ring, Red Dead Redemption 2, Watch Dogs legion, God of War, Halo infinite and some others cannot even average 60 or in some cases not even average 50 FPS at 4K. And you are looking at minimums in the 40s and even 30s at times in the really demanding games.
 
Last edited:
You you seem to be quite delusional about what a 3060 TI can do at 4K. There are plenty of games that cannot even average never mind maintain 60 FPS on max settings at 4K even without ray tracing. Assassin's Creed valhalla, Control, Cyberpunk 2077, Dying light 2, Microsoft flight simulator, Elden Ring, Red Dead Redemption 2, Watch Dogs legion, God of War, Halo infinite and some others cannot even average 60 or in some cases not even average 50 FPS at 4K.
Who said anything about max settings? All those games ran fine when I still had my 2080 Ti, which isn't much stronger than the 3060 Ti.
 
These new OLED gaming displays like the LG C2 42", or Asus ROG 42" OLED, are 120hz to 138hz, so gaming @ 4k and 120hz fps, requires a pretty beefy GPU. I have a RTX 3080 10GB and a i7 9700k on a Z390 motherboard, with 32GB DDR4 3200hz,and had the Asus ROG 42" OLED for a brief period, and yeah it ran them ok...but it felt jittery at times, in Halo Infinite I would range from 80fps to 120fps, but mostly in the 80s to 90s FPS, at highest settings. Cyberpunk 2077 was borderline playable, didn't watch the fps counter, but it felt stuttery.

Only game that ran really smooth for me @ 4k 120hz was Doom Eternal and Destiny 2, those had no problem keeping up, and World of Warcraft played well, but at time would dip into the 60fps to 70 fps, other times easily keep up at 120fps.

I just upgraded, and putting together a new build this week; i9 13900k, Z790 motherboard, 32GB DDR5 5600hz, M.2 SSD ( my first one ) but reusing the rest of my current systems parts, keeping my 3080 10GB for now, as I just can't justify the $1200+ price of the 4080, and no thanks on the power hog 4090, will wait to see what the 7900XTX is like, or just might buy a used 3090 Ti next year for dirt cheap if possible.

My opinion for you, there's good deals on combo's of a i7 12700K and Z690 motherboards, I'd go that route first. But preferably I'd get the i5 13600k, that thing trades blows with the 12900k in games.
 
Who said anything about max settings? All those games ran fine when I still had my 2080 Ti, which isn't much stronger than the 3060 Ti.
Because your comment made it look like ray tracing would be the only compromise to getting 60 FPS. I mean you said even without dlss as long as you didn't use Ray tracing 60fps plus. So now we are compromising on settings and to do that you would have to compromise quite a bit because again there are games that are going to be in the 40s and 30s for minimums. Now if you simply would have said that a 3060 TI can get by at 4K with reduced settings and/or using dlss then I never would have said anything as I fully agree with that.
 
4k OLED @ 120hz Max settings looks AMAZEBALLS

But 4k at medium settings and low frame rates looks a bit blurry and just not nearly as sharp and cool.

I wouldn't buy a 4k OLED monitor to run at lower settings.
 
I can still watch movies on 4K TV, but my rig won't handle 4K gaming at enjoyable framerate. Didn't know 8700K is so outdated... By itself, upgrading to 13900K won't be much of an improvement with 3060 Ti.
 
I can still watch movies on 4K TV, but my rig won't handle 4K gaming at enjoyable framerate. Didn't know 8700K is so outdated... By itself, upgrading to 13900K won't be much of an improvement with 3060 Ti.

A 13900k is a pretty big jump from a 8700k, yeah the GPU does most the heavy lifting, but buying a new high end video card like a 4080 paired with a 8700k would be losing a ton of performance due to bottlenecking. I've seen benchmarks showing old processors can make you lose like 20% performance with new video cards.

I understand you want a cool new 4k OLED display, because yeah, they're super awesome. But in your situation I just have to recommend you spend that money on [H]ardware first.

An 8700k is a 2017 processor almost 6 years old. Your video card is from 2020, not that old, and actually still a good card on lower resolution monitors.

I will repeat, I'd do your upgrade this way;

New CPU, Motherboard, RAM, and M.2 SSD, #1

New video card #2

4k OLED monitor #3
 
First of all, the notion that you should throw money into a new CPU and mobo when you only have a 3060 paired with 8700k at 5Ghz is insane to me. That chip at 5Ghz is plenty fast. I use a 3090 with it at less than 5Ghz and it absolutely screams. It is old to be sure, but it is more than sufficient for today’s games. But even if you were CPU bottlenecked, technically speaking, the gains from going to a better GPU would still be tremendous. You WILL need a new CPU eventually, but a 4080 is in no way overkill for a 8700k at 5Ghz. The 4080 may be capable of more with a better CPU, but it will still scream.

Get the fastest card you can afford, PERIOD.

Just wait on OLED. A 4K OLED makes no sense without the better GPU. Too many pixels to push. A 1440p wide OLED could maybe be driven by your current GPU, but it depends on your tolerance for dips in framerate.

I say GPU now and watch and wait for better pricing on OLEDs as they become more common place. Then monitor.

I’d wait on CPU and board. Just keep in mind with that kind of overclock, that mobo and cpu might give up the ghost at some point over the next few years. That is the only concern I have about that CPU, however.
 
Last edited:
These new OLED gaming displays like the LG C2 42", or Asus ROG 42" OLED, are 120hz to 138hz, so gaming @ 4k and 120hz fps, requires a pretty beefy GPU. I have a RTX 3080 10GB and a i7 9700k on a Z390 motherboard, with 32GB DDR4 3200hz,and had the Asus ROG 42" OLED for a brief period, and yeah it ran them ok...but it felt jittery at times, in Halo Infinite I would range from 80fps to 120fps, but mostly in the 80s to 90s FPS, at highest settings. Cyberpunk 2077 was borderline playable, didn't watch the fps counter, but it felt stuttery.

Only game that ran really smooth for me @ 4k 120hz was Doom Eternal and Destiny 2, those had no problem keeping up, and World of Warcraft played well, but at time would dip into the 60fps to 70 fps, other times easily keep up at 120fps.

I just upgraded, and putting together a new build this week; i9 13900k, Z790 motherboard, 32GB DDR5 5600hz, M.2 SSD ( my first one ) but reusing the rest of my current systems parts, keeping my 3080 10GB for now, as I just can't justify the $1200+ price of the 4080, and no thanks on the power hog 4090, will wait to see what the 7900XTX is like, or just might buy a used 3090 Ti next year for dirt cheap if possible.

My opinion for you, there's good deals on combo's of a i7 12700K and Z690 motherboards, I'd go that route first. But preferably I'd get the i5 13600k, that thing trades blows with the 12900k in games.
You don't need games to run at your screen's refresh rate with G-SYNC.
Because your comment made it look like ray tracing would be the only compromise to getting 60 FPS. I mean you said even without dlss as long as you didn't use Ray tracing 60fps plus. So now we are compromising on settings and to do that you would have to compromise quite a bit because again there are games that are going to be in the 40s and 30s for minimums. Now if you simply would have said that a 3060 TI can get by at 4K with reduced settings and/or using dlss then I never would have said anything as I fully agree with that.
Because ray tracing drops your frame rate by at least 40%, so if you can run a game at 70 FPS with mixed settings, adding ray tracing to that will drop it to 42 FPS.
 
There are games showing a 3090 gaining performance with an upgrade from 5900x to 7900x, let alone a 13900K. Considering the 8600k, it will be bottlenecking even a 3080ti, especially when you try running 4k.
 
If you upgrade your cpu first, you'll see almost no benifit because it's not holding you back currently.
The 4080 is way overpriced and if you wait until next year, they may drop with the competition, recession coming, or you could grab an older 3090ti at a much lower price.

If I were you, I would do the monitor first and just turn down some settings in game until you get a new card. I would do cpu last only if you feel like it's holding you back after the new graphics card. Personally I don't think an 8700k to a 13600k will be noticeable at all if playing 4k at max settings with ray tracing.
 
Thing is, even the higher end 3000-Series cards are still selling fairly high. 3090's on Craigslist and Facebook Marketplace are going for $900+. And 3090 Ti's still over $1000+. I'm kind of shocked those cards haven't dropped a lot in price yet.

3080's are still in sale and in stock at Microcenter and they asking $800.

I thought two years later the 3000-Series would have dropped in half at this point, or at least 1/3rd off. But they still priced pretty high.
 
I feel a bit is lost, OLED with HDR upgrade bring a lot on is own right without running it at 4k, everyone with a console do that most of the time enjoying it a lot and one day will be used at 4k via good upscaling at 120fps it is not now.

Depending on the eye, game, sitting distance it will be more or less an issue to start with.

And it is not like there is many cheaper non 4K oled better than a black friday promo C2 to choose from anyway or I could be missing something.
 
OP exercise caution will the oled suggestions. Oleds burn in and the displays are dim necessitate being in a dark unlit room. I fell for the hype and bought one and immediately returned it. The dim display, the extreme high burn in risk, the necessity of being in pitch dark room, the whites looking grey, the ABL fluctuating are a few reasons why It is a very bad option for me unless you knew all this be aware it was a rude awakening for me lol
 
With your two initial choices, you're going to run into problems. The video card isn't going to 4k very well so upgrading only the monitor won't work. The video card is a bad value and eventually the CPU is going to be a bottleneck for it. With the age and power of the system certain incremental upgrades aren't going to work.

I've been in a very similar situation as you and am still slowly working out of it. I was on a Ryzen 2600x with a Radeon RX570. I upgraded motherboard, RAM and went with a 5800x first. I saw some gaming benefit but also reaped the benefits in other places because I can always use more CPU power. No matter what it was a required upgrade. The RX570 and 1080p monitor I have are still limiting especially since I've wanted to move up to 1440p for quite a while. I'm saving up for a Radeon 6700xt or 6750xt. Either would be massive overkill for 1080p and that's the way it will be for a while. But with the platform upgrade and then GPU upgrade I'll finally be set for 1440p when I can afford the new monitor.

It would be best for you to upgrade CPU and platform first and ironically it's probably the cheapest option. Gaming performance may or may not be noticeably better than what you currently have but you need that base to build from. Once that's done, I would recommend replacing the GPU when you get the chance and having some extra time before purchasing is only likely to net you a better value on the GPU. Only then would I look to upgrading the monitor.

You're going to end up replacing most of the components of the system but with where you're sitting right now it cannot be avoided.
 
Okay, there are a lot of opinions in this thread.

Start here:


If you want, skip to the conclusions. If you're using a 6 core CPU in a GPU bound situation, moving to a much faster CPU doesn't matter. If he wants to move to playing games in 4k on a 4080, he'd be FINE. That's not the issue here.
 
It depends on the games the op really plays, if they are GPU bound or CPU bound. He'll need to look up the ones he cares about and determine if his setup does his target FPS at 4k at his desired settings. While he may be "fine" pairing a 4080 with an 8700k, it's also saying you'd be fine on running around on racing compound tires driving a 2016 elantra 4 banger. Sure it would "work" but those tires were meant to see as much as 200 mph.

You'll see people playing high end cards on 1080p monitors, but they'll be targeting 240hz -360hz gameplay. Niche cases, but everyone has their intended goals, and most of the opinions is more on pairing things up in more ideal situations. Those frame dips are at high res also induce slow responsiveness, so playing at 4k may make for 60fps avg, but those 1% lows into the 30s would be maddening to me.
 
Last edited:
Okay, there are a lot of opinions in this thread.

If you want, skip to the conclusions. If you're using a 6 core CPU in a GPU bound situation, moving to a much faster CPU doesn't matter. If he wants to move to playing games in 4k on a 4080, he'd be FINE. That's not the issue here.
I'm pairing a 6900xt with a first gen Ryzen 1700x (also from 2017). Processor never goes above ~20% usage at 4K. Also running a Aorus FO48U and this monitor will make a 3d modeled pile of shit look good! Get an oled, it is indeed a game changer, and there is literally no going back after experiencing one. Turn the brightness down, you won't have to worry about the burn-in boogeyman!
 
I'm pairing a 6900xt with a first gen Ryzen 1700x (also from 2017). Processor never goes above ~20% usage at 4K. Also running a Aorus FO48U and this monitor will make a 3d modeled pile of shit look good! Get an oled, it is indeed a game changer, and there is literally no going back after experiencing one. Turn the brightness down, you won't have to worry about the burn-in boogeyman!

this right here - you may end-up GPU-limited in several games, but everything else will look fantastic

the CPU could limit you, but if you're already running a 1440p gsync monitor (Assuming 144 hz at-least on a gsync monitor), tit should Run exactly the same!
 
I'm pairing a 6900xt with a first gen Ryzen 1700x (also from 2017). Processor never goes above ~20% usage at 4K. Also running a Aorus FO48U and this monitor will make a 3d modeled pile of shit look good! Get an oled, it is indeed a game changer, and there is literally no going back after experiencing one. Turn the brightness down, you won't have to worry about the burn-in boogeyman!
With all do respect. No, it's not a game changer. It's just a different display technology that has pros and cons. Also, there is going back after experiencing one. I bought one tried it for a week and gladly returned it never to buy another oled again.
The problem with people that like oleds is they usually only mention the pros and shove it down everyone's throat hard as if It's the right choice for everyone. It's really not though.
To the original poster of this thread. Make sure you learn all the cons of oleds that many oled fanboys will deflect or omit to talk about. I learned them the hard way because usually owners of oleds have heavy buyers justification to feel better about their purchase.
 
With all do respect. No, it's not a game changer. It's just a different display technology that has pros and cons. Also, there is going back after experiencing one. I bought one tried it for a week and gladly returned it never to buy another oled again.
The problem with people that like oleds is they usually only mention the pros and shove it down everyone's throat hard as if It's the right choice for everyone. It's really not though.
To the original poster of this thread. Make sure you learn all the cons of oleds that many oled fanboys will deflect or omit to talk about. I learned them the hard way because usually owners of oleds have heavy buyers justification to feel better about their purchase.
What was your issue with the OLED? My PG27UQ looks like garbage next to my FO48U, and that is/was the best LCD on the market.
 
What was your issue with the OLED? My PG27UQ looks like garbage next to my FO48U, and that is/was the best LCD on the market.
Not issue, issues. Too many. Not even worth discussing lol there is plenty of information about it everywhere My 50" mini led hdr QN90B at 144hz is the last last display I'll use for the next 10 years easy, and I am certain it will last like a champ as long as I need it too with no compromises and not a single worry.
 
Not issue, issues. Too many. Not even worth discussing lol there is plenty of information about it everywhere My 50" mini led hdr QN90B at 144hz is the last last display I'll use for the next 10 years easy, and I am certain it will last like a champ as long as I need it too with no compromises and not a single worry.
1669407203738.png
 
Going to a 4K monitor requires a much higher end video card. And at 2560x1440, the **80s are a bit overkill. But it isn't as overkill as people say. I am on that resolution and have been using **70s for a few generations. There is always a few games that make getting a stable 80-90 FPS impossible. So the **80s can certainly help for those few very demanding games.

If you want to go to 4K, get a better GPU. If you go with ultra wide 1440, you won't need as fast of a GPU as 4K. Could be a good middle ground if you like wide screens. Still, that would be a bit much for a 3060ti.
 
Going to a 4K monitor requires a much higher end video card. And at 2560x1440, the **80s are a bit overkill. But it isn't as overkill as people say. I am on that resolution and have been using **70s for a few generations. There is always a few games that make getting a stable 80-90 FPS impossible. So the **80s can certainly help for those few very demanding games.

If you want to go to 4K, get a better GPU. If you go with ultra wide 1440, you won't need as fast of a GPU as 4K. Could be a good middle ground if you like wide screens. Still, that would be a bit much for a 3060ti.

or he could just run 1440p 120hz up-scaled on that lg TV (remember when they added that? it looks identical to native 4k for most games!) quit pretending like this such a fucking gargantuan UPGRADE THAT ITS COMPLETELY IMPOSSIBLE!@
 
I'm playing Batman Arkham Origins in 4k 144hz on a 3080Ti hydrocopper and it pushes up to 80-90 percent usage. That game is almost a decade old. Without relying on dlss upscale or down sample or whatever which I don't like because I hate noticing the change in resolution or when the background looks noticably less sharp etc I would say yeah I wouldn't mind a 4000 series card preferably a 4090 (not at these prices) to be comfy for the next 3-5 years and that's coming from a 3080Ti on water lol.
 
With all do respect. No, it's not a game changer. It's just a different display technology that has pros and cons. Also, there is going back after experiencing one. I bought one tried it for a week and gladly returned it never to buy another oled again.
The problem with people that like oleds is they usually only mention the pros and shove it down everyone's throat hard as if It's the right choice for everyone. It's really not though.
To the original poster of this thread. Make sure you learn all the cons of oleds that many oled fanboys will deflect or omit to talk about. I learned them the hard way because usually owners of oleds have heavy buyers justification to feel better about their purchase.
What was your issue with the OLED? My PG27UQ looks like garbage next to my FO48U, and that is/was the best LCD on the market.
Not issue, issues. Too many. Not even worth discussing lol there is plenty of information about it everywhere My 50" mini led hdr QN90B at 144hz is the last last display I'll use for the next 10 years easy, and I am certain it will last like a champ as long as I need it too with no compromises and not a single worry.
I wasn't even going to reply that, except with yeah ok then...
But also: oled fanboys? really? :bigrolleyes That's a new one, and quite possibly the stupidest one yet.

Posts like that are ones that deserve to be called out, but then when it did it get responded to with basically 'too much to list'. Again like yeah ok.

I remember when I got my BenQ 27" 144hz (can't recall the model number, don't care) back in 2016 I thought it was the cat's meow! But now looking at it, it just looks bad in comparison. Terrible back light bleed along the bottom and bottom right side corner particular. Never really noticed at the time, it's just the way it was. But yeah, it's night and day difference. I also got an Omen 32, which by todays standard is seriously lacking in the specs, but still has a somewhat rich picture to it.

A reasonable question back to the OP though, why not just wait a few more weeks until 7900 releases/get benched? The xtx is pretty much guaranteed to be a better choice than a 4080 anyhow.
 
I wasn't even going to reply that, except with yeah ok then...
But also: oled fanboys? really? :bigrolleyes That's a new one, and quite possibly the stupidest one yet.

Posts like that are ones that deserve to be called out, but then when it did it get responded to with basically 'too much to list'. Again like yeah ok.

I remember when I got my BenQ 27" 144hz (can't recall the model number, don't care) back in 2016 I thought it was the cat's meow! But now looking at it, it just looks bad in comparison. Terrible back light bleed along the bottom and bottom right side corner particular. Never really noticed at the time, it's just the way it was. But yeah, it's night and day difference. I also got an Omen 32, which by todays standard is seriously lacking in the specs, but still has a somewhat rich picture to it.

A reasonable question back to the OP though, why not just wait a few more weeks until 7900 releases/get benched? The xtx is pretty much guaranteed to be a better choice than a 4080 anyhow.
Yes oled fanboys. Really. They only like to talk about the pros. Not so much the cons. Brand new oleds & qdoleds are burning in. This is a fact, really. The only thing "stupid" as you suggest is how this is overlooked. REALLY. If you want an oled go ahead. Just can't shove it down my thoat. I had one & meh dim at full brightness and will burn in at full brightness lol.
 
xDiVo may be a bit overly enthusiastic but he is partially correct. While you can hit the lottery and purchase an oled that works perfectly for 4+ years, I wouldn't recommend anyone buy one for PC usage atm unless they can afford to purchase a new one every other year.

Even the multimillionaire content creators like Linus have videos on their channel showing the multiple issues they suffer from oled screens, especially for PC usage as game HUDs and UI elements can sit in the same place for hours at a time. The preventative tech built into modern oleds (such as pixel shift) are helpful but still not adequate.

That said, LG (and some other manufacturers) are now starting to make oled monitors specifically for PC usage, and the operating system on them will probably be heavily tweaked to suit the usage of a typical PC user, which we will see the effect of when reviewers get their hands on them.

(LG has the "The world’s first 240Hz OLED gaming monitor" listed on their site which you can pre order around xmas time for $1000 US)
 
I do think there is room for OLED to mature for the desktop. I don't have one yet, but I would never get a LED after seeing the black levels and contrast the OLED is capable of. I was going to plunk a TV on my desk but now I am waiting for true desktop models.
 
Back
Top