Rumor: 3080 31% faster than 2080 Ti

Yeah, the above needs a truck load of salt. Someone just grabbed some of the Twitter "leaks" Printed up some specs on a laser printer, then took a picture.
 
Yeah, the above needs a truck load of salt. Someone just grabbed some of the Twitter "leaks" Printed up some specs on a laser printer, then took a picture.
Yep, also I would think their 3080ti/3090 would have more than 12GB of VRAM
 
Well, while some people do, others don't... just as you feel it's not worth golf course access, season tickets, guns, ammo, w/e; he doesn't think it's worth $1200 for a GPU. He's entitled to his opinion just as you are. It doesn't make him wrong or you wrong. Some people don't spend $10k for upgrades on a $50k car and instead buy cheap cars to work on (my daily driver cost me $250 and another $60 to get on the road) and then spend a few $'s here and their to make them faster. Which person is the enthusiast? The one that paid someone else to put stuff on their car, or the one that tore into their own car? It's not just about spending money, it's about getting whatever you can out of what you have. Some people have more discretionary funds than others and different priorities. This used to be the place to buy a celeron and almost double your frequency... when did it turn into buy the most expensive stuff or your not [H]? If you can only afford to buy 3rd hand used computer parts and build it yourself, who cares?

That's basically my point. The high-end is never a bang-for-the-buck option in any consumer luxury good. If someone can't afford something or thinks it's just "not worth the price" that's perfectly fine but complaining about the best of the best consumer goods not being cheap is just showing that that person doesn't understand how business works.

For me, it's not about having the funds, or priorities. It's more about a consumer ethos where I won't pay for a product for which I think the price has been artificially jacked up through measures that are insidious and abject going well past any reasonable level of free market arguments.

Further, accepting that a $1200 top-end card is reasonable is effectively saying you're ok with the smoke-and-mirrors upwards pricing scheme nV has been pushing for the past 5-6 years. nV is in fact banking on the fact people will just tacitly accept that they can charge 30-40% more than the past. I'm not one of those people.

This happens in EVERY consumer luxury good. The top-end line of any product is always going to be all about "smoke-and-mirrors" on the pricing. You could spend $14k on a bicycle that probably cost $1k to manufacture, $3k on audiophile headphones that probably have $50 in materials, or a million on a Ferrari. All of these things have cheaper alternatives that can be damn near as good but the extra cost is worth it to some people. 20 years ago a bicycle over $3k was unfathomable even on the pro scene but now $3k is an average bike. Manufacturers know that if their product takes off there will be more money to be made in the high-end market.

Being that I have a 2080ti that can run anything I throw at it with disgusting gobs of performance I just do not personally see the need to get a 3k series. I will wait until 4k series. Or lets see what Big Navi can do but it probably wont exceed 30% more under any metric though.

I'm of the mind that if you're already at the top you might as well sell that card for top dollar right before the new cards come out and then take that cash and put it towards the new top card. $400-600 every two years and staying with the top card is better than $1200 every 4 years IMHO.

People like you :banghead:
Finding excuses for nearly doubling the price of top en card while giving half the usual performance boost.
Get out of here.

It's not excuses; it's an understanding of how markets and life in a Capitalistic society works. If $1200 was too much then NV wouldn't be able to sell any cards and they'd have to reduce their prices but obviously there are enough people that do think it's worth it. Because of that NV may even look to raise prices even more in the future. Will I like it if they do this? Absolutely not but I'm sure that there are still a bunch more people out there with the disposable income that won't even flinch at spending $2k on the best of the best and there will still be cards that cost what I'm willing to pay.
 
It's WCCFTech. I read the site, but their rumors articles should be taken with a salt shaker minimum; escalating up through sacks of rocksalt and salt spreader trucks for GPU leaks.
For sure. I wouldn't even have posted it if it wasn't already a "rumor" thread.
 
30% in what though?

Compute? Non RT? RT?

Being that I have a 2080ti that can run anything I throw at it with disgusting gobs of performance I just do not personally see the need to get a 3k series. I will wait until 4k series. Or lets see what Big Navi can do but it probably wont exceed 30% more under any metric though.

I assume you are running less than 4K? Seems the top end will be the realm of getting 4K speeds up ultimately to 144HZ. I don't know I just think AMD is going to be much faster than 208TI this generation in 4K and possibly close to the 3090. We will see soon hopefully.
 
there's always a market for the uber rich to want to have something that the dirty masses can't and I'm sure binning existing hardware will always make that niche market profitable regardless of the size of it.

I'd be willing to bet that even without major hardware advances, there is still a lot of meat on the plate for performance improvements by designing the bare metal for the latest vulkan api rather than being focused on old obsolete api's.. Less translation layers and workarounds will be dominant improvement areas as latency is more important above a certain threshhold than brute power.

as intel and amd creep into nvidia's mobile market ...it'll be interesting to see how nvidia responds. Raising prices wont do it any good in any market. Nvidia needs to undercut intel and amd in cost, because enthusiast performance is a niche market that the masses do not care about and it costs more to include discrete hardware than apu's.
 
there's always a market for the uber rich to want to have something that the dirty masses can't and I'm sure binning existing hardware will always make that niche market profitable regardless of the size of it.
There's plenty of cases just for enthusiasts, where guaranteeing a speed is a benefit worth paying for.
 
I assume you are running less than 4K? Seems the top end will be the realm of getting 4K speeds up ultimately to 144HZ. I don't know I just think AMD is going to be much faster than 208TI this generation in 4K and possibly close to the 3090. We will see soon hopefully.

3440x1440 which is hard on a GPU
 
Harder than the 2560 vesion of 1440 but still 67 % less tan 3840x2160 so not really that hard comparitively. But still needs a solid video card


3440*1440 = 4953600 pixels
3840*2160=8294400 pixels

4953600/8294400 = .59722222 so 60%, im not being nitpicky but if were going to use numbers lets be accurate


There is no real benefit to 4k to be honest. Unless youre watching some 4k super hifi dolby atmos movie to be honest 1080p with a good display at about 3 feet away looks stellar for gaming.

Now heres where everyone is going to argue but whatever. I have tried every resolution under 5k, excluding 5k, and I find that the most enjoyable resolution over all is 1440p but 1080p with good antialisasing is outstanding.

In fact I still have my 1080p Panasonic VT25 50" Plasma that is full on Pioneer Kuro Elite Tech and just recently is OLED surpassing the color reproduction of a top end plasma panel. I had my plasma professionally calibrated a few years ago as well and the person calibrating it was like, "fuck me this thing looks good".

Thus, if your blowing cash on ultra 4k HDR Gsync $3000 Monitors and $2k potential GPU you really are just being a neurotic person at that point. I mean Farcry 5 can only look so good. Or whatever the title is.

If I had my way i'd rather just rock a big ass Sony Trinitron 1080p Wega CRT but those are unobtanium.

Even at 3440x1440 I have stellar detail and can run that at 100 - 130 fps depending on the title. Youre 4k is lucky to hit 60. So you are trading an absolute shit ton of performance loss for something that doesn't even look much better to the eye.

For the new gen of GPUs sure we might see them hit 90 fps on 4k but yours and others dreams of 144 fps steady at 4k is so funny you can almost hear me laughing through the internet. There is just no way a 3080 or 3080ti is going go have a 40 to 50% increase in rendering performance in one single generation. Thus do not hold your breath. But maybe just maybe nvidia found a way to pack 30 to 40% more transistors into 7nm. It might just happen to see that big of a jump. And I am saying this only because we seen AMD do it with their Ryzen stuff.
 
Last edited:
Yeah, well 4K is nice but I feel like it's not worth it in the end, given how much you spend on your machine to barely get 60 fps.

I've done all sort of setups, including a 7680x1440 Surround setup and a 55" 4K TV on my desk, and I'm on 1080p ultrawide now (2560x1080).

It sounds crazy, but honestly 1080p still looks nice. Given you can run ultra settings and get really high and smooth refresh (I'm on 166 Hz) I feel it is a good trade off.

Also good for ray tracing. Was able to play Control with RT high and medium other settings, 90 FPS, very nice. No way that would be possible at 4K.
 
Yeah, well 4K is nice but I feel like it's not worth it in the end, given how much you spend on your machine to barely get 60 fps.

I've done all sort of setups, including a 7680x1440 Surround setup and a 55" 4K TV on my desk, and I'm on 1080p ultrawide now (2560x1080).

It sounds crazy, but honestly 1080p still looks nice. Given you can run ultra settings and get really high and smooth refresh (I'm on 166 Hz) I feel it is a good trade off.

Also good for ray tracing. Was able to play Control with RT high and medium other settings, 90 FPS, very nice. No way that would be possible at 4K.

Like I have said in my previous post, (not speaking for everyone, just me) There is no real benefit to 4k gaming. Movie watching yes but not gaming. I have a IPS Predator x34 on my left display and a Predator 1080p 240hz and I game more on the 240hz at 1080p far far far more than the big wide screen. I have a 2080ti so its not the GPU holding me back. Its the refresh rate and smoothness of this 240hz panel. That is what makes gaming so fun to me. Smoothness equates to more immersion. 60hz is punishing to look at now. And I dont know how in the hell people can do that on their 4k screens most of them barely hitting 35 fps on avg.

But to keep this on topic in this particular reply, I sincerely hope that the 3000 series nV cards can shred the 2080ti in every metric because its good for the market.
 
Well the next Nvidia card should help make 4K more viable.

You can do it today, but once you start talking about high refresh that is a struggle.

Or, at the very least, full max settings with 60 fps minimum (maybe without ray tracing for now).
 
Like I have said in my previous post, (not speaking for everyone, just me) There is no real benefit to 4k gaming. Movie watching yes but not gaming. I have a IPS Predator x34 on my left display and a Predator 1080p 240hz and I game more on the 240hz at 1080p far far far more than the big wide screen. I have a 2080ti so its not the GPU holding me back. Its the refresh rate and smoothness of this 240hz panel. That is what makes gaming so fun to me. Smoothness equates to more immersion. 60hz is punishing to look at now. And I dont know how in the hell people can do that on their 4k screens most of them barely hitting 35 fps on avg.

But to keep this on topic in this particular reply, I sincerely hope that the 3000 series nV cards can shred the 2080ti in every metric because its good for the market.
I'm exactly the opposite. I had an ASUS 165Hz 1080p 24" (sold it) and my Predator X34 (100Hz 34" 21:9 3440x1440); after going ultrawide I could not go back to a 16:9 monitor or go back to 1080p after experiencing the increased clarity of 1440p. Not to mention, over 100Hz I have a very hard time actually noticing the increased refresh rate. But to each their own, everyone has their own preference.
 
  • Like
Reactions: noko
like this
Yeah, I might have to go for 1440p ultrawide next. I had 1440p before, so I know what it looks like. It's definitely crisper, and 100 Hz is enough.

I think around 90 fps on a VRR is about golden. I can see more, I have a 240 Hz screen too and it's nice, but it's not twice as smooth, it's just a little nicer.
 
I'm exactly the opposite. I had an ASUS 165Hz 1080p 24" (sold it) and my Predator X34 (100Hz 34" 21:9 3440x1440); after going ultrawide I could not go back to a 16:9 monitor or go back to 1080p after experiencing the increased clarity of 1440p. Not to mention, over 100Hz I have a very hard time actually noticing the increased refresh rate. But to each their own, everyone has their own preference.

To each his own. I gave my z35p ultrawide to my wife since I missed the added resolution and DPI of a 28" 4k (even though it's only 60hz). She gets more use out of ultrawide as a work display since you can nicely fit 3 windows across the screen without having to mess with text size calibration.

In terms of what I want in the 3080, it's a totally stable 60fps at 4k.
 
  • Like
Reactions: Auer
like this
3440*1440 = 4953600 pixels
3840*2160=8294400 pixels

4953600/8294400 = .59722222 so 60%, im not being nitpicky but if were going to use numbers lets be accurate


There is no real benefit to 4k to be honest. Unless youre watching some 4k super hifi dolby atmos movie to be honest 1080p with a good display at about 3 feet away looks stellar for gaming.

Now heres where everyone is going to argue but whatever. I have tried every resolution under 5k, excluding 5k, and I find that the most enjoyable resolution over all is 1440p but 1080p with good antialisasing is outstanding.

In fact I still have my 1080p Panasonic VT25 50" Plasma that is full on Pioneer Kuro Elite Tech and just recently is OLED surpassing the color reproduction of a top end plasma panel. I had my plasma professionally calibrated a few years ago as well and the person calibrating it was like, "fuck me this thing looks good".

Thus, if your blowing cash on ultra 4k HDR Gsync $3000 Monitors and $2k potential GPU you really are just being a neurotic person at that point. I mean Farcry 5 can only look so good. Or whatever the title is.

If I had my way i'd rather just rock a big ass Sony Trinitron 1080p Wega CRT but those are unobtanium.

Even at 3440x1440 I have stellar detail and can run that at 100 - 130 fps depending on the title. Youre 4k is lucky to hit 60. So you are trading an absolute shit ton of performance loss for something that doesn't even look much better to the eye.

For the new gen of GPUs sure we might see them hit 90 fps on 4k but yours and others dreams of 144 fps steady at 4k is so funny you can almost hear me laughing through the internet. There is just no way a 3080 or 3080ti is going go have a 40 to 50% increase in rendering performance in one single generation. Thus do not hold your breath. But maybe just maybe nvidia found a way to pack 30 to 40% more transistors into 7nm. It might just happen to see that big of a jump. And I am saying this only because we seen AMD do it with their Ryzen stuff.

40% less (he said 67% less); 60% of 4k.

Personally, I love 3440x1440. It’s a sweet spot imo.

Given their HPC part is ~54 billion transistors to the 2080ti’s 18.6 billion the sky is the limit. It comes down to what they think the market will support $$$ wise.
 
There is no real benefit to 4k to be honest. Unless youre watching some 4k super hifi dolby atmos movie to be honest 1080p with a good display at about 3 feet away looks stellar for gaming.

Now heres where everyone is going to argue but whatever. I have tried every resolution under 5k, excluding 5k, and I find that the most enjoyable resolution over all is 1440p but 1080p with good antialisasing is outstanding.

In fact I still have my 1080p Panasonic VT25 50" Plasma that is full on Pioneer Kuro Elite Tech and just recently is OLED surpassing the color reproduction of a top end plasma panel. I had my plasma professionally calibrated a few years ago as well and the person calibrating it was like, "fuck me this thing looks good".

Thus, if your blowing cash on ultra 4k HDR Gsync $3000 Monitors and $2k potential GPU you really are just being a neurotic person at that point. I mean Farcry 5 can only look so good. Or whatever the title is.

If I had my way i'd rather just rock a big ass Sony Trinitron 1080p Wega CRT but those are unobtanium.

Even at 3440x1440 I have stellar detail and can run that at 100 - 130 fps depending on the title. Youre 4k is lucky to hit 60. So you are trading an absolute shit ton of performance loss for something that doesn't even look much better to the eye.

For the new gen of GPUs sure we might see them hit 90 fps on 4k but yours and others dreams of 144 fps steady at 4k is so funny you can almost hear me laughing through the internet. There is just no way a 3080 or 3080ti is going go have a 40 to 50% increase in rendering performance in one single generation. Thus do not hold your breath. But maybe just maybe nvidia found a way to pack 30 to 40% more transistors into 7nm. It might just happen to see that big of a jump. And I am saying this only because we seen AMD do it with their Ryzen stuff.

I disagree with everything you wrote there.
I was constantly annoyed at how bad 1440p/120hz looked compared to 4k/60hz on my previous Samsung TV, 2019 model.
Now I have a 43" 4k/120hz monitor Valve Index and I need a top-end video card.
You can stay at 1080p and enjoy it, but saying that there's no benefit to 4k is ridiculous at best.
 
3440*1440 = 4953600 pixels
3840*2160=8294400 pixels

4953600/8294400 = .59722222 so 60%, im not being nitpicky but if were going to use numbers lets be accurate


There is no real benefit to 4k to be honest. Unless youre watching some 4k super hifi dolby atmos movie to be honest 1080p with a good display at about 3 feet away looks stellar for gaming.

Now heres where everyone is going to argue but whatever. I have tried every resolution under 5k, excluding 5k, and I find that the most enjoyable resolution over all is 1440p but 1080p with good antialisasing is outstanding.

In fact I still have my 1080p Panasonic VT25 50" Plasma that is full on Pioneer Kuro Elite Tech and just recently is OLED surpassing the color reproduction of a top end plasma panel. I had my plasma professionally calibrated a few years ago as well and the person calibrating it was like, "fuck me this thing looks good".

Thus, if your blowing cash on ultra 4k HDR Gsync $3000 Monitors and $2k potential GPU you really are just being a neurotic person at that point. I mean Farcry 5 can only look so good. Or whatever the title is.

If I had my way i'd rather just rock a big ass Sony Trinitron 1080p Wega CRT but those are unobtanium.

Even at 3440x1440 I have stellar detail and can run that at 100 - 130 fps depending on the title. Youre 4k is lucky to hit 60. So you are trading an absolute shit ton of performance loss for something that doesn't even look much better to the eye.

For the new gen of GPUs sure we might see them hit 90 fps on 4k but yours and others dreams of 144 fps steady at 4k is so funny you can almost hear me laughing through the internet. There is just no way a 3080 or 3080ti is going go have a 40 to 50% increase in rendering performance in one single generation. Thus do not hold your breath. But maybe just maybe nvidia found a way to pack 30 to 40% more transistors into 7nm. It might just happen to see that big of a jump. And I am saying this only because we seen AMD do it with their Ryzen stuff.

Yeah I'm on 1440p (16:9) and honestly the only thing I want is a nice rasterized performance bump so I can turn on TAA and use the sharpening filter in some games while still maintaining a frame rate north of 100 fps. Or honestly even if more devs adopted DLSS 2.0 that would boost me up with my current card enough.
 
I disagree with everything you wrote there.
I was constantly annoyed at how bad 1440p/120hz looked compared to 4k/60hz on my previous Samsung TV, 2019 model.
Now I have a 43" 4k/120hz monitor Valve Index and I need a top-end video card.
You can stay at 1080p and enjoy it, but saying that there's no benefit to 4k is ridiculous at best.

Since I never heard of a Samsung 1440P TV, that sounds more like an issue of running at non-native resolution. BTW I actually wish there was a nice 1440P TV for computer/console gaming in the living room.

I am firmly in the camp of staying 1440P or lower for a PC desktop monitor and running faster and/or with higher settings, than merely throwing more pixels at it.
 
Yeah, the above needs a truck load of salt. Someone just grabbed some of the Twitter "leaks" Printed up some specs on a laser printer, then took a picture.
The phrase is a grain or pinch of salt. A truckload of salt would be taking it very seriously.
 
Since I never heard of a Samsung 1440P TV, that sounds more like an issue of running at non-native resolution. BTW I actually wish there was a nice 1440P TV for computer/console gaming in the living room.

I am firmly in the camp of staying 1440P or lower for a PC desktop monitor and running faster and/or with higher settings, than merely throwing more pixels at it.

Exactly, his issue was most likely due to running non-native. Even then the reports of 1440p looking terrible on a 4K TV are just hype in my experience. I run 1440p, 1620p and 1800p on a 55" OLED and they all look very good from 8ft away. Native 4K is best of course but certainly not noticeable enough to warrant the massive performance hit.
 
I have a 1440p165hz and two 4k60hz.
They're both valid for different games and viewing scenarios. It doesn't need to be an either/or scenario.

Sometimes I have a 4k game and movie up simultaneously on both my 4k monitors.
Sometimes I have a 4k YouTube on one, a browser on the other 4k and a game on my 1440p monitor
Sometimes I just have the internet on all 3 monitors.

Usually when doing editing that takes up two whole monitors.

Frankly I haven't used just one monitor since 2005 or so. Even when I was stuck with a laptop in a hotel room for 6 weeks last year I bought a secondary monitor.
 
  • Like
Reactions: noko
like this
Since I never heard of a Samsung 1440P TV, that sounds more like an issue of running at non-native resolution. BTW I actually wish there was a nice 1440P TV for computer/console gaming in the living room.

I am firmly in the camp of staying 1440P or lower for a PC desktop monitor and running faster and/or with higher settings, than merely throwing more pixels at it.

Exactly, his issue was most likely due to running non-native. Even then the reports of 1440p looking terrible on a 4K TV are just hype in my experience. I run 1440p, 1620p and 1800p on a 55" OLED and they all look very good from 8ft away. Native 4K is best of course but certainly not noticeable enough to warrant the massive performance hit.

Samsung QF7
 
I have the Q7F. It's a great kit but the 1440p mode leaves something to be desired.

I'd rather look at 1080p native than 1440p upscaled.
 
Yeah I'm on 1440p (16:9) and honestly the only thing I want is a nice rasterized performance bump so I can turn on TAA and use the sharpening filter in some games while still maintaining a frame rate north of 100 fps. Or honestly even if more devs adopted DLSS 2.0 that would boost me up with my current card enough.

Off loading Anti-aliasing to the the extra non gaming cores is really good as DLSS has shown. I was impressed with it very much. Maybe they will implement it in many many more games. I dont think it is really devs in as much as it has to be nVidia that forces DLSS on a game because it is nV that has to run the AI on their super computing clusters to calculate the DLSS data. They then have to pass that off the devs so the devs really cant do much until nV is done crunching at it.
 

According to the site you linked, it says the tv is 4k and mentions nothing about "Native Resolution" @ 1440P.

Well... Unless your referring to "native support" under supported resolutions.... but that's not the "panel native resolution".. From what I can gather from the rting site, "Native Support" means that it supports that resolution and refresh rate without having to force the TV into that mode with a custom resolution profile.

Something like what you have to do with this Sony TV if you want it to run 1080P @ 120Hz https://www.rtings.com/tv/reviews/sony/x850e
"1080p @ 120Hz Yes (forced resolution required)"

"1080p @ 120 Hz @ 4:4:4 is supported but doesn't appear by default, so a custom resolution is needed. "
 
It's not excuses; it's an understanding of how markets and life in a Capitalistic society works. If $1200 was too much then NV wouldn't be able to sell any cards and they'd have to reduce their prices but obviously there are enough people that do think it's worth it. Because of that NV may even look to raise prices even more in the future. Will I like it if they do this? Absolutely not but I'm sure that there are still a bunch more people out there with the disposable income that won't even flinch at spending $2k on the best of the best and there will still be cards that cost what I'm willing to pay.

Nah. That's not capitalism. It's only capitalism if it is a competitive market. Competitive markets make businesses compete for their consumers on - among other things - price.

If AMD had a comparable product to Nvidia's high end, we'd easily see some more realistic pricing here.

Don't get me wrong. I am aware that complexities and thus costs have gone up since the good old days of the best video card money can buy costing $350. I'm not expecting that to come back. We are approaching the limits of silicon, and with that everything is going to cost more.

I am - however - hoping for competition to make the likes of Nvidia stop treating high end consumer video cards like a cash cow, where they put in minimal effort and cash out excessive profit margins. If there were real competition, the 2080ti would cost between $600 and $800, not $1200.
 
According to the site you linked, it says the tv is 4k and mentions nothing about "Native Resolution" @ 1440P.

Well... Unless your referring to "native support" under supported resolutions.... but that's not the "panel native resolution".. From what I can gather from the rting site, "Native Support" means that it supports that resolution and refresh rate without having to force the TV into that mode with a custom resolution profile.

Something like what you have to do with this Sony TV if you want it to run 1080P @ 120Hz https://www.rtings.com/tv/reviews/sony/x850e
"1080p @ 120Hz Yes (forced resolution required)"

"1080p @ 120 Hz @ 4:4:4 is supported but doesn't appear by default, so a custom resolution is needed. "

I've never even seen a 1440p TV.

Monitor? Of course, but TV? TV went straight from 1080p to 4k.
 
40% less (he said 67% less); 60% of 4k.

Personally, I love 3440x1440. It’s a sweet spot imo.

Given their HPC part is ~54 billion transistors to the 2080ti’s 18.6 billion the sky is the limit. It comes down to what they think the market will support $$$ wise.

Sad thing is I did use a calculator and somehow screwed up. 1440 does seem to be the sweet spot right now for many gamers, but I personally prefer the fidelity of 4K over 1440 more than frame rates as long as it can be close to 60 fps. I love 32" 4K since I can see the whole screen without panning. I assume on 3440x1440 you need to pan which I guess could be more immersive but not sure I would enjoy that. Not really an immersive gamer play mostly RPGS, MMOS and strategy games that can be played at a more casual pace.
 
Sad thing is I did use a calculator and somehow screwed up. 1440 does seem to be the sweet spot right now for many gamers, but I personally prefer the fidelity of 4K over 1440 more than frame rates as long as it can be close to 60 fps. I love 32" 4K since I can see the whole screen without panning. I assume on 3440x1440 you need to pan which I guess could be more immersive but not sure I would enjoy that. Not really an immersive gamer play mostly RPGS, MMOS and strategy games that can be played at a more casual pace.

I always wanted surround without the bezels and 21:9 34” curved fills that. I don’t think the panning is too bad. It’d be hard to go back...

My TV is 4k. For some games like Control, the Witcher, where I sit on the couch 7’ from a 55” I can’t tell the difference between 1440p and 4k. 1080p only slightly... which is nice since I can drop resolution and crank settings.

So I think there’s a couple factors for monitors, distance you are from the screen and screen size are huge ones. Then there’s the actual human part...
 
Sad thing is I did use a calculator and somehow screwed up. 1440 does seem to be the sweet spot right now for many gamers, but I personally prefer the fidelity of 4K over 1440 more than frame rates as long as it can be close to 60 fps. I love 32" 4K since I can see the whole screen without panning. I assume on 3440x1440 you need to pan which I guess could be more immersive but not sure I would enjoy that. Not really an immersive gamer play mostly RPGS, MMOS and strategy games that can be played at a more casual pace.

I was playing Path of Exile when upgrading from 2560x1600 30" to 4k 32". The improved detail from the higher DPI was immediately obvious to me when first playing on my new screen; and fortunately my 1080 was able to (just barely) keep up, at around 45 FPS (closer to 60FPS using the beta Vulcan render instead of DX11).

The extent to which 21:9 ends up being panning vs shorter but showing more due to games fixing the vertical POV depends on the screen size. 32" 4k is 28x18" inches, 27" 4k is 23x14". 34" 3440x1400 is 31x13". Compared to a 32" 16; it's slightly wider but much shorter. Compared to a 27" 16:9; it's about the same hieght but much wider. A 27" 2560x1080 screen comes in at 25x11".

32:9 superwides are intended to fill your peripheral vision, 5120x1440 49" ends up being 47x13" which ends up being about as wide as my 3 screen setup with a 32" 4k for gaming in the center and pair of 20" 1200x1600 side screens for chat, hint sheets, etc on either side. With that setup I do need to explicitly look at the side screens to notice anything beyond someone said something in chat vs still being able to see the corners of my main one all at once.
 
Back
Top