Who Needs 4K When You've Got 8K

Megalith

24-bit/48kHz
Staff member
Joined
Aug 20, 2006
Messages
13,000
Not that you have to look at them all, but here are over 7,000 8K screenshots for your perusal. As someone who still games in 1080p, this is the kind of thing that is supposed to make me feel jealous and inferior, right? Not really, but I will say that these are pretty.

The gaming industry has spent the last couple of years talking up the beauty of 4K, but forget that. 8K is the real deal, people. The screenshots are courtesy of a Flickr gallery curated by user Xanvast, who has uploaded more than 7100 shots of games in glorious 8K resolution. I've resized the photos down from their original 8K resolution to 1080p for size considerations, but you can view the gallery in full here. The photos are ridiculously pretty. Scrolling through the entire gallery just makes me want to update my desktop wallpaper. Or wallpapers, really. It's basically an album of gamer porn.
 
There is a point of diminishing returns when it comes to resolution. Sure, that many pixels will produce a cleaner image but past a certain point the game itself will just look way too artificial. The Dreamcast played with the VGA adapter is a good example of that.

1440p is the sweet spot for this generation.
 
I haven't even seen 4K be realistically playable on most games in the HardOCP reviews. I think they are getting ahead of themselves lol
 
I expect older games like Borderlands 2 will play just fine at 8K on today's hardware, and that GPUs will manage 8K just fine in 2-3 years. Single-GPU gaming at 4K is viable right now. I expect a pair of Volta GPUs will be perfectly adequate.
 
If you can't notice the difference between 4k and 8k video, why would we notice it in games?
 
4k and 8k look great but until game developers stop forcing 1080p UI's there's really no reason for me to ever bother with it.

I haven't even seen 4K be realistically playable on most games in the HardOCP reviews. I think they are getting ahead of themselves lol

agree with the statement but also don't agree, i think 4k and VR is the only reason the gpu market hasn't become stale.. the last 2 generations of high end GPU's play 1440 and 1600p just fine on a single card but suffer on anything higher than that.
 
I expect older games like Borderlands 2 will play just fine at 8K on today's hardware, and that GPUs will manage 8K just fine in 2-3 years. Single-GPU gaming at 4K is viable right now. I expect a pair of Volta GPUs will be perfectly adequate.

no it isn't.

there is no card right now that will do a steady 60fps at 4k.
 
There is a point of diminishing returns when it comes to resolution. Sure, that many pixels will produce a cleaner image but past a certain point the game itself will just look way too artificial. The Dreamcast played with the VGA adapter is a good example of that.

1440p is the sweet spot for this generation.

Not to mention I'd rather have 1440p with really advanced shaders, higher quality models, and fluid FPS versus uber high resolutions.
 
kelloggs-special-k.png
 
no it isn't.

there is no card right now that will do a steady 60fps at 4k.

You don't need a steady 60 fps at 4K. A Titan XP with a Gsync monitor will do just fine. Sure you don't get a solid 60 fps but the tests show that you get decent framerates.
 
You don't need a steady 60 fps at 4K. A Titan XP with a Gsync monitor will do just fine. Sure you don't get a solid 60 fps but the tests show that you get decent framerates.

so for the low low price of around 3500 CAD plus taxes i can play 4k with one card.

if devs weren't so lazy/shitty they'd maybe put some effort in to multi-gpu.

GTA V is the benchmark everyone should compare their games to.

it's almost 2 years old and it still looks better and plays better than most of the turds that are newer than it,
 
no it isn't.

there is no card right now that will do a steady 60fps at 4k.

I can do a steady 60fps in many titles with my watercooled overclocked Pascal Titan.

Not in all titles though. Deus Ex Mankind Divided was a no-go and required me getting creative.

Fallout 4 was a "60fps most of the time" experience, with occasional drops.

Even some older titles, like Metro 2033 are no go, getitng me nowhere near 60fps.

Granted, this is at ultra settings with AA and 16x AF.

It's possible dropping the settings would have brought them to 60fps.

That being said, I don't see the point for 8k at all.

I see no need for greater than 100-110 dpi on the desktop, as you don't look at your monitor as closely, as - say - a phone or tablet. I sit 2.5ft from my screen, so higher than 100dpi is pointless, and would just require scaling.


At 110dpi 4k is already a 40" screen. That is plenty large IMHO.

At 110 dpi, 8k is a 80" screen. You'd certainly have to sit further back. That is WAY too large at 2.5ft, and if you sit further back, you need less resolution :p

To me, the biggest value of 8K is if it is used as a DSR resolution, and displayed on a 42" 4K Screen for the better cleaner aliasing that would result in.

Either way, this is moot. We are a ways off from having GPU's that can easily handle 8k resolutions.
 
At 110 dpi, 8k is a 80" screen. You'd certainly have to sit further back. That is WAY too large at 2.5ft, and if you sit further back, you need less resolution :p

Oh. Here I thought you were [H]ard. Apparently not. 80" for an 8k screen is just fine at 2.5ft away.

The nausea will go away after a few days of gaming like that. And, it won't hurt your eyes.

8K looks good, but I don't think it's too practical yet.
 
Oh. Here I thought you were [H]ard. Apparently not. 80" for an 8k screen is just fine at 2.5ft away.

The nausea will go away after a few days of gaming like that. And, it won't hurt your eyes.

8K looks good, but I don't think it's too practical yet.

You know, it might actually be good, but it would take some FOV adjustments I think.

Start thinking of the gaming experience differently. Render a huge FOV and move your head around to see the edges. Would probably require a healthy curve though.

All that said, it would probably be cheaper, more convenient and more effective to just buy a VR headset :p
 
I can do a steady 60fps in many titles with my watercooled overclocked Pascal Titan.

Not in all titles though. Deus Ex Mankind Divided was a no-go and required me getting creative.

Fallout 4 was a "60fps most of the time" experience, with occasional drops.

Even some older titles, like Metro 2033 are no go, getitng me nowhere near 60fps.

Granted, this is at ultra settings with AA and 16x AF.

It's possible dropping the settings would have brought them to 60fps.

That being said, I don't see the point for 8k at all.

I see no need for greater than 100-110 dpi on the desktop, as you don't look at your monitor as closely, as - say - a phone or tablet. I sit 2.5ft from my screen, so higher than 100dpi is pointless, and would just require scaling.


At 110dpi 4k is already a 40" screen. That is plenty large IMHO.

At 110 dpi, 8k is a 80" screen. You'd certainly have to sit further back. That is WAY too large at 2.5ft, and if you sit further back, you need less resolution :p

To me, the biggest value of 8K is if it is used as a DSR resolution, and displayed on a 42" 4K Screen for the better cleaner aliasing that would result in.

Either way, this is moot. We are a ways off from having GPU's that can easily handle 8k resolutions.

Things like fonts do look better with a higher dpi. Maybe it might not be such a big deal when you have moving images rather than static images, but I'm all for impossible to see pixels. Plus, my paranoia and OCD over dead pixels would hopefully lessen.
 
You don't need a steady 60 fps at 4K. A Titan XP with a Gsync monitor will do just fine. Sure you don't get a solid 60 fps but the tests show that you get decent framerates.

no one needs 60 fps at any resolution

I guess a large portion of those who paid for a Titan XP and a gsync monitor would want more than decent though
 
no one needs 60 fps at any resolution

I guess a large portion of those who paid for a Titan XP and a gsync monitor would want more than decent though

Lol, are you trolling?

For a multiplayer FPS, 60fps is bare minimum. You need your framerate to stay above 60 fps at all times. No drop below 60, even for a second is acceptable.

Now, for single player FPS titles you can get away with less, and for turn based strategy games even much more less.
 
  • Like
Reactions: jtm55
like this
No kidding can you believe some vendors dont even support 8k IFFT? (Yeah no one knows what Im talking about I'll shut up now.)
 
8k would be AMAZING on a 100" screen.

Much like 4k, these super crazy resolutions dont make sense on smaller screens.
 
There is a point of diminishing returns when it comes to resolution.
Not with VR. I think 4K per eye will produce levels of realism/immersion that may be society breaking, like the dumping of opium in China in the 1800s.
 
Who needs 8k when you have 4k?
Especially for flat screen PC gaming.
 
So here's a stupid question... how does one go about taking an 8K screenshot? There aren't any 8K screens yet... Or is this some kind of supersampling downrez?
 
So here's a stupid question... how does one go about taking an 8K screenshot? There aren't any 8K screens yet... Or is this some kind of supersampling downrez?
With a camera.
I've got one, of a small area.
 
Not with VR. I think 4K per eye will produce levels of realism/immersion that may be society breaking, like the dumping of opium in China in the 1800s.

Obviously the linked gallery and topic of discussion had nothing to do with VR. With respect to VR, the higher the better but even you can agree that driving two independent 4K 'screens' is way out there for VR. Currently, the VR market is capable of a fully immersive experience so as it stands more pixel count is not necessarily the driving factor.

Pixels per inch, or PPI, is a much more meaningful measurement for VR than pure pixel count.
 
There is a point of diminishing returns when it comes to resolution. Sure, that many pixels will produce a cleaner image but past a certain point the game itself will just look way too artificial. The Dreamcast played with the VGA adapter is a good example of that.

1440p is the sweet spot for this generation.
The biggest draw about higher resolutions for me is less of a need for antialiasing, which is good because modern games do a pretty crappy job of supporting it. Hell, with good 4x SSAA, even 1080 looks great to me, but that almost never happens. With SMAA or FXAA, you always have problem areas that cause massive shimmering when they're in motion and get really distracting. Higher resolutions make those a lot less noticeable and I think you're right about 1440p being where to aim for now.
 
so for the low low price of around 3500 CAD plus taxes i can play 4k with one card.

if devs weren't so lazy/shitty they'd maybe put some effort in to multi-gpu.

GTA V is the benchmark everyone should compare their games to.

it's almost 2 years old and it still looks better and plays better than most of the turds that are newer than it,
Gtav is not hard to run at 4k. My 295x2 does it pretty easily with pretty high settings. Witcher 3 on the other hand....
 
Graphics card prices need to come down by more than 90% to make even 4k viable to most users and 8k is just a joke. There are only a handful of people currently willing (or stupid enough) to invest 5000 bucks to a gaming machine.
 
A complete and total lack of need for AA is a good reason.

You'd either be playing AT 8k or at a scaled 8k on whatever res screen you are using.

Shimmer and artifacting would be virtually eliminated. The only remaining battle would be screen tearing and input lag.
 
A complete and total lack of need for AA is a good reason.

You'd either be playing AT 8k or at a scaled 8k on whatever res screen you are using.

Shimmer and artifacting would be virtually eliminated. The only remaining battle would be screen tearing and input lag.

I don't think screen tearing will be an issue when your game's running like a slideshow. 4k's demanding enough.
 
So here's a stupid question... how does one go about taking an 8K screenshot? There aren't any 8K screens yet...

There is at least one, from Dell. I'm guessing Xanvast works for Dell or their OEM and has one.
 
With 4k gaming only now breaking into the main-stream of the average consumer consciousness, I doubt we'll see real 8k gaming for several more years, let alone affordable hardware to render it. GTX 24gb 1480Ti SLI?
 
Today's GTX 1080 is fine for 4K gaming. So assuming a doubling of performance every 2 generations, a single GTX 1480 will cope and the 1580 will be the one to bring it mainstream. So 5-6 years.
 
Today's GTX 1080 is fine for 4K gaming. So assuming a doubling of performance every 2 generations, a single GTX 1480 will cope and the 1580 will be the one to bring it mainstream. So 5-6 years.

Excellent news. I'll adjust my expectations accordingly. I wonder if SLI will even be a thing in 5 years.
 
Today's GTX 1080 is fine for 4K gaming. So assuming a doubling of performance every 2 generations, a single GTX 1480 will cope and the 1580 will be the one to bring it mainstream. So 5-6 years.
A lot more than 5-6 years, performance less than doubles in 2 generations.
It takes approx 3 generations to double performance, at least with NVidia.

http://www.techspot.com/article/1191-nvidia-geforce-six-generations-tested/

That doesnt consider driver related changes.
However, we recently saw that it hardly changes with NVidia except for initial bug fixes.
http://www.hardocp.com/article/2017/02/08/nvidia_video_card_driver_performance_review/

Its going to be closer to 9 years before 8K performance is realisable.
Although AMDs competition could change that.
 
Last edited:
Back
Top