What is 8k gaming?

Does 8k dlss count as 8k gaming (assuming it looks as good)

  • Yes, what matters is what you actually see.

  • No, If it is not rendering 33 million pixels, gtfo.


Results are only viewable after voting.
Why would 8K make any sense for normal use (e.g. excluding military and medical uses) given that the human can only distinguish pixels of a 40" 4K monitor from 2.3ft??
 
Why would 8K make any sense for normal use (e.g. excluding military and medical uses) given that the human can only distinguish pixels of a 40" 4K monitor from 2.3ft??

I think there's more to the equation than just the ability to distinguish pixels, but I do agree it's a case of diminishing returns.

I doubt 8K is something that will interest me for 5+ years. We're barely at 4k/60. I'd much prefer 4k/120 over 8k/60.
 
I think there's more to the equation than just the ability to distinguish pixels, but I do agree it's a case of diminishing returns.

I doubt 8K is something that will interest me for 5+ years. We're barely at 4k/60. I'd much prefer 4k/120 over 8k/60.

May I ask you to elaborate on what are the advantages of 8K over 4K other than higher pixel density (and marketing)?

Here are some disadvantages:
- LG found that brightness was lower on 8K vs 4K and they explored using top emission for their 8K OLED (they finally found a workaround with their current lower bottom technology).
- More pixels require more complex electronic, more complex manufacturing process, etc... so more costs and theoretically less reliable
- 8K requires more bandwidth, more GPU power, etc...

It seems to me that TV manufacturers should instead focus on contrast ratio (Brightness), near black details and motion resolution... things for which the human eye is very sensitive.
 
Pretty sure most companies in the electronics biz don't really care about what's truly better - they just know that 8K is twice as many as 4K and people want bigger/better/faster/more. It's a lot easier to sell something that's "twice as good" than to explain why their older sounding tech is better than someone else's newer sounding tech. Kinda like how the console manufacturers jumped to 4K rather than trying to get their FPS up for HD.
HDR was tossed in before it was better than normal settings, too.
 
I don't think you can call 1440p upscaled to 8k, 8k gaming...

Having said that, DLSS is fantastic technology. Going from 1440p to 4k via DLSS seems like less of a stretch to me, but any time you say you are gaming at 4k, you should probably prefix that by saying you are using DLSS if you are being technical. Realistically, lots of people feel like they have been gaming at 1080p, 4k on consoles for years, when in reality most games render at a lower resolution and upscale... Nvidia DLSS is just doing this better, so...

Going beyond 4k seems extremely pointless. Realistically going from 1440p to 4k isn't even that huge of a deal on computer monitor sized displays for most folks I would say. HDR effects etc. seem more important. I am very happy with my Asus PG27Uq 4k HDR display, don't get me wrong; but for me, the improvements compared to my Asus PG348q aren't so much in resolution, but features like local dimming (this really improves the contrast in most situations significantly), HDR with very high brightness also adds a whole new dynamic to games that support it... I think we are starting to see these features on 1440p displays, but I haven't bothered to research this lately...

I see no reason to need to add "8k" to the spec list personally, 4k is already overkill in most situations and we are honestly just barely getting to the point where we can truly say 4k gaming is a thing. Realistically, for a lot of the highest end AAA games, I have to turn down game details to attain >60 FPS if the game doesn't support DLSS. Some games like Death Stranding and Control for example, DLSS is basically required if you want to turn everything up on my 2080 Ti... I have been playing Red Dead Redemption 2 again lately and I decided to just use the GeForce Experience recommended settings because I was sick of trying to find the happy medium... This just barely produce 60 FPS on average, that seems to be a mix of some max settings, lots of medium settings... 3080 and 3090 of course will help with this, but this will probably be just enough to smooth out the 4k gaming experience IMHO.
 
Not sure In understand, in that example the Display port 1.4 from a titan RTX has a 32.4 Gbs bandwitch, allowind 7680 x 4320 @ at hz.

A recent TV that support HDMI 2.1 and a new video card that also support HDMI 2.1 should support 48 GBS, more than your example (or was multiple display port use at the same time to feed a single 8K monitor ?)

I have been gaming in 8K since April 2017 when I bought the Dell UP3218K - an actual 8K resolution PC monitor that was 32". That monitor required 2x DP 1.4 cables since it did not use Display Stream Compression (DSC) to run 8K @ 60hz with one DP 1.4 cable - in other words, it was a tiled display (aka MST monitor). However, that has really been the ONLY 8K monitor on the market for consumers and it is almost the end of 2020. Only very recently has LG released their 8K OLED TV (the one Linus & MKBHD were sent for previewing/reviewing the 3090). For all intents and purposes, an 88" TV is NOT a monitor - yes it can be used for gaming in the living room or a dedicated media/theater room but for 99.9% of owners of that TV, it will not be used as a PC monitor, especially not as a gaming monitor.

I understand your disdain for DLSS (though i personally think its incredible), however I don't understand this "8K monitor (not a TV)" part. In the recent crop of sensational 8K gaming video's on youtube, at least two that I can think of were using LG Signature ZX 88" TV's.

That model has a true 7,680 by 4,320 resolution, has HDMI 2.1 and support VRR. What would have to change for that not be be considered "gaming"? I'm failing to see how another display that happened to be marketed as a "monitor" in the same situation would make any difference.

The point of my post was that I am probably one of the very few people on this forum (or online tbh) to have had actual experience (extended period of time and not just "reviewing" for a video) with a real 8K monitor and gamed on it 24/7. I was running 4x Titan Xp in 4-Way SLI with an OC'd 6950X that was actually able to handle many games in native 8K resolution.

Nvidia's recent marketing spiel about the 3090 being an "8K gaming" card is absolute trash and simply false advertising - as Steve from GN correctly stated. They shot themselves in the foot since the 3090 can't even hold 30fps in native 8K in most of the games. Add to this their gimping of SLI except native implementation by game devs, the 3090 is most definitely NOT an "8K gaming" GPU.

Here are some examples of native 8K with 4x Titan Xp with all settings maxed out (no AA of course):

Look at the scaling of the 4 GPUs in these examples and tell me SLI doesn't/didn't work well. Also, shove this in Nvidia's face and tell them that THIS is "8K gaming" and not the 1440P upscaled to 8K their 3090 does with DLSS. False advertising is false.











"Look on my works, ye mighty. And despair!" - Shelley
 
Res means nothing to me beyond 4k. My eyes are not good enough and I am sure it is the same to most on this site to tell much of a difference between 4k, 8k DLSS on or off. There are more important facts to me about TVs. Panel manufactures need to make a no compromise solution to display the image. OLED while it is fabulous and I love my C9 it is still not perfect.
 

So after all that you don't have an answer for my question about the display. The display in question was a true 7680 × 4320 display with a single HDMI 2.1 input. Your "monitor" is a display with a 7680 × 4320 that required some janky multi display port input. If you had the "TV" back in 2017 how would the experience been different? Its the exact same resolution?

True 8K vs DLSS aside, you seem to be standing on some hill that unless the 7680 × 4320 display being used is marketed to you as a "monitor" that its not "8K" which is preposterous.
 
So after all that you don't have an answer for my question about the display. The display in question was a true 7680 × 4320 display with a single HDMI 2.1 input. Your "monitor" is a display with a 7680 × 4320 that required some janky multi display port input. If you had the "TV" back in 2017 how would the experience been different? Its the exact same resolution?
The question was indeed not answered, but there was a clue, multiple display port had I would easily imagine an higher bandwith and no or less compression than a 8K single HDMI going on, if there is 0 artifact at the joncture it is more true 8K.

But in general really unclear.
For all intents and purposes, an 88" TV is NOT a monitor - yes it can be used for gaming in the living room or a dedicated media/theater room but for 99.9% of owners of that TV, it will not be used as a PC monitor, especially not as a gaming monitor.
8K gaming and PC are 2 different subject, in 2033 most people playing "8K" will probably not be with a PC and for now the only 8K content that exist is pretty much coming out of a PC no (maybe some utlra compressed youtube demo and other nice try, but it is not like there is any movie or tv content to watch in 8K with this) and it is a different subject.

It is not clear what a HDMI 2.1 TV like that do not do that a PC monitor does, for one to be impossible to play a game on it at a 8K resolution are you talking about compression of the signal because it is only one input instead of multiple ?
 
So after all that you don't have an answer for my question about the display. The display in question was a true 7680 × 4320 display with a single HDMI 2.1 input. Your "monitor" is a display with a 7680 × 4320 that required some janky multi display port input. If you had the "TV" back in 2017 how would the experience been different? Its the exact same resolution?

True 8K vs DLSS aside, you seem to be standing on some hill that unless the 7680 × 4320 display being used is marketed to you as a "monitor" that its not "8K" which is preposterous.

How would the experience be different? Well, fitting an 88" TV on my desk in my game room as opposed to using a 32" 8K monitor and a 30" 4K accessory display would have been impossible. Further, to use an 88" screen practically as a PC monitor, I'd have to be sitting quite far from it which adds to the impracticality of it.

If they make an 8K OLED monitor that is 32" or even 55", I'd buy it in a heartbeat over the 88" TV. Assuming they use HDMI 2.1 or even better, DP 2.0.

My point still stands that using a TV as a PC monitor is impractical and silly, especially if the former is 88".
 
The question was indeed not answered, but there was a clue, multiple display port had I would easily imagine an higher bandwith and no or less compression than a 8K single HDMI going on, if there is 0 artifact at the joncture it is more true 8K.

But in general really unclear.

8K gaming and PC are 2 different subject, in 2033 most people playing "8K" will probably not be with a PC and for now the only 8K content that exist is pretty much coming out of a PC no (maybe some utlra compressed youtube demo and other nice try, but it is not like there is any movie or tv content to watch in 8K with this) and it is a different subject.

It is not clear what a HDMI 2.1 TV like that do not do that a PC monitor does, for one to be impossible to play a game on it at a 8K resolution are you talking about compression of the signal because it is only one input instead of multiple ?

My main point is the impracticality of using an 88" TV as a PC monitor. That is all. Not the actual technology behind it (HDMI 2.1). In fact, I agree that using a single cable (SST) HDMI 2.1 is better than using 2x DP 1.4 non-DSC that the Dell UP3218K was. However, the monitor is more than 3 years old. Here's to hoping that an 8K OLED SST monitor is available soon.

My current monitor is the 4K OLED 120Hz Dell AW5520QF - I'd love to replace that with an 8K OLED (even 60Hz) of the same size.
 
I don't know if I'd use an 88inch, 8k tv for a monitor purely for logistical reasons, but fuck yes will I use it for playing games. Actually a goal I'd like to reach in 5ish years.

I won't consider resolution high enough until you can't see any pixels at all and anti-aliasing is no longer needed.

For text printing this is considered to happen at 600dpi at a minimum, so I'm wanting screen resolution to at least match that.
 
I don't think you can call 1440p upscaled to 8k, 8k gaming...

That's ultra performance mode or 9x AI scaling and I wasn't alluding to that. I was talking more like 3x AI scaling from 5k, which should look similiar.

I guess 8k is still very irrelevant, even here. I better question would have been is 1440p dlss upscaled to 4k still 4k. In many games, it looks as good if not better.

As for those saying it must render that to count is sort of like comparing a card with 8 GB of vram with really bad compression version a card with 6 GB of really good compression. I feel like most of you would choose the former because... numbers, even if the 4 GB card performed better.
 
I better question would have been is 1440p dlss upscaled to 4k still 4k.
The answer is the same: no, it is not.

Again... that is not to say it doesn't look good. That's not to say it isn't an efficient use of a GPU's resources. That is not to say that it isn't useful, even impressive.

"It looks as good if not better" is not a quantifiable metric. The 1060 6GB I've got sitting on my shelf is a 16K, 360FPS card*!

*(Upscaled from 240p. Hey, it looks fine to me.)

Do you get the point of what I (and many others) are trying to say here? Saying that 5K w DLSS is 8K is the same as saying 2+2=5 because you snuck a 1 in there when we weren't looking and it still counts.
 
Back
Top