What is 8k gaming?

Does 8k dlss count as 8k gaming (assuming it looks as good)

  • Yes, what matters is what you actually see.

  • No, If it is not rendering 33 million pixels, gtfo.


Results are only viewable after voting.

Nightfire

2[H]4U
Joined
Sep 7, 2017
Messages
3,279
Hypothetical question: If all or even most games were able to run 8k dlss rendered from 5k, would you consider that as 8k gaming?

We see that 4k dlss rendered from 1440p often looks as good or even better than 4k, so scaling things up, one would assume that 5k to 8k dlss would look as good or even better than 8k native
 
Look at my "Dodged a Bullet" thread - the video I linked explains this. No real 8k gaming atm.

As for 1440p looking as good or better than 4k - at what size monitor? 24-27, yes; above that? Probably not.
 
so a reg blueray player hooked up and using the tv's upscaling is still 8k? no, no its not.
 
True 8K gaming is rendering at a native 8K resolution.

However, that is not saying that DLSS 2.0 isn’t amazing technology that can imitate higher native resolutions very well.
 
Screenshotters will always prefer native resolution if they care enough. From a gameplay PoV though if performance is there and the neural network model is good enough then it likely doesn't matter as much.
 
Not to mention, even if it worked you'd probably be at low settings. 4K native ultra would look way better than 8K upscaled low settings.
 
If you are not at native resolution, then you are not at that resolution.

DLSS is a good technology, but ultimately it is just an evolution of upscaling and not a native experience.

8k is silly, we haven't even solidified gaming at 4k with full IQ and a consistent 60fps in demanding titles. We sure are not going to get double 4k anytime soon, even if panel manufacturers push it.
 
8k is quad 4k, not double.

Yes, but 5k is.

I am rather shocked by the results to be honest. As I pointed out, this was a hupothetical question that assumed most games would support dlss and the 8k dlss would look as good or better than native 8k.

For some reason, that is not enough for most here. It's like you guys get more of a chub from rendering pixels than getting a the best possible gaming experience.
 
Not to mention, even if it worked you'd probably be at low settings. 4K native ultra would look way better than 8K upscaled low settings.

This is an irrational comparison with too many variables.

What I am talking about is 8k quality dlss, which would be rendered 5k since these numbers are both 4x that of 4k rendered from 1440p - ultra settings both.

Those would both theoretically have similar image quality as Hardware Unboxed review of dlss 2.0
 
Take an N64 game and render it at 8K native
Then take an N64 game and render it at 720P and use DLSS to 'infer' it up to 8K.

You won't be able to tell the difference. Why? Is it because Nvidia's tech is so good?

No, it's because the fidelity of the content can't hold up to the 720P resolution, let alone 8K. The content was designed for a 480p or even 320p CRT. So you're taking something that had its assets authored for such a low resolution and then giving them WAY more resolution. You aren't revealing any more detail. You're just revealing that there is no more detail to reveal.

In other words, The reason DLSS 'looks just like native' is because the fidelity of the content is already fully exposed (or close to it) by the pre-scaled resolution, and DLSS can scale it, use its fancy inferencing and fill in gaps.

DLSS is more than upscaling to be sure. Its MUCH more advanced and much better looking, but until we get our content to the fidelity level that REQUIRES 8K rendering, 8K DLSS is really just taking an image and making it sharper.
 
DLSS 2.0 has shown to be as good or better than native res when rendering at half the resolution and even adds a decent AA effect for hair and thin lines, so for all intents and purposes it is 8K IMO. But I think DLSS is the exception too and not comparable to other upscaling techniques, as pretty much all other upscaling techniques do not match the native resolution whatsoever like DLSS does. Checkerboard rendering is probably the next best thing and that has fairly obvious aliasing effect on hard edges and lines.

So I say yes, it counts, esp. considering even if the new GPUs had the rasterization performance to play at "true" 8K, I would always opt to still use DLSS for much better performance with no hit to the image quality, no asterisks needed. Now if only it wasn't proprietary to Nvidia and there was a competitive open solution that AMD and consoles could use instead so it could be implemented in much more games.
 
Last edited:
1) DLSS isn’t actually rendering 8K pixels, so it doesn’t count. It’s still an approximation, albeit a good one.

2) Even if you could count DLSS as 8K, only select titles currently support it, there aren’t many yet, and even in the ones that do, performance varies, with titles for graphics that are easier to predict performing the best.

In short, if you’re not pushing out 33 million pixels, you’re not “actually” 8K.

Either way, who even has an 8K monitor right now? That market is so small, and not even the 3090 can handle it despite what Nvidia’s marketing team wants you to think, so it’s really a non-issue at this point.
 
Take an N64 game and render it at 8K native
Then take an N64 game and render it at 720P and use DLSS to 'infer' it up to 8K.

You won't be able to tell the difference. Why? Is it because Nvidia's tech is so good?

No, it's because the fidelity of the content can't hold up to the 720P resolution, let alone 8K. The content was designed for a 480p or even 320p CRT. So you're taking something that had its assets authored for such a low resolution and then giving them WAY more resolution. You aren't revealing any more detail. You're just revealing that there is no more detail to reveal.

In other words, The reason DLSS 'looks just like native' is because the fidelity of the content is already fully exposed (or close to it) by the pre-scaled resolution, and DLSS can scale it, use its fancy inferencing and fill in gaps.

DLSS is more than upscaling to be sure. Its MUCH more advanced and much better looking, but until we get our content to the fidelity level that REQUIRES 8K rendering, 8K DLSS is really just taking an image and making it sharper.

This is a fantastic explanation.

People get so excited about the numbers they see in the big box stores showing an 8k OLED (or even really 4k) - but don't remember that most content is offered at 720p, 1080i and a very few at 4k.

"Ultra HD Blu-ray, and nearly all UHD streaming content from Netflix, Amazon and others, is 3,840x2,160 resolution. "
" most HDTV broadcasts including those from CBS and NBC, are still 1080i. "
" all ABC, Fox, ESPN, and their affiliated/sister channels broadcast at 720p. "

https://www.cnet.com/news/4k-1080p-2k-uhd-8k-tv-resolutions-explained/

and even when you can get 4k content - you're not guaranteed to actually receive it: net traffic can cause the HD provider to drop your resolution to SD:

" HD and ultra-HD resolution videos will now be switched to SD (Standard Definition) by default when streaming videos. The list of participants includes Sony, Google, Facebook, Viacom18, MX Player, Hotstar, Zee, Tiktok, Netflix and Amazon Prime Video. "

https://tech.hindustantimes.com/tec...n-breakdown-story-ih1mLFAFXjjzWB67VcqW9M.html
 
I chose 8k native simply because at this early stage of 8k the industry is defining terms, and the industry needs a reference point to speak from. 8k native is that reference point. If we call any type of upscaling “8k”, no matter how advanced, them we’re setting the stage for rubber truth in the future. A good example of which were the “LED TVs” we’ve ostensibly had for years prior to OLED which were truly just LCDs with LED backlights.

TLDR, IMO:
What’s 8k gaming? 33M pixels per frame
Can advanced scaling look as good as native 8k gaming? Yes.
Doesn’t that then make it 8k gaming? No.
 
For some reason, that is not enough for most here. It's like you guys get more of a chub from rendering pixels than getting a the best possible gaming experience.
It's not this.

It's this:
What’s 8k gaming? 33M pixels per frame
Can advanced scaling look as good as native 8k gaming? Yes.
Doesn’t that then make it 8k gaming? No.
People who are enthusiasts about technical matters tend to prefer actual technical definitions.

Telling [H]ers that DLSS-upscaled is the same as 8K because it "looks just as good" is a little bit like telling gearheads that a highly-tuned turbocharged four-cylinder is the same as a naturally aspirated V8 because it "also goes fast."
 
It's not this.

It's this:

People who are enthusiasts about technical matters tend to prefer actual technical definitions.

Telling [H]ers that DLSS-upscaled is the same as 8K because it "looks just as good" is a little bit like telling gearheads that a highly-tuned turbocharged four-cylinder is the same as a naturally aspirated V8 because it "also goes fast."

I get that analogy to some extent, but there are othe factors there such as durability of the motor and the sweet sound of an American V8. Nothing like that applies to video cards that I know of.

Then again... https://hardforum.com/threads/help-xfx-r9-295x2-died-on-me.1977909/#post-1044098824
 
I get that analogy to some extent, but there are othe factors there such as durability of the motor and the sweet sound of an American V8. Nothing like that applies to video cards that I know of.

Then again... https://hardforum.com/threads/help-xfx-r9-295x2-died-on-me.1977909/#post-1044098824

Then again.
It's not a perfect analogy, no, but I think you get the idea. 8k has a definition. In any other discussion about graphics card performance, when you talk resolution, you talk render resolution. IQ is largely subjective so it's not a quantifiable metric. Nvidia is not using 8k as a technical definition, but as a marketing buzzword. I'm not surprised at all that a forum of hardware enthusiasts see though it - and are not impressed.
 
I would have not issue calling 8K something that look more crisp and better than native 7K, if an technology achieve to be has good or better than native 7K without needing to natively render at 8K it should count imo, has it is all there is too it, how does it subjectively look and nothing else.

The almost has good if not has good/better is important here obviously and will not be respected by marketing and sell teams.

And it is a good thing imo if the term get ultra bended A community that want to be about actual pixel count isn't particularly good and could get what they wish for, terrible game but rendered a 8K instead of a much better experience that let marketing sell it has 8k, think about the under 80 mbits 4k streaming (under 30 mbits even that exist), we would have been much better if Netflix could have charge for high bandwidth best possible image without having to think and sell it with a bit ridiculous "how many pixel".
 
8K Gaming is gaming on an actual 8K monitor (not TV) with enough GPU grunt to drive ~ 60fps at 7680x4320 (33.2 million pixels).

In other words, this is real 8K gaming, not the garbage that Nvidia claims as "8K gaming":

 
8K Gaming is gaming on an actual 8K monitor (not TV)

Not sure In understand, in that example the Display port 1.4 from a titan RTX has a 32.4 Gbs bandwitch, allowind 7680 x 4320 @ at hz.

A recent TV that support HDMI 2.1 and a new video card that also support HDMI 2.1 should support 48 GBS, more than your example (or was multiple display port use at the same time to feed a single 8K monitor ?)
 
8K Gaming is gaming on an actual 8K monitor (not TV) with enough GPU grunt to drive ~ 60fps at 7680x4320 (33.2 million pixels).

In other words, this is real 8K gaming, not the garbage that Nvidia claims as "8K gaming":



Just look at the detail on those 2002 graphics! 😬
 
This is like asking a guy if they could marry the most physically smoking hot intellectually arousing female in the world. Except she's a transsexual. There's no physical way to tell the difference with the exception of not being able to produce children or a genetic test.

IDK.
 
This is like asking a guy if they could marry the most physically smoking hot intellectually arousing female in the world. Except she's a transsexual. There's no physical way to tell the difference with the exception of not being able to produce children or a genetic test.

IDK.
As much as I hate this analogy, I think it's a little bit more like asking a guy if he'd marry a girl he can actually see in person, or if he'd be content to marry her having only seen heavily photoshopped glamour shots of her.
 
8K Gaming is gaming on an actual 8K monitor (not TV) with enough GPU grunt to drive ~ 60fps at 7680x4320 (33.2 million pixels).

I understand your disdain for DLSS (though i personally think its incredible), however I don't understand this "8K monitor (not a TV)" part. In the recent crop of sensational 8K gaming video's on youtube, at least two that I can think of were using LG Signature ZX 88" TV's.

That model has a true 7,680 by 4,320 resolution, has HDMI 2.1 and support VRR. What would have to change for that not be be considered "gaming"? I'm failing to see how another display that happened to be marketed as a "monitor" in the same situation would make any difference.
 
I understand your disdain for DLSS (though i personally think its incredible), however I don't understand this "8K monitor (not a TV)" part. In the recent crop of sensational 8K gaming video's on youtube, at least two that I can think of were using LG Signature ZX 88" TV's.

That model has a true 7,680 by 4,320 resolution, has HDMI 2.1 and support VRR. What would have to change for that not be be considered "gaming"? I'm failing to see how another display that happened to be marketed as a "monitor" in the same situation would make any difference.
I'm making a guess that they said "no TVs" because of the rather ubiquitious upscalers found in high-res TVs to support smaller input resolution. Just a guess. A TV with monitor-like chops is still a rarity.
 
I'm making a guess that they said "no TVs" because of the rather ubiquitious upscalers found in high-res TVs to support smaller input resolution. Just a guess. A TV with monitor-like chops is still a rarity.
Could be but I doubt it, for one PC Monitor will also have upscalers, anything that is non crt will have them by definition, anything that change a signal resolution is an upscaler or downscaler PC monitor have bicubic algorithm usually, the moment a display is able to receive non native resolution and able to display them full screen it must have up/down scaling ability now aday (otherwise it would be quite funny to send a non native signal on them). Some monitor like Asus Vivid pixels one will have complex upscaler and be part of the marketing other will have extreme basic one (when not a pure strecthing).

Making them similar on that aspect between the 2 and considering the implied scenario was either 8K rendering vs upscaling made by the PC (what Nvidia claim to be 8K is upscaled by the GPU not by the monitor or the TV from what I understand), the upscaler in the TV or the Monitor would not be used here.
 
If DLSS is 8k than my 3600x is 12 cores!

SMT/HT does a fancy job of simulating cores just as DLSS simulates higher resolution but its not real.
 
If DLSS is 8k than my 3600x is 12 cores!

SMT/HT does a fancy job of simulating cores just as DLSS simulates higher resolution but its not real.

Except in terms of performance your 3600x performs no where near 12 cores.

DLSS 2.0 is indistinguishable from native, and in many cases is an improvement over native. That's the argument being made.
 
I mean...technically? No. But if 8K DLSS is indistinguishable from "true" 8K, does it really matter in any appreciable way?
 
If DLSS is 8k than my 3600x is 12 cores!

SMT/HT does a fancy job of simulating cores just as DLSS simulates higher resolution but its not real.

It would be a difference between the 2 here from the op premise that include the assuming it look as good, a 3600x is not has fast has a 3900x with SMT off by a significant amount:
https://www.techpowerup.com/review/amd-ryzen-9-3900x-smt-off-vs-intel-9900k/3.html

If you 6c/12thread was exactly the same performance or better than any 12c/12t (if those existed in that world) then it would be a good comp for the hypothetical scenario I think.

If 6c/12thread that consumed less watts were marketed as the equivalent of the 12c/12t and called the same if their performance were identical, would that make people angry.... I think yes because it is obviously not 12core, but for the dssl gaming the video card is actually outputting an 8K resolution, if it look has good has native 8K they achieved 8K gaming.
 
DLSS 2.0 is indistinguishable from native, and in many cases is an improvement over native. That's the argument being made.
If you can provide proof for either of those statements I'll begrudgingly relent, but I don't see how you can, short of a pixel-for-pixel comparison of true 8K vs DLSS-upscaled 8K. "Indistinguishable" and "improvement" are subjective terms when you can't attach anything quantifiable to them.

Again though, the whole problem here is the marketing speak where there's a technical definition being abused to make the card look better.

It's NOT doing 8k. It's doing 5k, then upscaling using some really neat tech that produces a fantastic looking image. No one here is trying to say that 5k upscaled looks bad or isn't impressive. It's just NOT 8k. Nvidia attempting to call it 8k is just them moving the goalposts so they can hit a nice round number that happens to be "double" the old gold standard.
 
It's challenging enough to maintain 4K at 60fps in modern titles. I can't help but laugh at trying to double that resolution. Hell, even hitting 5K is a battle. Maybe you can enjoy some older games (with low-res textures) or maybe you're cool with 30fps, but I don't see the point. 8K is something to be concerned with in a few years.
 
Back
Top