Where are the 8K Monitors?

Depends on your usage scenario.

For me the usage scenario would be as a "command center" or "battlestation" style setup where the screen would be decoupled from the desk in order to allow for increased viewing distance and a more optimal viewing angle for a larger screen.

For a theoretical 8k screen with a similar format to the 1000R curvature of the samsung ark for example , in 55inch or 65 inch size:

..the center point of the circle. 1000R means a 1000mm radius. That's nearly 40 inch view distance for all of the pixels to remain pointed at you, equidistant from your eyes along the curve, remaining on axis in relation to you all the way to the ends of the screen.


..Pixels per degree is a much better measurement of the pixel pixel density you'll be seeinging than just stating ppi alone.






At longer view distances on larger screens, for example 32inch to 40inches away on a 55inch or 65 inch - you'd benefit from much higher ppi realized as PPD. A 55inch ark gets more like a 1440p desktop screen's pixel sizes to your perspective when sitting even 32 inches away, worse when closer. Even at 38 inch viewing distance its only getting 60ppd, which is ok but not stellar. I use my 48cx at around 65 to 70 PPD, 60deg to 55deg angle viewing distance wise but if it was mounted right on my desk it would be nore like 50 ppd ~ 1500p looking.





.. the desktop/app real-estate estate would be much greater on a larger gaming tv style 8k screen. They promote the 55inch 4k ark as a bezel free multiple monitor setup replacement but it's quads of desktop real estate are only 1080p. Most people using multiple screens are likely using at least 1440 per screen space if not 4k, and many seeking the real estate using at least one 4k screen space in the mix I'd guess. No matter how you look at it, 1080p screen spaces aren't going to cut it imo.

.. for gaming, on a larger format screen like those described, optimally you'd be able to run smaller screen spaces 1:1 pixel when desired for more demanding games, more encompassing field of view and/or when desiring the opposite in an uw format. E.g. 4k at higher hz than 8k native. Also 5k, 6k, x1600, x2160 ultrawide resolutions etc. 1:1 w/o scaling. On a larger screen those smaller portions of the overall screen would still be decent sizes and also in relation to your perspective as you might sit closer to the screen while viewing games in those fields and sit farther when using the rest of the desktop/app/OS real-estate.

Gaming on a large format 8k screen(full screen or in smaller screen spaces 1:1) would be leaning on dlss AI upscaling and frame generation of today but also as it matures along with more powerful 5000 series gpus (2025 most likely, should have better 8k options and competitors by then too) and into the years of gpus and advancements beyond (perhaps eventually more vector informing: game engines,game development, os, drivers/peripherals to allow several frames to be generated more accurately rather than a single "tween" frame - using that kind of actual informed vector system rather than solely guessing what vectors might be between two frames).


.......

"All I'll ever need."

I can understand the sentiment at this stage considering what is available right now. However people said the same things about 1080p vs 1440p. 1440p vs 4k , 60hz vs 120hz, HDR nits, etc.

Right now I'd love a 65" 8k screen that could do 8k 120hz desktop / windowed games, higher hz ( ~ 144hz+) at lower resolutions 1:1 ~ letterboxed and at 4k upscaled full screen. I'm probably going to wait it out until more mfgs come out of the deep freeze they put 8k gaming tvs into though. More competition in pricing, models ,features and 8k a little less green on the vine.


I guess I just don't see the visual benefit of going greater than the equivalent of 100ppi at ~2ft distance. Once you hit 100ppi at that distance serious limiting returns start setting in.

My rule of thumb is that if your combination of screen size, resolution and viewing distance requires you to use any scaling factor greater than 100% to comfortably use the windows desktop UI, then you are essentially just wasting pixels, and would be better off with a lower resolution screen. it would save on a needlessly expensive screen, a needlessly expensive GPU and on wasted power.

If it weren't for laptop manufacturer "big number go up = sale" marketing resulting in ludicrously over-rwsolytioned screens being put in laptops that are way too small for the resolution, desktop scaling shouldn't even hen need to exist. Everyone should be running at 100% all the time.
 
Last edited:
I guess I just don't see the visual benefit of going greater than the equivalent of 100ppi at ~2ft distance. Once you hit 100ppi at that distance serious limiting returns start setting in.

My rule of thumb is that if your combination of screen size, resolution and viewing distance requires you to use any scaling factor greater than 100% to comfortably use the windows desktop UI, then you are essentially just wasting pixels, and would be better off with a lower resolution screen. it would save on a needlessly expensive screen, a needlessly expensive GPU and on wasted power.
It's immediately obvious when using eg. a 5K 27" screen (220 PPI) at that distance with scaling. The sharpness of any desktop content and text is something else, making text more pleasant to read.

The big reason why I moved away from a 49" 5120x1440 superultrawide was because it was not as sharp as I'd like. Much happier with dual 28" 4K screens at 150% scaling but I miss the single display experience without bezels.

With the only 40-43" screens worth buying being OLEDs, those are not great for 100% scaling due to their pixel structure issues. 125% scaling solves the problem.

Then as the screen size goes larger, so has the viewing distance increase or you are wringing your neck trying to look at a huge screen close up. This again means scaling so you can actually see what you are reading.

The great thing about 8K is that you can go pretty large while still maintaining a high PPI - so it's both sharp text/UI and lots of desktop space. The only problem is the screen sizes tend towards the extreme. Nobody wants to sit anywhere near a 70+ inch 8K TV for desktop use.
 
I guess I just don't see the visual benefit of going greater than the equivalent of 100ppi at ~2ft distance. Once you hit 100ppi at that distance serious limiting returns start setting in.

My rule of thumb is that if your combination of screen size, resolution and viewing distance requires you to use any scaling factor greater than 100% to comfortably use the windows desktop UI, then you are essentially just wasting pixels, and would be better off with a lower resolution screen. it would save on a needlessly expensive screen, a needlessly expensive GPU and on wasted power.

If it weren't for laptop manufacturer "big number go up = sale" marketing resulting in ludicrously over-rwsolytioned screens being put in laptops that are way too small for the resolution, desktop scaling shouldn't even hen need to exist. Everyone should be running at 100% all the time.
Are you not missing out on the most important factor here, personal eye sight? While I kind of agree that scaling is bad (at least when you are out to maximize screen real estate and based on today's monitors) the need to do so would depend heavily on a combination of resolution, screen size, viewing distance and ones personal eye sight with the latter probably being the most deciding factor for a single buyers decision. Then of course, most people probably only run something like word in full screen all day and are less affected, but I would imagine few of them would be hanging around in a thread discussing 8K monitors :)
 
The great thing about 8K is that you can go pretty large while still maintaining a high PPI - so it's both sharp text/UI and lots of desktop space. The only problem is the screen sizes tend towards the extreme. Nobody wants to sit anywhere near a 70+ inch 8K TV for desktop use.
Agreed, this is what makes a 65" 8K TV the ideal desk monitor :)

As mentioned before, I see 8K for desk use mainly as a replacement for multiple monitors more than anything else. In the future I believe you are correct and that we will reach a PPI level (and with GPUs able to handle it) where we don't have to see scaling as wasted screen real estate but instead something we do to make things even crisper. For most normal people, this is probably already true. On the topic, virtual desktops are in most cases perhaps an even better replacement for multiple monitors, unless you really need to see things at the same time which I do (or at least has managed to convince myself I do).
 
Give me an 8k 42 or 48 inch and I'll be happy. Give me a 32" of it and I'll be on cloud nine! High refresh and I'll be convinced I died and went to heaven...
 
Agreed, this is what makes a 65" 8K TV the ideal desk monitor :)

As mentioned before, I see 8K for desk use mainly as a replacement for multiple monitors more than anything else. In the future I believe you are correct and that we will reach a PPI level (and with GPUs able to handle it) where we don't have to see scaling as wasted screen real estate but instead something we do to make things even crisper. For most normal people, this is probably already true. On the topic, virtual desktops are in most cases perhaps an even better replacement for multiple monitors, unless you really need to see things at the same time which I do (or at least has managed to convince myself I do).
I use virtual desktops heavily because they work in a mostly sensible way in MacOS. But it's still often a limitation vs just being able to cram more windows on screen at once.

For me 2x 2560x1440 worth of desktop space + a few virtual desktops is generally good enough. I found the Samsung CRG9 superultrawide a bit more convenient because in PbP mode the split of 21:9 + 11:9 was more practical, using the 11:9 portion for secondary stuff like Teams/Slack/email/calendar/notes/terminals and so on while the 21:9 portion fits a browser + IDE side by side nicely.

For me there's a point where "too much stuff on screen at once" becomes a distraction robbing my focus.
 
I guess I just don't see the visual benefit of going greater than the equivalent of 100ppi at ~2ft distance. Once you hit 100ppi at that distance serious limiting returns start setting in.

My rule of thumb is that if your combination of screen size, resolution and viewing distance requires you to use any scaling factor greater than 100% to comfortably use the windows desktop UI, then you are essentially just wasting pixels, and would be better off with a lower resolution screen. it would save on a needlessly expensive screen, a needlessly expensive GPU and on wasted power.

If it weren't for laptop manufacturer "big number go up = sale" marketing resulting in ludicrously over-rwsolytioned screens being put in laptops that are way too small for the resolution, desktop scaling shouldn't even hen need to exist. Everyone should be running at 100% all the time.

It's "a waste" - just like 1440p was a waste vs gpus of the time, 4k was a waste similarly. 16:9 vs 4:3, 1080p, etc. etc. That perspective is biased because of how limited we've been at each stage along the way. We have always been barely reaching fingers onto the next tier of the ladder instead of having an overabundance in things display wise (resolution and hz vs gpu power, color, nits, etc.). Once hardware power/tech/pricing all advance over years it sort of pushes you higher or escalates you instead of you reaching for the next rung and struggling to pull yourself up.

At each rung of resolution (and even hz) increases the returns were often scoffed at but that's because you were in the position that you were already in a deficit - trying to sqeeze more juice out of more limited technology/power. So I agree that your position is valid for most. Versus the limitations of current technology right now it can be a big stretch so it's understandable you feel that way, and that other people felt the same way vs 4k rez, 120hz vs 60 gpu demand wise, etc. when those were being reached for.

Scaling ultimately will be better resolution since it provides a much larger number of pixels per character (more pixel resolution) on text and more pixels per edge/fringe/stepping of in-game graphics.
Also importantly 2d desktop graphics and imagery will be more "megapixels". The 2d graphics and imagery, (outside of some "3d" active workspace windows in cgi authoring suites) typically gets no pixel edge fogging hacks to mask how large the pixels sizes actually are. High resolutions -> PPD in imagery is a big thing. 8k requires a 33 megapixel camera. Photographer's/Video-taker's cameras can be 100MP (or higher) but some of the recommended cameras currently are around 60MP . Again the tradeoffs issue comes up due to the weakness of tech overall vs where it will probably be farther in to the future. Specifically, 8k bandwidth for streaming, uploading/downloading, drive space, etc. - (for all 8k but especially raw images/videos for pros). The same was said of 4k though at one point. As storage tech, cable+port bandwidth, compression tech, upscaling tech, processor/gpu power gain ground over time that argument loses some steam and people find breathing room and eventually a comfortable plateau.

8k will give double the PPD of a 4k of the same size+viewing distance in games, 2d graphics and imagery. It might give slightly less than halving the physical size of text and interfaces vs 4k due to scaling. Depending on view distance maybe 6k of desktop/app real-estate depending how far you are sitting - but the text and interfaces mapped at say "6k size" will have a much greater # of pixels per object/interface/character in an 8k grid of pixels. When sitting closer it could still be essentially a grid of 4k screens 1:1 pixel without bezels though too.



. . . .

https://recoverit.wondershare.com/photo-recovery/resolution-from-4k-to-8k.html

resolution-from-4k-to-8k-3.jpg


resolution-from-4k-to-8k-6.jpg



. . . . .

https://www.tomshardware.com/news/n...olutio-gaming-thing-of-past-dlss-here-to-stay

"During their discussion with Digital Foundry's Alex Battaglia and PCMR's Pedro Valadas, Bryan Catanzaro — Nvidia's VP of Applied Deep Learning Research — stated that native resolution gaming is no longer the best solution for maximum graphical fidelity. Catanzaro went on to say that the gaming graphics industry is headed toward more significant reliance on AI image reconstruction and AI-based graphics rendering in the future."

"Catanzaro's statement was in response to a question from Valadas regarding DLSS and whether Nvidia planned to prioritize native resolution performance in its GPUs. Catanzaro pointed out that improving graphics fidelity through sheer brute force is no longer an ideal solution, due to the fact that "Moore's Law is dead." As a result, Catanzaro explained, smarter technologies like DLSS need to be implemented to improve graphics fidelity and circumvent the otherwise low gen-on-gen performance improvements seen in todays graphics hardware."

. . . . . .

Hopefully gpu mfgs, game engine devs, game devs, OS devs, peripherals/drivers devs etc can all move to a system where vectors (of in game entities', scripted sequences, AI entity behaviors, character/player movement ~ peripheral inputs, FoV movement, etc) are all broadcast to AI upscaling+frame generation systems for much more accurate results where we can have multiple frames inserted accurately.

AI could probably be implemented for better sub-sampling/scaling of text too eventually. Finding optimal patterns is something it is extremely good at.
 
Last edited:
8K will probably not be useful to me given that I am technically not even gaming at 4K anymore. DLSS Quality renders internally at 1440p then upscales to 4K. I would need to see an 8K screen in person to see if running at 2160p or 2880p and being upscaled to 8K with DLSS would look substantially better than 1440p DLSS upscaled to 4K. Obviously it WILL look better, I'm wondering if it's going to look better enough because of diminishing returns. Of course I am purely talking about gaming only before all the "muh productivity" guys chime in.
 
8K will probably not be useful to me given that I am technically not even gaming at 4K anymore. DLSS Quality renders internally at 1440p then upscales to 4K. I would need to see an 8K screen in person to see if running at 2160p or 2880p and being upscaled to 8K with DLSS would look substantially better than 1440p DLSS upscaled to 4K. Obviously it WILL look better, I'm wondering if it's going to look better enough because of diminishing returns. Of course I am purely talking about gaming only before all the "muh productivity" guys chime in.
It is pretty pointless for gaming IMO, even with native 4K vs native 8K. To me 8K is all about productivity. Unfortunately display makers don't agree.
 
It is pretty pointless for gaming IMO, even with native 4K vs native 8K. To me 8K is all about productivity. Unfortunately display makers don't agree.
Everyone claimed 4k was overkill before too for gaming and it isn't. I've used a 4k 24" panel before and would still take higher ppd happily on a larger screen. Bring on 8k gaming! I've been on 4k since 2014. Better distance detail ftw.
 
I want a wall of high PPD and large desktop/app real-estate space as a command center style screen space where I can run a game as a tile any size on instead of using multiple screens with bezels. It doesn't have to be the whole 8k screen field for every game. 1:1 pixel windowed sizes on 120hz 8k desktop and, (if possible eventually on 8k despite DSC), better performance letterboxed games by 1:1 pixel mapping at higher hz as a 4k, 5k, 6k, or various ultrawide resolutions.

All of that glorious deskop/app real-estate and at pretty high PPD ( e.g. 55" 8k at ~ 60 deg viewing angle still gets an appreciable = 120 PPD) - while still being able to have game spaces/fields on it for gaming sessions at fairly high hz (even leaning on AI upscaling + frame generation for a lot of games) would be great.

. . .

Someyear we'll have XR and MR glasses that will be of very high resolution and they'll be able to essentially put floating screeens in them of high resolutions. Right now the XR light sunglass format ones only do 120hz 1080p screen floating in space (micro OLED, 500nit 120hz are the best ones right now I think). It looks usable but 1080p is pretty low. The apple MR "VR skiing goggles sized" kit is still way too bulky for regular use imo, or the quest3 passthrough bulk and quality, etc. They will get there in time however in future models over the years. XR/MR lightweight glasses will probably move to 4k per eye then 6k - 8k but that won't be for quite awhile yet. Apple pushed it's lightweight sunglass style design back to 2027 so things might turn a page then.
 
Everyone claimed 4k was overkill before too for gaming and it isn't. I've used a 4k 24" panel before and would still take higher ppd happily on a larger screen. Bring on 8k gaming! I've been on 4k since 2014. Better distance detail ftw.

Well that's the thing though, would people even be gaming at 8K natively? Or just 4K/5K upscaled to 8K? I would need to see an 8K screen in person to be able to make the comparison of 4K upscaled gaming vs 8K upscaled gaming. We all know 8K upscaled will look better than 4K upscaled so that's not the question here. The question is whether or not the leap in image fidelity will be big enough. A big part of the reason why everyone (myself included) called 4K useless at first is because games were simply not made for such high resolutions. Today the story is different in that games are made to look great with 4K in mind, and we're going to experience the same thing with 8K where games are just not made for such a high resolution initially and people will call 8K useless until games are made to be rendered on 8K displays.
 
Everyone claimed 4k was overkill before too for gaming and it isn't. I've used a 4k 24" panel before and would still take higher ppd happily on a larger screen.

I have to ask why?

What could you possibly hope to benefit from it?

Even given infinite computing power at zero power use, and a free screen, I'm not sure I would do it. I just don't see what it adds over native resolution at 100% scaling 2ft from a screen. That looks absolutely perfect the way it is, and increasing resolution beyond that at those settings does nothing but give you a uselessly large screen.

If you are hoping to eliminate aliasing, well good luck. That will never happen at any resolution. You'll always need some form of anti-aliasing solution to deal with that, at any ppd.

If you want that, you don't even need the screen with crazy high ppi to accomplish it. Just play using DSR (if you have the GPU power to spare, and if you don't, a larger screen isn't going to help you anyway.)
 
I have to ask why?

What could you possibly hope to benefit from it?

Even given infinite computing power at zero power use, and a free screen, I'm not sure I would do it. I just don't see what it adds over native resolution at 100% scaling 2ft from a screen. That looks absolutely perfect the way it is, and increasing resolution beyond that at those settings does nothing but give you a uselessly large screen.

If you are hoping to eliminate aliasing, well good luck. That will never happen at any resolution. You'll always need some form of anti-aliasing solution to deal with that, at any ppd.

If you want that, you don't even need the screen with crazy high ppi to accomplish it. Just play using DSR (if you have the GPU power to spare, and if you don't, a larger screen isn't going to help you anyway.)
Pretty simple. For games, more pixels and detail in a given screen area ie better textures, more pixels per puff of smoke, etc. For productivity, more pixels per icon or letter by way of scaling = better quality image and text. I am more than able to tell between 24, 28, and 32" 4k, with the 32" being obvious with pixels like a 1080p 15.6" notebook screen.
 
Pretty simple. For games, more pixels and detail in a given screen area ie better textures, more pixels per puff of smoke, etc. For productivity, more pixels per icon or letter by way of scaling = better quality image and text. I am more than able to tell between 24, 28, and 32" 4k, with the 32" being obvious with pixels like a 1080p 15.6" notebook screen.

Is the juice worth the squeeze though? Able to tell, sure, but it still has to be pretty marginal.

At 8k we would be talking 4x the GPU power of 4k or 16x the GPU power of 1080p, and people are already scaling using DLSS and FSR to get the latest titles to run acceptably at 4k on the fastest GPU's money can buy.

How about comparing 67% scaling at 4k vs 33% scaling at 8k (same render resolution) Do you still think there is more clarity, when the base rendering resolution is going to be the same 2460x1440 upscaled using various trickery?
 
Is the juice worth the squeeze though? Able to tell, sure, but it still has to be pretty marginal.

At 8k we would be talking 4x the GPU power of 4k or 16x the GPU power of 1080p, and people are already scaling using DLSS and FSR to get the latest titles to run acceptably at 4k on the fastest GPU's money can buy.

How about comparing 67% scaling at 4k vs 33% scaling at 8k (same render resolution) Do you still think there is more clarity, when the base rendering resolution is going to be the same 2460x1440 upscaled using various trickery?
More gpu power will come, as always ;). I heard the same arguments for 4k, "oh it's four times the pixels of 1080p, and most people game at that!". Tech evolves :).
 
More gpu power will come, as always ;). I heard the same arguments for 4k, "oh it's four times the pixels of 1080p, and most people game at that!". Tech evolves :).
Agreed. If Samsung was able to convince people they needed curved TVs a few years ago, 8K should be easy peasy :)
 
I planned to drop numerous grand on one or multiple TVs which I just did. Before doing this I spent hours in BB (literally around 6 hours over the course of multiple days and multiple locations). Conclusion -8k is the truth, to my eyes. I saw the QN800c side by side with a similarly priced 4k samsung mini LED and there was noticeably more detail, especially in regards to texture detail, facial pores etc. in the 8k. That was on the same 4k feed, so the QN800c was clearly upscaling. It wasn't a mind blowing difference but noticeable.

I then saw an LG OLED z2, the TV that costs 10k for the 77th inch and 25k for the 88 inch. I was mind blown from about 7 feet. The store rep showed me it was running on an 8k thumb drive, which I verified later on YouTube. The other TVs were running on the 4k version of the same YouTube nature loop so I had a rare side by side opportunity.

In the end, I dropped the $5499 on the 85 inch Samsung QN900c. While I wasn't as mind blown as the Z2, the native 8k demonstration was extremely impressive from all viewing distances, and TV was a noticeable step up from the 4k I had before.

I have a 4080, but plan to get the 5090 late 2024 or early 2025 when it releases. I may experiment with it as a monitor replacement too.

For me, 8k makes total sense at gigantic TV sizes. I wouldn't be opposed to it in a 42 inch size from a closer distance as some in the thread mentioned.

This is really me saying, yes, we will reach points of dismissing returns when an increase in resolution is not useful. However, IMHO, that "point of pointlessness" would not be 8k. 8k is noticeable. It's been a few years but I tested better than 20 20 vision last checkup so this might be a factor in people claiming there's not a difference when I clearly see one.
 
Also, regarding VR, Quest 3 was the first minimum viable viewing experience I've seen (before that had only seen Q2 and Reverb G2, and wasn't impressed with either due to the dumpster fire ringed fresnel lenses).

To get from minimum viable to "compelling" we really need those 3.5k + per eye micro OLED that people have been talking about forever.
 
I planned to drop numerous grand on one or multiple TVs which I just did. Before doing this I spent hours in BB (literally around 6 hours over the course of multiple days and multiple locations). Conclusion -8k is the truth, to my eyes. I saw the QN800c side by side with a similarly priced 4k samsung mini LED and there was noticeably more detail, especially in regards to texture detail, facial pores etc. in the 8k. That was on the same 4k feed, so the QN800c was clearly upscaling. It wasn't a mind blowing difference but noticeable.

I then saw an LG OLED z2, the TV that costs 10k for the 77th inch and 25k for the 88 inch. I was mind blown from about 7 feet. The store rep showed me it was running on an 8k thumb drive, which I verified later on YouTube. The other TVs were running on the 4k version of the same YouTube nature loop so I had a rare side by side opportunity.

In the end, I dropped the $5499 on the 85 inch Samsung QN900c. While I wasn't as mind blown as the Z2, the native 8k demonstration was extremely impressive from all viewing distances, and TV was a noticeable step up from the 4k I had before.

I have a 4080, but plan to get the 5090 late 2024 or early 2025 when it releases. I may experiment with it as a monitor replacement too.

For me, 8k makes total sense at gigantic TV sizes. I wouldn't be opposed to it in a 42 inch size from a closer distance as some in the thread mentioned.

This is really me saying, yes, we will reach points of dismissing returns when an increase in resolution is not useful. However, IMHO, that "point of pointlessness" would not be 8k. 8k is noticeable. It's been a few years but I tested better than 20 20 vision last checkup so this might be a factor in people claiming there's not a difference when I clearly see one.
What was the viewing distance though? I've seen one of the LG 8K 77" and a couple Samsung 8K models in person and at a viewing distance appropriate for the sheer size, had a hard time telling a difference to 4K because the size + viewing distance required negated my ability to tell the resolution apart.
 
8k is pointless atm. We need better screen tech first. Every type of panel has some kind of down side. Like said earlier the GPU power is not there yet. We barely get 4k and still need a lot of tricks to get there.
 
8k is pointless atm. We need better screen tech first. Every type of panel has some kind of down side. Like said earlier the GPU power is not there yet. We barely get 4k and still need a lot of tricks to get there.

A few things:

..If you followed this thread you'd see that some of us want a wall of desktop/app real-estate out of a 8k screen, which would give you quads of 4k worth of real-estate and at high PPD. It's not all about games, that's only one facet. Gaming is an important aspect to a lot of us but 8k should be able to provide some decent performance there too, even now and in the next couple of years.

..AI upscaling technologies, including DLSS (but also an 8k TV's own upscaling for media), can provide higher details without added stress. That even now but also as we get gpus more capable of running 4k over 120fpsHz, we could upscale 1440+ through 4k up to 8k just like we do 1440 - > 4k. The same tricks that get us to 4k can help us get us to 8k. From reports, nvidia is concentrating heavily on upscaling and frame generation going forward.

.. 8k gaming tv mfg's tech could potentially devolop upscaling with negligible input lag on the TV end eventually. That would allow you to send a higher hz 4k signal to the 8k tv avoiding 8k high hz cable/port bandwidth limitations . . , where the 8k tv would then upscale the signal to 8k itself on the tv end of the equation.

..Many of us would like mfgs to allow you to run non-scaled non-native resolutions letterboxed, 1:1 pixel mapped. So while some of us might sit nearer where our wall of screen is pushed outside of our central human viewing angle some as a "multi-monitor" replacement sans bezels, we could game in a somewhat smaller central area and at higher fpsHz. e.g. 4k, 6k, or different uw resolutions (1/2 screen, etc.) at 1:1 pixel mapping.

.. 5000 series gpu should hit in 2025 by reports. Hopefully AI upscaling and frame generation will also mature along the way.

.. the graphics ceiling is arbitrary to begin with. The challenge for devs is to whittle game complexity down to fit "real time" of given generation(s). You don't have to run your games at the max on all games, you can dial them in. There is also already a vast library of existing games that have ripened into better performance vs successive gpu gens, drivers, patches, mods, etc.

.. Some of us are used to running more than one screen in order to get the benefits of two different screen technologies/sets of features, so worst case could run two different screens using a different screen than the 8k one for some games as necessary in the meantime until gpu tech (5000series and on) hits. Running lower resolutions or scaled resolutions on the 8k for other games while outside of games having four 4k worth of desktop/app real-estate estate at high PPD even on a large 55", 65" screen.
 
Last edited:
What was the viewing distance though? I've seen one of the LG 8K 77" and a couple Samsung 8K models in person and at a viewing distance appropriate for the sheer size, had a hard time telling a difference to 4K because the size + viewing distance required negated my ability to tell the resolution apart.
At the same size and viewing distance, the PPD of the 8K display will be double the 4K, but are eyes have a harder time noticing a difference above 60 PPD. The recommended 12 foot viewing distance from a 77" display means the 8K TV would be 288 PPD, and the 4K TV would be 144 PPD. It's no surprise that one wouldn't be able to easily tell. You'd have to get closer than 2.5 feet away to start noticing the pixel structure of the 8K TV. For reference, Apple defines a display as "Retina" quality if the PPD is greater than 53 PPD at the recommended viewing distance.
 
At the same size and viewing distance, the PPD of the 8K display will be double the 4K, but are eyes have a harder time noticing a difference above 60 PPD. The recommended 12 foot viewing distance from a 77" display means the 8K TV would be 288 PPD, and the 4K TV would be 144 PPD. It's no surprise that one wouldn't be able to easily tell. You'd have to get closer than 2.5 feet away to start noticing the pixel structure of the 8K TV. For reference, Apple defines a display as "Retina" quality if the PPD is greater than 53 PPD at the recommended viewing distance.

The PPD is bad enough that we have to resort to using aggressive anti-aliasing in games and massaged text sub-sampling on text to mask how large the pixel sizes actually are.
The 2D desktop's graphics and imagery typically get no text-ss or graphic AA style "hacks"/masks in an attempt to compensate how large the pixel structures actually are either, so they benefit from even higher PPD. Graphics/art professionals and medical imaging can especially benefit from very high PPD but it's better in general, especially for the unmasked 2d content. If you think it's good enough PPD turn off text-ss and graphics anti-aliasing, because that's what the 2d desktop graphics and imagery are at. The desktop isn't rendered in 3d so is aliased with no compensation tricks to mask it's pixel sizes. Text SS isn't a perfect thing either. The less text has to lean on it the better the text will look.

What higher PPD large resolutions are not good for is stress on GPUs in games at full/native resolution, and it's not good for peak Hz capability of displays at native. However scaling and AI upscaling tech can already do a lot and has potential to advance alot more in the future too, (via gpus but also including that built into the displays themselves).

I had a 15" glossy 4k laptop for a few years for example, and it looked very nice PPD wise by comparison to my other monitors. At 18 inch view distance it got around 106PPD to a view distance of 24 inch at 120 PPD depending how close I was sitting to it. I like my 48" 4k screen at around 70 PPD personally too, though I'd love something like a 55" 8k or 65" 8k:

4k at 64 deg viewing angle gets 60 PPD.
55" 8k at 64 deg viewing angle (38 inch view dist.) would be 120PPD
65" 8k at 64 deg viewing angle (45 inch view dist.) would be 120 PPD

It might take a little text+interface scaling to something like 6k of desktop/app real estate for ease of reading depending on how close your are sitting to a large 8k, but it would still be made of 8k worth of pixel definition on text (and game graphics depending on what you are doing and how you are doing it) . . . , and through to the 2d desktop's graphics and apps.


I can definitely tell and appreciate higher PPD but there is the question of finding the sweet spot vs gpu demands in gaming. That is where the idea of a larger 8k used for high PPD desktop real-estate could theoretically allow you to run 4k, 6k, and various uw resolutions 1:1 pixel mapped, letterboxed, would be welcomed. So your gpu could run the rez faster, and the screen might be able to run at higher hz than 8k native.


When a screen fully fills your human central viewing angle it is as listed below. People tend to have tvs that don't fill that in their living room as it's often set up more like arena seating/viewing rather than single viewer, and because generally living room/furniture/wall spacing, layouts aren't suited to sitting closer to screen - so the PPD they get is typically a lot higher in that kind of scenario than if at 60 to 50 degree filling their whole focal view. For example, for a 77" TV you'd have to sit at around 58 inches away, near to 5' away, to get 60 deg viewing angle. I think most people are probably sitting a lot farther than that. I sit like 8 - 9' away from mine - which results in only a ~ 35 deg viewing angle, closer to half as wide as your central viewing angle would take to fill. To fill my whole central viewing angle at that same kind of distance I'd need a ~ 145 inch tv.
https://qasimk.io/screen-ppd/

At the human central viewing angle of 60 to 50 degrees, every 8k screen of any size gets around 127 to 154 PPD.
.
At the human central viewing angle of 60 to 50 degrees, every 4k screen of any size gets around 64 to 77 PPD.
.
At the human central viewing angle of 60 to 50 degrees, every 2560x1440 screen of any size gets only 43 PPD to 51 PPD.
.
At the human central viewing angle of 60 to 50 degrees, every 1920x1080 screen of any size gets only 20 PPD to 25 PPD

Quote of mine guestimating some phone and tablet PPDs in usage:
Say you are viewing a samsung s20+ phone from 12 inches away to watch a video up fairly close without it being right at your face. That phone is 3200x1440. For the sake of argument let's say it's 6.7inch display. That would be a 30 degree viewing angle (spanning half of your central viewing angle in the middle of your FoV) and would result in around 112 PPD. Holding it any closer would be lower PPD.

Iphone 12 pro max is 2778 × 1284 at 6.1 inch so would get a similar 98 PPD or so at 12 inch view distance. Regular 12 pro would get around 90 PPD. Iphone pro max 14 is 2796x1290 at 6.1 inches so it still around 108 ppd at 12 inch view distance.
 
Last edited:
What was the viewing distance though? I've seen one of the LG 8K 77" and a couple Samsung 8K models in person and at a viewing distance appropriate for the sheer size, had a hard time telling a difference to 4K because the size + viewing distance required negated my ability to tell the resolution apart.
At home I sit 8 feet away from a 77 inch a95l. It'll be the same for the 85 inch QN900c I just ordered for a different room. I might increase to 9 feet. I can clearly see the difference at that distance. People sit too far away for a cinematic experience. I went from being bored with movies to bingehauling blu rays after going to this setup, because it's amazingly immersive.

My use case is about 40% 4k blu ray, 30% streaming and 20% gaming.

No, there's not a ton of 8k content, but I plan on watching everything available, and eventually someone will break the barrier. Weirdly enough, PBS is rumored to get something soon.

::edit - video removed, duplicate from page 6::
 
Last edited:
At the same size and viewing distance, the PPD of the 8K display will be double the 4K, but are eyes have a harder time noticing a difference above 60 PPD. The recommended 12 foot viewing distance....

http://www.hometheaterengineering.com/viewingdistancecalculator.html

According to this the recommended THX viewing distance for 77 inches is 8.5 feet, not 12. 8-8.5 been consistent with my preference of getting to a 35-40 degree viewing angle.

At that distance it's very easy to spot a 4k versus 8k difference side by side if they're both playing native content for their respective resolution. I could personally see a difference back to about 11-12 feet as I had this exact scenario in a Best Buy. I'd say for most people it may be that 8-9 foot range.

People always put these supposedly scientific statements about what PPI can be perceived according to some chart a guy did from a one time vision study, and to me, they're BS and rarely align with real world experience.
 
What was the viewing distance though? I've seen one of the LG 8K 77" and a couple Samsung 8K models in person and at a viewing distance appropriate for the sheer size, had a hard time telling a difference to 4K because the size + viewing distance required negated my ability to tell the resolution apart.

It's hard to find a situation with like for like content with respective 4k and 8k resolutions for each set. I was lucky enough to get that and the difference was striking. It's an issue of granular detail. It comes out especially in sand, hair, clothing texture, things of that nature. It's not going to give you an orgasm and send you to the heavenly realm by looking it, which is what seem people seem to expect, but it's noticeably more detailed, and therefore, more compelling. That's my opinion anyway.
 
People always put these supposedly scientific statements about what PPI can be perceived according to some chart a guy did from a one time vision study, and to me, they're BS and rarely align with real world experience.
There was real world (in a way, it was irrealistically high quality lossless content): https://www.techhive.com/article/578376/8k-vs-4k-tvs-most-consumers-cannot-tell-the-difference.html

If it was not blind and if the 4k vs 8k used two different monitor, (instead of both on the 8k with a 4k content with a basic miltiply by 4 the pixel upscaling vs the 8k version the only but really the only possible test to have a conversation here) we will take your in person experience with a grain of salt. theater chain did some real life test as well and going from 2k to 4k for people with 20/20 vision stopped to be noticeable real quick after the first few rows, people were watching 2k content on super giant IMAX screen just fine).

On a 88inch monitor with loosless 8k scan of 70mm movie like Dunkirk (i.e. multiple TB file drive that took a special system to be able to play at the time) or newly remade animated film rendering in native 8k, people with 20/20 or better sitting at just 5 feet of the screen, blindtested the result were not impressive, most people thought 4k>8k or the same than 8K>4k (i.e. which open the door that people were just guessing even for the 8k>4k crowd).

Some people for example can make the mistake to compare a new over $10,000 TV that look better than the 4k model they have at home and attribute a difference they see in a resolution change or compare higher bittrate 8k content to lower bitrate 4k content and attribute a change to the resolution and not to the higher bitrate send. People that sell 8k TV will either know this or giving material to show from people that know this.
 
Last edited:
There was real world (in a way, it was irrealistically high quality lossless content): https://www.techhive.com/article/578376/8k-vs-4k-tvs-most-consumers-cannot-tell-the-difference.html

If it was not blind and if the 4k vs 8k used two different monitor, (instead of both on the 8k with a 4k content with a basic miltiply by 4 the pixel upscaling vs the 8k version the only but really the only possible test to have a conversation here) we will take your in person experience with a grain of salt. theater chain did some real life test as well.

On a 88inch monitor with loosless 8k scan of 70mm movie like Dunkirk or new animated film rendering, people with 20/20 or better sitting at just 5 feet of the screen, blindtested the result were not impressive, most people thought 4k>8k or the same than 8K>4k

The only way for something like this to be definitive would be to (1) Get a gigantic sample size, (2) demonstrate no difference in distinguishing 8k and 4k beyond chance (50%) outside of standard deviation, (3) replicate study several times.

People can take my A B comparison with a grain of salt all they want, to your point, but I'd take a $1000 bet to distinguish in an A B all day long (with kicking me out of the room and switching TVs around continually) and be a rich man at 8 feet from a 77 inch 8k and 77 inch 4k displaying native content, respectively.
 
The only way for something like this to be definitive would be to (1) Get a gigantic sample size, (2) demonstrate no difference in distinguishing 8k and 4k beyond chance (50%) outside of standard deviation, (3) repeat study.
It depends what we try to test
1) Is it possible for human to distinguish, that not disputed
2) Is it a significant difference

2 would not take many peoples, the bigger the effect the smaller the test group you need.

but I'd take a $1000 bet to distinguish in an A B all day long and be a rich man at 8 feet from a 77 inch 8k and 77 inch 4k.
What was distinguished ?, was it the exact same 4k content vs its 8k version (the 4k having been upsampled before with a very simple 4k->8k to keep it at 4k resolution but at a 8k signal) on the exact same monitor ? From your short description and did not had the feeling that it is what you are talking about. And distinguished is not really the point specially if you were trying to distinguish, was it a significant difference, would you have quickly tell in a blind test ?
 
What I saw was a 4k version of a nature scene on a Samsung S95c 77 inch, versus the same exact scene in an 8k version on a thumb drive on the LG z2. We can split hairs about me not knowing what the bit rate for each was or any number of factors, as these forums are all about splitting hairs, but it was plainly visible to me. At a split second dart of the eyes? Maybe not? Would it take me 5 seconds of staring at them? No. So somewhere in between.

Similarly looking at the 8k content on an 85 inch QN900c, sand through the hand, clothing texture, facial pore detail etc. were all of obviously higher than 4k quality to my eye (and yes, they were playing a custom Samsung 8k demo). And some of the details I'm THAT demo actually were split second recognition.

Yes, I'm an N of 1, but that's all I need for my purchase decision
 
We can split hairs about me not knowing what the bit rate for each was or any number of factors,
You say this like this would not be by far more important than the resolution here, 2k IMAX movie theater projection has much more high quality-texture definition than 4k trailers of the same movie on Youtube.

And that on 2 different monitor has well ? a $12-25k Tv vs a $3.3k one ? Why not watch the 4k version of nature on the LGz2 at least... same bitrate, same monitor would already be better.

That 2 giant variable right there (usually 2 variable much bigger than the content resolution that change between A and B), it is not a test where only the resolution changed that would be informative on its value, it tells us as far as we know nothing on that variable.
 
Last edited:
252 FALD Zones at 8K 55" is gonna bleed the entire screen. This would be selling well if it was released in 2018-2019. Nowadays, LDZ of High-end FALD should start at least 512 and going up from there.
Fair enough. Simply figured I'd post given the title of this thread. It may not be a great 8k monitor. But it's an 8k monitor, and they're rare to come by so far. Lol
 
The current samsung 900c 8k tv has 1,920 zones for comparison.

Mfgs put 8k on ice for the most part for the last few years though. Samsung has a few but they hold monopoly on it more or less for now, I figure until other major mfgs start making competitive high end 8k gaming tvs in the next few years hopefully, and start marketing them heavily. That should bring competition in pricing and better features. That and the 5000 series gpus should be out in 2025.

While you can get a 65 inch 8k samsung now for example with decent performance, the wrinkles haven't all been ironed out yet and there should be better performing 8ks (e.g. 120hz 8k native on the horizon), and more features, maybe better AI scaling tech on gpus (and on tvs) in a near timeframe, etc. when the 8ks go into full swing from multiple mfgs. A large 8k gaming tv, e.g. 65 inch for a pc command center layout, is doable now at fairly high samsung pricing expense. While temping, 8k still seems a little green on the vine imo especially for the high asking price so it's a tough sell for me currently.

Farther ahead we'll probably get lightweight sunglass form factor XR glasses capable of displaying a 4k and later 8k screen in real space too. Apple pushed their version back to 2027 (whatever rez it ends up being initially) so I'm hopeful for higher rez XR by 2027+. There are some intriguing models out already in micro oled at 600nit but they are still only 1080p. There is a different 90 or 120hz model by a mfg too but I don't think it's micro oled.
 
Last edited:
Sorry what does the mfgs acronym people keep referring to? I know it has something to do with the companies selling TVs, but specifically.
 
The current samsung 900c 8k tv has 1,920 zones for comparison.

Mfgs put 8k on ice for the most part for the last few years though. Samsung has a few but they hold monopoly on it more or less for now, I figure until other major mfgs start making competitive high end 8k gaming tvs in the next few years hopefully, and start marketing them heavily. That should bring competition in pricing and better features. That and the 5000 series gpus should be out in 2025.

While you can get a 65 inch 8k samsung now for example with decent performance, the wrinkles haven't all been ironed out yet and there should be better performing 8ks (e.g. 120hz 8k native on the horizon), and more features, maybe better AI scaling tech on gpus (and on tvs) in a near timeframe, etc. when the 8ks go into full swing from multiple mfgs. A large 8k gaming tv, e.g. 65 inch for a pc command center layout, is doable now at fairly high samsung pricing expense. While temping, 8k still seems a little green on the vine imo especially for the high asking price so it's a tough sell for me currently.

Farther ahead we'll probably get lightweight sunglass form factor XR glasses capable of displaying a 4k and later 8k screen in real space too. Apple pushed their version back to 2027 (whatever rez it ends up being initially) so I'm hopeful for higher rez XR by 2027+. There are some intriguing models out already in micro oled at 600nit but they are still only 1080p. There is a different 90 or 120hz model by a mfg too but I don't think it's micro oled.
I should be noted that the zone count is different for different sizes and I believe the 65" has something lke 1200 zones.

As been mentioned above, I personally mainly see 8K today as a replacement for multiple monitors rather than feeling the need for 8K in general. Or rather, as you also mention, the requirments of 8K is still a bit too much for existing GPUs etc so outside very large / near screens just using it just as an "image enhancer" still seems to be a bit distant in the future. So my interest in having 8K on my desk, mainly for work, is much greater than having it in my livingroom ATM.

But I have no doubt we will one day consider 8K to be mainstream, just like with did with 4K. If not for any other reason because the manufacturers probably still wants us to keep buying new stuff.
 
Back
Top