Where are the 8K Monitors?

Baasha

Limp Gawd
Joined
Feb 23, 2014
Messages
249
Why are there no 8K monitors on the market (for PCs) other than the archaic Dell UP3218K from 2017? The GPUs we have now (4090) should be able to handle many games at that resolution and for productivity, 8K would be amazing, especially if it's an OLED display.

With DP 2.1 available, where the f00k are the 8K monitors? An 8K 120Hz OLED at 42", 48", or even 55" would be amazing. DP 2.1 should be able to handle that... right?

Does anyone else think it's strange that literally no 8K monitors have been announced even at CES this year? :rolleyes:
 
Not at all surprising. Even 8K TVs are on the backburner partly due to EU regulations, I don't think there were much new in that category this year either.

Meanwhile display manufacturers don't seem to understand that 8K for TVs is stupid but 8K for large desktop displays would be great. 8K at 43-50" would be pretty great, especially as a curved display.

I think BOE is making a new 8K 32" panel but it's going to be the same old 60 Hz stuff.
 
Nothing in volume as there is zero market for 8k currently. But things can change. If "something" drives 8k, I'd expect products to follow.

I just can't see anything for the next 5 years or so.... but I could be wrong.
 
Meanwhile display manufacturers don't seem to understand that 8K for TVs is stupid
Stupidity depends what the goals.

if you goals is to sell a 50 to 65 inch tv to someone that own a 55 4k tv, maybe a 8k tag can help regardless of the notion does a 8k 40 mbps AV1 stream look better or worst than a 4k 40 mbs AV1 stream.

Streamer could again play the same game they did with 4k, it is easier for them to charge a premium for 8k and have it look better because it is a 65 mbps stream instead of a 4k 35mbs, regardless and without anyone asking does a 65 mbps 4k stream look better or worst than a 8k one ?

We already do all this with 4k monitor.

As for the more obvious case than a native uncompressed signal of a desktop could benefit and be great (and I am not sure 60hz would be necessarily an issue), the price tag....

Is there any reason to not have 5-6k type monitor before 8k outside marketing ? Is there a technical-manufacturing reason why multiply pixel by 4 ? At 8k the price would make it so niche that it would be some artisanal affair that make it even more costly, would need Apple to start it maybe.
 
Is there any reason to not have 5-6k type monitor before 8k outside marketing ? Is there a technical-manufacturing reason why multiply pixel by 4 ? At 8k the price would make it so niche that it would be some artisanal affair that make it even more costly, would need Apple to start it maybe.
4K got popular in various sizes because panel manufacturers were able to make more multi-purpose displays - it could go in a TV, display, info screen etc. I would assume 8K would have similar synergies.

I do agree 5-6K would be a nice in between, even if for gaming it leaves you with less options. But unfortunately that's also a complete wasteland. At least this year there's a 5K and 6K 60 Hz models from someone who isn't Apple.
 
Maybe it let you do 1-2 or even 3 smaller 4k panel if your 8k panel has an issue in some part of it in easier ways than going to 6k would, to increase yield or something ?
 
theres a few out there but there aint much to drive them at 8k, PC is def not there yet.
 
Lack of media content is probably one reason (it has to be popular in the TV market before enough panels are available to make it practical for the monitor market). and don't be too sure that a 4090 can handle 8K, mine just barely produce workable fps in Hogwarts legacy in 4K once you turn on ray tracing.
 
I think 4K will be the resolution of choice for tv/monitor manufactures for a while now, just like how 1080p was around for years. 4K has a long way to mature still, and most new media is filmed in 4K. Even just data storage at 8K would require a major overhaul for content streamers.
According to some anecdotal reviews of people at CES, 8K just doesn't feel like a massive leap over 4K like going to from 720p to 1080p was, and because it doesn't look vastly improved over 4K tv manufacturers aren't going to be jumping at the opportunity to create 8K panales, and if there are no 8K TV panels then there are no 8K monitor panels.
Don't hold your breath for 8k, it's going to be a long time coming.
 
Problem is, real investment in 8k isn't going to take off because a small niche of enthusiasts want to see extreme DPI for PC use. It needs a use case in TVs and console gaming. I'm as much of a nerd for displays as the next guy on here but I also just don't see the need for 8k in any personal use-case scenario.

An example for why it doesn't currently grasp me at all; the Samsung 8k TVs available at the moment sacrifice image quality in other areas compared to the equivalent 4k panels due to their pixel density and heightened need for processing the ~33m pixels (or 100m subpixels, if you like). Resolution isn't everything (just like brightness isn't everything... hi that other thread).

8k at "sensible" PC sizes would 100% require asset scaling and while that might be a DPI enthusiast's dream, it's far from everyone's bag. Certainly isn't me. I find the idea a complete waste of GPU grunt as things stand.
 
I think 4K will be the resolution of choice for tv/monitor manufactures for a while now, just like how 1080p was around for years. 4K has a long way to mature still, and most new media is filmed in 4K. Even just data storage at 8K would require a major overhaul for content streamers.
According to some anecdotal reviews of people at CES, 8K just doesn't feel like a massive leap over 4K like going to from 720p to 1080p was, and because it doesn't look vastly improved over 4K tv manufacturers aren't going to be jumping at the opportunity to create 8K panales, and if there are no 8K TV panels then there are no 8K monitor panels.
Don't hold your breath for 8k, it's going to be a long time coming.
I can tell you that 8K makes a noticeable difference given the right contents. Every time I'm in Japan. I would just go to one of those giant electronics store and sit there and watch their 8K demo until it loops. This is the current Sharp Aquos DX1 demo shot on my iPhone from last month. I know you won't really see the detail from a 1080P shot but it's impressive in person.

 
Last edited:
  • Like
Reactions: HA5
like this
I don't know how some people can use 4k nevermind 8k. I'm on 1440 on my center monitor, and I have browse at 1080. The print is so small and videos look...weird on it.
 
I can tell you that 8K makes a noticeable difference given the right contents.
At what mbps does it start to be noticeable ?

We would need a blind test using equal in every way monitor and content.

Test 100mbs 4k vs 100mbs 8k
200 mbs 4k vs 200mbs 8k
etc...

To start to have any idea, it is usually not what is done when people talk about being noticeable or not.

They look at a newer better monitor than they are use too and look at better content they are use too (how it was filmed, mastered and their bitrate), usually it is never a single variable that changed, resolution.
 
I can tell you that 8K makes a noticeable difference given the right contents. Every time I'm in Japan. I would just go to one of those giant electronics store and sit there and watch their 8K demo until it loops. This is the current Sharp Aquos DX1 demo shot on my iPhone from last month. I know you won't really see the detail from a 1080P shot but it's impressive in person.


There's a reason why it's a "demo" on a loop.

So, I guess one could argue, 8k is required for a (specific) 8k demo on a loop (?) But, probably still not enough to drive up volume in the space.
 
At what mbps does it start to be noticeable ?

We would need a blind test using equal in every way monitor and content.

Test 100mbs 4k vs 100mbs 8k
200 mbs 4k vs 200mbs 8k
etc...

To start to have any idea, it is usually not what is done when people talk about being noticeable or not.

They look at a newer better monitor than they are use too and look at better content they are use too (how it was filmed, mastered and their bitrate), usually it is never a single variable that changed, resolution.
A better question would be at what distance would it become noticeable versus a same-size 4k panel?

I can tell you that 8K makes a noticeable difference given the right contents.
Ignoring the fact that an iPhone recording of a display uploaded to YouTube can't tell anyone anything, the only thing that looks good about that demo is the framerate. It looks oversaturated for one, giving the "wow" factor. I doubt the vast majority of people would be able to tell a difference between that display and a 4k panel showing the same loop with the same colours/range/contrast etc. even if they were standing in front of it.
 
A better question would be at what distance would it become noticeable versus a same-size 4k panel?
And they could look worst from say 1 to 80 mbs, hard to tell between 80 to 130mbs and only better above 130mbs, it is not just that you would not necessarily see the better image, it could be a worst one.

It is obvious enough that for a small 150kb static jpeg picture, a less compressed 400x300 jpeg could look better than more compressed 8000x6000 image, that still true for videos.
 
Last edited:
And they could look worst from say 1 to 80 mbs, hard to tell between 80 to 130mbs and only better above 130mbs
Depends on the codec, but yeah. Bitrates will have to increase considerably for high quality 8k content, cutting out the idea of streaming it (because for better or worse this is what counts these days) for many people around the world still on slow connections.
 
I don't know how some people can use 4k nevermind 8k. I'm on 1440 on my center monitor, and I have browse at 1080. The print is so small and videos look...weird on it.
It all depends on your visual acuity. Seeing everything clearly and effortlessly from distance at which you can still use only eyes to see on every part of the screen.
Visual acuity itself is result of having working eyes, proper muscle control and then being to do all necessary data processing in the optimal enough way.

Higher resolution screen than what visual acuity immediately allows resolving can lead to one of two outcomes. If eyes are used incorrectly it might lead user of them trying to force them even more/harder to focus more to see clearly and this effort will lead to even faster eyesight deterioration. Alternatively having high PPI screen can stimulate nervous system to make it refine eye control and ways it processes/uses impulses from eyes and lead to seeing details from distance user find physically comfortable.

Because of course it is possible to just improve eyesight in response of high PPI screen to the point it can be comfortably used.
 
There's a reason why it's a "demo" on a loop.

So, I guess one could argue, 8k is required for a (specific) 8k demo on a loop (?) But, probably still not enough to drive up volume in the space.

That's why I said earlier that it's the lack of content that's holding 8K back. I was also in Japan when they're doing live broadcast test in prep for the Tokyo Olympic and It blows anything I've seen on 4K as seen in the clip:

 
A better question would be at what distance would it become noticeable versus a same-size 4k panel?


Ignoring the fact that an iPhone recording of a display uploaded to YouTube can't tell anyone anything, the only thing that looks good about that demo is the framerate. It looks oversaturated for one, giving the "wow" factor. I doubt the vast majority of people would be able to tell a difference between that display and a 4k panel showing the same loop with the same colours/range/contrast etc. even if they were standing in front of it.

I shot the video just out of habit. Otherwise I would have at least pull out my Osmo Pocket 2 and shot it in 4K@60 HDR. Oh they have their top 4K models set next to this one with the same loop and there is a noticeable difference.
 
It blows anything I've seen on 4K
What the highest bitrate of similar content with the same quality capture you ever saw ? I think this goes to the 90mbit/s territory, which is more than twice say netflix/Disney+ 4k (usually top at 35 mbs or so), and looking at the size of the camera they use, it would look much better that pretty much anything 4k outthere even on a 4k tv (would they send and accept a 90mbs 4k signal).
 
It all depends on your visual acuity. Seeing everything clearly and effortlessly from distance at which you can still use only eyes to see on every part of the screen.
Visual acuity itself is result of having working eyes, proper muscle control and then being to do all necessary data processing in the optimal enough way.

Higher resolution screen than what visual acuity immediately allows resolving can lead to one of two outcomes. If eyes are used incorrectly it might lead user of them trying to force them even more/harder to focus more to see clearly and this effort will lead to even faster eyesight deterioration. Alternatively having high PPI screen can stimulate nervous system to make it refine eye control and ways it processes/uses impulses from eyes and lead to seeing details from distance user find physically comfortable.

Because of course it is possible to just improve eyesight in response of high PPI screen to the point it can be comfortably used.


The way our eyes work is very complicated and people dismiss higher resolution when there really is still a benefit.

Sharp did an experiment way back when Apple started calling things "retina displays" and essentially disproved the term. They showed images at set distances and resolutions with much higher PPI than considered retina and people with normal eye sight could still tell the highest resolution one was sharper.

People try to say you can only see X resolution based on rod density in the eyes but there is much more to it. We have 2 eyes looking from slightly different angles, our eyes can move, our head can move, our brains see more detail this way.

Yeah, we are at the point of diminishing returns with 4k, where other improvements have a much bigger impact, but even higher resolutions are still is noticable improvements.
 
Even on the desktop 120+ Hz is just more responsive and pleasing to use. If I had my way, 120 Hz would be the minimum spec for all monitors made today.
Agreed - I'd sooner see this happen than 8k become the next big selling point.
 
What the highest bitrate of similar content with the same quality capture you ever saw ? I think this goes to the 90mbit/s territory, which is more than twice say netflix/Disney+ 4k (usually top at 35 mbs or so), and looking at the size of the camera they use, it would look much better that pretty much anything 4k outthere even on a 4k tv (would they send and accept a 90mbs 4k signal).
The Japanese NHK BS8K Satellite system uses transponders at 34.5MHz with a transmission rate of 100Mbps each using H.265 encoding. This blows away anything that is available in the U.S.

It's kinda sad nowadays when most people gets their so called HD contents from Streaming and think that 35mbs is enough for a 4K stream. OTA HD broadcast is already ~8GB/hour or roughly 2.2MB/sec (17.6Mb/sec) and a proper OTA 4K broadcast will run ~30GB/hour, 8.8MB/sec, 70Mb/sec which is 2X more than you'll see from streaming services.

Yes. Compression like H.264/265 helps but I can tell you that an OTA HD broadcast will be superior to any HD streaming through the net. Recent example is when I switch between the Oscar Red Carpet show between OTA and Cable and I would actually go out and say that the Oscar OTA broadcast even looks better than some of the 4K streaming contents from varies streaming service (I have subs to most of them). Blame that on the poor state and high cost of of our data infrastructure. It is sad that there are still area in the Silicon Valley where you're stuck with 50Mb/sec speed unless you're willing to shell out for cable so the only way to get HD/4K content is with compression. That's also why I still have a OTA antenna, a Tivo and Ultra HD Bluray player and why I still buy movies and TV series in HD and UHD. I know they also uses compression but they usually have a higher bit rate than those streamed over the web.

So yes. 8K in Japan is amazing and immediately noticeable over 4K if you have the proper content and equipment to handle the bandwidth.
 
Last edited:
So yes. 8K in Japan is amazing and immediately noticeable over 4K if you have the proper content and equipment to handle the bandwidth.
Which do not tell us if being 8k help at all, obviously a 100mbs signal will look better than a lower one and that would be true for 100mbs 4k versus regular 4k, maybe 100mbs 4k look better than 100mbs 8k with current compression algorithm.

We very rarely have a chance to test (about never), if higher resolution is worth it at compression level we have access to, Warner Brrther years ago tested the value of 8k content:
https://www.techhive.com/article/578376/8k-vs-4k-tvs-most-consumers-cannot-tell-the-difference.html
Using 3,000 mbs content (superbe higher 8k resolution of recen Nolan 70mm movies, native 8k render of animated movie, 8k cameras, etc...), on people with 20/20, 20/10 or better sitting only 5 feet of a very large 88inch tvs

Compared with 4k upscaled (using a pure 4 exact pixel for each pixel to keep an actual 4k resolution) result were quite limited:

4k-vs-8k-fig5-100833695-orig.jpg


I doubt that with a compressed media it would be worth it in 2023, a lot of people preferred the 4k to the 8k (probably did not saw a difference and picked one).

And for one the industry would make it hard for anyone to ever have a chance to look if there is a difference, like for all previous resolution increase.
 
Which do not tell us if being 8k help at all, obviously a 100mbs signal will look better than a lower one and that would be true for 100mbs 4k versus regular 4k, maybe 100mbs 4k look better than 100mbs 8k with current compression algorithm.

We very rarely have a chance to test (about never), if higher resolution is worth it at compression level we have access to, Warner Brrther years ago tested the value of 8k content:
https://www.techhive.com/article/578376/8k-vs-4k-tvs-most-consumers-cannot-tell-the-difference.html
Using 3,000 mbs content (superbe higher 8k resolution of recen Nolan 70mm movies, native 8k render of animated movie, 8k cameras, etc...), on people with 20/20, 20/10 or better sitting only 5 feet of a very large 88inch tvs

Compared with 4k upscaled (using a pure 4 exact pixel for each pixel to keep an actual 4k resolution) result were quite limited:

View attachment 566888

I doubt that with a compressed media it would be worth it in 2023, a lot of people preferred the 4k to the 8k (probably did not saw a difference and picked one).

And for one the industry would make it hard for anyone to ever have a chance to look if there is a difference, like for all previous resolution increase.

That's why I said that too bad the U.S does not have the infrastructure to handle the increased bandwidth requirements. Upscaling algorithm good as they are will never be as good as a native image, same with compression. The difference maybe small but it'll be there. At least at the current bandwidth allocation being used by the BS8K, the difference is noticeable between 4K and 8K TV when showing the same content.
 
Last edited:
Even on the desktop 120+ Hz is just more responsive and pleasing to use. If I had my way, 120 Hz would be the minimum spec for all monitors made today.
Okay, personal preference not a needed use case. Expect to pay large for a luxury item is all.
Mainstream is just slowly grasping mass adoption of 4K 60Hz or even 1440K 120Hz atm.
 
That's why I said that too bad the U.S does not have the infrastructure to handle the increased bandwidth requirements. Upscaling algorithm good as they are will never be as good as a native image, same with compression. The difference maybe small but it'll be there
But it will not be necessarily a positive difference if there is one, we all agree that at the same bitrate, higher resolution (and thus more compression) does not always equal better. The Hollywood studio test was with pretty unrealistic high bitrate to give all the best possible chance to theorical 8k in a far future to shine and it did not. With hundreds of time less bandwith it would be an other game.
 
Back
Top