Will CES 2023 Have 8K 60Hz+ Monitors? OLED etc.?

Baasha

Limp Gawd
Joined
Feb 23, 2014
Messages
247
As the title states, is there any news on whether CES in January 2023 will showcase any 8K monitors? Preferably OLED etc.?

The last 8K display I had was the Dell UP3218K back in 2017 - with the 4090, it looks like 8K 60fps might be doable especially on many of the slightly older games (FPS etc.).

Imagine an Alienware QD-OLED 8K 120Hz display?! o_0
 
Unlikely. BOE is making another 32" 8K panel which IMO is entirely pointless for anything but medical imaging at that size.

I don't think we will see 8K in any sensible size like 40-50" anytime soon, let alone with e.g support for higher refresh rate at lower resolution.
 
BOE is making another 32" 8K panel which IMO is entirely pointless for anything but medical imaging at that size.
I once used a 4k monitor that was 24". It was glorious how high ppi that was but the monitor was too small. I would love a 32" 8k :D. It was like looking through a window... In a way 32" and 28" 4k can't capture.
 
let alone with e.g support for higher refresh rate at lower resolution.
All they have to do is let the monitor take a single video signal over two inputs, like two DP 1.4 connections, to double the bandwidth.

High bandwidth monitors have allowed this in the past but it seems like a thing nobody is interested in doing anymore.
 
I once used a 4k monitor that was 24". It was glorious how high ppi that was but the monitor was too small. I would love a 32" 8k :D. It was like looking through a window... In a way 32" and 28" 4k can't capture.
6K is more than enough for that at 32".
 
Hot take: 4k 1000hz is the final pinnacle of pc monitor tech. We don't need more then that. 8k for anything less then a 100" screen is unnecessary, and only freaks would want a 100" PC monitor.
4k 1000hz everyone, demand it, yell it from the rooftops!
 
Hot take: 4k 1000hz is the final pinnacle of pc monitor tech. We don't need more then that. 8k for anything less then a 100" screen is unnecessary, and only freaks would want a 100" PC monitor.
4k 1000hz everyone, demand it, yell it from the rooftops!

Lol. You should try gaming on a projector sometime. For some types of games it's awesome. I completely disagree that we don't need 8k except for those, too. Higher hz is a must, but don't rule out higher res either!
 
We don't need more then that. 8k for anything less then a 100" screen is unnecessary,
Play something at 4k without anti-aliasing and you'll still see subpixel shimmer. And any anti aliasing that completely removes subpixel shimmer will blur the image to some degree.

So if you could use 4x the pixel to do the anti-aliasy, by doing TAA upsampling or DLSS or something similar, you can eliminate that shimmer and still have a sharper image. Even if the internal resolution is still 4k.
 
Play something at 4k without anti-aliasing and you'll still see subpixel shimmer. And any anti aliasing that completely removes subpixel shimmer will blur the image to some degree.

So if you could use 4x the pixel to do the anti-aliasy, by doing TAA upsampling or DLSS or something similar, you can eliminate that shimmer and still have a sharper image. Even if the internal resolution is still 4k.

One argument that could made for prioritizing resolution over refresh rate is that lower resolutions can at least be made to look a lot closer to higher resolutions. Thanks to better upscaling techniques like DLSS and FSR 2.1, 1080p upscaled to 4k today looks way better than trying to upscale 1080p to 4k like 10 years ago. So if running native 8k is too much you could at least do some upscaling with decent results. As for trying to turn low frame rates into higher frame rates, nvidia is trying that out with DLSS 3 but it's still in the early stages and needs per game implementation, and it also results in increased input latency. So for now faking pixels is a lot more refined than faking frames.
 
  • Like
Reactions: elvn
like this
Personally, I think 40 -50 inch 8K would be perfect, it would be roughly 200PPi, so you could scale up Windows 200% and it would look amazing, while still giving you a huge 40+ inch desktop for productivity.

That said, if you offered me a 40 inch 8K screen at 120Hz, or a 40 inch 4K screen at 240Hz, I'd chose the 240Hz option.

Motion clarity is really a HUGE plus outside of gaming as well. Anybody who has said "the human eye can't see more than 60FPS" has never actually used a desktop running in HRR. You'd be surprised how much less your eyes are strained when using a desktop environment with fluid motion. Personally, I think 60Hz should die in a fire and no longer be considered standard. Bring on 120Hz as standard: it divides into 20, 24, 30, 60 easy. A perfect starting point for everyone.
 
That said, if you offered me a 40 inch 8K screen at 120Hz, or a 40 inch 4K screen at 240Hz, I'd chose the 240Hz option.
What games made in the past 5 years can you actually run at 200+ FPS all the time?

My CRT running at 60hz has better motion clarity than anybody's LCD running at 144, because of strobing.

So really, if you're interested in motion clarity, you need to be asking monitor makers for strobing support, so we can have motion clarity at any refresh rate 60 and above.
 
Hot take: 4k 1000hz is the final pinnacle of pc monitor tech. We don't need more then that. 8k for anything less then a 100" screen is unnecessary, and only freaks would want a 100" PC monitor.
4k 1000hz everyone, demand it, yell it from the rooftops!
With that AI frame generation we're starting to get, even 1000 fps doesn't sound that impossible anymore... They'll have to improve the tech for sure, but since doing it natively is truely impossible for modern games at any moment, I guess that would be the way.
 
Hot take: 4k 1000hz is the final pinnacle of pc monitor tech. We don't need more then that. 8k for anything less then a 100" screen is unnecessary, and only freaks would want a 100" PC monitor.
4k 1000hz everyone, demand it, yell it from the rooftops!
Hot take indeed.

8K would actually be perfect for desktop displays.
  • 8K would be sharp for desktop displays sized above 32". 4K becomes too low res when you go past about 40".
  • You get way more usable scaling levels to size your UI/text the way you like it for your monitor size. Atm, only 125% and 150% are useful settings on 4K displays unless you are using it at TV viewing distances.
  • You can use integer scaling to play at 4K, 1440p or 1080p. Ideally an 8K display would support higher refresh rates at these lower resolutions.
  • 4090 has moved things up to proper 4K 120 fps and with DLSS and frame generation both high resolutions and framerates would be possible.
Where display manufacturers go wrong is by making 8K displays in sizes that make the least sense for them: 32" and 70+ inch TVs. 40-50" would be spot on.
 
Yup, a 48" 8K 120fps OLED display would be nearly perfect (though I'm not sure 8K 120hz OLED even exists at this point).
 
With that AI frame generation we're starting to get, even 1000 fps doesn't sound that impossible anymore... They'll have to improve the tech for sure, but since doing it natively is truely impossible for modern games at any moment, I guess that would be the way.
Exactly, DLSS and the 4090 has pushed the envelope to now making 500fps a reality. For the first time in years GPU hardware is surpassing what monitors can do.
 
Hot take: 4k 1000hz is the final pinnacle of pc monitor tech. We don't need more then that. 8k for anything less then a 100" screen is unnecessary, and only freaks would want a 100" PC monitor.
4k 1000hz everyone, demand it, yell it from the rooftops!

LMAO

Go use a Dell UP3218K


Its like looking through a window. There is a NOTICIBLE difference at 8k even @ 32"
 
LMAO

Go use a Dell UP3218K


Its like looking through a window. There is a NOTICIBLE difference at 8k even @ 32"
Yep. 32" 4k is noticeably pixelated to me compared to 24 and 27". It's like 1080p is on a 15.6" laptop, ppi wise. The 24" I had was amazing ppi in 2014. I can only imagine 8k on a 32", it sounds perfect.
 
Hot take: 4k 1000hz is the final pinnacle of pc monitor tech. We don't need more then that. 8k for anything less then a 100" screen is unnecessary, and only freaks would want a 100" PC monitor.
4k 1000hz everyone, demand it, yell it from the rooftops!
I'm a 4k 16:9 fanatic but I must say I could not resist a high refresh rate ultrawide at 5k2k. It's just 1440p is a no go for me so I've stayed far from ultrawides.

5k2k 45" 144hz+ ultrawide OLED or Mini-led is the dream for me.
 
I once used a 4k monitor that was 24". It was glorious how high ppi that was but the monitor was too small. I would love a 32" 8k :D. It was like looking through a window... In a way 32" and 28" 4k can't capture.

I've got an equally high DPI laptop screen (13.3" 3200x1800) it's wonderful; and the increased DPI makes everything sharp enough I can run at a lower scaling factor: At the same distance as my main screen I can run it comfortably at 2:1 (140 DPI); while with 1/4 the number of pixels small fonts on my 32" 4k fall just short of easily legibility forcing me to run at 120% scaling. I'd love to have a 32" 8k as my next main display; although lack of availability and presumably even higher costs mean it will probably be a late 2020s replacement for whatever 4k 144hz screen I upgrade to in a year or two.
 
I would be happy with 32” 6k 2:1, 100hz variable or so, QD-OLED, or MicroLED with 5000+ zones. For HDR grading and productivity.

However. That is basically an absurd wish that isn’t coming anytime soon if ever. Unless Apple decides to make a new XDR display that runs at higher than 60Hz. No other manufacturer would even bother.
 
I once used a 4k monitor that was 24". It was glorious how high ppi that was but the monitor was too small. I would love a 32" 8k :D. It was like looking through a window... In a way 32" and 28" 4k can't capture.

Sure, but you have to lean in to really tell the difference. With normal human vision, the pixels per-degree calculation is around 12 inches or closer to an 8K 32" to really see any difference. Sure it's cool to lean in to see fine details but I don't think many people are sitting that close to really see a difference under normal use.
 
Sure, but you have to lean in to really tell the difference. With normal human vision, the pixels per-degree calculation is around 12 inches or closer to an 8K 32" to really see any difference. Sure it's cool to lean in to see fine details but I don't think many people are sitting that close to really see a difference under normal use.

While it wasn't obivous at a glance like the fail of a 32" 720p tv used as computer screen would be a arms length the difference in sharpness between text at 140 and 280 dpi is obvious in a the laptop screen looks better than the big desktop monitor next to it way.
 
Sure, but you have to lean in to really tell the difference. With normal human vision, the pixels per-degree calculation is around 12 inches or closer to an 8K 32" to really see any difference. Sure it's cool to lean in to see fine details but I don't think many people are sitting that close to really see a difference under normal use.

I have both a 32" 4K and a 27" 5K, and at a normal viewing distance there is a significant difference in text clarity on the 5K. I can even see some aliasing around some characters when reading text on the iMac, so I imagine the higher pixel density of 8K at 32" would look better. An 8K 32" would be the ideal desktop display. However, I don't know if I could go back to 60hz after using 144, even for the higher pixel density. Every time I use my iMac it feels like something is off because of the lesser refresh rate. 10 years from now 8K will probably be as ubiquitous as 4K is today.
 
10 years from now 8K will probably be as ubiquitous as 4K is today.
One issue is that outside reading text and some other work related application, 8K over 4k seem to be at a whole a big negative format.

Warner brother with the best possible at the time generated 8k content (super high resolution scan of the new 70mm movie Dunkirk) and re-rendered in native 8k animated movie, native 8k red digital footage, seven natively shoot in HDR clip has well:

https://www.techhive.com/article/578376/8k-vs-4k-tvs-most-consumers-cannot-tell-the-difference.html
Unlike test that do this on different display-different quality outside resolution, that was well done, exact same tv, simple upscale (doubling of the pixles) for the 4k version to 8k, on a 88 inch display and something that will never be has good, giant uncompressed file played at 3GBs (that 2,500 time Disney+ 4K UHD bitrate) were used for the test.

They used mostly people with 20/20 or better vision with some sitting has close has 5 feet to the 88 inch screen of an uncompressed signal to give it all the chance of the world, with a regular even very good compression that will exist and realistic sitting distance with people with realistic vision, 8k would end up either worst than 4k or pure gimmick marketing.

Not that it ever stopped the audio-video business before obviously, but this time it could be purely on the work computer side (Apple will do it for example), with what it mean in term of mainstream popularity, unlike 4K being fully mainstream (nearly 50% of American household has at least one), 8k could be in 10 years has rare and niche has 4k display is for computer monitor now (which is maybe what you meant) and not something you can buy for $275 USD on Black Friday and that even lower than average income people have in their living room.

If we are lucky one could argue it will stay fully niche or they will do like what they did to sales 4k, simply use better and larger signal when they use 8k over 4k and make people think that it is the resolution that was the big difference, much easier to market and sale.
 
Last edited:
Will DP2.1 be able to support 8K 120hz? With AMD's GPU having DP 2.1 and I'm sure Nvidia's next iteration (4090 Ti?) also having it, DP 2.1 monitors will be a reality in 2023. Imagine an 8K 120hz OLED monitor! o_0
 
Will DP2.1 be able to support 8K 120hz? With AMD's GPU having DP 2.1 and I'm sure Nvidia's next iteration (4090 Ti?) also having it, DP 2.1 monitors will be a reality in 2023. Imagine an 8K 120hz OLED monitor! o_0

165Hz according to AMD. But the 7900 XTX will not have anywhere near the power needed to do 8K 165Hz so it's kind of a moot point.

1668714130583.png
 
Will DP2.1 be able to support 8K 120hz? With AMD's GPU having DP 2.1 and I'm sure Nvidia's next iteration (4090 Ti?) also having it, DP 2.1 monitors will be a reality in 2023. Imagine an 8K 120hz OLED monitor! o_0
See https://linustechtips.com/topic/729232-guide-to-display-cables-adapters-v2/?section=calc

You would need Display Stream Compression even with DP 2.1. Even HDMI 2.1 with DSC could reach 8K 100 Hz.

It would be interesting if someone was able to develop the DLSS equivalent tech for desktop use where 8K would be most relevant, considering the main issue is how sharp text and UI rendering looks on screen and how naive upscaling like MacOS does cannot work well if the display is not at least 4K or equivalent PPI.

I was watching a 75" or something like that 8K OLED TV at a store a few days ago and for video content it seemed so irrelevant. At an appropriate viewing distance I would not be able to tell it apart from 4K. 8K should be pushed for desktop use above all.
 
I just hope to see actual 32" OLEDs at this one. Hopefully that LG one hinted at earlier.

I've got a size limit and maybe this will at last be my year to upgrade the old Dell 30".
 
I have to say, I don't really care.
I would much rather have affordable 4k desktop OLED. 32", DCI-4k, 120fps, 1000 Nitts peak brightness (HDR10). DP 2.1, HDMI 2.1.

8k, is so far out, and a 4x resolution increase would net me far fewer gains than when I listed above. In order to have 8k+ the above I'd have to have an nVidia 4 series card and be willing to spend $6000+ to even be in the ballpark. So even if this did come out at CES, it would be far out of my price range. And likely out of 99.9% of consumers price range too.
The benefits of 8k are too low for me to consider it a first priority upgrade. Would 8k be nice? Yes. But a 4k 32" OLED would give me many more tangible benefits.
 
Last edited:
One argument that could made for prioritizing resolution over refresh rate is that lower resolutions can at least be made to look a lot closer to higher resolutions. Thanks to better upscaling techniques like DLSS and FSR 2.1, 1080p upscaled to 4k today looks way better than trying to upscale 1080p to 4k like 10 years ago. So if running native 8k is too much you could at least do some upscaling with decent results. As for trying to turn low frame rates into higher frame rates, nvidia is trying that out with DLSS 3 but it's still in the early stages and needs per game implementation, and it also results in increased input latency. So for now faking pixels is a lot more refined than faking frames.

. . .

Hot take indeed.

8K would actually be perfect for desktop displays.
  • 8K would be sharp for desktop displays sized above 32". 4K becomes too low res when you go past about 40".
  • You get way more usable scaling levels to size your UI/text the way you like it for your monitor size. Atm, only 125% and 150% are useful settings on 4K displays unless you are using it at TV viewing distances.
  • You can use integer scaling to play at 4K, 1440p or 1080p. Ideally an 8K display would support higher refresh rates at these lower resolutions.
  • 4090 has moved things up to proper 4K 120 fps and with DLSS and frame generation both high resolutions and framerates would be possible.
Where display manufacturers go wrong is by making 8K displays in sizes that make the least sense for them: 32" and 70+ inch TVs. 40-50" would be spot on.

I agree with most of those replies.

A fairly large 8k display would allow a lot of desktop real-estate and you'd get higher than 1080 or 1200 when running 32:9 or 32:10 resolution on one, even if using 4k + AI upscaling to 8k. That or running 21:10 at much higher rez than 3840x1600.

With a big 8k display, you wouldn't need to use the entire display for your game all of the time. So you could theoretically run 21:10 or 32:10 resolution, or even a 4k or so 16:9 resolution with higher Hz when in a demanding game yet still get 4 quadrants of 4k screen real-estate on the desktop otherwise.

There is also a lot of room for PC tech to adopt VR space warp / pixel shifting~reprojection forms of frame amplification tech. These involve the os, drivers, user inputs, and in game software supplying their vectors to the frame amplification technology. In this method - the reprojection is done with the supplied vector information as opposed to the way nvidia is doing it right now which is by paging ahead to the next frame and guessing what the vectors might be. The nvidia method is uninformed by comparison. VR tech adopted by pc gaming could theoretically even allow for different refresh rate objects within a scene which would save a lot of processing power. VR hardware/display and dev tech is way ahead of pc in some facets.


. . . . . . . . .

VR's space warp style frame amplification tech <....> has a z-axis grid with <...> some of the vectors <...>. VR system's apps can use actual vector data incl. rates, arcs, player inputs, etc. and then plot and draw measured results on the middle page going forward. So it's not completely guessing what the vectors are for everything in the scene, including the player's inputs and the effects of the virtual camera (or the player's head motion and hand motions in VR).

Unfortunately nvidia went with the former uninformed version, at least in the near timeframe. Hopefully the whole industry will switch to the VR method at some point. That would require the PC game devs to write their games using the same kind of VR tools and do the work to code for that. They have a lot of room to advance. Unfortunately progress is a lot slower on the PC and PC display front than the VR front, though in aspects like PPD,VR is way "behind" or inferior and will continue to be for some time yet.

. . .


from blurbuster.com 's forum replies (Mark R.) :


reprojection could in theory be created as a parallel-layer API running between the game and the graphics drivers, much like how VR APIs do it.

One interesting innovation of reprojection that has not yet been done is making sure the pre-reprojected framerate is above flicker fusion threshold. That causes stutters in reprojection to disappear!

For VR, We Already Have Hybrid Frame Rates In The Same Scene
----------------------------------------------------------------------------------------------------


For example on Oculus Rift 45fps to 90fps, sometimes certain things stutter (hand tracking 45fps) while the background scrolls smooth (90fps) via head turns.

But if we had 100fps reprojected to 500fps, then even physics objects like enemy movements would still be smooth looking, just simply more motionblurred (due to frametime persistence) than turns (due to reprojection-based frame rate amplification).

Not everything in the game world *needs* to run at the same framerate; if motion blur is acceptable for such movements.

Different things running at different frame rates on the same screen is very common with reprojection (Oculus Rift), which is ugly when some of the framerates are below stutter/flicker detection threshold.

But if all framerates could be guaranteed perfect framepaced triple-digit, then no stuttering is visible at all! Just different amounts of persistence motion blur (if using reprojection on a non-strobed display). This will be something I will write about in my sequel to the Frame Rate Amplification Article.

Hybrid Frame Rates Stop Being Ugly if 100fps Minimum + Well Framepaced + Sample And Hold
------------------------------------------------------------------------------------------------------------------------------------------


Hybrid frame rates will probably be common in future frame rate amplification technologies, and should no longer be verboten, as long as best practices are done:

(A) Low frame rates are acceptable for slow enemy movements, but keep it triple-digit to prevent stutter
(B) High frame rates are mandatory for fast movements (flick turns, pans, scrolls, fast flying objects, etc)
(C) If not possible, then add GPU motion blur effect selectively (e.g. fast flying rocket running at only 100 frames per second, it's acceptable to motionblur its trajectory to prevent stroboscopic stepping)

The frame rates of things like explosions could continue at 100fps to keep GPU load manageable, but things like turns (left/right) would use reprojection technology. The RTX 4090 should easily be capable of >500fps reprojection in Cyberpunk 2077 at the current 1 terabyte/sec memory bandwidth -- and this is a low lying apple just waiting to be milked by game developers!

In other words, don't use 45fps. Instead of 45fps-reproject-90fps, use min-100fps-reproject-anything-higher. Then frame rate amplification can be hybridized at different frame rates for different objects on the screen -- that becomes visually comfortable once every single object is running at triple-digit frame rates!

Technically, reprojection could in theory be created as a parallel-layer API running between the game and the graphics drivers, much like how VR APIs do it. Except it's overlaid on top of non-VR games.

One major problem occurs when you're doing this on strobed displays -- sudden double/multi-image effects -- and requires GPU motion blur effect (mandatory) to fix the image duplicating (akin to CRT 30fps at 60Hz). However, this isn't as big a problem for sample-and-hold displays.




. . . . . . . . . . . . . .


From using the LTT resolution/bandwidth calculator - These uw resolutions look like they'd be nice to run on a 16:9 4k or 8k screenif they upped the Hz on oled tvs, even at 4k upscaled to 8k being sent as an 8k signal as far as the display would be concerned. They fit within HDMI 2.1's bandwidth, at least when using DSC 3:1 compression ratio. DP 2.0 would be nice but realistically I'd prob stick with a TV model than paying up to triple+ in some cases for a comparable desktop gaming monitor version to get dp2.0 someday and potentially end up suffering AG coating to boot which would really bother me, especially on an OLED.

8k at 32:10 ultrawide 7680 × 2400 rez @ 200Hz, 10bit signal at 3:1 compression: 41.03 Gbit/s (HDMI 2.1 ~> 41.92 Gbit/s)

8k at 24:10 ultrawide 7680 × 3200 rez @ 150Hz, 10bit signal at 3:1 compression: 40.02 Gbit/s (HDMI 2.1 ~> 41.92 Gbit/s)

4k at 24:10 ultrawide 3840 × 1600 rez @ 500Hz, 10bit signal at 3:1 compression: 40.73 Gbit/s (HDMI 2.1 ~> 41.92 Gbit/s)

. . .

8k at 32:10 ultrawide rez 345Hz, 10bit signal at 3:1 compression: 76.38 Gbit/s (DP 2.0 ~> 77.37 Gbit/s)

8k at 24:10 ultrawide rez 270Hz, 10bit signal at 3:1 compression: 76.56 Gbit/s (DP 2.0 ~> 77.37 Gbit/s)

4k at 24:10 ultrawide rez 780Hz, 10bit signal at 3:1 compression: 76.32 Gbit/s (DP 2.0 ~> 77.37 Gbit/s)


Yep. Id be happy with thise kinds of ultrawide resolutions that say a large 55inch 1000R 8k screen would be capable of though. Theoretically we might at some point get some gaming upscaling tech in the display end of the pipeline to avoid the bottleneck, sending 4k over the ports and cable to be (AI?) upscaled to 8k on the display itself. At least I hope so.

Here is the full scroll from a few days ago:


Useful reference:

https://linustechtips.com/topic/729...ssion=dsc2.0x&calculations=show&formulas=show

Max. Data Rate Reference Table:
  • DisplayPort 2.0 77.37 Gbit/s
  • DisplayPort 1.3–1.4 25.92 Gbit/s
  • DisplayPort 1.2 17.28 Gbit/s
  • DisplayPort 1.0–1.1 8.64 Gbit/s
  • HDMI 2.1 41.92 Gbit/s
  • HDMI 2.0 14.40 Gbit/s
  • HDMI 1.3–1.4 8.16 Gbit/s
  • HDMI 1.0–1.2 3.96 Gbit/s
  • DVI 7.92 Gbit/s
  • Thunderbolt 3 34.56 Gbit/s
  • Thunderbolt 2 17.28 Gbit/s
  • Thunderbolt 8.64 Gbit/s

According to that calculator, I calculated a bunch of these below to get a general idea of limitations:


. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
4k 240hz 10 bit DSC 2:1 = 40.61 Gbit/s (HDMI 2.1 ~> 41.92 Gbit/s)

4k 240hz 12 bit DSC 2:1 = 48.75 Gbit/s

. . . .

4k 240hz 12 bit DSC 3:1 = 32.49 Gbit/s

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

4k 400Hz 10bit DSC 3:1 = 41.52 Gbit/s (HDMI 2.1 ~> 41.92 Gbit/s)

Video Format:
3840 × 2160 (16∶9 ratio) at 400 Hz
10 bpc (30 bit/px) RGB color
Display Stream Compression (3.0× ratio)
CVT-R2 timing format

Data Rate Required: 41.52 Gbit/s


. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

4k 470Hz 10bit DSC 2:1 = 76.82 Gbit/s (DP 2.0 ~> 77.37 Gbit/s) (HDMI 2.1 ~> 41.92 Gbit/s)

Video Format:
3840 × 2160 (16∶9 ratio) at 470 Hz
10 bpc (30 bit/px) RGB color
Display Stream Compression (2.0× ratio)
CVT-R2 timing format

Data Rate Required: 76.16 Gbit/s

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

4k 640Hz 10bit DSC 3:1 = 76.82 Gbit/s (DP 2.0 ~> 77.37 Gbit/s) (HDMI 2.1 ~> 41.92 Gbit/s)

Data Rate Required: 76.82 Gbit/s

Video Format:
3840 × 2160 (16∶9 ratio) at 640 Hz
10 bpc (30 bit/px) RGB color
Display Stream Compression (3.0× ratio)
CVT-R2 timing format

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

8K

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

8k 240Hz 10bit signal (even upscaled from 4k on the pc end to 8k) at 3:1 compression: 107.20 Gbit/s (out of range of even dp 2.0)

Video Format:
7680 × 5120 (3∶2 ratio) at 240 Hz
10 bpc (30 bit/px) RGB color
Display Stream Compression (3.0× ratio)
CVT-R2 timing format

Data Rate Required: 107.20 Gbit/s 8k 240 Hz (DP 2.0 ~> 77.37 Gbit/s)
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

8k 165Hz 10bit signal (even upscaled from 4k on the pc end to 8k) at 3:1 compression: 70.95 Gbit/s (DP 2.0 ~> 77.37 Gbit/s)

Video Format:
7680 × 5120 (3∶2 ratio) at 165 Hz
10 bpc (30 bit/px) RGB color
Display Stream Compression (3.0× ratio)

Data Rate Required: 70.95 Gbit/s 8k 165 Hz (DP 2.0 ~> 77.37 Gbit/s)


. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

8k 175Hz 10bit signal (even upscaled from 4k on the pc end to 8k) at 3:1 compression: 75.63 Gbit/s (DP 2.0 ~> 77.37 Gbit/s)

Video Format:
7680 × 5120 (3∶2 ratio) at 175 Hz
10 bpc (30 bit/px) RGB color
Display Stream Compression (3.0× ratio)
CVT-R2 timing format


Data Rate Required: 75.63 Gbit/s (DP 2.0 ~> 77.37 Gbit/s)

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

8k 100Hz 10bit signal (even upscaled from 4k on the pc end to 8k) at 3:1 compression: 41.65 Gbit/s (HDMI 2.1 ~> 41.92 Gbit/s)

Video Format:
7680 × 5120 (3∶2 ratio) at 100 Hz
10 bpc (30 bit/px) RGB color
Display Stream Compression (3.0× ratio)
CVT-R2 timing format

Data Rate Required: 41.65 Gbit/s (HDMI 2.1 ~> 41.92 Gbit/s)



. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

Ultrawide

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .



8k at 32:10 ultrawide rez 200Hz, 10bit signal at 3:1 compression: 41.03 Gbit/s (HDMI 2.1 ~> 41.92 Gbit/s)

Video Format:
7680 × 2400 (16∶5 ratio) at 200 Hz
10 bpc (30 bit/px) RGB color
Display Stream Compression (3.0× ratio)
CVT-R2 timing format

Data Rate Required: 41.03 Gbit/s (HDMI 2.1 ~> 41.92 Gbit/s)


. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

8k at 32:10 ultrawide rez 345Hz, 10bit signal at 3:1 compression: 76.38 Gbit/s (DP 2.0 ~> 77.37 Gbit/s)

Data Rate Required: 76.38 Gbit/s

Video Format:
7680 × 2400 (16∶5 ratio) at 345 Hz
10 bpc (30 bit/px) RGB color
Display Stream Compression (3.0× ratio)
CVT-R2 timing format


Data Rate Required: 76.38 Gbit/s 76.38 Gbit/s


. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

8k at 24:10 ultrawide rez 150Hz, 10bit signal at 3:1 compression: 40.02 Gbit/s (HDMI 2.1 ~> 41.92 Gbit/s)

Video Format:
7680 × 3200 (12∶5 ratio) at 150 Hz
10 bpc (30 bit/px) RGB color
Display Stream Compression (3.0× ratio)
CVT-R2 timing format

Data Rate Required: 40.02 Gbit/s (HDMI 2.1 ~> 41.92 Gbit/s)


. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

8k at 24:10 ultrawide rez 270Hz, 10bit signal at 3:1 compression: 76.56 Gbit/s (DP 2.0 ~> 77.37 Gbit/s)

Video Format:
7680 × 3200 (12∶5 ratio) at 270 Hz
10 bpc (30 bit/px) RGB color
Display Stream Compression (3.0× ratio)
CVT-R2 timing format


Data Rate Required: 76.56 Gbit/s (DP 2.0 ~> 77.37 Gbit/s)

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

4k at 24:10 ultrawide rez 500Hz, 10bit signal at 3:1 compression: 40.73 Gbit/s (HDMI 2.1 ~> 41.92 Gbit/s)

Video Format:
3840 × 1600 (12∶5 ratio) at 500 Hz
10 bpc (30 bit/px) RGB color
Display Stream Compression (3.0× ratio)
CVT-R2 timing format


Data Rate Required: 40.73 Gbit/s (HDMI 2.1 ~> 41.92 Gbit/s)

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

4k at 24:10 ultrawide rez 780Hz, 10bit signal at 3:1 compression: 76.32 Gbit/s (DP 2.0 ~> 77.37 Gbit/s)

Video Format:
3840 × 1600 (12∶5 ratio) at 780 Hz
10 bpc (30 bit/px) RGB color
Display Stream Compression (3.0× ratio)
CVT-R2 timing format

Data Rate Required: 76.32 Gbit/s (DP 2.0 ~> 77.37 Gbit/s)

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
 
Last edited:
Hot take indeed.

8K would actually be perfect for desktop displays.
  • 8K would be sharp for desktop displays sized above 32". 4K becomes too low res when you go past about 40".
  • You get way more usable scaling levels to size your UI/text the way you like it for your monitor size. Atm, only 125% and 150% are useful settings on 4K displays unless you are using it at TV viewing distances.
  • You can use integer scaling to play at 4K, 1440p or 1080p. Ideally an 8K display would support higher refresh rates at these lower resolutions.
  • 4090 has moved things up to proper 4K 120 fps and with DLSS and frame generation both high resolutions and framerates would be possible.
Where display manufacturers go wrong is by making 8K displays in sizes that make the least sense for them: 32" and 70+ inch TVs. 40-50" would be spot on.



I'd definitely be interested in a 55" 1000R 8k OLED especially if it allowed higher Hz uw resolutions on it. That would be a dream screen for me personally. I'd settle for a 48" one however. Smaller than that wouldn't be as appealing to me.

.....................................

human-viewpoint_50.to-60.deg_55inch.1000R.CURVED.SCREEN.png
. . .
 
One issue is that outside reading text and some other work related application, 8K over 4k seem to be at a whole a big negative format.

Warner brother with the best possible at the time generated 8k content (super high resolution scan of the new 70mm movie Dunkirk) and re-rendered in native 8k animated movie, native 8k red digital footage, seven natively shoot in HDR clip has well:

https://www.techhive.com/article/578376/8k-vs-4k-tvs-most-consumers-cannot-tell-the-difference.html
Unlike test that do this on different display-different quality outside resolution, that was well done, exact same tv, simple upscale (doubling of the pixles) for the 4k version to 8k, on a 88 inch display and something that will never be has good, giant uncompressed file played at 3GBs (that 2,500 time Disney+ 4K UHD bitrate) were used for the test.

They used mostly people with 20/20 or better vision with some sitting has close has 5 feet to the 88 inch screen of an uncompressed signal to give it all the chance of the world, with a regular even very good compression that will exist and realistic sitting distance with people with realistic vision, 8k would end up either worst than 4k or pure gimmick marketing.

Ah the Warner test! It was a good test, but IMO it's not definitive.

First, they didn't do a true doubling of pixels for the 4K content -- they actually used a Nuke cubic filter, which has a reasonable amount of edge smoothing (but no sharpening). You can see examples here: https://learn.foundry.com/nuke/cont...sforming_elements/filtering_algorithm_2d.html . So in some sense, the experiment compares 8K native to 4K upscaled to 8K, not 4K native (though getting 4K native with the same TV size and characteristics is probably unrealistic as well).

Second, the quality of the source material really matters at 8K. So they used: 8K scans of 70mm film (Dunkirk), and some footage filmed on an 8K Red camera (The Tick and a nature clip). But the camera itself doesn't guarantee good quality and sharpness. What lenses did they use? Can the lenses even resolve 8K resolution? Unclear. Was there a lot of cropping done in post-processing? Probably for some clips!

I will say that a lot of the 8K content on Youtube is not up to snuff. I currently use a 65" 8K TV sitting about 4 ft away (this is roughly equivalent to viewing a 24" 4K monitor from 3 ft away). At least for my eyes, I can definitely tell the difference between 4K and 8K... but only if the content is right. I've probably found only a handful of creators where there's a material resolution difference between 4K and 8K. The timelapses from Timestorm Films and Morten Rustad are amazing (I know those guys use really sharp camera lenses), as is the content by Eugene Belsky. The videos uploaded by "8K Videos Ultra HD" are absolute garbage--they probably take 4K or worse footage and do some cheap upscale to claim it's 8K. I bet most of the subscribers of that channel who think 8K is great are viewing it in 4K or on their phones. Even the 8K content from Jacob + Katie Schwarz, which IMO looks really good in 4K, is only so so in 8K (some clips are good, others are not).

And yeah I run most of my games with a smaller window on my screen.

I would love to have 8K@120hz at 40-50". I currently use a keyboard tray to extend my viewing distance, but I'd love to go back to just a simple keyboard-on-desk setup.
 
  • Like
Reactions: elvn
like this
Yes once you hit something like a 55" 1000R 8k screen you could have quads of 4k windows or any tile you want on that wall of screen. Notably a 32:9 or 32:10 uw belt wouldn't be limited to 1080px or 1200px tall anynmore. You could also run 4k - 6k screen spaces 1:1, etc.

I think the best case would be if they started integrating some kind of AI upscaling tech on the screens themselves in order to bypass the port and cable bandwidth. Then you could send (perhaps more advanced frame amplification tech amplified) 4k or 4k based uw resolutions in games and upscale them on the screen end of the equation.

As for 8k fidelity, we are forced to use Anti-aliasing and text sub-sampling to compensate for aggressive pixelization on screens as it is. *Aggressive* anti-aliasing (at a performance hit) , and heavily massaged and/or alternate forms of text sub-sampling only start to compensate enough around 60PPD or so. Notably, there is typically no AA outside of text's sub-sampling on the 2D desktop for desktop graphics and imagery (outside of the viewports of some select cgi authoring/digital imagery authoring apps). So the 2D desktop is even worse as it's not compensated outside of text. We can definitely benefit from higher PPD.

For example, the samsung ark is around 61 - 62 PPD when sitting at the focal point of the curve, which is still pixelated pretty badly but can be compensated for with aggressive AA and heavily massaged text sub sampling (which are fogged edge hacks really). The 2D desktop remains uncompensated for at all. We won't be able to get away with not using AA /text-ss for the most part until we get around 160PPD. Not that we shouldn't use them to some degree in the meantime, I'm just saying - we really have room for and can benefit a lot from higher PPD.

A 55" 1000R curved 8k screen would have a 1000R, 1000mm, ~ 40" focal point. Sitting at that focal point you'd get around 122PPD instead of ~ 61 PPD. That would be a huge benefit in overall display quality.
 
Last edited:
Back
Top