Where are the 8K Monitors?

elvn Have to ask, since you are obviously interested in 8K, have you ever actually owned one? At least I have bought, owned and returned a few but it seems like few actually have (for good reasons) :) At least in the EU we have no restocking fees which makes it kind of risk free to buy and try, not sure if that is always a thing in the US.

With the QN900C being on sale again and me being doubtful if the QN900D actually comes with some real advantages, I am finding myself considering trying out the QN900C again. I am obviously sick and need help :D
 
elvn Have to ask, since you are obviously interested in 8K, have you ever actually owned one? At least I have bought, owned and returned a few but it seems like few actually have (for good reasons) :) At least in the EU we have no restocking fees which makes it kind of risk free to buy and try, not sure if that is always a thing in the US.

With the QN900C being on sale again and me being doubtful if the QN900D actually comes with some real advantages, I am finding myself considering trying out the QN900C again. I am obviously sick and need help :D

No I haven't. I have a PLP setup with a 43" 4k samsung VA + 48" CX + 43" 4k Samsung VA. So 3x 4k in some configuration with bezels and using window management software (stream deck with it's own window management plugins + displayfusion window management scripts hotkeyed to the stream deck) rather than a 2x2 quad of 4k resolutions that a 8k screen has. I break the side screens into 3 tiers high, or 2 "shelves" 66% high on bottom more aligned to the central oled, + 1 "shelf" at the top. Either way the side screens have a bottom 2 shelves as 2 different windows or as one big window the same height. I put lower priority stuff up on the top shelf but I can shuffle everything around teleporting windows anywhere easily with my stream deck's buttons. I never really have to drag any windows around or resize them anymore. Also has a saved window positions profile so with a single button I can move all of my most used apps back to their home positions+sizes at any time no matter how much teleporting around I've done of things.

stream.deck.with.display.fusion.functions_button.main.menu_elvn_1.png



I'm interested in the 900D's AI chip for all posterity and usage scenarios of the screen if I were to get one, based heavily on reviews of it's real-world performance obviously - but it might be a long wait for it to hit the price point drop plus I have to bank ~$1500 usd for rig component upgrades at some point, and later probably a 5090 (which will be a big price) if all goes well. Depends heavily on reviews, weighing it vs my current type of PLP setup with an oled, and timing of all three of those factors I just mentioned.

Really I'd love a dp 2.1, 120hz 8k 55" - 65" 1000R curvature phOLED with MLA, that could do 240hz at 4k, and also 5k, 4k+4k s-uw etc. un-scaled resolutions 1:1 but I don't see that happening.

Otherwise if the 900D doesn't pan out, I'd keep eye open for if large 4k gaming tvs ever get 240hz native to swap out my 48cx 's 120hz and keep using PLP array.

. . . .
 
Thanks, the 1st vid has a little in it but most of them so far are just people "visiting" the 900D, at CES or at some samsung demo place. Very little details other than impressions.. Hopefully more people will get their hands on one in their own studio or house in the coming weeks, and then legit review sites like Rtings etc.
 
. .


https://www.techradar.com/televisions/samsung-qn900d-review

. .

https://www.pcmag.com/reviews/samsung-85-inch-qn900d-8k-qled-tv


The TV has a 120Hz native refresh rate, but it can boost to 240Hz with a 4K picture and supports variable refresh rate and AMD FreeSync Premium Pro. And, while the Tizen OS has some frustrating elements, its Game Bar feature in game mode is very helpful because it lets you confirm you’re getting the right refresh rate from your console or PC and offers useful game-specific settings like genre-based picture mode. It also helpfully lets you configure features and display an ultrawide 21:9 or 32:9 picture across the screen, zoom in on a mini map, or toggle on-screen crosshairs.
.
More importantly, the QN900D is very fast. Using an HDFury Diva HDMI matrix, we measured an input lag of just 2.0 milliseconds in Game Mode. This is fantastic, and well below the 10ms threshold we use to consider a TV to be good for gaming (though not as fast as the sub-1ms S90C OLED).
. .
 
Last edited:
Nice discussions here but until content is readily available, it is a no sell for me. As screens get larger the PPI drops and can look low res depending on size.
Quality pixels over quantity factor, for example a 100K Christie 4K cinema projector is amazing just go to your local theater, no one complains about PPI or fringing etc.
Go to RED cinema to see 8K video and see what a 100K telecine lens kit can do.

View: https://youtu.be/bmIObGZDjew
 
With how good upscaling could be with a good 4k signal input, not sure if waiting for content to exist will be as much of an issue than in the past (and obviously for computer text, application, desktop, vector graphic, 3d game, etc... many does not come with a preset resolution and can take advantage right away). I am not sure if bandwith/physical media size will in a realistic lifetime for those TV offer a native 8k better than 4k->8k upscaled anyway if they wanted too, when would 8k>4k without upscaling, at 150 mbs av1 ?

Eye good enough/screen big enough/sitting distance (to take your example lot of people still watch 2k movie in cinema on giant screen counted in feet, 2k vs 4k in blind test for people with 20/20 vision they have an hard time telling difference real quick past the very first rows of seats), IMAX digital projection were in 2k.
 
Last edited:
Nice discussions here but until content is readily available, it is a no sell for me. As screens get larger the PPI drops and can look low res depending on size.
Quality pixels over quantity factor, for example a 100K Christie 4K cinema projector is amazing just go to your local theater, no one complains about PPI or fringing etc.
Go to RED cinema to see 8K video and see what a 100K telecine lens kit can do.

View: https://youtu.be/bmIObGZDjew


Movies are still mostly 24fps and softened looking. That's not comparing apples to apples to high graphics detail, high fpsHz pc gaming.

In a theater, the thx and other standards optimal viewing are 45 - 50 deg viewing angle still, so the pixels are smaller to your perspective than you are suggesting even though the screens are large.

No matter what size screen, at the same viewing angle the same rez will look essentially the same pixel density (PPD, pixel per degree) to your perspective. A lot of the discussion in this thread went over that on multiple occasions.


Field-of-view-comparisons-The-field-of-vision-of-a-human-showing-the-binocular.png



https://qasimk.io/screen-ppd/

For reference:

At the human central viewing angle of 60 to 50 degrees, every 8k screen of any size gets around 127 to 154 PPD..

At the human central viewing angle of 60 to 50 degrees, every 4k screen of any size gets around 64 to 77 PPD

..At the human central viewing angle of 60 to 50 degrees, every 2560x1440 screen of any size gets only 43 PPD to 51 PPD

..At the human central viewing angle of 60 to 50 degrees, every 1920x1080 screen of any size gets only 32 PPD to 39 PPD

=================================



The PC command center style setup, where a larger gaming monitor is set back a few feet on it's own stand (or wall mount, etc), would fill your central 60 to 50 degree viewing angle just like a monitor would on a desk.

For the smallest of the 900D screens available, that would be a 65" 8k screen.

You can get 120 PPD on a 65" 8k at a 64 degree viewing angle, which is only a few degrees into your peripheral on each side. To get that viewing angle would require a 45 inch viewing distance, which means making around 21" gap behind a 24" deep desk to the screen, maybe even less depending where you head ends up in relation to the side of the desk you are sitting on (e.g. subtract from that gap if you head is farther away from the nearest edge of the desk you are sitting at).

65" 8k tv at 60 deg viewing angle = ~ 129 PPD

65" 8k tv at 50 deg viewiong angle = ~ 154 PPD


PPI is not a descriptive enough way to measure how we see pixel density. PPD is pixels per degree, or what you could consider the perceived pixel density at any given distance.

120 , 129 PPD are not low PPD at all. They are twice the perceived pixel density you get from a 4k screen at the same viewing angles.

For example:
32" 4k screen at 23" view distance = 64 deg viewing angle = 60 PPD
65" 4k screen at 45" view distance = 64 deg viewing angle = 60 PPD
65" 8k screen at 45" view distance = 64 deg viewing angle = 120 PPD

. . .

Content wise:

- PC desktop gets 4x the desktop/app real-estate compared to a 4k desktop, and without suffering the middling bezels of multiple monitors in your central field of view.
- PC gaming, well in this case you'd probably use 4k and let the TV scale it so you could get higher Hz (4k 240Hz supposedly), - but you could run some easy to render games at native 8k 120hz, or those easier to render games windowed on the desktop while you have other stuff open.
- Movies and shows, from most reports so far where people view the screen at nearer pc gaming distances rather than 30 - 35 deg living room distances, the 3rd Gen AI upscaler chip's 4k content upscaled to 8k looks more detailed than the 4k native content. It also reportedly upscales 1080p media nicely as well.


If you were looking at one of these just for watching movies from far away with a 30deg - 35 deg viewing angle, then no it wouldn't be worth it since 4k has twice the PPD at those distances compared to a 4k at the 60 to 50 deg central human viewing angle you'd typically use for a PC.
 
Last edited:
With how good upscaling could be with a good 4k signal input, not sure if waiting for content to exist will be as much of an issue than in the past.

Eye good enough/screen big enough/sitting distance (to take your example lot of people still watch 2k movie in cinema on giant screen counted in feet, 2k vs 4k in blind test for people with 20/20 vision they have an hard time telling difference real quick past the very first rows of seats), IMAX digital projection were in 2k.
Movies are still mostly 24fps and softened looking. That's not comparing apples to apples to high graphics detail, high fpsHz pc gaming.

In a theater, the thx and other standards optimal viewing are 45 - 50 deg viewing angle still, so the pixels are smaller to your perspective than you are suggesting even though the screens are large.

No matter what size screen, at the same viewing angle the same rez will look essentially the same pixel density (PPD, pixel per degree) to your perspective. A lot of the discussion in this thread went over that on multiple occasions.


View attachment 643816


https://qasimk.io/screen-ppd/

For reference:

At the human central viewing angle of 60 to 50 degrees, every 8k screen of any size gets around 127 to 154 PPD..

At the human central viewing angle of 60 to 50 degrees, every 4k screen of any size gets around 64 to 77 PPD

..At the human central viewing angle of 60 to 50 degrees, every 2560x1440 screen of any size gets only 43 PPD to 51 PPD

..At the human central viewing angle of 60 to 50 degrees, every 1920x1080 screen of any size gets only 32 PPD to 39 PPD

=================================



The PC command center style setup, where a larger gaming monitor is set back a few feet on it's own stand (or wall mount, etc), would fill your central 60 to 50 degree viewing angle just like a monitor would on a desk.

For the smallest of the 900D screens available, that would be a 65" 8k screen.

You can get 120 PPD on a 65" 8k at a 64 degree viewing angle, which is only a few degrees into your peripheral on each side. To get that viewing angle would require a 45 inch viewing distance, which means making around 21" gap behind a 24" deep desk to the screen, maybe even less depending where you head ends up in relation to the side of the desk you are sitting on (e.g. subtract from that gap if you head is farther away from the nearest edge of the desk you are sitting at).

65" 8k tv at 60 deg viewing angle = ~ 129 PPD

65" 8k tv at 50 deg viewiong angle = ~ 154 PPD


PPI is not a descriptive enough way to measure how we see pixel density. PPD is pixels per degree, or what you could consider the perceived pixel density at any given distance.

120 , 129 PPD are not low PPD at all. They are twice the perceived pixel density you get from a 4k screen at the same viewing angles.

For example:
32" 4k screen at 23" view distance = 64 deg viewing angle = 60 PPD
65" 4k screen at 45" view distance = 64 deg viewing angle = 60 PPD
65" 8k screen at 45" view distance = 64 deg viewing angle = 120 PPD

. . .

Content wise:

- PC desktop gets 4x the desktop/app real-estate compared to a 4k desktop, and without suffering the middling bezels of multiple monitors in your central field of view.
- PC gaming, well in this case you'd probably use 4k and let the TV scale it so you could get higher Hz (4k 240Hz supposedly), - but you could run some easy to render games at native 8k 120hz, or those easier to render games windowed on the desktop while you have other stuff open.
- Movies and shows, from most reports so far where people view the screen at nearer pc gaming distances rather than 30 - 35 deg living room distances, the 3rd Gen AI upscaler chip's 4k content upscaled to 8k looks more detailed than the 4k native content. It also reportedly upscales 1080p media nicely as well.


If you were looking at one of these just for watching movies from far away with a 30deg - 35 deg viewing angle, then no it wouldn't be worth it since 4k has twice the PPD at those distances compared to a 4k at the 60 to 50 deg central human viewing angle you'd typically use for a PC.
Yes this could be the next best entertainment/gaming display at the right distance on a stand behind the desk if you don't want to wall mount it. Also if the 5090 can push double the horsepower of a 4090 8K high refresh gaming could be incredible.
This is properly exciting 😃 I can't wait for it to be released and prices decrease a bit it would also be nice if they made a bit smaller sizes like 50 for extreme high resolution.
 
  • Like
Reactions: elvn
like this
Yes this could be the next best entertainment/gaming display at the right distance on a stand behind the desk if you don't want to wall mount it. Also if the 5090 can push double the horsepower of a 4090 8K high refresh gaming could be incredible.
This is properly exciting 😃 I can't wait for it to be released and prices decrease a bit it would also be nice if they made a bit smaller sizes like 50 for extreme high resolution.

" it would also be nice if they made a bit smaller sizes like 50 for extreme high resolution. 8k resolution at extremely high PPD."

I knew what you meant, but 65" seems the standard lowest since home theater people complain that 8k isn't meaningful at smaller screen sizes because they are usually sitting almost twice as far from their screens.

The size is an issue if you can't drop it back 18" - 21" farther from a 24" deep desk, or refuse to based on the room environment or other factors, personal choices, etc. The PPD isn't a factor vs the larger 65" size if you have the space, since at for example 45" away from your eyeballs at a 64 deg viewing angle it would still get still a whopping 120 PPD. A 55" flat 8k could get the same 64 deg, 120PPD a little closer at 38inch view distance though. So 64 deg, 120PPD at 3.75 feet vs. 3.16 feet between the two screens. More if you want 60 to 50 degree viewing angle but 64 deg isn't a bad fit as a minimum. It would be very cool if they released a 55" 8k version of the 1000R curvature ark though (or even a 65" 1000R curvature one). That would make it even better since you can get all pixels on axis to you on a 1000(R)adius = 1000m = ~ 40" (3.33 feet) to center of curvature screen

I'm definitely keeping my eye on this one for later, closer to 5000 series release. The $3600 at release ark dropped considerably after 8 - 12 months and so did some of their flagship ultrawides historically. Samsung charges high early adopter fee, especially in form factors and features where they have no real competition.


. . .

From the responses in that Tech with KG thread, KG is saying that it is true 4k 240hz, but that the 8k can only do 60Hz. I haven't seen the true 240hz 4k verified elsewhere yet though. If it's true, I'm assuming that the 8k 60hz rather than 120hz capability is a limitation with the lanes on the gpu ports. Nvidia 4000 gpus similarly can't do multiple 4k 240hz screens. So perhaps that might change in the 5000 series.


from nvidia's 4090 page's spec sheet:

1 - Up to 4K 12-bit HDR at 240Hz with DP 1.4a + DSC or HDMI 2.1a + DSC. Up to 8K 12-bit HDR at 60Hz with DP 1.4a + DSC or HDMI 2.1a + DSC.

2 - As specified in HDMI 2.1a: up to 4K 240Hz or 8K 60Hz with DSC, Gaming VRR, HDR

. . . .

From some of my other replies:

"NVIDIA's specs for the GeForce RTX 4090 list the maximum capabilities as "4 independent displays at 4K 120Hz using DP or HDMI, 2 independent displays at 4K 240Hz or 8K 60Hz with DSC using DP or HDMI." Could support be added as part of a driver update? That remains to be seen."


Reddit user reply from a g95nc thread:
"I want to clarify how DSC works since I have yet to see anyone actually understand what is going on.
DSC uses display pipelines within the GPU silicon itself to compress the the image down. Ever notice how one or more display output ports will be disabled when using DSC at X resolution and Y frequency? That is because the GPU stealing those display lanes to process and compress the image.
So what does this mean? It means if the configuration, in silicon, does not allow for enough display output pipelines to to be used by a single output port, THAT is where the bottleneck occurs."

"Nividia's own spec notes that only 8k 60hz is feasible using DSC over HDMI 2.1 on their cards by disabling at least one port (it will just disable the one that isn't plugged in), so it's clear all the display pipelines are interconnected for use together. I suppose it may be possible to forcibly disable 2 ports to achieve a high enough internal bandwidth to deal with 240hz at 1/2 8k resolution, but again, that is also determined by the slicing and compression capabilities."
 
Last edited:
. . .


https://www.whathifi.com/reviews/samsung-qe75qn900d

According to that review, it's 4k 144hz , so still no confirmation of true 240Hz 4k other than the youtuber's take on it.

it’s also got plenty going on for gamers. All four of its HDMIs are equipped to handle the latest cutting-edge features of 4K/120Hz, 144Hz refresh rates if your PC supports them, VRR (including AMD Freesync Premium Pro) and ALLM switching. Plus, of course, 8K games at up to 60Hz refresh rates should you be lucky enough to have a PC rig capable of rendering so many pixels.

This isn’t just true with video sources, either. Good 4K-level gaming graphics also look dazzlingly great after passing through the QN900D’s brainbox, enjoying new levels of sharpness, detail and depth that breathe new life into old favourites. The way Samsung’s set does this without making the resulting pictures look noisy or full of processing side effects genuinely makes you feel more immersed in what you’re playing.

The QN900D manages to achieve this upscaling feat, too, while only taking 11ms when using its fastest game mode setting to render graphics received at its inputs. We can’t even start to imagine the level of processing power required to calculate the look of so many pixels in so little time.


===================================


. . .
 
Last edited:
Nice discussions here but until content is readily available, it is a no sell for me. As screens get larger the PPI drops and can look low res depending on size.
Quality pixels over quantity factor, for example a 100K Christie 4K cinema projector is amazing just go to your local theater, no one complains about PPI or fringing etc.
Go to RED cinema to see 8K video and see what a 100K telecine lens kit can do.

View: https://youtu.be/bmIObGZDjew

Content is ready as soon as you plug it into a PC. At least for me, that is what is driving my interest in 8K.
 
. . .


https://www.whathifi.com/reviews/samsung-qe75qn900d

According to that review, it's 4k 144hz , so still no confirmation of true 240Hz 4k other than the youtuber's take on it.

Can WhatHifi really be trusted? I have the feeling that they have been giving out glowing reviews for years to anything from the day it was released and more seems like a sponsored video at LTT. Or am I wrong?
 
  • Like
Reactions: elvn
like this
I am still a bit "worried" about these AI is fantastic updates. As I recall it, that was exactly the same that has been said the last couple of years, which basically turned out to be the same TV as last year with only marginal changes. If this was first gen products, it would seem reasonable that second gen might improve a lot but for something like forth gen to fifth gen, there need to be some actual breakthrough for things to change to the better (unless the price doubles or something like that).

With 8K still being such a small market, I also can't really see a massive amount of money being spent by Samsungs R&D either. I mean, the second best 8K TV is probably last gen Samsung 8K, so they are basically competing with themselves it seems. Like LGs OLEDs kind of stagnated until QDOLED, and then all of a sudden they just happened to have MLA available.
 
I am still a bit "worried" about these AI is fantastic updates. As I recall it, that was exactly the same that has been said the last couple of years, which basically turned out to be the same TV as last year with only marginal changes. If this was first gen products, it would seem reasonable that second gen might improve a lot but for something like forth gen to fifth gen, there need to be some actual breakthrough for things to change to the better (unless the price doubles or something like that).

With 8K still being such a small market, I also can't really see a massive amount of money being spent by Samsungs R&D either. I mean, the second best 8K TV is probably last gen Samsung 8K, so they are basically competing with themselves it seems. Like LGs OLEDs kind of stagnated until QDOLED, and then all of a sudden they just happened to have MLA available.

It's pretty wild how QD OLED has caused LG to make much better TVs in a short amount of time, at least when it comes to the G series. We went from almost complete stagnation, to the G2 putting a heatsink and pushing brightness up, then G3 adding MLA, and now G4 with MLA+ all within 2 years. Next year we may see even see PHOLED with MLA++.
 
It's pretty wild how QD OLED has caused LG to make much better TVs in a short amount of time, at least when it comes to the G series. We went from almost complete stagnation, to the G2 putting a heatsink and pushing brightness up, then G3 adding MLA, and now G4 with MLA+ all within 2 years. Next year we may see even see PHOLED with MLA++.
As somewhat of a cynic, I am pretty convinced that we could have had MLA a few years earlier if Samung had been a few years earlier with their OLEDs.
 
  • Like
Reactions: elvn
like this
As somewhat of a cynic, I am pretty convinced that we could have had MLA a few years earlier if Samung had been a few years earlier with their OLEDs.

I agree. In fact I say that is the only reason why we still do not have MLA on the 42 and 48 inch WOLEDs, because there isn't a 42/48 inch QD OLED. If Samsung were to release a 42/48 QD OLED then LG would immediately put MLA on those sizes and push the brightness up.
 
  • Like
Reactions: elvn
like this
Agree with most of those comments, except that about the AI. While I can understand your reservations/suspicions/concerns, cynicism, (however you want to describe it ) - Samsung and for the most part nvidia "compete" against their previous gens all of the time and put a lot of money into making better tech generations (and specifically certain companies are investing a lot into AI). Kind of like automobiles come out with a newer model every year. I do agree that competition pushes them forward more, but I don't agree that they aren't going to try to push higher features in their more expensive flagship product showpieces.

I don't trust whathifi necessarily, I just posted that because there is still some descrepancy in reports of the 240Hz native 4k being a thing or not. Looking forward to a more fleshed out review or video from someone where it's really shown as fact (or not).


By the way, according to some of the small handful of current first look hand on at samsung facility "reviews", the 900D is pretty glossy "and suffers from reflections" which is a complaint by some of the early look reviewers in the showroom environment but which imo is great. I prefer glossy 100% so that is good news to me, especially controlled lighting environments I'd use for HDR gaming and media.

. .
 
Content is ready as soon as you plug it into a PC. At least for me, that is what is driving my interest in 8K.
Exactly what the Red ecosystem is for PC editing and video production and with Nikon acquiring Red they should open up rental market lowering cost. Cannon has the rental market for DSLR lenses.
 
. .


View: https://www.youtube.com/watch?v=U0CuaaNkJ88

It's too bad that none of the people invited to that samsung location (that have posted videos so far) brought a long a legion pro or other 3000 - 4000 laptop series nvidia gpu to test the 4k 240hz. Could probably even test it with a legion go handheld + an egpu with an easy enough to render game.
 
Last edited:
Agree with most of those comments, except that about the AI. While I can understand your reservations/suspicions/concerns, cynicism, (however you want to describe it ) - Samsung and for the most part nvidia "compete" against their previous gens all of the time and put a lot of money into making better tech generations (and specifically certain companies are investing a lot into AI). Kind of like automobiles come out with a newer model every year. I do agree that competition pushes them forward more, but I don't agree that they aren't going to try to push higher features in their more expensive flagship product showpieces.

I don't trust whathifi necessarily, I just posted that because there is still some descrepancy in reports of the 240Hz native 4k being a thing or not. Looking forward to a more fleshed out review or video from someone where it's really shown as fact (or not).


By the way, according to some of the small handful of current first look hand on at samsung facility "reviews", the 900D is pretty glossy "and suffers from reflections" which is a complaint by some of the early look reviewers in the showroom environment but which imo is great. I prefer glossy 100% so that is good news to me, especially controlled lighting environments I'd use for HDR gaming and media.

. .
I am not necessarily saying that AI is bad, more that there might not be any reason to assume that it has become that much better compared to the last gen (which was also marketed as having a much better AI but turned out to be basically the same). I would imagine that the recent developments regarding ChatGPT etc. might not immediately translate to built in chipsets as hardware is usually a slower process than software and also that unlike ChatGPT etc, it is probably mostly executed on the device. Could be wrong here but then we need some news supporting anything but an evolution.

The 900C was also what I would describe as glossy, albeit with a MUCH better coating / filter than the 57" despite lacking the curve and also using VA panels. Even being flat, the viewing angles were even better as I recall it.

All in all, my impression until proven wrong is that the 900D is basically the 900C with newer chipset.
 
Last edited:
https://www.expertreviews.co.uk/tvs/samsung-qn900d-review

Sadly, this is another "review" that has me a bit concerned, as it seems some stuff is right out of Samsungs marketing. And also claiming the TV to have much more zones than last year when it in fact has the exact same number of zones as the 75" last year does not exactly inspire confidence.
 
Last edited:
  • Like
Reactions: elvn
like this
https://www.expertreviews.co.uk/tvs/samsung-qn900d-review

Sadly, this is another "review" that has me a bit concerned, as it seems some stuff is right out of Samsungs marketing. And also claiming the TV to have much more zones than last year when it in fact has the exact same number of zones as the 75" last year does not exactly inspire confidence.

Yeah these are all at a glance type "looks" at it. Might have to wait a little while for someone to actually get one in house and at least test pc gaming on it more thoroughly. They are up for order at their big markup release price at BB and on samsung's site so it'll happen in a few weeks hopefully. I wouldn't be ordering for quite awhile yet but I'm very curious about it ahead of time.

. .
 
The 900D has a 3rd gen AI chip (NQ8 AI Gen3 processor) - with a ton more processing power and speed. The 800a, 900a and even the 800D do not have the new flagship AI chip.

The main improvements, at least from the marketing so far, are that its AI upscaling is way better, faster, and cleaner of 4k and it also upscales 1080p well which other 8k's didn't. The quality of the picture is reportedly better (potentially has better fald handling), and the AI upscaling by the chip to 8k very noticeably making 4k content look more detailed than previous generations. Better motion handling/clarity in media (e.g. fast moving sports balls) too, and it has some AI depth feature that supposedly makes things a little more "3d looking" if you want to enable that. It's also supposedly low input lag gaming at 240Hz, still only 11ms when 4k upscaled by the chip to 8k which is decently low for gaming and remarkably low considering it's being processed a bit. It potentially might have half of that input lag at higher fpsHz though, I don't think the reviewers all used 240fpsHz , some don't even give the 120fpsHz input lag numbers.

. .


The FALD backlight zone count on the 800a and 900a is 1344, and the 900D has 1920 zones I guess. Have to wait for some detailed reviews to see the performance of the FALD algorithm , and blooming, which might be improved by the new chip too. The 900A had noticeably slower FALD zones transitions in game mode than out of it, and with a larger # of zones affected (less detailed lighting management) during gaming, so I'm curious to see if that has changed or improved.

This is an excerpt from the Backlight:media section of the 900A review from RTings:

Update 02/11/2022: After updating the firmware version to 1903, the local dimming received a score increase from 6 to 7. It slightly improves overall performance from the previous firmware version, but it's not a significant difference. There's less blooming, and it's less distracting, but there's still black crush, and the zone transition speed is still okay. The videos are also updated.

Update 08/02/2021: We retested the local dimming feature with the latest firmware, version 1511 and updated the local dimming videos. Local dimming is slightly better than before, as the algorithms seem to spread bright objects across more dimming zones, reducing the appearance of blooming slightly. Overall zone performance hasn't changed, though.

The Samsung QN900A has decent local dimming once you update it to its latest firmware. It's better than the Samsung QN800A 8k QLED but still produces some blooming around anything bright. It's even distracting with subtitles, where the blooming bleeds into the black bars. There's also some black crush, with smaller stars in our star field demo being crushed out, but it's not too bad overall. Zone transitions are a bit better than the QN800A but are still quite noticeable, depending on the scene. Like the QN800A, it uses Mini LED backlighting with 1344 dimming zones.

We tested local dimming on 'High'.

This is an excerpt from the Backlight: gaming section from the 900A review from RTings:
Update 02/11/2022: After updating the firmware version to 1903, the local dimming received a score increase from 6 to 7. It slightly improves overall performance from the previous firmware version, but it's not a significant difference. There's less blooming, and it's less distracting, but there's still black crush, and the zone transition speed is still okay. The videos are also updated.

Update 08/02/2021: We retested the local dimming feature with the latest firmware, version 1511 and updated the local dimming videos. There's a slight improvement in local dimming performance, as there's a bit less noticeable blooming. It appears that the algorithms are spreading blooming out across more zones than before.

In Game Mode, the local dimming performs about the same as it does with Game Mode disabled. The biggest difference is that the blooming is a bit less aggressive with actual content, but the zone transitions are slower. The local dimming makes everything a bit brighter overall than it does outside of Game Mode, so there's a bit less black crush, but you may still lose some small highlights. The differences, however, are minimal, which is why they have the same score.





. . .

The 8k on the 900D can only do 60Hz right now but I suspect that might be a limitation of the current gen of GPUS since the panel is 120hz / 240hz 4k. So maybe if gpus had enough bandwidth assigned to a single hdmi port it could do 8k 120hz using DSC. The 57" g95nc can do 120hz 7680x2160 with dsc, or 240hz off of dp 2.1 amd gpu - - but again that's probably because of the way nvidia alloted the ports on the gpu. HDMI 2.1 should be able to go higher with dsc if a card was designed for it.

From LTT calculator. It's 42.58 vs 41.92 at 10 bit RGB(444), so it could prob do 8k 120hz at 8 bit color, or if they did 3.25x DSC compression on 10bit or something. Also, could just run 115 Hz and it would fit hdmi 2.1 on DSC 3.0, RGB/444, 10bit, or run 99Hz at that but with DSC 2.5x. Not that native 8k gaming will get high fps. Would be cool if it could do 7680x2160 s-uw rez at relatively high hz for gaming too.

firefox_hChB6GTDAc.png


. . . .

RTings comparison of the 800a vs 900a. They haven't reviewed the 900D yet.

https://www.rtings.com/tv/tools/com...0a-8k-qled/21550/21554?usage=1&threshold=0.10

. . .

Edit:

I forgot to mention that the 900D is also supposed to be 70w - 90w since they are trying to stay within the euro power regulations. Slipped my mind because it's not a feature that is that important to me as a selling point personally.
 
Last edited:
Such a shame TVs are still so much further ahead of monitors. Finally get QDOLED/WOLED monitors only to have 1/3 of the HDR brightness of what we all know the panel tech itself could do.

If they made this even 10” smaller…8k res, 2400nits in 10% window. I’d be all over it. There is still this gap between the best monitors and the best TVs. Make the monitors more like the TVs or make the TVs more like monitors. Pick your poison.
 
Such a shame TVs are still so much further ahead of monitors. Finally get QDOLED/WOLED monitors only to have 1/3 of the HDR brightness of what we all know the panel tech itself could do.

If they made this even 10” smaller…8k res, 2400nits in 10% window. I’d be all over it. There is still this gap between the best monitors and the best TVs. Make the monitors more like the TVs or make the TVs more like monitors. Pick your poison.

I agree, but why not both. 🍻

.

tenor.gif


. . .

I dedicated a room to being a pc room so I don't mind decoupling a larger gaming tv from my desk. A 65" screen would break my former peripehrals-on-a-desk usage record technically from 48" 4k singly, but my 48" OLED has a 43" 4k VA in portrait on each side of it so overall it is a huge "screen area", monitor array. I already view that array from around 40" away at my island desk on caster wheels so another 5" to get to 64 degree viewing angle on a 65" wouldn't be much of a problem. Around 4 feet including the depth of the desk means the screen might be after a 2' gap behind the desk.

The bigger issue separating gaming tvs from monitors going forward is probably going to be that "monitors" will likely get displayport 2.1 ports with higher than the current dp 2.1 bandwidth, eventually (requiring that bandwidth port on the gpu to drive it as well in some gpu gen obviously). The spec goes up to 80Gbps though the current dp 2.1 devices are all 54 Gbps vs Hdmi 2.1 's 48 Gbps. Until hdmi 2.1 hit, the port bandwidth ~ peak rez+Hz capability, and g-sync/vrr capability etc. was a major dividing factor on tvs vs monitors (other than size, which I can manage, and if you go back far enough, 1080p rez was bad). Someyear hdmi spec will increase but that might be a years from now.

I suspect if/when consumer priced micro-LED screens come out, they will be larger (than desktop monitor) TVs for awhile too. The larger the TV, the larger the emitters/microLEDs can be. Which is probably why so far most of the micro LEDs showcased at CES had been over 100" screens. Samsung made a 110", 99" and a 88" one though, and now they have a 76" one which is the smallest to date. It prob costs tens of thousands of dollars though. The only one for sale on their site is still the 110" one for $150,000 usd.
🌪️💸💸💸


8k res, 2400nits in 10% window.

Here is the HDR nits of the 900A from the 900A RTings Review. I haven't seen reliable info about the 900D's HDR ranges yet.

firefox_tNs4iy3NSL.png
 
Last edited:
Not sure why we would want to look at 900A these days, when the QN900B was much better (and very similar to the QN900C according to Rtings etc). The QN900C currently retails for 50% of the listed price of the QN900D, and would be surprised if the actual differences turned out to match that. The funny thing about all these raving reviews about Samsungs 8K TVs is that none of the reviewers actually seem to buy them despite claiming them to be the best thing since sliced bread. Of course, the pricing could be a part of that...
 
The one reply was to

xDiVolatilX

because he specifically asked about the 900A and 800A, probably because he's seen them on sale relatively cheap.

You are right though, I should have answered tigger with the 900C's HDR brightness from RTings.

firefox_UNQ6mDJrnT.png



The 900C has 1344 zones, for comparison I think the 900D has 1920. More zones helps but it also comes down to the implementation . . hardware design, algorithm and now probably the AI as to how well the FALD works though.


. . .

The AI upscaling could be that great for media upscaled to "8k" detail and greater motion handling especially, but also gaming upscaling. Most first looks are saying samsung is leveraging this chip to finally make 8k screen worth it for 4k content, and from early reports the "reviewers" seem to agree. It's also the first 8k to do 1080p upscaling well, which they are touting. AI processing is the way of the future. Nvidia has also said as much. Upgrades in AI will probably be as important as upgrades in HZ and gpu power at times in the future (especially on nvidia gpus). If the 900D does native 240Hz 4k that alone is way better for gaming than the 900C's 144hz (which was only capable of 120Hz off of nvidia gpus), and would be one less trade-off vs the current crop of 4k 240Hz desktop gaming monitors. The AI could also shape the brightness/color output and FALD zone management better potentially.

I'm eagerly waiting on some more detailed reviews where they at least put it through it's paces on a powerful pc gaming rig, but I'm in no hurry at that price tag, plus the 5000 series gpu won't be until 2025.

Since they always ask a bigger early adopter price, it's worth waiting it out, kind of like buying a car at the end of the model year, b/c by then they'll have dropped a lot. It's usually not even that long for samsung to drop price. Plus some qualify for samsung discount on top of that (though that removes the ability to add a best buy warranty by buying from them, even though you can pick up the samsung purchase at best buy ironically).

https://pangoly.com/en/browse/monitor/samsung

firefox_OQpHrQLAV1.png


. .

firefox_jyZXsVmDtp.png
 
Last edited:
The 900C has 1344 zones, for comparison I think the 900D has 1920. It really comes down to the implementation . . hardware design, algorithm and now probably the AI as to how well the FALD works though.
Actually, the zone count varies with the size so AFAIK there is no change there. Even more surprising then that Stephen Withers claimed there to be a massive difference compared with the QN900C when he himself was the one doing the review of both in the same size. Maybe I am paranoid, but there seem to be something about Samsung reviews in general that bothers me and it is know that they have been caught cheating a few times as well. Not blaming you of course for this :D

https://www.expertreviews.co.uk/samsung/1418276/samsung-qn900c-review

https://www.expertreviews.co.uk/tvs/samsung-qn900d-review


My point about the QN900A was more that it from what I know is actually a noticeable downgrade from the QN900B, so unless it was really cheap, I would go for the 900B instead.
 
I am quite happy I noticed this thread !!
There is a plethora of very good information here.
------
I am ~3 months from fully pulling the trigger on 8k, but for photo/video, rather than gaming.
Starting with the 65" QN900D.
Followed by one of these ;
https://www.canon.ca/en/product?name=EOS_R5_C
------
GoPro's are not 8k yet, but soon.. They are at 5.3K.
--------
---------
I have a concern.
I saw a QN900 series review (not sure what version) that said the screen was so fragile, you can't clean it.
This is a deal breaker for me. My media workshop is like a recording studio, smoking is allowed. So is soldering smoke. etc.. I must be able to wash my TV screen. My current TV is a 4K Sony Bravia, I can just windex it.

I've read the QN900D user manual, it's vague at best about the screen and or coatings on the screen.

Any thoughts ?
I think I'll goto the place I normally shop at and ask them to Windex the 900D demo model, see what happens, LOL.

(y)
 
I am quite happy I noticed this thread !!
There is a plethora of very good information here.
------
I am ~3 months from fully pulling the trigger on 8k, but for photo/video, rather than gaming.
Starting with the 65" QN900D.
Followed by one of these ;
https://www.canon.ca/en/product?name=EOS_R5_C
------
GoPro's are not 8k yet, but soon.. They are at 5.3K.
--------
---------
I have a concern.
I saw a QN900 series review (not sure what version) that said the screen was so fragile, you can't clean it.
This is a deal breaker for me. My media workshop is like a recording studio, smoking is allowed. So is soldering smoke. etc.. I must be able to wash my TV screen. My current TV is a 4K Sony Bravia, I can just windex it.

I've read the QN900D user manual, it's vague at best about the screen and or coatings on the screen.

Any thoughts ?
I think I'll goto the place I normally shop at and ask them to Windex the 900D demo model, see what happens, LOL.

(y)
This is true for Samsungs QDOLEDs but not sure I have ever seen it mentioned for their LCDs nor that I thought of it as being a problem when I had one of them at home.
 
Now that 4K 240Hz OLEDs are out, is 8K 120Hz OLED going to be a thing (relatively soon) or is that fluff?
 
If the QN900D is 8K 240hz It will be the best display ever only issue is you will need a 5090 so total for both is 8K 🤬🤯😵‍💫😵😫
 
Too bad it only seems to have a 120 hz panel from what I have read. Must be a way to find this out for sure.

They are saying it supposedly can do 120hz 8k, with 3.x DSC I'd imagine (115Hz fits hdmi 2.1 bandwidth wise at 10bit, RGB/444 using DSC 3.0) . . . except gpus aren't designed for that yet, much like current 3000, 4000 series nvidia gpus can't do 240Hz 7680x2160 on the G95NC s-uw though they should be able to were the outputs/lanes set up for it. The current gpus also can't do more than two 4k 240Hz displays in an array at 240Hz (and I'm not sure it even does 2 if using three screens overall , another 60hz or 120hz screen besides the two at 240hz either, I'd have to look it up).


from nvidia's 4090 page's spec sheet:
1 - Up to 4K 12-bit HDR at 240Hz with DP 1.4a + DSC or HDMI 2.1a + DSC. Up to 8K 12-bit HDR at 60Hz with DP 1.4a + DSC or HDMI 2.1a + DSC.

2 - As specified in HDMI 2.1a: up to 4K 240Hz or 8K 60Hz with DSC, Gaming VRR, HDR

. . . .

From some of my other replies:
"NVIDIA's specs for the GeForce RTX 4090 list the maximum capabilities as "4 independent displays at 4K 120Hz using DP or HDMI, 2 independent displays at 4K 240Hz or 8K 60Hz with DSC using DP or HDMI." Could support be added as part of a driver update? That remains to be seen."

Reddit user reply from a g95nc thread:
"I want to clarify how DSC works since I have yet to see anyone actually understand what is going on.
DSC uses display pipelines within the GPU silicon itself to compress the the image down. Ever notice how one or more display output ports will be disabled when using DSC at X resolution and Y frequency? That is because the GPU stealing those display lanes to process and compress the image.
So what does this mean? It means if the configuration, in silicon, does not allow for enough display output pipelines to to be used by a single output port, THAT is where the bottleneck occurs."

"Nividia's own spec notes that only 8k 60hz is feasible using DSC over HDMI 2.1 on their cards by disabling at least one port (it will just disable the one that isn't plugged in), so it's clear all the display pipelines are interconnected for use together. I suppose it may be possible to forcibly disable 2 ports to achieve a high enough internal bandwidth to deal with 240hz at 1/2 8k resolution, but again, that is also determined by the slicing and compression capabilities."

. . . . . . . .


Like some of the older 4k 60hz screens could do 1440 or 1080p at 120hz, this 8k 120hz(60hz gpu limitation?) screen supposedly can do 240Hz 4k and have it upconverted to 8k, presumably by that high performing 3rd gen AI upscaling chip.

The input lag in those lightweight reviews has said 9ms or 11ms but I don't think any of them tested 240fpsHz where it might be less than that. In fact, I'm not sure they even tested 120fpsHz solid's input lag.


The goal for gaming on this is 240Hz 4k , upscaled better than anything has ever been able to by the 3rd gen AI chip if marketing pans out to reality.

8k at 115hz or 120hz on a 5090 might be fun to mess with on some simple to render games though. 8k is four times the resolution of 4k. It would be like running an array of four 4k screens, so fps would be pitiful on most things, well below 240fpsHz lol. I doubt it'd go past 120fpsHz on any kind of demanding game.


. .

Diablo 4, while it has some pretty graphics and FX, it still an isometric side scrolling game so shouldn't be that demanding compared to a high detail 1st/3rd person adventure game with large world areas (rather than corridor shooters and small arena games). He got almost 60 fps on it, but frame gen wasn't helping for some reason. I guess if fram gen were able to fully double it you'd get 120, and a 5090 would be more powerful. So for some games, maybe viable for 120fpsHz - 165fpsHz maybe, but prob simpler stuff and isometric games.

https://www.techradar.com/features/diablo-4-at-8k-is-a-beastly-good-time-but-16k-breaks-it
The results were pretty damn impressive, with Diablo 4 running at an average of 50.8fps at 8K with all graphical settings set to max. I recorded the frame rate during a series of battles in both closed spaces and in the open world, and the game felt reasonably smooth and solid while looking great.

However, that wasn't the 60fps I consider to be the ideal frame rate for 8K gaming, and I recorded drops to 34fps at points.

Such big drops in frame rate result in a rather unsatisfying gaming experience (as the game essentially runs at half the speed for a moment or two). In fact, I'd argue that it's better to run at a lower frame rate that's more consistent, than a higher frame rate with bigger drops.

It shows that while the RTX 4090 remains an absolute beast, Diablo 4's busy combat, environmental effects (such as snow and lighting), and other graphical touches do stress the mighty GPU at resolutions of 7,680 × 4,320.

In a bid to smooth out the frame rate, I kept all graphical settings to max but turned on DLSS to its 'Quality' setting. This minimizes the amount of upscaling involved, so image quality is prioritized, but performance gains are reduced.

With that turned on, the average framerate jumped to 59.5fps, pretty much hitting the 60fps goal. The minimum frame rate rose to 45.9fps - still a drop, but a less severe one, which made it less noticeable, and led to a much smoother gaming experience.

In a bid to see if I could get a rock-solid 60fps at 8K, I turned on frame generation, a feature exclusive to DLSS 3, which is only supported by Nvidia's current RTX 4000 series of GPUs. This tech inserts AI-generated frames between 'real' frames to help boost framerates.

However, this actually dropped the average frame rate to 55.4 fps, which seems odd, but this isn't the first time frame generation has apparently had a negative impact at 8K. I put this down to the fact that the tech is new, the effort to generate frames at 8K is likely more intense, and the fact that the game itself is brand-new (I played it before the global launch). Updates to both the game and GPU drivers could iron this out.

. .


. .

. . .

The 8k on the 900D can only do 60Hz right now but I suspect that might be a limitation of the current gen of GPUS since the panel is 120hz / 240hz 4k. So maybe if gpus had enough bandwidth assigned to a single hdmi port it could do 8k 120hz using DSC. The 57" g95nc can do 120hz 7680x2160 with dsc, or 240hz off of dp 2.1 amd gpu - - but again that's probably because of the way nvidia alloted the ports on the gpu. HDMI 2.1 should be able to go higher with dsc if a card was designed for it.

From LTT calculator. It's 42.58 vs 41.92 at 10 bit RGB(444), so it could prob do 8k 120hz at 8 bit color, or if they did 3.25x DSC compression on 10bit or something. Also, could just run 115 Hz and it would fit hdmi 2.1 on DSC 3.0, RGB/444, 10bit, or run 99Hz at that but with DSC 2.5x. Not that native 8k gaming will get high fps. Would be cool if it could do 7680x2160 s-uw rez at relatively high hz for gaming too.

firefox_hChB6GTDAc.png
 
Last edited:
I don't think HDMI 2.1 has enough bandwidth to do 8K 120Hz no matter how much DSC is used, but I could be wrong.
DSC compression is 3:1. 10-bit 8K @ 120 Hz is 111.24 Gbps, so with DSC it's 37.08 Gbps. That easily fits into HDMI 2.1's max data rate of 42.6 Gbps. I don't know about HDR, though. Common wisdom seems to be that HDR's overhead is 15%, which would take it up to 42.64 Gbps.
 
DSC compression is 3:1. 10-bit 8K @ 120 Hz is 111.24 Gbps, so with DSC it's 37.08 Gbps. That easily fits into HDMI 2.1's max data rate of 42.6 Gbps. I don't know about HDR, though. Common wisdom seems to be that HDR's overhead is 15%, which would take it up to 42.64 Gbps.

I was going by this LTT calculator in my comment. Even if it couldn't get 120fpsHz 8k at 10bit RGB 4:4:4 for some reason at DSC 3.x, you could run 115fpsHz.


https://linustechtips.com/topic/729232-guide-to-display-cables-adapters-v2/?section=calc

From LTT calculator. It's 42.58 vs 41.92 at 10 bit RGB(444), so it could prob do 8k 120hz at 8 bit color, or if they did 3.25x DSC compression on 10bit or something. Also, could just run 115 Hz and it would fit hdmi 2.1 on DSC 3.0, RGB/444, 10bit, or run 99Hz at that but with DSC 2.5x. Not that native 8k gaming will get high fps. Would be cool if it could do 7680x2160 s-uw rez at relatively high hz for gaming too.

firefox_hChB6GTDAc.png
 
Last edited:
Back
Top