Where are the 8K Monitors?

125% would work for me on that size :D. Yeah, people used to claim 4k was impossible to tell apart from 1440p at 27". Same old argument, new year and number...

And here I am pushing close to 50 and still doing 27" 4K without scaling on my tripple monitor setup and still feel that I need more screen area :D This is also my main reason for being interested in 8K monitors.
 
And here I am pushing close to 50 and still doing 27" 4K without scaling on my tripple monitor setup and still feel that I need more screen area :D This is also my main reason for being interested in 8K monitors.

Corrective lenses are a thing so age shouldn't matter unless you have a more serious eyesight condition than nearsighted/farsighted as you age.
 
I understand the use of 8k for productivity but what the hell is the use for 8k when it comes to gaming when even the 4090 can struggle to deliver playable frame rates at native 4k in the most intensive titles? Take pathtraced Cyberpunk for example. We need to render at 1440p internally then upscale it with DLSS to get playable frame rates. On a 4k display this is DLSS quality, on an 8k display that would be DLSS ultra performance. Is upscaling to the higher output resolution of 8k vs 4k really going to look that much better? Then when future GPUs are strong enough, would 8K DLSS performance mode rendering internally at 4K look any better than just native 4K with DLAA instead?
 
I understand the use of 8k for productivity but what the hell is the use for 8k when it comes to gaming when even the 4090 can struggle to deliver playable frame rates at native 4k in the most intensive titles? Take pathtraced Cyberpunk for example. We need to render at 1440p internally then upscale it with DLSS to get playable frame rates. On a 4k display this is DLSS quality, on an 8k display that would be DLSS ultra performance. Is upscaling to the higher output resolution of 8k vs 4k really going to look that much better?
8K is mainly useful for desktop use. But most people want multipurpose displays rather than having separate ones for different tasks.
 
8K is mainly useful for desktop use. But most people want multipurpose displays rather than having separate ones for different tasks.

That doesn't answer the question of the benefit to gaming lol. I get 8k is good for desktop and productivity.
 
I would have been a well paid eSport professional if I only could run the games in 8K, does that answer your question? :)

The questions that I'm asking are:

1. How much of a noticeable difference is there when upscaling to 4K vs 8K. DLSS Quality 1440p -> 4K on a 4K screen vs DLSS Ultra Performance 1440p -> 8K on an 8K screen, is the higher output resolution going to be noticeably better when both are starting from the same internal res?
2. How about a native 4K + DLAA screen vs 4K + DLSS Performance -> 8K on an 8K screen?

Obviously I don't own an 8K screen so I can't test this out myself but it's something that would be worth exploring in the future to see if 8K is really worth it for gaming.
 
The questions that I'm asking are:

1. How much of a noticeable difference is there when upscaling to 4K vs 8K. DLSS Quality 1440p -> 4K on a 4K screen vs DLSS Ultra Performance 1440p -> 8K on an 8K screen, is the higher output resolution going to be noticeably better when both are starting from the same internal res?
2. How about a native 4K + DLAA screen vs 4K + DLSS Performance -> 8K on an 8K screen?

Obviously I don't own an 8K screen so I can't test this out myself but it's something that would be worth exploring in the future to see if 8K is really worth it for gaming.
I honestly don't think we are at the detail level to truly take advantage of 1440p... 4K is great for large screens and high FOV so you can have a more immersive whole-view experience, but the assets and rendering tech isn't there to truly make a 4K image look 4x better than a 1080p one. It's just 4x sharper.

Like rendering an old N64 games in HD, Link's face goes from 'cute blurry blob with blinking eyes' to 'literally 10 jagged triangles and low-res eye billboards' So by rendering at a higher resolution, you get a sharper image revealing more imperfections. 4K does that with modern games, unless their fidelity is truly phenomenal.
 
The questions that I'm asking are:

1. How much of a noticeable difference is there when upscaling to 4K vs 8K. DLSS Quality 1440p -> 4K on a 4K screen vs DLSS Ultra Performance 1440p -> 8K on an 8K screen, is the higher output resolution going to be noticeably better when both are starting from the same internal res?
2. How about a native 4K + DLAA screen vs 4K + DLSS Performance -> 8K on an 8K screen?

Obviously I don't own an 8K screen so I can't test this out myself but it's something that would be worth exploring in the future to see if 8K is really worth it for gaming.

1. Can't really say as it can't be disabled and I have nothing to compare with. With an 8K screen, I guess it is more about filling the entire screen even when the content is 4K or lower. Please note that I am talking about the TVs build in upscaling.

2. No idea.
 
8K is mainly useful for desktop use. But most people want multipurpose displays rather than having separate ones for different tasks.
That doesn't answer the question of the benefit to gaming lol. I get 8k is good for desktop and productivity.

He's saying he believes (and I agree) that people who would love a large screen full of high PPD desktop real-estate would also like to be able to game on that single wall of screen without having to buy a separate screen in order to get higher gaming functionality. However it's been a set of different screen tradeoffs like that forever so I've been using a separate screen for desktop/app use and one that is better at gaming alongside it since at least 2006. "Always" has been, at least if you want to get the better tradeoffs out of two or more screen technologies.

e.g. I had a fw900 crt next to a lcd, one of the first 120hz gaming LCDs (1080p, no VRR) next to a 1440p 60hz glossy ips, 144hz 1440p g-sync AG TN next to 1440p 60 Hz glossy ips, 165hz 1600p g-sync VA next to 1440p glossy ips, then larger 4k gaming tvs that are VA (60hz) and OLED (120hz, VRR).

I don't think that requiring at least 2 different screens like that to get the better tradeoffs of each (for me) is going to change for a long time in relation to 8k screens so I've accepted that I'll have to just run two or more screens as I have been for years. If I had to guess, still using more than one monitor type for the next 5 years or so until 8k screen tech (Hz, scaling, backlights, and also gpus/frame amplification techs for that matter) advance a lot. But even then 4k (or 4k+4k doublewide) might be some even higher spec of extreme hz or something or have the first microleds or whatever so there could still be a split. Eventually I'd probably end up trading off between using some future AR stuff and physical flat screens someyear in the future too.

Te questions that I'm asking are:

1. How much of a noticeable difference is there when upscaling to 4K vs 8K. DLSS Quality 1440p -> 4K on a 4K screen vs DLSS Ultra Performance 1440p -> 8K on an 8K screen, is the higher output resolution going to be noticeably better when both are starting from the same internal res?
2. How about a native 4K + DLAA screen vs 4K + DLSS Performance -> 8K on an 8K screen?

Obviously I don't own an 8K screen so I can't test this out myself but it's something that would be worth exploring in the future to see if 8K is really worth it for gaming.

I don't know the answer to that exactly but I'll reply/reiterate that the view distance vs ppi and screen dimensions matters too. An 8k screen can have very tiny pixels to your viewing perspective compared to 4k. The tinier the pixels are, the smaller occasional edge artifacts from screen amplification techs like DLSS + Frame Generation will be. However the higher the fpsHZ, the less difference between the two compared frames will be because less has changed in that shorter time. So having a 8k screen might be better in theory for smaller, less obvious fringe/edge artifacts but it would also arguably have much lower fpsHz so could have artifacting happening to a degree more often. You also have to consider that you'd most likely be getting more FoV movement blur compared to much higher fpsHz. Every time you double the fpsHz, you cut the blur by half again. (E.g. 60fpsHz vs 120fpsHz, 120fpsHz vs 240fpsHz). That blur is of the entire viewport/game world during mouse-looking, movement-keying, and controller panning, etc. Even with DLSS and frame amplification, you should be able to get higher fpsHz on a 4k than an 8k due to gpu power limitations. 8k screens are also always behind the peak manufactured Hz that 4k screens can do just like 4k are/were behind 1440p and 1080p screen's peak Hz in the fastest models.

To revisit what improvise said again though - it would be great to be able to have a large bezel-free wall of very high PPD for desktop/apps that could still do fairly high Hz, (even if not the highest Hz a high end 4k could do at the time) in a letterboxed section of the screen 1:1 at a lower resolution like 4k, 5k, 6k, or a uw or super-uw rez where you could get higher fps from the gpu and higher Hz from the screen than when at 8k native.
 
Last edited:
To revisit what improvise said again though - it would be great to be able to have a large bezel-free wall of very high PPD for desktop/apps that could still do fairly high Hz, (even if not the highest Hz a high end 4k could do at the time) in a letterboxed section of the screen 1:1 at a lower resolution like 4k, 5k, 6k, or a uw or super-uw rez where you could get higher fps from the gpu and higher Hz from the screen than when at 8k native.
Exactly. With 8K mostly making sense for a desktop display that is in the 40-55" range, we are also talking about realistically viewing distances of around 80-100+ cm based on how I found my LG CX 48" to feel comfortable. Curved may change that a bit.

At that distance you aren't going to be able to tell individual pixels on 8K no matter what, but on 4K you are heavily limited by desktop scaling options - basically 100 and 125% are the only sensible ones. On a 8K screen, you would have options probably between 175-300% to tailor it more granularly for desktop use.

I wish Windows offered more granularity at the low end of the scale because 115-120% is pretty spot on for 48" 4K. The old custom scaling option doesn't work correctly, resulting in UI scaling being based on a fixed scale.

For gaming, even if there is no real fidelity advantage, you gain a lot of options too. More integer scalable resolutions, better options for running games in a smaller window or ultrawide aspect ratios etc. Even for DLSS you have more options when combined with using a lower res.
 
  • Like
Reactions: elvn
like this
Exactly. With 8K mostly making sense for a desktop display that is in the 40-55" range, we are also talking about realistically viewing distances of around 80-100+ cm based on how I found my LG CX 48" to feel comfortable. Curved may change that a bit.

At that distance you aren't going to be able to tell individual pixels on 8K no matter what, but on 4K you are heavily limited by desktop scaling options - basically 100 and 125% are the only sensible ones. On a 8K screen, you would have options probably between 175-300% to tailor it more granularly for desktop use.

I wish Windows offered more granularity at the low end of the scale because 115-120% is pretty spot on for 48" 4K. The old custom scaling option doesn't work correctly, resulting in UI scaling being based on a fixed scale.

For gaming, even if there is no real fidelity advantage, you gain a lot of options too. More integer scalable resolutions, better options for running games in a smaller window or ultrawide aspect ratios etc. Even for DLSS you have more options when combined with using a lower res.

Agree an sympathize with your reply in general but I just want to clarify that we'd probably still need to be using some level of AA for highly contrasted edges even at 8k at high PPD/optimal viewing distances. If you have to use any AA or text-ss at all then you are still needing to mask how large the pixel granularity/visible edges really are. (The 2d desktop's graphics and imagery typically get no AA at all either so are aliased even when 2d text/fonts and gaming's pixel granularity are masked).

. . . .

..At the human central viewing angle of 60 to 50 degrees, every 8k screen of any size gets around 127 to 154 PPD

..At the human central viewing angle of 60 to 50 degrees, every 4k screen of any size gets around 64 to 77 PPD

..At the human central viewing angle of 60 to 50 degrees, every 2560x1440 screen of any size gets only 43 PPD to 51 PPD

..At the human central viewing angle of 60 to 50 degrees, every 1920x1080 screen of any size gets only 20 PPD to 25 PPD

. . . . . .

While I got what you meant by that "not being able to tell pixels", we probably can't get away from aliasing being visible on highly contrasted edges (without pixel masking compensations) until approaching 280 - 300 PPD , 16k (or higher). Double the normal optimal distance you'd sit singly from an 8k (move from around 60 deg viewing angle to 30deg) and that's the perceived pixel sizes you'd be seeing. Not that it will be a thing but I think it's worth mentioning as a measurement/scale showing where you might be able to zero AA/text-ss and get negligible to no aliasing on highly contrasted edges.

I think AI, if implemented, has great potential in the future for patterning high-ppi-viewed-at-high-PPD screen's (e.g. 8k) pixels and subpixels to optimal results. That is probably the be the best way to go. It could theoretically get much better results than the scaling and text-ss we use now, (perhaps even rebuilding versions of the font libraries themselves per PPD as it optimizes over time). AI scaling could theoretically affect 2d graphics and imagery as well if enabled, everything that is displayed on screen via full screen AA. I would welcome that on the desktop if/when AI did it optimally. AI systems are capable of learning so could learn how to render individual apps "more optimally" also as they are regularly used apps as well as operating "on the fly" on random images generically. Theoretically could even be able to calibrate using a camera at your view distance to optimize specifically to your layout, much like we now can use a mic for automatically optimizing surround sound speaker array's levels to your sitting/listening position. That or just a view distance slider in OS/OSDs similar to the windows scaling one. So the native text size could be automatically calibrated and the higher the PPD, the less aggressively it would have to mask pixels so the optimal anti-asliasing/pixel masking via subpixel pattern could be different (slightly less subpixel "fog" applied to the edges at higher PPD, i.e. only as needed).
 
So in other words for GAMING, not much of an upgrade to be gaming at 8K with upscaling from a lower resolution like 4K or even 1440p over just gaming at native 4K + DLAA. That's all I needed to know. Obviously native 8K + DLAA would look the best by far, but we are not close to achieving that with playable frame rates in the most demanding titles.
 
Must say, having struggled with different multi monitor setups ever since I returned my QN900B, I am almost at the point where I am considering getting it or a QN900C again and just setup another desk with a "normal monitor" for stuff that won't work well with 8K (like laptops etc). So sad that Samsung killed the party by not offering to disable upscaling, should really be an easy thing to implement, and probably remove a bit of input lag at the same time (although I honestly could not feel much input lag in game mode to begin with).
 
I think we are at a stalemate that we have to wait 6 mth. or longer to have more brand name to produce 8K monitor. I have no faith in Samsung, but I would like to have a leap of faith in LG. But this whole thing reminds me of the year that only Philips has their BDM4035UC and we all jump the bandwagon, and every member lost, as that thing only last 4 yr. But since then, other brand names jumps in, and they last. 8K is a desktop productivity tool for me.

If the screen is very large, then I would hope that the industry c/w the option of retractable 8K screen, so I can have an transparent screen and see thru the screen, and retract it back when I'm done for the day. I just don't understand why the screen industry is so slow in 4K to 8K or 4K to 4K retractable. Look at the other parts of PC industry, look how fast they move from 1 generation to another

by the way, speaking of 8K screen, who are these people? Is there other people who heard of this? C - Seed? As their screen appears to come out of a James Bond movie

 
Last edited:
by the way, speaking of 8K screen, who are these people? Is there other people who heard of this? C - Seed? As their screen appears to come out of a James Bond movie


Probably some oddball company catering to the ultrarich. They seem to have made that render video 2 years ago and 10 months ago actually showed their real display (you can find the videos on their YT channel). It's probably just the Samsung Micro-LED panels which have already been possible to connect separately and stuff.

Retractable stuff makes no sense to me. You can't put anything else in that place without having to clear it all over again when you want to use the display, so why have it retractable at all? Plus I'd hate to wait for some rolling OLED or folding Micro-LED to come out of its housing.
 
Must say, having struggled with different multi monitor setups ever since I returned my QN900B, I am almost at the point where I am considering getting it or a QN900C again and just setup another desk with a "normal monitor" for stuff that won't work well with 8K (like laptops etc). So sad that Samsung killed the party by not offering to disable upscaling, should really be an easy thing to implement, and probably remove a bit of input lag at the same time (although I honestly could not feel much input lag in game mode to begin with).

I can completely understand you wanting that bezel-free full field of 8k, quads of 4k essentially, high PPD real-estate. I would love it. I have a feeling you won't start seeing much to satisfy on some of those facets they dropped the ball on until multiple mfgs come back to the 8k table in 2025. Also no pricing competition.

I'd like having a 8k for desktop/apps above a 4k double wide if I could swing it or get a 8k on liquidation sale or something. However the price premium of the 8k's for what they offer quality wise isn't very appealing to me currently considering the money I'm considering spending on a 240Hz 4k doublewide for gaming as my primary purchase in the first place. Also by the time 2025 comes around I'll most likely be dropping money on a 5000 series gpu.

I looked at the reviews and specs, pricing,etc. and considered the 8k's several times but they just aren't ripe. I wouldn't blame you for getting one I just don't think can't justify it for what they are offering as 8k screens right now price/performance wise and for $3000 (800c) or $4500 (900C) after tax - especially considering what I'm thinking of as my future purchasing map. At those kind of prices it would have to be equivalent picture quality to a 4k version (they have been somewhat worse so far), and have no cons like the 1:1 scaling thing ~ forced upscaling. And then it would probably be either 4k doublewide or 8k purchase at those kind of prices (for me). Maybe once there are other 8k mfg'd tvs on the market again from TCL, etc I can get one for an over under though. Not worth the pricetag for a samsung "flagship" one considering I'd be primarily using it for text/apps where 60hz and less bells and whistles would be fine. If they made a 55" model it would be a little lower in price too, they have 65" as the smallest option, at least on the newer ones.

. . . . . .

Even considering all of that I just said, I did end up finding the 55" 8k 700B's at bestbuy on sale for $1k which was a little tempting. They definitely bloom though esp. on dark backgrounds but they could be a "cheap" (at least relative to the others) 8k option for desktop/app real-estate.

At 28" view distance, where I'd probably sit for a 4k doublewide, - a 55" 8k would be 95 PPD, and a 65" 8k which most of the other ones available are would be 85 PPD. However at that view distance a 55" would result in an 81 deg viewing angle (and 65" screen at 28" would be 91deg). Could work like that for use as a "bezel-free multi monitor array" type of scenario though with some slight head turning It really would only have around 10 to 15 degrees on each end outside of your human central viewing angle for a 55" (15 to 20 deg each outisde on end for a 65") when used like that which probably isn't bad for desktop/app use. For comparison, a 57" super-ultrawide at 1000R has a base width straight across between the ends of about 51", a 55" flat screen is ~ 48" across, a 65" flat screen is 57" across. So it's probably more like a curved 65" but a flat 55" 16:9 would probably align better with it in an over/under setup.

If ever viewed at the human central viewing angle starting around 60deg (42" view distance), a 55" 8k would be ~ 128 PPD. That's close to the focal point/radius of a 1000R curve as well but something like the 57" 4kd doublewide would turn into a short belt to your perspective at that distance so wouldn't be good to game on from there. Would have to move closer for gaming but with the right setup changing view distance like that depending on what you are doing wouldn't be a problem.


The 700B had a dithering issue but I guess there is a workaround using game mode where it disappears.

From Reddit replies in a thread:

"if you set the input to PC mode and turn on game mode, the picture is great and the dithering completely disappears. The picture looks entirely different and how you’d expect it to.
I’m running 8K 60Hz over HDMI 2.1 from a 2023 MacBook Pro M2 Max."

"To get good results I had to set the input to be a PC, my laptop to use variable refresh rate, and set Game Mode on. Game Mode is a standalone feature that can be used regardless of input, and there isn’t a PC Mode or any other modes."

"
Just wanted to comment for anyone trying this. You can get this to work properly.
You can get the checkerboard and dithering to go away. Your dreams can come true.
I tortured myself for hours getting it to work, and it was completely worth it.
You need:
- HDMI 2.1 GPU
- 48gbps certified HDMI cable (I'm serious)
- G-sync
You must do the following:
- Enable InputSignalPlus, then Game mode in the TV menu
- Then enable Gsync in Nvidia control panel.
I run at 60hz, very rarely I get some localized jitter (once a month at most) I simply switch between 59hz and back to 60hz to make it go away (again in Nvidia control panel).
This monitor is an absolute gamechanger. Excellent brightness, HDR, and most of all
4x4K with the same pixel density as a 4K 27" monitor.
FYI, running GTX 3080
"


700b with dithering
https://i.postimg.cc/QCJMn5X7/IMG-9047.jpg


700B without dithering
https://i.postimg.cc/T20GFTVn/IMG-9046.jpg
 
Last edited:
I think we are at a stalemate that we have to wait 6 mth. or longer to have more brand name to produce 8K monitor. I have no faith in Samsung, but I would like to have a leap of faith in LG. But this whole thing reminds me of the year that only Philips has their BDM4035UC and we all jump the bandwagon, and every member lost, as that thing only last 4 yr. But since then, other brand names jumps in, and they last. 8K is a desktop productivity tool for me.

If the screen is very large, then I would hope that the industry c/w the option of retractable 8K screen, so I can have an transparent screen and see thru the screen, and retract it back when I'm done for the day. I just don't understand why the screen industry is so slow in 4K to 8K or 4K to 4K retractable. Look at the other parts of PC industry, look how fast they move from 1 generation to another

by the way, speaking of 8K screen, who are these people? Is there other people who heard of this? C - Seed? As their screen appears to come out of a James Bond movie



From what I read 2025 will probably be the year that 8k comes back with multiple mfgs. Hopefully some smaller FALD arrays or something by then and some good 8k options but those 8k pricetags are kinda crazy with only one major mfg having flagship 8k tvs, and samsung has a history of overpricing things to begin with. Nvidia '5000' series will probably hit in 2025 too, assuming with dp 2.1 and hopefully DLSS and especially frame generation/insertion will mature some. On the oled side of things, phosphorescent blue emitters will hit in 2025 and should be a considerable improvement in brightness and perhaps even brightness and sustained durations vs burn in mitigation/abl. That to go along with ai wear evening detection/measuring methods replacing heat and power using wear evening circuitry, and micro lens array lenses on more screen models potentially.

So we are in a bit of a lull right now, mfgs putting things in low gear. Maybe due to post pandemic, economy/purchasing power, supply issues, etc. and the regulations on 8k.

. . . .

Regarding the stow away screen tech, there are already hydraulic arms that are able to stow screens behind furnishings or masking object/wall, or able to retract the screen inside of a hutch,etc.

1691672601095.gif


1691672618581.jpeg




If I had a sit/stand desk I'd think about getting another small as I could find standing desk just for the screen. I saw one once that had usb bluetooth control so that you could control it via a streamdeck or a voice assistant, or app. Theoretically, if you had usb/BT control of two standing desks you could mount screen/arm on a 2nd smaller one farther away and activate a multi-action on a stream deck or a multi action routine in a voice assistant to raise/lower both at the same time, as well as having separate buttons/actions to fine tune them individually. According to this video you can also set timers to automatically remind you to stand every so often, etc. if you wanted.



There are also some screw in/down (to floor or you could prob install to a heavy block/foot panel) motorized hydraulic pillar mounts you can get for tvs that raise/lower the tv. People use them to stealth the TV behind a hutch where they can activate a remote to raise the tv up for viewing (up to 65" vertical on the 'mount-it!' brand one). They wouldn't have portrait mode unless you bought a vesa adapter/plate that added portrait rotation capability. Theoretically you could mount one to a heavy caster wheel dolly base/cart too if you wanted to be able to move it around, rotate it, etc.

1691672682588.gif



1691672631125.jpeg
 
Last edited:
why would there be regulation on 8K

Power use. Idk what the original 8k specs were supposed to be but to me it's really silly considering how much power people can use for other things but that's EU regulations that had some effect on the rollout or dev of 8k for the whole world apparently. Here in the usa people might have central AC, pool filters, decide to install a hot tub with it's pumps and heating, workshops running high power tools, run multiple screens, game and mine on gpus , etc. A tv's energy use is a drop in the bucket, not to mention energy waste and pollution generated in general by industry (i.e. "it's ok to burn tons of energy as long as you are profiting off of it"), unnecessary commuting, etc. Besides, the market for 8k screens is pretty small and will be for years so there probably wouldn't be that many in operation compared to 4k screens in the first place.

Samsung may have adjusted their output on their 8k's to comply with EU regs, but here is what I found as peak usage:

samsung 900C 8k FALD LCD is rated at 475 watts.
samsung 900B 8k FALD LCD is rated at 375 watts.
samsung S95B 4k OLED is rated at 350watts
LG 65" C3 4k OLED is rated at 235watts
samsung qn90b-qled 4k FALD LCD is rated at 175 watts

However it's more about the typical watts per hour:

https://www.avinteractive.com/news/...t-use-of-8k-and-microled-displays-01-03-2023/

According to an invidis analysis, the energy consumption of 8K displays is more than twice as high as 4K (190 watts vs. 83 watts per hour) but the EU has imposed a limit of 90 watts. This would force manufacturers selling 8K devices to apply energy-saving presets that prevent the use of 8K.


However, Samsung which has championed 8K TVs is reported by Techspot and Forbes to be planning to ship 8K TVs with a brightness-limiting eco mode as default, but with the option for consumers to switch to another mode with higher brightness settings. While the regulations require energy-saving presets to be in place out of the box, they do not prohibit switching to other modes.

. . . . . .

That was just another barrier dropped on 8k though. Probably also post covid supply/chip shortages, inflation/economic effects on people's buying power in order to be able to spring for 8k screens, people seeing no need for 8k without 8k content + 8k price tags having to compete with high performance 4k screens being so affordable, profiting off of 4k instead longer, etc.
 
Last edited:
I am not sure a TV energy is a drop in the bucket, as everyone has 1 of these big screen TV. Many yr. ago, I recall some study shows the overall use of CPU worldwide takes up 10% of world's electricity. So collectively they add up
 
I can see some kind of limit on some things but while 400w - 500watts is good draw it isn't that crazy for a high performing electronic device like a high end gaming pc, surround system, audio setups, etc. I understand what you are saying from a management perspective cumulatively and regulations wise but I don't consider a 400w - 500w device a lot for a flagship big media device. TVs are moving into being higher performance HDR devices now with higher and higher outputs, just like bigger audio setups have more powerful amps, bigger water features and devices (pools, hot-tubs, etc) use larger pumps/filters, larger houses have to use higher powered heating/cooling systems, and higher powered and/or heavier vehicles use more gas (or electric). While my lower hdr range/sustained 77 C1 oled pulls up to 200w, the 2000nit samsung S95c 4k FALD LCD pulls up to about 400 watts. We should eventually get even higher HDR output as we move up to HDR 4000 and eventually even HDR 10,000 someyear. I don't know what the power requirements for those will be but maybe they'll be more like a 750w to 1000w pc rig's power use by the tv itself and with vented housing, heatsink and active fan cooling profile similarly. Idk will have to see what they can manage to do with tricks like micro array lenses, etc to help optimize things.

Power regulation, budget waste and overall greed/vampirism, lack of massively modernizing infrastructure for much more efficient transmission vs loss, not more heavily funding solar cells on more peoples houses, building more power plants (incl. modern nuclear), huge waste of power (and pollution generation) in unncessary work commutes and travel (incl. the power req for the buildings to be run and occupied), # of children people choose to have vs their footprint among other things are a highly political issues. I have some strong opinions on it but I won't go into it much more than that in this reply.

. .



tumblr_p7li6nwPdV1qmob6ro1_500.gif
 
Last edited:
what about the next new gadget? Micro LED I believe, is that sucks up more electricity or less?
 
what about the next new gadget? Micro LED I believe, is that sucks up more electricity or less?

Any self emissive display varies a lot on how much energy it uses. The brighter the image it's displaying the more it will use.

I would guess without needing the light to filter through the LCD layer, and not having to drive the LCD layer, a micro LED would be more efficient displaying similar images. But you probalby ned to look at specific models, because even LCDs vary quite a bit.
 
what about the next new gadget? Micro LED I believe, is that sucks up more electricity or less?

Sony demo'd a 10,000nit 8k HDR display back at ces 2018 (and a 4000nit back in 2016) - but I didn't see anything about power consumption or how hot it got ~ heat mitigation methods on it.

https://www.anandtech.com/show/1227...spec-hdr-8k-display-with-10000-nits-luminance

I know dolby had a small hdr 10,000 nit display a long time ago that they built just for testing purposes, that had liquid cooling b/c it ran so hot.

Sharknice is right about brighter usually sucking up more juice in order to output that much. With a super dense array for 4000nit and later 10,000 nit micro leds you not only have those peaks but you'll have a lot more emitters, per pixel, rather than something like a 45x25 "lighting rez" of FALD backlights like we have now in some models. Plus you'd prob also require active cooling which uses a little power itself. So I doubt very high HDR output (4000 to 10,000) even with micro LED would be low watts, but micro LED is way more efficient than LCDs.

This article is below is from 2020. OLED are already making higher efficiency advances since then - in AI wear evening calculating routines no longer requiring mfgs to have power/heat using circuitry for that on the displays anymore, plus micro lens arrays adding efficiency, and in 2025 phosphorescent blur oled emitters will be a fairly big upgrade. The article still has some good info though:

=========================================

https://www.microled-info.com/discussion-microled-efficiency-microled-displays

This article will look at the entire microLED display, and also compare it to current LCD and OLED displays. After all one of the main advantages of microLED displays is the increased efficiency (and brightness) compared to current displays. Most people assume that indeed microLEDs are much more efficient than OLEDs and LCDs.


As we have seen in the previous article, the efficiency of a blue microLED (blue is the most efficient LED color) is around 35-40% (EQE). MicroLED chips are indeed more efficiency than OLED emitters. For red and green OLED emitters, the EQE range at around 30% in theory, but in practice with high voltages (brightness) the EQE drops to around 15-20%. The efficiency of blue OLEDs is much lower - around 5-10% (although companies are developing next-generation high-efficiency blue OLEDs which will hopefully solve this issue).
When we look at the entire display, however, things are more complicated. It turns out that quite a bit of the energy is lost in the driving of the display rather than in the emitter in some cases.
For microdisplays, the silicon backplane is highly efficient, and thereâs minimal power loss, and in these displays microLED will indeed be much more efficient and bright than OLED microdisplays.

Moving over to small passive-matrix displays, it is estimated that the power loss in the driver electronics of such displays is around 25%. With active matrix LTPS TFTs, it is estimated that the power loss in the TFTs reaches almost 70%. In such displays, only 30% of the power goes to the emitter material or device. Even if microLEDs are twice as efficient as OLEDs, then the whole display will only get to be around 15% more efficient.
In fact we discussed this with industry experts, and it is expected that at least in the near future, microLED displays will only be marginally more efficient than OLEDs. It is estimated that a microLED LTPS display will have a display efficiency of around 10%.
If we look at LCDs, by the way, the situation is much worse - it is estimated that in large-area LCD TVs, the display efficiency is around 3% - even though thereâs minimal loss at the TFT because it is voltage driven. But LCDs suffer from low efficiency because of the energy loss in the color filters, polarizers and LC materials. Both OLEDs and microLEDs will outperform LCDs, but it seems as if microLEDs will not be able to achieve a meaningful step up in display efficiency - especially if/when an efficient blue emitter is finally developed which could dramatically increase the efficiency of OLEDs.
Of course it is hopeful that microLED displays, which are still at a very early stage of development, will offer much faster advances in technology - and maybe in efficiency too. Perhaps new backplane technologies (or maybe microIC-based displays) could prove to be the key for high efficiency microLED displays.
 
Found this on a macsetups sub on reddit. 55" QN700B 8k.

"confirmed BetterDisplay can enable HiDPI modes from 4K to 8K. I'm currently running 5K (5120x2880@60Hz) UI scaled 1.5 to 8K and the text is very clear/legible and the image quality is the same as in the native 4K HiDPI scaled to 8K mode."

I think if he was sitting closer he could use higher rez without the interfaces being too small, or if his screen was one of the more modern 65" 8k's instead of 55" at the pictured distance. Either way the setup looks pretty great to me.

8k_QN700B_Desktop-Windows_1.jpg
 
Last edited:
Found this on a macsetups sub on reddit. 55" QN700B 8k.

"confirmed BetterDisplay can enable HiDPI modes from 4K to 8K. I'm currently running 5K (5120x2880@60Hz) UI scaled 1.5 to 8K and the text is very clear/legible and the image quality is the same as in the native 4K HiDPI scaled to 8K mode."

I think if he was sitting closer he could use higher rez without the interfaces being too small, or if his screen was one of the more modern 65" 8k's instead of 55" at the pictured distance. Either way the setup looks pretty great to me.

View attachment 589871
https://www.reddit.com/r/macsetups/...p_v3_8k_samsung_55_qn700b_at_4k_hidpi_160ppi/

Could be remembering wrong but I think the QN700B is only 60 hz?
 
so tempting but I'm not set up for the 700b just yet. I'll probably look at the 65" ones that bloom less if they drop in price end of 2023 -> into 2024. I don't want to pay 4k for one though.
 
so tempting but I'm not set up for the 700b just yet. I'll probably look at the 65" ones that bloom less if they drop in price end of 2023 -> into 2024. I don't want to pay 4k for one though.
Samsung is back at, at least in Sweden, offering the QN900B for something like $2200 including VAT (really hard to translate prices, Sweden is a high cost nation with a low valued currency at the moment). And that is including a new Samsung Tab S9 or whatever it is called that I believe retails for around $1000. Tempted to get one again :D

QN700B does not really do it for me, it basically only has 8K going for it in my book.
 
Found this on a macsetups sub on reddit. 55" QN700B 8k.

"confirmed BetterDisplay can enable HiDPI modes from 4K to 8K. I'm currently running 5K (5120x2880@60Hz) UI scaled 1.5 to 8K and the text is very clear/legible and the image quality is the same as in the native 4K HiDPI scaled to 8K mode."

I think if he was sitting closer he could use higher rez without the interfaces being too small, or if his screen was one of the more modern 65" 8k's instead of 55" at the pictured distance. Either way the setup looks pretty great to me.

View attachment 589871
and what is those 2 silver looking thing on his desk?
 
I already post that question earlier, Samsung QN700B doesn’t do 4:4:4 chroma , do a google search, you'll get something like this:

https://twitter.com/andrewculver/status/1618054483833155586


From that mac reddit sub thread on QN700B :

"Edit: Looks like it has Chroma 4:4:4 support using rtings test image with belkin HDMI2.1 48Gbps cable. Haven't been able to get into service menu yet to verify connection details."


I posted earlier in reply to improvise a few replies back:



The 700B had a dithering issue but I guess there is a workaround using game mode where it disappears.

From Reddit replies in a thread:


"if you set the input to PC mode and turn on game mode, the picture is great and the dithering completely disappears. The picture looks entirely different and how you’d expect it to.
I’m running 8K 60Hz over HDMI 2.1 from a 2023 MacBook Pro M2 Max."

"To get good results I had to set the input to be a PC, my laptop to use variable refresh rate, and set Game Mode on. Game Mode is a standalone feature that can be used regardless of input, and there isn’t a PC Mode or any other modes."

"
Just wanted to comment for anyone trying this. You can get this to work properly.
You can get the checkerboard and dithering to go away. Your dreams can come true.
I tortured myself for hours getting it to work, and it was completely worth it.
You need:
- HDMI 2.1 GPU
- 48gbps certified HDMI cable (I'm serious)
- G-sync
You must do the following:
- Enable InputSignalPlus, then Game mode in the TV menu
- Then enable Gsync in Nvidia control panel.
I run at 60hz, very rarely I get some localized jitter (once a month at most) I simply switch between 59hz and back to 60hz to make it go away (again in Nvidia control panel).
This monitor is an absolute gamechanger. Excellent brightness, HDR, and most of all
4x4K with the same pixel density as a 4K 27" monitor.
FYI, running GTX 3080
"


700b with dithering
https://i.postimg.cc/QCJMn5X7/IMG-9047.jpg


700B without dithering
https://i.postimg.cc/T20GFTVn/IMG-9046.jpg



. .


and what is those 2 silver looking thing on his desk?

From that mac reddit sub thread on QN700B :

"2x Trackballs: this is a new project I haven't released much on yet, but they're solid aluminum 55mm trackballs on low-friction ptfe bearings with dual optical sensors for smooth mouse movement and twist to scroll. My favorite feature is capacitive touch to allow for single-button features and support for click & drag. The left trackball is dedicated to pan and scroll as well as a few gestures like forward/back functionality. Work in progress trackball repo "

. . . .


Samsung is back at, at least in Sweden, offering the QN900B for something like $2200 including VAT (really hard to translate prices, Sweden is a high cost nation with a low valued currency at the moment). And that is including a new Samsung Tab S9 or whatever it is called that I believe retails for around $1000. Tempted to get one again :D

QN700B does not really do it for me, it basically only has 8K going for it in my book.

Yeah 8k and $1000 heh. Since I am considering eventually dropping money on a 2x4k , 32" 4k 'doublewide' 57" 7680 x 2160 gaming screen an 8k would be mostly for desktop/apps so I don't need to spend $2k to $3k extra for features I'm not using as much. I'll prob just throw one of my existing screens above the doublewide at first and wait out deals on a better 8k later at this point though b/c I agree with you performance wise 700B isn't as great even for media as the FALD haloing/glow is bad. It's not horrible for all that desktop/app real-estate and high PPD for $1k though considering.
 
Last edited:
why bothers w/ Reddit? I can't find Samsung spec. to show whether their chroma is 4:4:4 or not

Samsung should have make that clear
 
why bothers w/ Reddit? I can't find Samsung spec. to show whether their chroma is 4:4:4 or not

Samsung should have make that clear

The result he posted from the reddit post:



https://www.reddit.com/r/macsetups/...p_v3_8k_samsung_55_qn700b_at_4k_hidpi_160ppi/

---------------------------
.. QUOTEs ....

TL;DR I'm loving the QN700B 55" running 4K HiDPI at 8K Native VRR ~60Hz with ~160PPI and big improvement in text/image clarity all on a 2023 MBP M2 Pro.

Edit: I have 5K HiDPI UI resolution scaling up to 8K working with the help of BetterDisplay Dummy Mirroring. Text/Image clarity is impressive and I've never had this much screen real estate on a single monitor that's actually usable before. I'm impressed.

Edit: ← Couple sample images comparing chrome-rendered text and part of an image in photos.app in either 4K HiDPI 8K Native vs 4K native on the QN700B.

Edit: Looks like it has Chroma 4:4:4 support using rtings test image with belkin HDMI2.1 48Gbps cable. Haven't been able to get into service menu yet to verify connection details.

Edit: I did a quick comparison on 4K Native 120Hz which works fine in Game Mode on the TV. While I appreciate faster response times, ~60Hz is fine for my use-case and having the clearer text and images is far more important to my use case.

I sold off my 2x 43" IPS monitors 2 years ago and have been happily running an M1 Mac with the 2020 Samsung Q90T 55" in 4K@60Hz until now. I've definitely enjoyed the 55" footprint, as obscenely large as it sounds, given my usual >4' viewing distance. It's very comfortable overall, minimal neck movements and surprisingly good ergonomics (for me). Almost all of my workflows involve referencing a lot of static content for work/play and having more screen real estate has long been a priority for me while not making it an ergonomic/comfortable. I've been long interested in getting more pixel density for text & images for the sake of increased clarity and reduced eye strain.

While this is still a very new setup to me, I've been impressed with the 2022 Samsung QN700B which appears to be an obscure model only really available at BestBuy in US yet running at a discount price as of now. Right out of the box with the 2023 M2 Pro Mac it defaults to 4K HiDPI 8K Native @ 60Hz and the clarity of text/images was immediately noticeable. There were a few caveats along the way, but enabling Game Mode and using VRR ~60hz, a few tweaks to the picture config and using samsungs smart calibration (via iphone) it's pretty well dialed in for me right now. Using it for some long work sessions has already proven it's value to me just in the improved clarity of text (osx font smoothing default override enabled as well).

"Edit: Wow... confirmed BetterDisplay can enable HiDPI modes from 4K to 8K. I'm currently running 5K (5120x2880@60Hz) UI scaled 1.5 to 8K and the text is very clear/legible and the image quality is the same as in the native 4K HiDPI scaled to 8K mode. Not likely to try a bigger UI resolution since the size of the UI and text might be just too small. Will try a few more resolutions between 4k and 5k for the UI."

.... end QUOTEs....



Edit (by elvn) : I think HiDPI might be incompatible with HDR, not sure. His examples are all best using HiDPI which might be some kind of supersampling. He did say 5k worked really well on text. I think his mac's HiDPI method's resolutions might be partly due to output limitations on current mac but there was supposed to be a 8k mac at some point. Idk I'm not up on the mac stuff but external screens on one that aren't mac seem to be problematic.

Pretty sure considering what he showed, that 444 works, and that text would look pretty great. It's like four 4k monitors so scaling, if any, would depend how far away you sat. The newer 65" 8k models are better, more zones, a lot less blooming, better viewing angles (vs. shift/uniformity .. even sidelong viewing to the sides when sitting up close to the screen), probably a little higher HDR peaks, and better for gaming if not just using the 8k for desktop/apps, etc. . . but they are also 3x to 4x more more expensive.. 3k to 4k+ in usd. Samsung charges high premiums and there is little competition in the 8k market for now, maybe until 2025 idk. Maybe they'll discount the 900b/c by end of 2024 though if some 2025 flagships are going to come out. Would be great if there was more competition from LG and TCL, etc.
 
Last edited:
I understand. But by the same token, if you do a search under the model name and the word 'chroma", there are other website who said it doesn't have 444

So surely this shouldn't be a guessing game. But when I read the spec. of this model, it doesn't say. I am guessing some samsung website should says 1 way or the other what the chroma ratio is

imagine you and I go buy a car, and the manufacturer won't tell you how many HP the car is. And you and I have to guess. Would you buy that car?
 
  • Like
Reactions: elvn
like this
That rtings 444 test image isn't being shown at 1:1. It's got some scaling on it for sure, and even if it's integer scaling there are suggestions in areas that it's not playing ball with 444.

Diagonal pixels in particular look mangled.

Edit: There's a real lack of understanding that this test image must be at 1:1 on any and all displays it's being used on for a valid result.
 

Attachments

  • 1691945527338.png
    1691945527338.png
    93.8 KB · Views: 0
  • 1691945551475.png
    1691945551475.png
    65.9 KB · Views: 0
That rtings 444 test image isn't being shown at 1:1. It's got some scaling on it for sure, and even if it's integer scaling there are suggestions in areas that it's not playing ball with 444.

Diagonal pixels in particular look mangled.

Edit: There's a real lack of understanding that this test image must be at 1:1 on any and all displays it's being used on for a valid result.

Yeah his example might be using that mac supersampling/scaling HiDPI at 4k or 5k. Still he got good results overall I think and for $1k prob not a bad deal overall. I'll wait out drops on the 800b/900b - c though. Not in that big of a rush but looking forward to higher than 4k real-estate someday w/out bezels. Currently I have two 4k screens for desktop/apps + a 4k OLED for media and games.
 
Back
Top