Samsung’s premium 2020 TVs to support 8K at 60Hz and 4k at 120Hz

Eh? All 2019 and newer OLEDs have full support for HDMI 2.1. They will be able to do 4k/120.
There is just a issue with the eARC not sending uncompressed audio.

Do they? It has been a while since I was looking (early 2019), but at the time OEMs were only supporting what they felt like. Some had high refresh rates and/or variable refresh while missing the other. Are they all full-spec other than eARC now?
 
Do they? It has been a while since I was looking (early 2019), but at the time OEMs were only supporting what they felt like. Some had high refresh rates and/or variable refresh while missing the other. Are they all full-spec other than eARC now?
From what I understand yes. It is untested since there is no HDMI 2.1 devices out atm. It just what LG claims.
 
Source? I've heard OLED latency input lag was terrible but last I looked into it was a few years ago.
It's damn good - 6ms.. less than a frame at 120Hz. Then you have pixel transitions themselves which make most LCDs look like shit and the infinite black depth. My next screen will be a CX for these reasons and more.
 
Do they? It has been a while since I was looking (early 2019), but at the time OEMs were only supporting what they felt like. Some had high refresh rates and/or variable refresh while missing the other. Are they all full-spec other than eARC now?

Yeah, LG's OLEDs are pretty much the only TVs out now that have HDMI 2.1 (on all ports too, not just one like a lot of TVs do with HDMI 2.0/2.1) on them. It was one of the main reasons I got my B9 along with full VRR/G-Sync support for next gen consoles and my PC. Though as I said earlier, the best your can do right now is 1440p/120 Hz or 4k/60 Hz on HDMI 2.0 devices and graphics cards out right now. I'm sure the RTX 3000 series cards (and next gen consoles) will have HDMI 2.1 on them to fully support VRR on it and 4K/120 Hz.
 
I'm probably buying a Vizio PQX 75 inch 2020 model. They'll have support for almost all HDR formats and 120Hz screens with HDMI 2.1
 
At this point I'm probably going to hold off for Ampere since I can't really take advantage of the tech yet anyway. Whatever I get will be at least 70" with quick response times. I'm brand-agnostic for the most part. I was all about Sony 10-15 years ago, but they've disappointed me so much that I'm open to almost anything now.
 
I don't understand the value of 8k.

I can't even see pixels right now on my 32" Dell 2560x1440 work monitor at about 2.5 foot viewing distance. 4k should be plenty good enough for anything for my old (still 20/20) eyes.

So I do understand 4K, 120Hz as an upgrade, but 8K? Immaterial on a traditionally sized PC monitor or TV.

How close do you have to be to a 65" TV to see pixels at 8K? 1 foot? Who would ever sit that close? In that...who cares.

2439491B-CE69-476B-BC4D-824747C151B9.jpeg
 
Last edited:
I don't understand the value of 8k.

Marketing. TV makers are always desperate to find the next big thing to make people buy a new TV. That was why 3D TVs were a thing and then weren't. They hoped that would make people want a new TV. They didn't, so they went away. Well 4k has been successful marketing, so they are hoping 8k will also be successful.
 
  • Like
Reactions: Mega6
like this
No Dolby Vision on Samsung TVs 👎

We got a Sony with Dolby Vision this year since we are an Apple and Xbox household. Holy cow, the effect is very noticable....borderline uncomfortable in some scenes....not blinding, but your eyes do protest.

I've also noticed the difference playing co-op with my kids while on Lockdown. We have two Xbox's in the same room, one on the Sony dolby vision and one on standard IPS monitor and seeing the same cutscenes side by side you can see the Dolby Vision can do "neon" colors. Where as the monitor will show just normal yellows or oranges, the Dolby Vision can do neon yellow or orange, which looks great on fire or sunsets. Also the glowing blue effects on like power suits and weapons are far more pronounced.

This was our first TV with Dolby Vision/HDR+ and it's very cool trick. I imagine HDR+ is similar effect, just different backing.
 
We got a Sony with Dolby Vision this year since we are an Apple and Xbox household. Holy cow, the effect is very noticable....borderline uncomfortable in some scenes....not blinding, but your eyes do protest.

I've also noticed the difference playing co-op with my kids while on Lockdown. We have two Xbox's in the same room, one on the Sony dolby vision and one on standard IPS monitor and seeing the same cutscenes side by side you can see the Dolby Vision can do "neon" colors. Where as the monitor will show just normal yellows or oranges, the Dolby Vision can do neon yellow or orange, which looks great on fire or sunsets. Also the glowing blue effects on like power suits and weapons are far more pronounced.

This was our first TV with Dolby Vision/HDR+ and it's very cool trick. I imagine HDR+ is similar effect, just different backing.

I read somewhere that all TVs are internally 10 bit & not 12 bit. Dolby vision uses 12 bit color (unlike samsung's HDR+ which sticks to 10 bit color). 4k + 12 bit color requires more bandwith, so no TV as of now can do Dolby Vision & 4k 120hz right now. They can do only normal HDR for 4k 120hz. Is that true ?
 
At this point I'm probably going to hold off for Ampere since I can't really take advantage of the tech yet anyway. Whatever I get will be at least 70" with quick response times. I'm brand-agnostic for the most part. I was all about Sony 10-15 years ago, but they've disappointed me so much that I'm open to almost anything now.
Sony still makes great TVs. Just not for the prices. You can get a LG OLED for like half of the Sony OLED. Which uses LG's panel lol. From reviews the Sony OLED has slightly better PQ due to better pricing but it lacked HDMI 2.1.
 
I read somewhere that all TVs are internally 10 bit & not 12 bit. Dolby vision uses 12 bit color (unlike samsung's HDR+ which sticks to 10 bit color). 4k + 12 bit color requires more bandwith, so no TV as of now can do Dolby Vision & 4k 120hz right now. They can do only normal HDR for 4k 120hz. Is that true ?
Actually there are a few TVs with full HDMI2.1 bandwidth inputs that support 12bit color, 4k, 120hz, and 4:4:4 chroma subsampling all at the same time. I just read an article on it this morning.
Whether the TV takes that 12-bit input and downsamples to 10-bit IDK. I would hope not given that the TVs are very high end and expensive, and why support 12-bit with all the other stuff if the TV cant even do it?
 
Last edited:
Sony still makes great TVs. Just not for the prices. You can get a LG OLED for like half of the Sony OLED. Which uses LG's panel lol. From reviews the Sony OLED has slightly better PQ due to better pricing but it lacked HDMI 2.1.

Yup, that's been the problem I ran into. I bought models that were priced what basically everyone else was charging. They were always missing the latest tech standards and underperforming. That's in spite of them being brand new. My buddy that does home theater buildouts for new homes would have always have Sony models that were amazing...but they cost like 2x what I was willing to pay.
 
Actually there are a few TVs with full HDMI2.1 bandwidth inputs that support 12bit color, 4k, 120hz, and 4:4:4 chroma subsampling all at the same time. I just read an article on it this morning.
Whether the TV takes that 12-bit input and downsamples to 10-bit IDK. I would hope not given that the TVs are very high end and expensive, and why support 12-bit with all the other stuff if the TV cant even do it?

They do use the 12-bit internally. Not for the display, those are all 10-bit I'm not aware of any 12-bit panels, but for processing. That's the real reason for DV and 12-bit. With HDR particularly it isn't as simple as "Just output the data on the disc." There is processing as to how to map it to the screen. The higher precision allows for better image quality after processing.
 
I'm probably buying a Vizio PQX 75 inch 2020 model. They'll have support for almost all HDR formats and 120Hz screens with HDMI 2.1
I'm probably buying whatever 2019 model Costco tries to purge from their inventory, just due to cost, seriously the low end 2020 model will probably cost more than the high end 2019 model.
 
Makes me sick that the best rated $400-$750 USD gaming monitors have garbage blacks. Matte IPS is just plain gross to look at when playing movies, watching cut scenes or YouTube videos and playing in dark game areas.

I'm using a Sony X900e overclocked to 200Hz on 1080p, but the scaling looks terrible like that. Blacks are great though and local dimming works well.

Perfect solution to me is:

27" 200Hz 1440p inch glass panel VA with hardware g-sync and local dimming.

But that doesn't exist at any price.

And I have no idea why people complain about glare. Blacks and colors on smartphones look almost as good as OLED because of the glass display. Glare is a small price to pay for that.
 
Makes me sick that the best rated $400-$750 USD gaming monitors have garbage blacks. Matte IPS is just plain gross to look at when playing movies, watching cut scenes or YouTube videos and playing in dark game areas.

I'm using a Sony X900e overclocked to 200Hz on 1080p, but the scaling looks terrible like that. Blacks are great though and local dimming works well.

Perfect solution to me is:

27" 200Hz 1440p inch glass panel VA with hardware g-sync and local dimming.

But that doesn't exist at any price.

And I have no idea why people complain about glare. Blacks and colors on smartphones look almost as good as OLED because of the glass display. Glare is a small price to pay for that.

I mean, you can have something close to that: https://www.bhphotovideo.com/c/product/1511615-REG/asus_pg35vq_35_rog_swift_ultra_wide.html. It's not glass but VA, 200Hz, 1440 (ultrawide in this case) FALD.
 
I have two VA panels. I don't want any more. I'll take IPS blacks if OLED isn't available for the use case.
 
I'm probably buying whatever 2019 model Costco tries to purge from their inventory, just due to cost, seriously the low end 2020 model will probably cost more than the high end 2019 model.

HDMI 2.1 is important to me
 
Just a +1 to all the others (a) hoping for options in the 40" range and (b) comfortable waiting for video cards that can drive these displays. I've been using a Samsung 40" JU7100 for a few years now and it's been a deeply satisfying experience... but it also convinced me that 40" is at the upper end of what I am comfortable with. Something like a 38" 16:9 with a light curve would be a hell of a display at 4k 120Hz... if OLED, even better.

I also am frustrated by Samsung's tendency to remove key features from their smallest size panels within a given model line... and this has me skeptical of their claims.

As for 8K... fine by me as something to play with, so long as I am not giving up significant brightness in order to get it.
 
HDMI 2.1 is important to me

Same here, which is why I got my B9 since it was basically the only TV that has HDMI 2.1 now other than maybe the Samsung Q80 or whatever I was cross-shopping my B9 with at the time. I wanted it just for proper VRR support for next gen consoles, but it's also nice that it supports G-Sync for my PC if I want to drag it into the living room. I wish Steam Link or my game stream on my Shield TV supported 4K60 over LAN, but that may be hard to do even over gigabit ethernet.
 
Last edited:
Apparently the PS5 is not able to make use of the full bandwidth of HDMI 2.1, so color compression quality is impacted slightly it seems


PS5 HDMI 2.1 Bandwidth of 32GB/s is Slower than Xbox Series X’s 40GB/s Number

https://t.co/Jsv5oUpSLB

Ever since the launch of both the consoles there is confusion about the PS5 HDMI 2.1 cable and the bandwidth. To figure this out popular tech reviewer HDTVTest Vincent Teoh tested the console with LG 48-inch CX TV and find out that Sony PS5 HDMI 2.1 bandwidth is capped at 32Gbps.

According to Teoh, he used the LG 48-inch CX TV and during the test he concludes that the Xbox Series X is capable of delivering an output bandwidth of up to 40GB per second via HDMI 2.1.

The Sony PlayStation 5 on the other hand, is restricted to deliver a bandwidth of 32GB per second. As per the report, this clearly means if a user is playing a game at 120Hz refresh rate then the console will automatically switch to 4:2:2 chroma subsampling. However, the Xbox Series X is capable of sticking to 4:4:4 format.
 
Apparently the PS5 is not able to make use of the full bandwidth of HDMI 2.1, so color compression quality is impacted slightly it seems


PS5 HDMI 2.1 Bandwidth of 32GB/s is Slower than Xbox Series X’s 40GB/s Number

https://t.co/Jsv5oUpSLB

Ever since the launch of both the consoles there is confusion about the PS5 HDMI 2.1 cable and the bandwidth. To figure this out popular tech reviewer HDTVTest Vincent Teoh tested the console with LG 48-inch CX TV and find out that Sony PS5 HDMI 2.1 bandwidth is capped at 32Gbps.

According to Teoh, he used the LG 48-inch CX TV and during the test he concludes that the Xbox Series X is capable of delivering an output bandwidth of up to 40GB per second via HDMI 2.1.

The Sony PlayStation 5 on the other hand, is restricted to deliver a bandwidth of 32GB per second. As per the report, this clearly means if a user is playing a game at 120Hz refresh rate then the console will automatically switch to 4:2:2 chroma subsampling. However, the Xbox Series X is capable of sticking to 4:4:4 format.

Interesting. I wonder if Sony plans to fix this in an update soon along with adding VRR support that should have been available at launch. As of right now though, I don't see this impacting the vast majority of PS5 owners that don't have HDMI 2.1/120 Hz TVs. It annoyingly does impact me though, as 120 Hz and VRR is one of the main reasons I got my B9 OLED TV last year in anticipation of these new consoles fully supporting these features.
 
Interesting. I wonder if Sony plans to fix this in an update soon along with adding VRR support that should have been available at launch. As of right now though, I don't see this impacting the vast majority of PS5 owners that don't have HDMI 2.1/120 Hz TVs. It annoyingly does impact me though, as 120 Hz and VRR is one of the main reasons I got my B9 OLED TV last year in anticipation of these new consoles fully supporting these features.
The PS5 forcing everything to be in HDR is the biggest issue right now IMO.
 
No Dolby Vision on Samsung TVs 👎
This is what it came down to for us. This was the primary aspect when we chose a new TV. Since we have AppleTV and Xbox, we had to go with Dolby Vision. Which meant (primarily) LG, Sony and Vizio. We ended up going Sony.

Samsung was completely out of the picture for us due to the HDR Wars™

And Dolby Vision is awesome. Easily the most noticeable difference on our new TV. HDR10+ might also be just as noticable.....but it's one of those feature you don't want to ignore just because of price or brand loyalty. Get whatever format your content supports....because it's great. Our TV gets almost annoyingly bright on some scenes.
 
Some of this is me on the outside looking in, but I feel like Sony sets have gotten better (again) in the last few years. For a while they were marked up over similar brands and they tended to have issues with response time, missing features, etc. I was ride-or-die for Sony sets going back to the 90's, but I swore them off 5-6 years ago. I'll actually be looking at Sony again next spring.
 
Some of this is me on the outside looking in, but I feel like Sony sets have gotten better (again) in the last few years. For a while they were marked up over similar brands and they tended to have issues with response time, missing features, etc. I was ride-or-die for Sony sets going back to the 90's, but I swore them off 5-6 years ago. I'll actually be looking at Sony again next spring.
They're very good this gen. Not a deal/steal, but not over expensive, you get what you pay for. Not in a negative way but more of an 'as you would expect to' way.
 
For gaming, I wonder if 8k will be the point where the unaliased moire patterns go away. The golden goose with higher resolutions always seems to be the “don’t need antialiasing due to the sheer number of lines” point, but annoying moire patterns still develop even at aliased 4K, particularly with oblique views of parallel lines on the screen, like stairs. They are less strong than at lower resolutions though. That said, you still get moire even with antialiasing on but it’s usually much less pronounced.
 
For gaming, I wonder if 8k will be the point where the unaliased moire patterns go away. The golden goose with higher resolutions always seems to be the “don’t need antialiasing due to the sheer number of lines” point, but annoying moire patterns still develop even at aliased 4K, particularly with oblique views of parallel lines on the screen, like stairs. They are less strong than at lower resolutions though. That said, you still get moire even with antialiasing on but it’s usually much less pronounced.
Depends on how far away from it you are/pixel density. It should be somewhere around 300ppi at normal, arm length, monitor distance, when that happens, though I don't know of any testing to verify that (this is based off what we see in print media). So for 30" or under an 8k should probably do the trick to have pixels so invisible to the eye as to make further resolution increases not matter.

Likely to be quite some time though. Right now we are just getting to the point of being able to handle 4k at good speeds and that is only when you don't start doing things like raytracing. It'll be some time yet before GPUs are beefy enough to handle nice looking games at 8k and give good FPS. Particularly since, really, we want to target above 60. Spatial aliasing isn't the only thing to think about, low framerate is temporal aliasing. High FPS really does make everything look more fluid, and make it easy to follow fast moving objects with your eye.
 
But this guy has such a non-punchable face



Damn that guy is annoying.

He is mostly right though.

Didn't help his argument that his face was blurry most of the video though. I'm guessing poor focus was the cause.

4k is really only useful if you are up close and personal. Essentially filling your entire FOV.

8k is never useful, unless you are using it in such a way that you disregard most of what is on screen, like standing super close and only looking at small portions of the screen at a time.
 
For gaming, I wonder if 8k will be the point where the unaliased moire patterns go away. The golden goose with higher resolutions always seems to be the “don’t need antialiasing due to the sheer number of lines” point, but annoying moire patterns still develop even at aliased 4K, particularly with oblique views of parallel lines on the screen, like stairs. They are less strong than at lower resolutions though. That said, you still get moire even with antialiasing on but it’s usually much less pronounced.

Nah, increasing resolution (or pixel density) is not an effective way to get rid of aliasing. At some point it will happen, but it will be at such an excruciatingly high resolution that it will be completely impractical.

Do a little experiment. View aliased rendered content on a screen, and then keep stepping back until it is no longer perceptible. Do the math of relative FOV coverage and determine the resolution needed for that to happen

It's pretty damn high.

I think the future of antialiasing will be DLSS-like technologies that recreate on screen imagery using AI in such a way that it is not aliased.

It should be a hell of a lot less computationally expensive than the level of super high resolution required to do so.
 
I know there are LG 4k tv's out now that already do 120hz and have adaptive sync.

This isn't "news" , it's advertising...
 
Back
Top