3080 - 1440p 120fps HDR problem

mike6289

n00b
Joined
Oct 2, 2016
Messages
28
So I have an Asus Tuf Gaming edition 3080 hooked up to my OLED TV, a Sony A8G.
The TV doesn't support HDMI 2.1, but fortunately for me I want 120fps gaming, and the card is only really capable at 1440p using the nicer graphics settings in modern games.

We know of the bandwidth limitations of HDMI 2.0 - it's stated as 18.0gbps, 14.4 for video (with 8/10b encoding)

I don't know the exact math on how things are calculated, but 4k60 (8,294,400 million pixels per frame, 60 times a second is 497,664,000 pixels per second, each with 32 bits per pixel is 12,740,198,400bps, which is very close to the 12.54 Gbit/s stated on wikipedia for 4k60 8 bit colour (after dividing by 1024)

Now, on Wikipedia, it's stated that 1440p at 10 bit colour (which I really want over 8 bit because of banding) requires 14.49 Gbit/s. We have 14.40. It says on the page you need to use 4:2:2 colour to stay within available bandwidth. Now, I've tried to set it, but the colour isn't taking. I do set 4:2:2 successfully, but afterward the drop down menu for colour depth only has 8 bit as an option.

I really want 4:4:4 for clear text, and I found that there's a way to reduce video bandwidth a bit with a mode called "CVT reduced blank" - after some math I found only takes 97.11% of the bandwidth of the default mode, making the rate 14.07gbps for 10 bit colour.

I made a custom resolution with the reduced blank CVT mode of 2560x1440 @120hz, and it runs in 8 bit colour no problem (normal mode works too), but when I go to change the bit depth to 10, the drop down menu only contains the 8 bit colour option. It's extremely frustrating because even 4:2:2 and 4:2:0, where the bandwidth requirement is WAY under what's available, I still can't select 10 bit. It'll do 10 bit and even 12 bit at 30hz, but the problem is with the "active signal resolution" forcing itself to 4k with a selected "desktop resolution" of 1440p. I want the TV to scale the video signal because it does a much better job than the card (the higher end Sonys are known for having the best upscaling - when I feed 1080 and 1440 signal with the "active signal resolution" actually being 1080 and 1440 like the desktop resolution, the TV truly almost looks like a native 1080 or 1440p display. Fonts are sharp and not overly bold. It's likely noticeably better than LGs TVs, making my A8G a better display for high frame rate gaming with a 3080 than even LG's CX. As long as I can get these colour settings to take. Otherwise the TV is just a bit sharper, but with more banding.

I should add, I can't even select 10 or 12 bit colour at 1080p 120hz, so I have a bit of a feeling this isn't bandwidth related. The TV supports 1080p 60hz at both 10 and 12 bit colour 4:4:4 (SDR and HDR), but I can't get 1440p at 60hz to do 10 or 12 bit colour because windows wants the "active signal resolution" to be 3840x2160, and HDMI 2.0 doesn't do 4k60fps at anything but 8 bit colour.

So, does anyone know why I don't have the menu options available and how to go about enabling them? Is this a driver issue? I really, really don't want 8 bit HDR.
And how do I make it so in Windows 10 the active signal resolution always matches the chosen desktop resolution? For me and all my other systems they've always been the same as each other. It's very important because of the horrible built in scaling the 3080 seems to have (at least compared to my TV), and making 10/12 bit colour unavailable even at 1440p 60hz because the signal resolution is 3840x2160 (beyond HDMI 2.0 spec)
 
Last edited:
If you don't get a response here, Sony support might be your only option.

Rtings only shows 4K 60hz 10 bit for one measurement, all other resolutions don't have a 10 bit: https://www.rtings.com/tv/reviews/sony/a8g-oled
Please don't reply without reading what the actual issue is. People sometimes skim and check the first response for the gist of a problem when things are long, confusing the issue. Forcing the "active signal resolution" to match "desktop resolution" needs to happen (which I can't figure out how to do - the nvidia control panel display scaling page doesn't do it) before the tv can even be considered as the culprit. I'd also like to know if anyone has the resolution (1440p 120hz 4:4:4 10 bit HDR) working on any HDMI 2.0 monitor with an NVIDIA (or AMD) card for proof it works with anything, and what steps they may have needed to take, if they had to do anything extra
 
Last edited:
Please don't reply without reading what the actual issue is. People sometimes skim and check the first response for the gist of a problem when things are long. Forcing the "active signal resolution" to match "desktop resolution" needs to happen before the tv can even be considered as the culprit. I'd also like to know if anyone has the resolution (1440p 120hz 4:4:4 10 bit HDR) working on any HDMI 2.0 monitor with an NVIDIA (or AMD) for proof it works with anything, and what steps they may have needed to take, if they had to do anything extra

If you don't get a response here, Sony support might be your only option.
 
If you don't get a response here, Sony support might be your only
Sony doesn't have anything to do with nvidia or Microsoft, or matching signal resolution to desktop resolution. Nvidia and Microsoft are hard to get a hold of, so I'm here looking for users' experience, so I can fix my problem and create a record of how it was fixed for others with the same problem in the future to reference. If you have nothing to contribute...
 
Sony doesn't have anything to do with nvidia or Microsoft, or matching signal resolution to desktop resolution. Nvidia and Microsoft are hard to get a hold of, so I'm here looking for users' experience, so I can fix my problem and create a record of how it was fixed for others with the same problem in the future to reference. If you have nothing to contribute...
I contributed more than anyone else so far. That make you mad?
 
Sony doesn't have anything to do with nvidia or Microsoft, or matching signal resolution to desktop resolution. Nvidia and Microsoft are hard to get a hold of, so I'm here looking for users' experience, so I can fix my problem and create a record of how it was fixed for others with the same problem in the future to reference. If you have nothing to contribute...
You might find more help over in the "Display" section of the forum, since your question is about a specific display.
 
You might find more help over in the "Display" section of the forum, since your question is about a specific display.
I need Windows and the nvidia driver to output my chosen desktop resolution as the signal resolution. Has nothing to do with the display yet. The display is capable, and if it's not I should still be able to force an output and have the display tell me it's not supported
 
I need Windows and the nvidia driver to output my chosen desktop resolution as the signal resolution. Has nothing to do with the display yet. The display is capable, and if it's not I should still be able to force an output and have the display tell me it's not supported
It's a question about a specific display and the settings needed to get 1440p120. The video card section is mostly about GPU settings and tweaking. I think you're in the wrong section.
 
It's likely noticeably better than LGs TVs, making my A8G a better display for high frame rate gaming with a 3080 than even LG's CX. As long as I can get these colour settings to take. Otherwise the TV is just a bit sharper, but with more banding.

Apparently not, since the TV doesn't have HDMI 2.1 and thus does not support any sort of VRR tech, or 4k/120Hz, and both otherwise are in parity of supported resolution, HDR features, 4:4:4 Chroma support, response times, etc. Also, RTINGs themselves reported an issue with text clarity on the A8G and thus rated it lower as a monitor than LG's recent OLEDs (8.0 vs 8.6):

We encountered an issue when using the A8G connected to a PC. When sending full 4:4:4 chroma or RGB, there appears to be a processing issue affecting the white sub-pixel, which results in non-uniform text. The affected pixels also appeared to move around as we moved our test image around the screen.

So probably not the answer you're looking for, but it's likely your TV is being restricted, artificially or otherwise, by the HDMI 2.0 standard for 10 bit HDR even at 4:2:2/4:2:0 chroma. As others have said here, you may be better off contacting Sony or going to a forum more specific to this TV or displays in general for an answer for it. Else you may have to trade in this TV for one with proper HDMI 2.1 and 1440p/120Hz w/ 4:4:4 support, which Rtings also pointed a similar issue out:

Chroma 4:4:4 is displayed properly in most formats, except when sending 1440p @ 120Hz, as long as the Picture Mode is set to 'Game' or 'Graphics'. Even though chroma 4:4:4 is displayed properly, there is a processing issue on the A8G that causes a rendering issue when sending a 4k 4:4:4/RGB signal, as described here.
 
So I have an Asus Tuf Gaming edition 3080 hooked up to my OLED TV, a Sony A8G.
The TV doesn't support HDMI 2.1, but fortunately for me I want 120fps gaming, and the card is only really capable at 1440p using the nicer graphics settings in modern games.

We know of the bandwidth limitations of HDMI 2.0 - it's stated as 18.0gbps, 14.4 for video (with 8/10b encoding)

I don't know the exact math on how things are calculated, but 4k60 (8,294,400 million pixels per frame, 60 times a second is 497,664,000 pixels per second, each with 32 bits per pixel is 12,740,198,400bps, which is very close to the 12.54 Gbit/s stated on wikipedia for 4k60 8 bit colour (after dividing by 1024)

Now, on Wikipedia, it's stated that 1440p at 10 bit colour (which I really want over 8 bit because of banding) requires 14.49 Gbit/s. We have 14.40. It says on the page you need to use 4:2:2 colour to stay within available bandwidth. Now, I've tried to set it, but the colour isn't taking. I do set 4:2:2 successfully, but afterward the drop down menu for colour depth only has 8 bit as an option.

I really want 4:4:4 for clear text, and I found that there's a way to reduce video bandwidth a bit with a mode called "CVT reduced blank" - after some math I found only takes 97.11% of the bandwidth of the default mode, making the rate 14.07gbps for 10 bit colour.

I made a custom resolution with the reduced blank CVT mode of 2560x1440 @120hz, and it runs in 8 bit colour no problem (normal mode works too), but when I go to change the bit depth to 10, the drop down menu only contains the 8 bit colour option. It's extremely frustrating because even 4:2:2 and 4:2:0, where the bandwidth requirement is WAY under what's available, I still can't select 10 bit. It'll do 10 bit and even 12 bit at 30hz, but the problem is with the "active signal resolution" forcing itself to 4k with a selected "desktop resolution" of 1440p. I want the TV to scale the video signal because it does a much better job than the card (the higher end Sonys are known for having the best upscaling - when I feed 1080 and 1440 signal with the "active signal resolution" actually being 1080 and 1440 like the desktop resolution, the TV truly almost looks like a native 1080 or 1440p display. Fonts are sharp and not overly bold. It's likely noticeably better than LGs TVs, making my A8G a better display for high frame rate gaming with a 3080 than even LG's CX. As long as I can get these colour settings to take. Otherwise the TV is just a bit sharper, but with more banding.

I should add, I can't even select 10 or 12 bit colour at 1080p 120hz, so I have a bit of a feeling this isn't bandwidth related. The TV supports 1080p 60hz at both 10 and 12 bit colour 4:4:4, but I can't get 1440p at 60hz to do 10 or 12 bit colour because windows wants the "active signal resolution" to be 3840x2160, and HDMI 2.0 doesn't do 4k60fps at anything but 8 bit colour.

So, does anyone know why I don't have the menu options available and how to go about enabling them? Is this a driver issue? I really, really don't want 8 bit HDR.
And how do I make it so in Windows 10 the active signal resolution always matches the chosen desktop resolution? For me and all my other systems they've always been the same as each other. It's very important because of the horrible built in scaling the 3080 seems to have (at least compared to my TV), and making 10/12 bit colour unavailable even at 1440p 60hz because the signal resolution is 3840x2160 (beyond HDMI 2.0 spec)
So I finally forced myself to read completely through your post to see what the exact issue is . My takeaway here is that you're trying to get the A8G to do something that, put simply, it is incapable of doing.

I have to say that you sound quite arrogant in this post, especially saying that the 3080 is only a 1440p120 card. So many people here (myself included) would call bs on this, but I suppose that's up to someone's point of view, so I'll begrudgingly accept it.

It's apparent you've done quite a bit of research on this topic, and I applaud you for that. However, you're forgetting one major item; is the TV capable of processing what you're asking for? Sure, the port is HDMI 2.0b, but is it capable of accepting the full bandwidth of HDMI 2.0b? If it was, it would be able to accept a 4K120 4:2:0 8-bit signal, but it's not able to do that, which throws a wrench into your theory of what the TV "should" be able to do. Your issue is not HDMI.... it's the TV.

Which leads me to my last point about you saying that the A8G is "a better display for high frame rate gaming with a 3080 than even LG's CX." There is a small army of CX owners on this board that would 100% disagree with this statement, especially considering that the CX can handle 4K120 4:4:4 10-bit HDR on the 3080 without issue.

With that being said, the A8G is superior at picture processing if you're watching movies and other consumable content. Sony has always had excellent picture processing. But for gaming, the LG C9 and CX are the best large format displays for gaming, period. Full stop.

So instead of trying to justify your purchase with hilariously inaccurate statements like this, try asking "Is there a way to do what I want my A8G to do?"
 
I have a problem that appears to be down to a similar issue.
My TV can do 1440p120 and I used to play racing games like this a lot with older drivers.
But since NVidia started bundling all standard resolutions under the 4Kx2K section in the driver, I have not been able to select more than 60Hz.
Even 1080p 120Hz is no longer allowed.
It looks like lower than native resolutions under the 4Kx2K section are messed with in some way so they dont leave enough bandwidth for higher refresh. Or they are limited unnecessarily by the driver.
Either way, its not a satisfactory situation because the only resolutions I can use at 120Hz are those not under the 4Kx2K section, the max being a 16:10 res near 1080p, very annoying.
(Windows 7, could manifest a bit different in Windows 10)

Summing up
The NVidia driver is fucking with us.
 
Last edited:
IIRC I had to scroll down and select 2560x1440 in the PC resolution section instead of the 1440 setting in the Ultra HD, HD, SD section that it defaulted to. Refresh above 60Hz was not avaliable until I did. This is probably not applicable to the OP's problem since it appears they're getting 120Hz but might be worth a try since selecting the resolution under PC section may offer more options. The Nvidia control panel > display > change resolution section.
 
IIRC I had to scroll down and select 2560x1440 in the PC resolution section instead of the 1440 setting in the Ultra HD, HD, SD section that it defaulted to. Refresh above 60Hz was not avaliable until I did. This is probably not applicable to the OP's problem since it appears they're getting 120Hz but might be worth a try since selecting the resolution under PC section may offer more options. The Nvidia control panel > display > change resolution section.

For a while this was true and it worked, but 1440p is now solely in the 4Kx2K section and doesnt work above 60Hz.
Just being in that section signs the 120Hz death knell with my native res 60Hz display.
 
Just because the bandwidth theoretically is enough, doesn't mean Sony configured it to support it.
 
For a while this was true and it worked, but 1440p is now solely in the 4Kx2K section and doesnt work above 60Hz.
Just being in that section signs the 120Hz death knell with my native res 60Hz display.
This seems to be the issue. I created a custom resolution of 2560x1440 at 120hz, and to get it I either have to choose it from the custom resolutions in the nvidia control panel, or use windows. Once the custom resolution is added, in advanced display settings > display adapter properties > monitor tab, the 120hz refresh rate becomes availabe
 
I have a problem that appears to be down to a similar issue.
My TV can do 1440p120 and I used to play racing games like this a lot with older drivers.
But since NVidia started bundling all standard resolutions under the 4Kx2K section in the driver, I have not been able to select more than 60Hz.
Even 1080p 120Hz is no longer allowed.
It looks like lower than native resolutions under the 4Kx2K section are messed with in some way so they dont leave enough bandwidth for higher refresh. Or they are limited unnecessarily by the driver.
Either way, its not a satisfactory situation because the only resolutions I can use at 120Hz are those not under the 4Kx2K section, the max being a 16:10 res near 1080p, very annoying.
(Windows 7, could manifest a bit different in Windows 10)

Summing up
The NVidia driver is fucking with us.
If the game you're playing has "windowed full screen" mode, you might be able to add the custom resolution in the nvidia control panel and get 120hz. COD black ops 3 had the option for 120hz when I first got the card, but one day it disappeared and the only way I could get it working again was to set my desktop resolution first and use this mode. I don't think I did a driver update and don't know if there was a game update - reinstalling had no effect. There might be a slight performance penalty. I can't say for sure because both ways, 99.x% of the time I kept a minimum of 120 fps (played with vsync)
edit: with it off it was between 130-170fps. I only had it off briefly when 120hz was still available in game to see what the card did when I got it
edit2: when things were working best, for games to have the 120hz option in their menus, the desktop resolution needed to be set to either 1080p or 1440p 120hz. Whichever resolution he desktop was set to was the resolution which had the 120hz option in the drop down
 
Apparently not, since the TV doesn't have HDMI 2.1 and thus does not support any sort of VRR tech, or 4k/120Hz, and both otherwise are in parity of supported resolution, HDR features, 4:4:4 Chroma support, response times, etc. Also, RTINGs themselves reported an issue with text clarity on the A8G and thus rated it lower as a monitor than LG's recent OLEDs (8.0 vs 8.6):



So probably not the answer you're looking for, but it's likely your TV is being restricted, artificially or otherwise, by the HDMI 2.0 standard for 10 bit HDR even at 4:2:2/4:2:0 chroma. As others have said here, you may be better off contacting Sony or going to a forum more specific to this TV or displays in general for an answer for it. Else you may have to trade in this TV for one with proper HDMI 2.1 and 1440p/120Hz w/ 4:4:4 support, which Rtings also pointed a similar issue out:
I know about rtings reported issue with clarity on the A8G. When I first got the TV I experienced it myself, but since an update the issue seems to be resolved. Maybe not 100% but indistinguishable from perfect I'd say. Though I don't know what the OLED panel looks like when it's perfectly driven so there's that.

If you read what they say about the problem: "Chroma 4:4:4 is displayed properly in most formats, except when sending 1440p @ 120Hz, as long as the Picture Mode is set to 'Game' or 'Graphics'.", it's not really a sentence. From my experience with the TV at the resolution 1440p120, they seem to mean there's an issue in all picture modes except game and graphics. But maybe not. There's a slight reduction in the quality of small fonts at 120hz vs 60hz and below. I sit 9 feet from the 65" TV with 20:15 ish vision. I used it for months without noticing.

There's obviously a problem with the nvidia cards/drivers. For example, my appleTV and panasonic blu ray player both send 4k 4:2:2 10/12 bit HDR @60hz on a regular basis without issue. Can I get my 3080 to? No. And that's the most common 4k hdr signal sent to 4k TVs. This is not a small problem. There are a lot of 4k HDR HDMI 2.0 displays out there. Pretty much all of them, actually
 
Last edited:
I think it's the way that the Nvidia control panel reacts to modes. On my Windows 10 / Sony X900f combo, the RGB/YCbCr mode drop down populates correctly for standard 4k modes after it's in that resolution and in HDR mode (set in the Windows display settings). So if I'm in a standard res (e.g. 4k60 HDR) I can then switch to 4:2:2 10 bit mode. In fact at that point, I can pick other impossible combinations of bit depth / resolution that cause my TV to go black. It's clearly a bug. What I think it's trying to do is query the display after that mode is selected, as to what other color modes are available.
When in a custom res like 2560x1440p120 RGB 8bit SDR, I cannot pick any other color mode. I assume that's because the TV is not providing any other supported color modes (no wonder, it's a custom res). But I bet a different tool could do it (at the risk of causing your TV to black out until you hard reboot the TV). I used to use an older tool to set custom resolutions with my CRT which I kept long after I got my first Gsync 144hz monitor, but it clearly pre-dated W10.
 
So I finally forced myself to read completely through your post to see what the exact issue is . My takeaway here is that you're trying to get the A8G to do something that, put simply, it is incapable of doing.

I have to say that you sound quite arrogant in this post, especially saying that the 3080 is only a 1440p120 card. So many people here (myself included) would call bs on this, but I suppose that's up to someone's point of view, so I'll begrudgingly accept it.

It's apparent you've done quite a bit of research on this topic, and I applaud you for that. However, you're forgetting one major item; is the TV capable of processing what you're asking for? Sure, the port is HDMI 2.0b, but is it capable of accepting the full bandwidth of HDMI 2.0b? If it was, it would be able to accept a 4K120 4:2:0 8-bit signal, but it's not able to do that, which throws a wrench into your theory of what the TV "should" be able to do. Your issue is not HDMI.... it's the TV.

Which leads me to my last point about you saying that the A8G is "a better display for high frame rate gaming with a 3080 than even LG's CX." There is a small army of CX owners on this board that would 100% disagree with this statement, especially considering that the CX can handle 4K120 4:4:4 10-bit HDR on the 3080 without issue.

With that being said, the A8G is superior at picture processing if you're watching movies and other consumable content. Sony has always had excellent picture processing. But for gaming, the LG C9 and CX are the best large format displays for gaming, period. Full stop.

So instead of trying to justify your purchase with hilariously inaccurate statements like this, try asking "Is there a way to do what I want my A8G to do?"
Thanks for reading, you should do it with all threads. I'm not trying to justify my purchase, I'm frustrated with my purchase not doing what I thought it would when I bought the TV - whether it's up to the card or the TV is yet to be determined. It's for movies/shows first, then gaming. 85/15 split probably. Got it last boxing day, did absolutely no gaming on it for about three quarters of its life

edit: I mostly do fast paced games (fps, racing), and with OLED, the extremely fast pixel response time of ~0.1ms for mostly complete pixel transitions cause a strobe like effect with bright things on dark backgrounds when moving > approx. 5cm per frame on TV sized displays. Especially quick movements in fps games cause this - they happen a lot at 60hz and they're unpleasant and distracting, especially in a dark room. VRRing between 60 and 120, staying between 70 and 90 is not my idea of ideal, so in my opinion, for the games I play and OLED display type, the 3080 is a 1440p120 card.
 
Last edited:
I think it's the way that the Nvidia control panel reacts to modes. On my Windows 10 / Sony X900f combo, the RGB/YCbCr mode drop down populates correctly for standard 4k modes after it's in that resolution and in HDR mode (set in the Windows display settings). So if I'm in a standard res (e.g. 4k60 HDR) I can then switch to 4:2:2 10 bit mode. In fact at that point, I can pick other impossible combinations of bit depth / resolution that cause my TV to go black. It's clearly a bug. What I think it's trying to do is query the display after that mode is selected, as to what other color modes are available.
When in a custom res like 2560x1440p120 RGB 8bit SDR, I cannot pick any other color mode. I assume that's because the TV is not providing any other supported color modes (no wonder, it's a custom res). But I bet a different tool could do it (at the risk of causing your TV to black out until you hard reboot the TV). I used to use an older tool to set custom resolutions with my CRT which I kept long after I got my first Gsync 144hz monitor, but it clearly pre-dated W10.
For me, when I switch to 4k60 I get 8 bit colour. HDR gives me "8 bit dithered". Changing the chroma resolution to 4:2:2 doesn't make available the 10 or 12 bit colour options. The only thing that does is a refresh rate of 30hz.
Actually, I once made a custom resolution of 48hz and 10 bit colour was available there. I believe 10 bit 4k at 48hz 4:4:4 is the exact same bitrate as 8 bit 4k 60hz 4:4:4, about 87% of HDMI 2.0

I'm definitely looking for a tool like that. My TV will display non standard modes, and the only way to know which ones is to send them
 
Last edited:
Well we have different TVs (but both Sony HDMI 2.0) so I can see that. My exact steps:
1. Start in W10 display settings, enable HDR
2. Launch Nvidia control panel, click on correct monitor > Change resolution
3. Choose 4:2:2 instead of RGB
4. Apply
5. Now I can pick 10 bit (or 12 bit) > Apply

Note step 4, doesn't work without it.

Edit: After doing all of the above, I can now try to set 1440p120 or 1080p120 keeping the color settings from the previous mode (again, probably a bug that it lets me try), but it causes the Sony TV to crash back to the android screen after a ~20sec black screen.

Trying to replicate these steps from scratch without first going to a standard 4K HDR mode doesn't work for 1920x1080p120, which is 1/2 the bandwidth of the above, so it's clearly not bandwidth related.
 
Last edited:
Well we have different TVs (but both Sony HDMI 2.0) so I can see that. My exact steps:
1. Start in W10 display settings, enable HDR
2. Launch Nvidia control panel, click on correct monitor > Change resolution
3. Choose 4:2:2 instead of RGB
4. Apply
5. Now I can pick 10 bit (or 12 bit) > Apply

Note step 4, doesn't work without it.

Edit: After doing all of the above, I can now try to set 1440p120 or 1080p120 keeping the color settings from the previous mode (again, probably a bug that it lets me try), but it causes the Sony TV to crash back to the android screen after a ~20sec black screen.

Trying to replicate these steps from scratch without first going to a standard 4K HDR mode doesn't work for 1920x1080p120, which is 1/2 the bandwidth of the above, so it's clearly not bandwidth related.

I have had those options available before, not sure of the exact steps I took to get there, but when I tried to apply the 10 or 12 bit colour, the screen flickered (off for ~1/10th of a second), then the menu options were gone, along with the apply button. It takes the TV 2-3 seconds to start displaying any new resolution which led me to believe there was no attempt at a mode change.

Edit: it looks like the x900f has the x1 extreme video processor, same as A8G. Our TVs should have the same picture mode compatibility.
From reviews by competent people, (including Vincent Teoh of YouTube channel HDTVTest), it's better than the x1 ultimate in Sony's current master series TVs in some ways. I think better scaling and motion interpolation (selectable higher frame rates without looking like a cheap production aka "soap opera effect"). It has fewer interpolation errors and locks to the video better than literally all other TVs for fewest intermittently dropped frames while maintaining realism. Doesn't beat it at upscaling to 8k though, for obvious reasons. Its object based HDR enhancement is supposedly a bit better, but not something which really needs enabling with quality content
 
Last edited:
Back
Top