Gsync & FreeSync TVs (ever a possibility?)

mr_zen256

2[H]4U
Joined
Dec 29, 2005
Messages
2,608
We all know monitors are all the rage with that technology, seems a shame it would be limited to such small scale displays.

Obviously demand is a major factor and not a lot of PC gamers use TVs as their main displays. I would like to see this happen though.

Plausible?
 
Xbox One X has freesync in the specs so I think that’ll get the party started.

Freesync doesn’t require DisplayPort.
 
HDMI 2.1 spec includes variable refresh rate, and that's the most likely thing to be implemented in TVs. It's unlikely anyone is going to put displayport on TVs.

How about the opposite? Is this a potential development that could end this stupid split market on these -sync techs by flying over them with this variable refresh rate?
 
We all know monitors are all the rage with that technology, seems a shame it would be limited to such small scale displays.

Obviously demand is a major factor and not a lot of PC gamers use TVs as their main displays. I would like to see this happen though.

Plausible?
Technically, easily.
Under license, nope.
 
  • Like
Reactions: Q-BZ
like this
Considering oled tv's are the best quality displays right now, except the two fairly major drawbacks of input lag and no variable refreshrate, and hdmi 2.1 provides both variable refresh and lower input lag according to the specifications, this could provide incentive for nvidia to support freesync.
As it looks now, the top tier near future setup is AMD gpu, hdmi 2.1 freesync OLED TV, assuming they can/will implement freesync on OLED controllers which differ from normal LCDs.
So assuming it can be done, does nvidia want to be #2 for top tier setups, or is the market share of OLED tv pc gaming so small that they just don't care?
(The marketing value of being discussed as #1 among forum fanatics and tech news sites is probably not zero, and nvidia have remained open to supporting freesync in the future).
 
Last edited:
How about the opposite? Is this a potential development that could end this stupid split market on these -sync techs by flying over them with this variable refresh rate?

Sort of... it's a misconception that Gsync is just variable refresh rate and that Freesync is the same thing. At their core they're both variable refresh rates, but each includes various features and different implementation details. For example, Gsync includes variable overdrive for LCDs, which decreases ghosting, while Freesync does not. All Gsync displays also have what AMD calls low framerate compensation, while most Freesync monitors don't have it. The upcoming HDR version of Gsync may include new stuff that we haven't been told about, too.

Freesync over HDMI was an AMD custom protocol invention, it wasn't part of the spec, and I think HDMI 2.1 VRR is different(it might be based on AMD's protocol though, I don't know). It won't include support for all the tricks Gsync uses and I doubt Nvidia wants to be restricted by that, so I don't see Gsync going away any time soon. Gsync lets Nvidia put their own hardware in displays and use that hardware to do just about anything they want that they think would improve the gaming experience. This is more powerful and flexible and just works better. it's also inherently more expensive and proprietary, obviously.

AMD has already announced that they will be supporting HDMI 2.1 VRR. There is absolutely nothing preventing Nvidia from supporting it separately from Gsync and I will be really disappointed in them if they refuse to, as it's unlikely there will be any other way to do VRR with 2018 TVs, which is something I really want on LG OLEDs.
 
As it looks now, the top tier near future setup is AMD gpu, hdmi 2.1 freesync OLED TV, assuming they can/will implement freesync on OLED controllers which differ from normal LCDs.
You forgot the most major consideration here, GPU speed.
If you need to run async there already is GSync, expensive though it is.
AMD dont have cards to match the perfomance of NVidia though.
Your top tier prediction isnt top tier.
 
You forgot the most major consideration here, GPU speed.
If you need to run async there already is GSync, expensive though it is.
AMD dont have cards to match the perfomance of NVidia though.
Your top tier prediction isnt top tier.
If you feel gpu speed is the most important thing, ok, but that requires some very specific scenario and some extremely optimized game with very even rendering times.
Nvidia and amd gpus are close enough that I personally don't care about the difference between the top cards, if it means giving up variable refresh. If the best monitor in existence was gsync, i would run nvidia gfx (which i do), if the best monitor was freesync exclusive, i would change to amd in a heartbeat.

The user experience of a theoretical OLED TV with hdmi 2.1 and freesync would blow a nvidia+OLED non-sync tv experience away since g/freesync does soo much more than the performance advantage of 1080ti vs vega64.
 
If you feel gpu speed is the most important thing, ok, but that requires some very specific scenario and some extremely optimized game with very even rendering times.
Nvidia and amd gpus are close enough that I personally don't care about the difference between the top cards, if it means giving up variable refresh. If the best monitor in existence was gsync, i would run nvidia gfx (which i do), if the best monitor was freesync exclusive, i would change to amd in a heartbeat.

The user experience of a theoretical OLED TV with hdmi 2.1 and freesync would blow a nvidia+OLED non-sync tv experience away since g/freesync does soo much more than the performance advantage of 1080ti vs vega64.
Reading comprehension ftw, I didnt say most important.

There are async types for both mfrs, that leaves GPU speed.
GPU speed is incredibly important at 4K if you want a top tier experience.
AMD doesnt have a top tier card, compromise is needed to use freesync.
 
Reading comprehension ftw, I didnt say most important.

There are async types for both mfrs, that leaves GPU speed.
GPU speed is incredibly important at 4K if you want a top tier experience.
AMD doesnt have a top tier card, compromise is needed to use freesync.

In your mind, what is the difference between most important and most major? Most people would probably interpret it the same in this context.

Only amd currently supports freesync which it looks like hdmi 2.1 will use (for example some current freesync products like the new xbox claim they already support the vrr in hdmi2.1). So as it stands, with hdmi 2.1, it is amd with with vrr vs nvidia without, but with slightly higher performance.

The difference in current top gpu speed quite small, and again, it is a matter of choice but i think most people prefer variable refresh vs having a small performance boost.
 
In your mind, what is the difference between most important and most major? Most people would probably interpret it the same in this context.
"most major 'consideration'" was the term I used.
We know there are async solutions on both sides so the major consideration becomes GPU speed.
4K displays demand the best for top tier performance.

Only amd currently supports freesync which it looks like hdmi 2.1 will use (for example some current freesync products like the new xbox claim they already support the vrr in hdmi2.1). So as it stands, with hdmi 2.1, it is amd with with vrr vs nvidia without, but with slightly higher performance.

The difference in current top gpu speed quite small, and again, it is a matter of choice but i think most people prefer variable refresh vs having a small performance boost.
The difference in GPU speeds is substantial.
The best AMD card is just under GTX1080 performance.
NVidia have the 1080ti, Titan X and Titan Xp above that by a considerable margin.

Look at the average difference between Vega64 and just the stock 1080ti FE.
https://www.computerbase.de/thema/grafikkarte/rangliste/

4K
66.9% Vega 64
89.9% 1080ti FE

Nowhere near close.
Add another 10% for the faster 1080ti models.
 
"most major 'consideration'" was the term I used.
We know there are async solutions on both sides so the major consideration becomes GPU speed.
4K displays demand the best for top tier performance.


The difference in GPU speeds is substantial.
The best AMD card is just under GTX1080 performance.
NVidia have the 1080ti, Titan X and Titan Xp above that by a considerable margin.

Look at the average difference between Vega64 and just the stock 1080ti FE.
https://www.computerbase.de/thema/grafikkarte/rangliste/

4K
66.9% Vega 64
89.9% 1080ti FE

Nowhere near close.
Add another 10% for the faster 1080ti models.
Just wait for Navi. It will blow away Nvidia's beat by at least 83%!
 
We can expect Nvidia to continue to support only G-Sync over DisplayPort, and I give a 50/50 for them supporting HDMI VRR.

Not that I'd actually be interested in HDMI VRR; TVs remain a poorer solution for combined computer and gaming use, and OLEDs are the poorest of those. LG (or Samsung or whoever in China gets a top-tier large-panel line going) needs to get QC up and image retention down first.
 
We can expect Nvidia to continue to support only G-Sync over DisplayPort, and I give a 50/50 for them supporting HDMI VRR.

Not that I'd actually be interested in HDMI VRR; TVs remain a poorer solution for combined computer and gaming use, and OLEDs are the poorest of those. LG (or Samsung or whoever in China gets a top-tier large-panel line going) needs to get QC up and image retention down first.

I'm going to assume HDMI VRR will be similar to Freesync in that it will be implementable for minimal extra cost by commodity panel controllers. If that's the case, Nvidia supporting it will mean cheap VRR monitors will be able to compete against GSync for people with NV GPUs. If that's the case either Nvidia will have to make a much cheaper controller (theoretically should be doable, since IIRC Gsync is being done with an FPGA - great for fast time to market but horrible for cost - so moving to an ASIC would slash costs) or let Gsync wither away and die.
 
I'm going to assume HDMI VRR will be similar to Freesync in that it will be implementable for minimal extra cost by commodity panel controllers. If that's the case, Nvidia supporting it will mean cheap VRR monitors will be able to compete against GSync for people with NV GPUs. If that's the case either Nvidia will have to make a much cheaper controller (theoretically should be doable, since IIRC Gsync is being done with an FPGA - great for fast time to market but horrible for cost - so moving to an ASIC would slash costs) or let Gsync wither away and die.

That's assuming HDMI VRR provides a similar experience to Gsync which we already know it won't since it certainly isn't going to include variable overdrive and I'm sure there's a bunch of stuff they're working on for the new version of Gsync in the 4k144hz HDR displays. Freesync doesn't provide an equivalent experience either, pretty far from it on 99% of Freesync displays in fact.
 
We all know monitors are all the rage with that technology, seems a shame it would be limited to such small scale displays.

Obviously demand is a major factor and not a lot of PC gamers use TVs as their main displays. I would like to see this happen though.

Plausible?


The issue has been it was a DisplayPort tech, and I have talked to various companies as to why they wont put DP on a receiver or TV and the answer is always about HDCP support.
So now that variable refresh is a supported tech (though optional) of HDMI 2.1, we *may* see some TVs with it. My guess will be that yes we will see some,. but it will be in the flagship models for the first couple of years.
 
HDMI VRR is likely to resemble release Freesync- i.e., better than nothing, but vastly inferior to a G-Sync implementation.

Only Freesync 2.0 approaches technological parity with release G-Sync.
 
Okay, thanks. I just couldnt think of what VRR meant. Too many abbreviations in my life.
 
I'm sure you probably heard, but nvidia is coming out with Big Format Gaming Displays (65" with g-sync). Announced at CES.
 
The issue has been it was a DisplayPort tech, and I have talked to various companies as to why they wont put DP on a receiver or TV and the answer is always about HDCP support.
So now that variable refresh is a supported tech (though optional) of HDMI 2.1, we *may* see some TVs with it. My guess will be that yes we will see some,. but it will be in the flagship models for the first couple of years.
That makes no sense. DisplayPort 1.3 has HDCP 2.2 support just like HDMI 2.0.
I'm sure you probably heard, but nvidia is coming out with Big Format Gaming Displays (65" with g-sync). Announced at CES.
BFGD are monitors, not televisions.
 
nVidia has a series of 65" 120Hz 4k HDR displays with gSync but it sounds like they're basically just large monitors, not TVs.
 
Considering most people only watch streaming video on their TV's, the "monitor vs TV" distinction is meaningless.

And every single TV source besides using an antennae doesn't use the TV tuner and connects with HDMI anyways.

I wouldn't be surprised if they include a tuner on these those. They never said they wouldn't.
 
Sorry if this was mentioned but I had to stop reading comments half way through as things turned into your typical AMD vs nV pissing match *sigh*

That being said, from what I had read, FreeSync is exactly what VESA adopted and so all "FreeSync" is now, is the AMD branding of that specification. They pioneered it, but it was accepted as a standard. (I'm saying they pioneered the standard that was accepted, not the technology as a whole, as I'm well aware GSync came long before, so don't get your panties in a twist fanboys lol)

That being said, as mentioned it's now part of the new VESA standards so now HDMI has support, and was mentioned a few months back that sets coming out in 2018 will have adopted those standards; therefore, we will see some that are equipped with "FreeSync". (and I quote it because who knows if that's what it'll be called or if displays will specifically call it that in their features, or if AMD has a teeny tiny licensing fee tied to using the term [one never knows])

NOW
... The question becomes "To what end will it be implemented?", as I feel like my laptop is a very relevant datapoint to this all... I bought the HP ENVY X360 last year, when they finally decided to release a decent laptop with and AMD APU, in a convertible, with touchscreen, and most importantly 1080p. What was never mentioned anywhere in the data or specs, is that this thing's LG panel has FreeSync! Granted, it's basic, only supporting 40Hz to 60Hz, but I really don't care because it's a added bonus that I didn't see coming. Which comes back to my point: Even if some of the TVs only had 40-60Hz, I'd still be happy. I'd very much welcome there to be higher end models with 25-120Hz range to compete with the BFGDs in that arena, but for there to be just a basic level of "FreeSync" included in the majority of TV's that get release in 2018, well that's going to be a huge selling point for AMD no matter which way you slice it!

If I'm honest, I wouldn't be surprised at all if the BFGD is a direct answer to the new HDMI standard since it gave AMD a free (no pun intended) bit of market advantage over GSync. Because if I were on AMD's marketing team I'd definitely be throwing together a campaign stating "Now game, tearing free and with less input lag, on all 2018 TVs!" [NOTE: that's just an example, I'm not claiming all 2018 sets will be capable of that -.-] Which doing a quick bit of Googling seems to imply that "FreeSync" is going to be called "Game Mode VRR". Doesn't roll off the tongue as nice but... meh lol
"And AMD is backing another open variant called Game Mode VRR, which has actually been included as part of the new HDMI 2.1 spec — so we should soon see several conventional TVs ship with variable refresh rate support."
- The Verge

I'm still on a 2013? model TV for my monitor, and I'm quite happy with it, but if we're going to get FreeSync capable TVs this year, then hot damn, I'll be keeping my eyes open!
 
Sorry if this was mentioned but I had to stop reading comments half way through as things turned into your typical AMD vs nV pissing match *sigh*

That being said, from what I had read, FreeSync is exactly what VESA adopted and so all "FreeSync" is now, is the AMD branding of that specification. They pioneered it, but it was accepted as a standard. (I'm saying they pioneered the standard that was accepted, not the technology as a whole, as I'm well aware GSync came long before, so don't get your panties in a twist fanboys lol)

That being said, as mentioned it's now part of the new VESA standards so now HDMI has support, and was mentioned a few months back that sets coming out in 2018 will have adopted those standards; therefore, we will see some that are equipped with "FreeSync". (and I quote it because who knows if that's what it'll be called or if displays will specifically call it that in their features, or if AMD has a teeny tiny licensing fee tied to using the term [one never knows])

NOW
... The question becomes "To what end will it be implemented?", as I feel like my laptop is a very relevant datapoint to this all... I bought the HP ENVY X360 last year, when they finally decided to release a decent laptop with and AMD APU, in a convertible, with touchscreen, and most importantly 1080p. What was never mentioned anywhere in the data or specs, is that this thing's LG panel has FreeSync! Granted, it's basic, only supporting 40Hz to 60Hz, but I really don't care because it's a added bonus that I didn't see coming. Which comes back to my point: Even if some of the TVs only had 40-60Hz, I'd still be happy. I'd very much welcome there to be higher end models with 25-120Hz range to compete with the BFGDs in that arena, but for there to be just a basic level of "FreeSync" included in the majority of TV's that get release in 2018, well that's going to be a huge selling point for AMD no matter which way you slice it!

If I'm honest, I wouldn't be surprised at all if the BFGD is a direct answer to the new HDMI standard since it gave AMD a free (no pun intended) bit of market advantage over GSync. Because if I were on AMD's marketing team I'd definitely be throwing together a campaign stating "Now game, tearing free and with less input lag, on all 2018 TVs!" [NOTE: that's just an example, I'm not claiming all 2018 sets will be capable of that -.-] Which doing a quick bit of Googling seems to imply that "FreeSync" is going to be called "Game Mode VRR". Doesn't roll off the tongue as nice but... meh lol
"And AMD is backing another open variant called Game Mode VRR, which has actually been included as part of the new HDMI 2.1 spec — so we should soon see several conventional TVs ship with variable refresh rate support."
- The Verge

I'm still on a 2013? model TV for my monitor, and I'm quite happy with it, but if we're going to get FreeSync capable TVs this year, then hot damn, I'll be keeping my eyes open!


I agree. This is huge for AMD.

I'd love to vote with my wallet and reward them for sticking with the open standard, instead of Nvidia's proprietary nonsense, but the truth is they just don't have a GPU capable of taking advantage of 4k resolutions properly yet.

Heck, my Pascal Titan overclocked on water is only about 80% of the way there. AMD doesn't have any GPU competitive at that performance level, and there is no way in hell I'd ever go back to crossfire again.

I guess I'll be shopping for a BFGD screen in 2018, when a 4k 120+hz model in the 42-44" size range comes out.

*Sigh* I hate rewarding bad companies that come out with proprietary standards instead of open ones, but I just don't have a choice unless AMD pulls a surprise and actually comes out with a GPU 20+% faster than my overclocked Pascal Titan on water in 2018.

In the end, it all comes down to raw pixel-pushing performance, at least if you are an early adopter of new high resolutions.
 
Last edited:
I agree. This is huge for AMD.

I'd love to vote with my wallet and reward them for sticking with the open standard, instead of Nvidia's proprietary nonsense, but the truth is they just don't have a GPU capable of taking advantage of 4k resolutions properly yet.

Heck, my Pascal Titan overclocked on water is only about 80% of the way there. AMD doesn't have any GPU competitive at that performance level, and there is no way in hell I'd ever go back to crossfire again.

I guess I'll be shopping for a BFGD screen in 2018, when a 4k 120+hz model in the 42-44" size range comes out.

*Sigh* I hate rewarding bad companies that come out with proprietary standards instead of open ones, but I just don't have a choice unless AMD pilla a surprise and actually come out with a GPU 20+% faster than my overclocked Pascal Titan on water in 2018.

In the end, it all comes down to raw pixel-pushing performance, at least if you are an early adopter of new high resolutions.

I agree. The reason I dislike nV is not because I'm an AMD fanboy (though I do love AMD, and have since my Athlon Classic 550), but because of their business practices. Same reason I dislike Intel. I openly admit that both companies smack AMD around in their respective categories (thankfully Ryzen has leveled that playing field a bit), but I'm still not going to buy from either, even if it means a sacrifice in performance or game settings. I just can't, personally, invest in either of them due to their immoral business tactics. When it's not my own money and when money is no object, I definitely recommend Intel, but given most folk DO have a budget, I generally do suggest AMD's instead.

Nevertheless, something that's been kicked around in the past, after Ryzen's release when more Vega details came to light, and something that I think we all very much hope will be a reality, will be the Infinity Fabric becoming the new bridge for AMD systems. I loved the concept of HSA and was rather upset to see it never materialize (mostly due to lack of anyone wanting to adopt it due to how bad AMD was the underdog), so I'm hoping that this "HSA 2.0" of sorts will get a better chance since Ryzen has proven itself capable. It might require Ryzen+ and the X470 platform, but I'm hoping that we're going to see Vega be able to tap into the Infinity Fabric, where we'll perhaps see a dual.... quad?... chip card that won't require a PCIe Bridge Chip. All of the Vega dies will communicate across the Infinity Fabric with each other, with Ryzen, and with system RAM. I think that's what Ryzen has shown us is AMD's new path: combining products in a modular way to achieve what they've been unable to otherwise, in order to compete with nV again. Granted, I think Navi will bring good things, but that's the optimist in me :p

The problem then, will obviously become the exact same problem we're at currently... how to get these products into consumer hands and not mining racks locked away in a datacenter? Sure they've got the right idea with mining-centric cards, but that only works when you have the chips to supply both product lines, and I think we all know if there were enough, we'd see those mining cards available. :S

Either way, fingers crossed for 2018 :)
 
I'm not that hopeful we'll see a 42"-44" 4k @120Hz model in 2018 but we'll have to wait and see.
 
Might be of interest...
"When FlatpanelsHD met HDMI Forum’s CEO Rob Tobias and Marketing Director Brad Bramy at CES 2018, the team highlighted VRR (Variable Refresh Rate), QMS (Quick Media Switching) and eARC (enhanced Audio Return Channel) as examples of features that can be added via a firmware update."

That'd be really great if manufacturer's would add it to 2017 models! I mean, we all know that VRR (FreeSync) doesn't need HDMI 2.1 in order to function, given they work on pre-2.0 devices. Hopefully someone from those makers are reading this thread because here's some more free sales advice: Take your 2017 premium models, toss on the aforementioned 2.1 goodies *coughVRR* and sell them in 2018 along with your brand new lineup (which will also have these specs, naturally), but at a lower price. Now you'll have fleshed out your lineup with far more high-end models, be able to reap the benefits further of continuing to use your manufacturing lines already in place, and by now it's probably all profit! Furthermore, you'll have a leg up on others by offering this, and have an influx of people who'll want to obtain your product for their gaming interests.

OR

Just take your TVs, drop this firmware on them, remove all the TV Post Processing garbage and sell it as an HDMI Gaming Screen for less, aiming at console gamers and/or budget PC gamers (like me! :D) who don't need DisplayPort!

Win-Win-Win if you ask me :cool:
 
I think it's cute that anyone believes the scum sucking monsters that run the display manufacturing industry will actually add VRR in a firmware patch when they can simply charge you twice for another monitor in 2019. What a fucking joke this crappy industry is. The 2018 monitors are all trash other than the Nvidia big format ones. None of them have a single feature that last year's models didn't have. Laughing stock.
 
  • Like
Reactions: DG25
like this
I think it's cute that anyone believes the scum sucking monsters that run the display manufacturing industry will actually add VRR in a firmware patch when they can simply charge you twice for another monitor in 2019. What a fucking joke this crappy industry is. The 2018 monitors are all trash other than the Nvidia big format ones. None of them have a single feature that last year's models didn't have. Laughing stock.
2018 is easily the best year for monitors since the 2015 introduction of 1440p 144Hz IPS, because LG 32GK850G now offers the same formula perfected. Without IPS glow or BLB (horrible QC problems), and with 3x contrast.
 
Last edited:
2018 is easily the best year for monitors since the 2015 introduction of 1440p 144Hz IPS, because LG 32GK850G now offers the same formula perfected. Without IPS glow or BLB (horrible QC problems), and with 3x contrast.

While I agree that the LG monitor is probably the best gaming monitor as of today, the fact is that this is only the case because the entire market is a wasteland. Really, 1440p isn't enough for 32". It should have been a 4k screen. That's the problem with all this shit. It ALL has some fatal flaw that seriously detracts from it.

I fully expect us to have to wait until 2019 to get 4k, 32", 120hz, variable refresh, acceptable input lag, and image quality that isn't butt. That's flat out depressing. 2019. What a SLOW, shitty industry.
 
While I agree that the LG monitor is probably the best gaming monitor as of today, the fact is that this is only the case because the entire market is a wasteland. Really, 1440p isn't enough for 32". It should have been a 4k screen. That's the problem with all this shit. It ALL has some fatal flaw that seriously detracts from it.

I fully expect us to have to wait until 2019 to get 4k, 32", 120hz, variable refresh, acceptable input lag, and image quality that isn't butt. That's flat out depressing. 2019. What a SLOW, shitty industry.

2019 is being too optimistic I would say 2020 for that. And I don't even want to imagine how long we will be waiting for a self emmisive micro LED display that can deliver true blacks, high peak brightness without any burn in and doing so while offering 120Hz+ refresh rate with VRR at the same time.
 
2019 is being too optimistic I would say 2020 for that. And I don't even want to imagine how long we will be waiting for a self emmisive micro LED display that can deliver true blacks, high peak brightness without any burn in and doing so while offering 120Hz+ refresh rate with VRR at the same time.

The good thing is that once a technology is rolled out, it's hard for these scumbags to take them away going forward. People aren't happy with 60hz anymore, they're not happy without variable refresh anymore, etc.

If the first MicroLED monitors don't have variable refresh, 120hz, etc., I won't buy them, and I expect many others won't, either. I don't think they can get away with it. They pretty much have to include those features if they want to sell them.

The only reason they got away with 60hz screens without variable refresh with OLED is because they're TVs, and the TV market isn't accustomed to having those features. The PC gaming monitor market basically expects those features now. Look at that Dell OLED. Dead in the water without those features. No one wants monitors that can't do that shit now.
 
The good thing is that once a technology is rolled out, it's hard for these scumbags to take them away going forward. People aren't happy with 60hz anymore, they're not happy without variable refresh anymore, etc.

If the first MicroLED monitors don't have variable refresh, 120hz, etc., I won't buy them, and I expect many others won't, either. I don't think they can get away with it. They pretty much have to include those features if they want to sell them.

The only reason they got away with 60hz screens without variable refresh with OLED is because they're TVs, and the TV market isn't accustomed to having those features. The PC gaming monitor market basically expects those features now. Look at that Dell OLED. Dead in the water without those features. No one wants monitors that can't do that shit now.

The main issue has been HDMI bandwidth, just not enough for 4K @ 120 Hz. That said, not supporting 120 Hz at 1080-1440p on many TVs is a crappy move. The average buyer won't care though as many just buy based on the TV being 4K and cheap, other specs be damned.

The display industry is an annoyingly slow moving behemoth though. I can't believe that a year after announcing the high refresh rate 4K desktop monitors those still are not available.
 
The main issue has been HDMI bandwidth, just not enough for 4K @ 120 Hz. That said, not supporting 120 Hz at 1080-1440p on many TVs is a crappy move. The average buyer won't care though as many just buy based on the TV being 4K and cheap, other specs be damned.

The display industry is an annoyingly slow moving behemoth though. I can't believe that a year after announcing the high refresh rate 4K desktop monitors those still are not available.

That's not deliberate foot dragging. Asus, Acer, etc all announced 4k 120 HDR monitors last spring built around prototype panels from AUO with a mid summer ETA based on when AUO said they'd be transitioning to volume manufacture of the panel. Based on the latest slip of the release date, AUO still hasn't been able to solve whatever the showstopper problem has been preventing them from actually starting volume production of the panel. Some part of doing so (I'm guessing the 384 zone HDR) has proven difficult enough industry wide that none of the other panel makers (eg Samsung, LG, etc) have spoken publicly about such a design in their product lineups.
 
Back
Top