OLED HDMI 2.1 VRR LG 2019 models!!!!

You can use ARC with an appropriate receiver to play smart tv sources or F connector sources noises from your receiver's speakers instead of the TV's. Say your appropriate receiver does not have enough hdmi ports for all of your equipment you can plug them into your TV with eARC and not pay any penalty like it did with ARC with only 2 channel toslink instead of 37-40mbps, and ethernet in addition to the rest of the HDMI 2.1 specs.

That will actually be very nice. I have a PC connected to my B7 and have to do the stupid phantom screen thing and mess with the receiver and tv settings to get surround sound. Being able to get all that with ARC and connecting straight to the TV is so much nicer.
 
Both 2018 LG 4k OLEDs and Samsung 480zone Fald "QLED" LCD could stream 4k 120hz in their 2018 sets, they just didn't finalize the hdmi 2.1 spec in time for the fabrication plants to add it so this shouldn't be too surprising for 2019. It's still great news

Yes even the 2018 Q8 and Q9FN samsung hdr FALD displays could do freesync and VRR on hdmi 2.0 via amd gpus or xbox too. That is "some HDMI 2.1 features on HDMI 2.0b" rather than HDMI 2.1 input or HDMI 2.1 output which has the full bandwidth.. which I'd expect in both model lines in 2019.

The fact that nvidia released another whole gpu generation a few months early lacking hdmi 2.1 reminds me of when they had 780 series lacking hdmi 1.4 so they could only do 30hz 4k, painting a bunch of consumers into a corner. A year later they released the 900 series that did 60hz 4k. Currently the hdmi 2.0b and lack of hdmi 2.1 is also complicated by nvidia's profiting off of g-sync and refusing to support free-sync and now VRR with their monopoly on the highest tier gaming GPUs.

It's OLED so the picture quality blows LCD away.

OLED will never be able to do high HDR color volume as it stands now.

While 480zone+ FALD displays like the Q9FN won't get "true black" like an oled, and they defer slightly to dim side rather than bloom side on FALD dynamics, they still get 19,000:1 contrast ratio which is extremely dark. Most decent non-FALD VA TVs have 5000:1 to 6100:1 contrast ratio, and VA computer monitors have 3000:1. Non fald IPS and TN computer monitors have have 980:1 or 860:1 usually.

The WRGB OLEDs due to the introduction of the 'white' sub-pixel, this distorts the standard RGB color channel relationship - excessively at HDR brightness levels. (if you sum the Y of 100% patch of R+G+B primaries you get 400nits while the same time if you display a 100% White patch you get 800nit...so your color gamut is limited to 400 nits... this means that WRGB OLED's can never be calibrated accurately for HDR..

The current samsung Q9FNs already do ~1800nit, more advanced displays and especially mini LED should do 4000 to 10,000nit. Movies are mastered at 10,0000nit and UHD HDR discs are mastered at 10,000 .. 4,000 .. and 1,000 currently."

They need to set a HDR color volume brightness standard so OLED's cheat of using an extra white sub-pixel in addition to RGB on every pixel won't score as high. The idea of HDR color volume is increasing the whole range of colors much much higher than SDR without crushing to white until that ceiling.


" 384 - 480 zone FALD and mini LED FALD will be better because it will go over 2000 nit in HDR highlights and color volume, perhaps even 10,000 nit eventually like the article I quoted.

https://www.forbes.com/sites/johnar...ch-tv-is-the-best-ive-ever-seen/#7bd8c7312705


contrary to what I know many people reading this might expect, even the brightest parts of the image on this claimed 10,000 nits screen didn’t make me feel uncomfortable. They didn’t hurt my eyes. They didn’t make me worry for the safety of children, or send me rushing off to smother myself in suntan lotion. Even though the area of Sony’s stand where the screen was appearing was blacked out

These shots showed, too, that it’s only when you’ve got this sort of real-life brightness at your disposal that colors truly start to look representative of reality rather than a construct limited by the screen technology of the day.


OLED also obviously still has burn in risk on displays that cost thousands of dollars and are not covered for it by the manufacturer.

OLED will be rolling down from its low color volume brightness ceiling and/or cutting off it's color volume to white at a ceiling as a safety feature/fear of burn in mechanism. The brightness it's capable of before that is also making impure color by using a white subpixel (that means lots of added white subpixels on the screen) to cheat higher brightness measurements which makes the color accuracy way off and the color volume lower , so it can't even be calibrated in HDR.

The current samsung Q9FNs already do ~1800nit, more advanced displays and especially mini LED should do 4000 to 10,000nit. Movies are mastered at 10,0000nit and UHD HDR discs are mastered at 10,000 .. 4,000 .. and 1,000 currently."

-------https://www.resetera.com/threads/hdr-games-analysed.23587/

-----Incomplete HDR UHD movie list including Mastered peak luminances per movie:

https://tinyurl.com/yardqzu5

-------

https://www.techhive.com/article/3239350/smart-tv/will-hdr-kill-your-oled-tv.html

"manufactured by LG) don’t actually use true blue, red, and green OLED subpixels the way OLED smartphone displays do. Instead, they use white (usually, a combination of blue and yellow) subpixels with red, green, and blue filters on top. This has many advantages when it comes to large panel manufacturing and wear leveling, but filters also block light and reduce brightness.

To compensate, a fourth, white sub-pixel is added to every pixel to increase brightness. But when you add white to any color, it gets lighter as well as brighter, which de-emphasizes the desired color. "
 
Last edited:
Tech's just a moving target - continually chasing it, you'll never win. :) Glad that they are rolling out more improvements and adopting HDMI 2.1. I'm really very happy with my 65" C8 OLED purchase though - picked it up around Black Friday for $2.5K from Costco with a 5 year warranty. They go for closer to $3K now pretty much everywhere - supply and demand I guess.

OLED just completely blows LCD displays out of the water. Samsung's top tier quantum dot LCD's are a close second, but they still just can't match the blacks that OLED delivers.

So, even if this new 2019 LG line-up is going to be capable of beginning to support 120hz 4k, I bet they'll need some time to get it dialed in and all the kinks worked out - not much out there can deliver that kind of signal yet. We'll also need to wait at least another gen of PC video cards to be able to run 4K at that high a refresh and still be able to maintain 60+ fps in-game with all the eye candy cranked to 11. While a 2080Ti can pretty much deliver 4K @ 60hz with good fps (~60+), there is no way it can deliver 4K 60+ fps @ 120hz with the eye candy maxed. Will be glorious once the GPU to deliver that kind of performance arrives though! Hopefully they put more effort into OLED PC monitor tech as well - just not seeing a lot of OLED PC monitors... burn in still seems to be a pretty big issue to contend with on that front.
 
OLED blacks are definitely phenomenal and they do SDR well (with burn in risk) .. but 400nit HDR color volume faked to 800nit via white subpixels and rolling off of even those peak brightnesses with safety features vs burn in is going to be left in the dust for real HDR color volumes unless they do some major oled tech improvements.

avsforum post reference:
The WRGB OLEDs due to the introduction of the 'white' sub-pixel, this distorts the standard RGB color channel relationship - excessively at HDR brightness levels. (if you sum the Y of 100% patch of R+G+B primaries you get 400nits while the same time if you display a 100% White patch you get 800nit...so your color gamut is limited to 400 nits... this means that WRGB OLED's can never be calibrated accurately for HDR..

400nit color volume (even if the white subpixels can blow out the color white to 800) is a very psuedo-HDR which is really still in SDR color volume ranges . like many other low nit tv's stamped with HDR capabiliy.

----------------------------------------------
HDR movies and many HDR capable games are mastered and capable of 10,000 nit peak color volume, most UHD 4k discs are 4,000 and 1000 currently though (with a few exceptions at 10k like blade runner 2049), and a few games like tomb raider cap their peak to 4000 nit.

A 400nit color OLED ("800 nit brightness via white subpixel") is showing
1/25th of the full 10k HDR movie's and full 10k HDR Game's color volumes which is a tiny fraction.
A 2000nit LED FALD LCD is 1/5th.

Comparing 4000nit movies (4000 being 2/5 of Full HDR's vision to start with):
OLED is 1/10th of the HDR 4000 color volume, 2000nit LED FALD LCD is 1/2.

Comparing 1000nit movies (1000 being 1/10th of Full HDR's vision to start with):
OLED is less than half (2/5) of the HDR color volume, a 2000nit FALD LCD surpasses.

That means that a high end LED FALD LCD has five times greater ("taller") true HDR color volume than an OLED already.

LED FALD LCD and mini LED FALD LCD are only going to increase their HDR color volume more toward the full HDR vision as time goes on too.
... OLED will never be able to do high color volume HDR in it's current tech form
(which is already relying on white subpixel tricks and brightness rolldown safety features).

----------------------------

https://www.resetera.com/threads/hdr-games-analysed.23587/

"Star Wars : Battlefront 2 .. all of the HDR compatible Frostbite games I looked at (Battlefield 1,Mass Effect) all use the same setup.
Metadata output will be 100000 and the actual tone mapping is performed by the game via the HDR slider, with 0 nits being the very most left value and 10k nits being the most right value."

Tomb Raider has been capped at 4000nits
AC:Origins is another game that really does HDR well, like Tomb Raider is also capped at 4000nits

Forza:Horizon 3 "You can see the skie reaching 4000 or so nits, tires grill being suitably unlit and dark and then the headlights pumping out a full 10k nit output."

Gears of War 4: 10k nit color volume

Insects: 10k nit color volume

Shadow of Mordor: SoM offers a super simplified approach, the game is always outputting 10k nit maximum and the tone mapping is left entirely to the TV.

Agents of Mayhem: Similar approach here, except the game is capped at 1000nits and has obviously been graded/toned with this in mind,

Final Fantasy XV: 1000 nit fixed max output and a simple brightness slider to drop black levels.

.. HDR suffers design bugs in these two
Monster Hunter World: Much like Deus Ex, Monster Hunter World appears to operate within 4000nits, however much like DEUS EX, when HDR is enabled, the game appears to have severe black level problems.
.
 
Last edited:
OLED blacks are definitely phenomenal and they do SDR well (with burn in risk) .. but 400nit HDR color volume faked to 800nit via white subpixels and rolling off of even those peak brightnesses with safety features vs burn in is going to be left in the dust for real HDR color volumes unless they do some major oled tech improvements.

avsforum post reference:


400nit color volume (even if the white subpixels can blow out the color white to 800) is a very psuedo-HDR which is really still in SDR color volume ranges . like many other low nit tv's stamped with HDR capabiliy.

----------------------------------------------
HDR movies and many HDR capable games are mastered and capable of 10,000 nit peak color volume, most UHD 4k discs are 4,000 and 1000 currently though (with a few exceptions at 10k like blade runner 2049), and a few games like tomb raider cap their peak to 4000 nit.

A 400nit color OLED ("800 nit brightness via white subpixel") is showing
1/25th of the full 10k HDR movie's and full 10k HDR Game's color volumes which is a tiny fraction.
A 2000nit LED FALD LCD is 1/5th.

Comparing 4000nit movies (4000 being 2/5 of Full HDR's vision to start with):
OLED is 1/10th of the HDR 4000 color volume, 2000nit LED FALD LCD is 1/2.

Comparing 1000nit movies (1000 being 1/10th of Full HDR's vision to start with):
OLED is 1/5th of the HDR color volume, a 2000nit FALD LCD surpasses.

That means that a high end LED FALD LCD has five times greater ("taller") true HDR color volume than an OLED already.

LED FALD LCD and mini LED FALD LCD are only going to increase their HDR color volume more toward the full HDR vision as time goes on too.
... OLED will never be able to do high color volume HDR in it's current tech form
(which is already relying on white subpixel tricks and brightness rolldown safety features).

----------------------------

https://www.resetera.com/threads/hdr-games-analysed.23587/








Gears of War 4: 10k nit color volume

Insects: 10k nit color volume







.. HDR suffers design bugs in these two
.

Samsung exaggerates and oversaturates colors at high brightness. See this video for an measured comparison with OLED

Yeah, FALD LCDs are pretty good during the daytime or with the lights on, but OLED destroys LCDs in a dark room.
 
SDR and HDR color volume are very different things. OLED is 400nit color which is essentially SDR color volume. They can't do HDR color volumes because of burn in risks and have to resort to white subpixels on every pixel just to fake whites washing out colors to 800 and rolling off of even those peaks as burn in risk safety features instead of doing real HDR 1000 full colors volume (and 2000 nit out of HDR 4000 - 10000 color volume capable content on samsungs., eventually full 4000 and 10000 nit displays so the color volume gap is only going to get larger vs OLED's limitations).
 
Last edited:
SDR and HDR color volume are very different things. OLED is 400nit color which is essentially SDR color volume. They can't do HDR color volumes because of burn in risks and have to resort to white subpixels on every pixel just to fake whites washing out colors to 800 and rolling off of even those peaks as burn in risk safety features instead of doing real HDR 1000 full colors volume (and 4000, 10000 eventually).

Being the owner of a FALD LCD myself (Acer X27), yes I guess the LCD could beat out the OLED when viewing very bright and colorful HDR scenes, especially in a well lit room. But when it comes to sRGB/SDR content, which is what 99% of the content that you will be viewing on a PC at this point in time, OLED just flat out wins because that extra nit levels of brightness and color volume is a moot point for SDR. By the time HDR content becomes mainstream enough we might even have Micro LED displays to migrate towards to anyway, which is also superior to FALD LCD.
 
people said similar things about 3d passthrough gpus when only a few games supported glide and openGL, 16:9 support vs 4:3 , 1080p "HD" support, 120hz vs 60hz support.. 1440p 144hz being unnecessary/too demanding (at first), 4k only being capable of 30hz (and then 60hz limit, soon 120hz but limited frame rates by gpu power) etc. How many games supported each right away?
How many movies and streams supported 4k resolution right away? Yet people eager to experience the upgrades adopted some or all of them earlier than others.

HDR is considered by many to be a more important and striking advancement than 4k resolution.
OLED will be left behind by HDR color volume more and more unless there are some big advancements in the tech.

There are already a lot of UHD/4k HDR movies that are HDR 1000, 4000, 10000. Several games are 4000 and 10000 already too. 400nit color volume isn't going to cut it for real HDR enthusiasts. Also, spending thousand(s) on a considerable burn in risk for static content like a pc, obviously not covered by the manufacturer, is not for me either.
 
Last edited:
Maybe not on a monitor in a bright room - but for the most part, I prefer a darkened room to play games and/or watch TV. While I can respect your wanting max nits as to performance when it comes to the HDR color volume, I don't necessarily want or need them, especially in the environment I typically engage viewing in. I'd actually be annoyed at the eye searing levels you are complaining that OLED doesn't provide - my C8 OLED is plenty bright enough in my man cave - and caves are dark... :D
 
Last edited:
If you viewed HDR content on a 1000 nit display or a 500 nit display, any scene where the peaks are lower than 500 nits should look exactly the same on both displays, due to the way that brightness is now coded. It represents an exact value, not a relative value.
For scenes which have highlights exceeding 500 nits, you will have highlight clipping on the lower brightness display (everything above 500 nits turns white) which would not be present on the higher brightness display. <---- Which will show full color gradations across the spectrum in bright highlights instead of crushing to white after hitting the sdr ceiling.
HDR enables far more vivid, saturated, and realistic images than SDR ever could.

Burn in is the deal breaker for me but it's more than burn in. OLED colors can't hit higher HDR color volume. They hit a color luminance/ brightness cap much earlier where they clip the colors to white or roll off out of the riskier burn in levels. They also , even if saturated within their alllowed levels, can't be calibrated in HDR. This untrue color is due to their safety/fear factor against burn in - clipping to white early or rolling back from even the oled peak color luminances.. and because they "cheat" higher brightness levels than the first gens of OLED by adding a white subpixel.

I know OLEDS look good. I love black levels and have written off (nonFALD) IPS and TNs because of it. However I also love HDR color volumes and where HDR is going and oled while darker is figuratively stuck in the mud on that.

I realize this is just one reviewer's impression but from everything I've seen including Q9FN in person at near 2000nit peak HDR color volume I'd tend to believe him:

-----Quote------ https://www.forbes.com/sites/johnar...ch-tv-is-the-best-ive-ever-seen/#7bd8c7312705

contrary to what I know many people reading this might expect, even the brightest parts of the image on this claimed 10,000 nits screen didn’t make me feel uncomfortable. They didn’t hurt my eyes. They didn’t make me worry for the safety of children, or send me rushing off to smother myself in suntan lotion. Even though the area of Sony’s stand where the screen was appearing was blacked out (apart from the light being cast from both the 10,000-nit screen and five other more ‘normal’ screens sharing the space).

What the boldness of the highlights DID do, on the other hand, was make the pictures look more stunningly realistic and dynamic. More like real life. More direct. More intense.

Some of the demo content Sony is showing has extreme brightest peaks appearing within the context of generally much higher light levels than we’ve seen on an LCD TV before, so that they don’t stand out too brazenly. But actually, as noted earlier, even when the brightest images elements are ‘blaring’ out of a dark setting, they still just look compellingly realistic rather than something you should only watch with sunglasses on.

------
this underlines the importance of brightness to delivering truly realistic images. Shots of villagers embroidering on a bright sunny day show looked incredibly life-like with 10,000 nits (or so) of light output to define them.

These shots showed, too, that it’s only when you’ve got this sort of real-life brightness at your disposal that colors truly start to look representative of reality rather than a construct limited by the screen technology of the day.
------------------
 
Well sure if you are absolutely all about the HDR experience then I guess OLED might not be to your taste. But right now I own zero 4k HDR blu rays, never watch Netflix/Amazon/Youtube HDR, and don't play much AAA games which are usually the ones that have HDR support (the only HDR game I've played all of 2018 was God of War). I am still mostly still viewing/playing on SDR content and when HDR content does become my thing, so might Micro LED. I just see no reason to buy another FALD LCD screen, but that is just me personally.
 
Well sure if you are absolutely all about the HDR experience then I guess OLED might not be to your taste. But right now I own zero 4k HDR blu rays, never watch Netflix/Amazon/Youtube HDR, and don't play much AAA games which are usually the ones that have HDR support (the only HDR game I've played all of 2018 was God of War). I am still mostly still viewing/playing on SDR content and when HDR content does become my thing, so might Micro LED. I just see no reason to buy another FALD LCD screen, but that is just me personally.

I disagree with that first part. Contrast is still more important than peek brightness for HDR. When one pixel is supposed to be black and the next pixel over is supposed to be 10000 nits of white it looks better on an OLED.
 
thats the thing though. HDR isn't supposed to be whites it's supposed to be full colors through the brightness instead of clipping to white at low sdr-ish color brightness cielings . We have seen screens clip bright peaks of narrow sdr range limits to white in sdr so long we associate bright with white on screens. Reality isnt like that and high HDR color volume has much more realism with full color highlights and bright color sources.

OLED can only do true color to - 400nit and fakes 800 nit at white subpixels while a Q9FN can get a true HDR color volume surpassing HDR1000 through nearly 2000nit. Future screens will get closer to the full vision of HDR color volume at 4000 and 10000 nit too so oled is really psuedo hdr color if that. SDR is another matter and of course. FALD has tradeoffs, the 480zone Q9FN still favors a dim offset to bloom in its fald implementation but the array has to favor one way or the other at that fald density. Im not saying either tech is perfect, im saying OLED is never going to get HDR color volume (not white subpixel brightness) with its current tech. - 400nit color blasted with white subpixels utilizing brightness rolldown burn in safety features from already low peak colors compared to a HDR1000 color capable display, or a - 2000nit HDR color volume display on HDR4000 and HDR10000 color capable sources, and evetually displays that do 4000 and even 10000.
 
Samsung exaggerates and oversaturates colors at high brightness. See this video for an measured comparison with OLED

Yeah, FALD LCDs are pretty good during the daytime or with the lights on, but OLED destroys LCDs in a dark room.

What really gets me in this video is the space scene at 4:20. It's just horrible on the FALD LCD.

50% of stars are gone, and when the bright object moves on the scene, the stars suddenly start glaring around it because of the FALD halo engaging.
 
Last edited:
I see the anti-OLED / burn in brigade is back, yet again.

Don't you people ever get tired of repeating the same old arguments in every thread? We get it, you don't like OLED. You think it's crap because it can't go beyond some pie-in-the-sky nits, and it can't quite match the color volume of QD (yet).
 
The fact that nvidia released another whole gpu generation a few months early lacking hdmi 2.1 reminds me of when they had 780 series lacking hdmi 1.4 so they could only do 30hz 4k, painting a bunch of consumers into a corner. A year later they released the 900 series that did 60hz 4k. Currently the hdmi 2.0b and lack of hdmi 2.1 is also complicated by nvidia's profiting off of g-sync and refusing to support free-sync and now VRR with their monopoly on the highest tier gaming GPUs.

It will be interesting to see how Nvidia reacts to this in a few years as HDMI 2.1 becomes more established, there is most likely support for VRR in the next gen PlayStation too. I don't think they can afford to just avoid supporting these things on their GPUs forever. While I feel G-Sync is the better tech, being vendor locked for variable refresh rate support is just bullshit.
 
You think it's crap because it can't go beyond some pie-in-the-sky nits, and it can't quite match the color volume of QD (yet).
The color volume is the nits... at least in displays that aren't using white subpixel cheats.

The colors continue to be shown though 600nits higher at HDR1000 instead of being clipped to white after hitting the 350 - 400nit low ceiling on non HDR and pseudo HDR screens.

On HDR 4000 and HDR 10,000 content with a 1800 - 2000nit HDR display, the color volume rises to the 1800 - 2000nit limit which is nearly 5 times the color of 400nit SDR and the non-white subpixel washed and "screen saving safety function vs burn-in" brightness roll-downs in OLED ranges. That color gap is only going to get bigger as more colors are shown toward the HDR 4000 and the true HDR 10,0000 movies are mastered at and some games are already capable of.

--------

OLED, other than static usage scenario burn in risk not covered by the mfg on a $1k+ display, is great for SDR and it looks amazing. Per pixel emmissive by design seems like the best way to go. However the organic nature of the display tech causing not only some risk but requiring lower intrinsic brightness limitations , subpixel whites to cheat a little higher on numbers as HDR color is already going well beyond OLEDs penned in color limits and brightness levels , and using brightness rolldowns that kick in so your organics don't die... those make it a poor HDR color tech unless they do some huge advancements in the future somehow.
-------

ever get tired of repeating the same old arguments
I am not only arguing points, I'm relaying information that not everyone sees. Not everyone reads every thread and people can fall prey to marketing and thread titles (myself included). I also get more information from discussions regarding specs on different monitors, tvs, and technologies in general which help give me a more accurate picture (no pun intended). In my mind OLED had a lot going for it until I learned a lot more about what HDR is and how HDR works.. and became more aware of OLED's shortfalls/tradeoffs going forward especially in regard to high color which goes hand in hand with it's burn in risk limitations. The information flows both ways and I really appreciate all of the countering replies and everyone who adds to these discussions.

I also think that confirmation of HDMI 2.1 coming out on high end displays in 2019 is great news in general.
 
Last edited:
  • Like
Reactions: Nenu
like this
I also think that confirmation of HDMI 2.1 coming out on high end displays in 2019 is great news in general.

Then stick to the on-topic discussion rather than coming into seemingly every single thread and steering the discussion towards how OLED peak nits and color volume is inferior. Request a sticky thread or something if you feel so passionately about making long-winded armchair criticisms of technology to inform the general public.

I'm sorry, but it's seriously aggravating, not only because you've done it here (yet again), but you've done in my own original threads where you (again) went off topic.
 
It's winded because it's complicated. I wish I had known more details about each tech earlier. In these threads people often say things that are wrong about or are just misunderstanding HDR (and OLED and high end FALD). In these discussions and counterpoints, linked articles and arguments everyone posts (I found a lot of good info in responses to my replies), and actually getting to see the displays in person - sometime in 2018 I actually changed my purchasing outlook away from buying a 2019 HDMI 2.1 OLED. I know at least a few other people in these threads got some eye opening info out of this or at least a better understanding of the technologies and their limitations especially in regard to HDR color going forward.
 
Regardless, this just tells you that 2019 is going to see 4k high framerate displays with variable refresh that are both LCD and OLED. It's not like LG is going to be the only one. Samsung and others are going to have to respond to this.

elvn's right, though. OLED is basically the contemporary plasma. It's not going to win in the long run. It's inherently a deeply flawed technology. MicroLED will completely replace it as soon as it's on the market. It's basically better in every way. The burn in and brightness issues are real.

Personally, though, I don't even care about HDR.
 
While agree with the points being made, I can’t help but think some “cut off your nose to spite your own face” syndrome is at play here... Folks scrambling, red faced and out of breath to fall on their swords just to point out how OLED tech is so deeply flawed.

Right now, OLED is amazing at what it does and delivers a pretty incredible viewing experience compared to EVERYTHING else out there. Sure, tech evolves... better thing will come. But it’s not like enjoying OLED right now and reveling in its inky blacks and amazing performance is a bad thing. Sure in a couple years something much better will roll out, but that’s just life in the tech world. Hell, stop vomiting just long enough to stop and enjoy the view. :D
 
elvn - your contributions over the years for displays and tv are appreciated. Don't stop on account of people being offended.

:)
 
I wonder if they'll support refresh rates over 120hz. 1080@480hz would be amazing and oled has the response time to handle it.
 
Hell yeah! Dude since your last post was doubting the spec I thought you were going to post something to confirm your doubts and give me the biggest case of blueballs ever.

Ya it really seems that LG kept this under wraps and actually pulled off the real deal. Looks like passive cables at 48 GBps are going to be pretty short (3-6 feet), so plan on having your computer pretty close!

So does that mean you're all Radeon from here on out?

No; even in 2019 I don't think AMD will have anything fast enough to do 4K/120 justice. They are just so far behind. I will most likely buy 7nm NVIDIA and run the display locked at 4K/120 Hz at 25% BFI. Since that is very demanding, you should still have the option for 2560x1440 at 120 Hz.

I suspect AMD Navi could be a popular option for those that don't need to run 4K much faster than 60 FPS (while running VRR) and also have the option for more performance using 1440p VRR.

It could be a moot point though if NVIDIA 7nm isn't going to arrive until 2020 as this article would indicate:

https://www.pcgamesn.com/nvidia/nvidia-7nm-euv-graphics-card-samsung

But could little old Turing really take NVIDIA all the way to 2020?
 
Last edited:
You could crossfire amd. I'm not saying it's for everyone but..


http://amdcrossfire.wikia.com/wiki/Crossfire_Game_Compatibility_List

Still works on a lot of games and Unreal Engine supports it if the devs do the work.

From the reddit post tests:

  • WORKING EASY: 60% of games // 60% of apps tested

  • WORKING EASY or with minimal config: 71% of games // 75% of apps tested

  • UNSUPPORTED: 29% of games // 25% of apps tested

  • SCALING: Varies wildly from game to game, but usually averages in the 50% range.

  • ADDED DATA: Frame time analysis - fc5 doesn't look so great here, but all else looks great.
"Excepting things that are set to a specific resolution, the below were set to 4k res, highest detail levels.
Vsync always off, and I disabled AA. Some exceptions out there, which I noted. Also note that the actual display is 1080p, so this is scaling via VSR on driver 18.5.1.

6XWWlxo.png


Also depends how demanding your games are in getting ~ 100fps average or better in 4k resolution to get appreciable gains out of 120hz.

There is also this that I haven't messed with yet..



Lossless Scaling scales any game up to your native display resolution correctly without blur.

When you run the game at a resolution less than the native display resolution, the graphics driver uses bilinear or bicubic interpolation of the image resulting in blurring and loss of quality. For example, when you play old games on a 1080p display or modern games at a 1080p resolution on a 4k display, the result will be a blurred image.

Lossless Scaling solves a problem using integer scaling. In the case of scaling 1080p images to 4k, it simply doubles the pixels horizontally and vertically so the output image maintains its original clarity and integrity. Additionally you can enable fullscreen anti-aliasing in the app even if the game has no such option. Unlike anti-aliasing in the game itself, the app smooths already upscaled image.

App requirements are very simple. The game should be able to run in windowed mode and your OS must be Windows 8 or higher. Detailed instructions are in the app.
 
No; even in 2019 I don't think AMD will have anything fast enough to do 4K/120 justice

There's no GPU that can do 4k120 on any modern game.

That's why you go AMD, use resolution scaling on a per-game basis (or maybe dynamic resolution if available) and enable Freesync.

If you're going all-out on your display, you don't want frame rate-refresh rate mismatches.
 
Saying it again:

HDMI VRR is not Freesync.

I'm pretty sure Navi is going to support HDMI VRR. It is literally the only(?) reason to go with a AMD GPU over a NVIDIA GPU besides cost.

There's no GPU that can do 4k120 on any modern game.

That's why you go AMD, use resolution scaling on a per-game basis (or maybe dynamic resolution if available) and enable Freesync.

If you're going all-out on your display, you don't want frame rate-refresh rate mismatches.

Well when NVIDIA GPU's are almost literally twice as fast as AMD ones current gen, obviously NVIDIA is way closer.
 
I think they'll keep the price the same as last year's models despite the upgrades. I'm glad I waited and didn't buy a 2018 model on black Friday.
Correct me if i am wrong but you can use a Displayport to HDMI cable. Someone else at AVSForums a while back brought this up and shared a link at Amazon or some place where you can buy this Displayport to HDMI cable and get the beneifts of the higher frames,etc that come with HDMI 2.1

Here is an example of current DP to HDMI converter https://www.amazon.com/CAC-1080-DisplayPort-Adapter-displays-4096x2160/dp/B077JB28KM

So using Displayport from GPU to HDTV..with this can get your higher rez at 60fps..that is good for those that want 4k with 60fps with a good GPU. I suppose this will only get more popular/more products for converting as HDMI 2.1 HDTV's and Receivers are available..which will be this year.
 
Last edited:
As an Amazon Associate, HardForum may earn from qualifying purchases.
Correct me if i am wrong but you can use a Displayport to HDMI cable. Someone else at AVSForums a while back brought this up and shared a link at Amazon or some place where you can buy this Displayport to HDMI cable and get the beneifts of the higher frames,etc that come with HDMI 2.1

Here is an example of current DP to HDMI converter https://www.amazon.com/CAC-1080-DisplayPort-Adapter-displays-4096x2160/dp/B077JB28KM

So using Displayport from GPU to HDTV..with this can get your higher rez at 60fps..that is good for those that want 4k with 60fps with a good GPU. I suppose this will only get more popular/more products for converting as HDMI 2.1 HDTV's and Receivers are available..which will be this year.

I have the CAC 1080 DP to HDMI cable and it doesnt support HDR to my Samsung Q9 TV (from a 1080ti) and when connected to my AVR instead it reports a max bitrate (and the only bitrate) for audio of 16/48KHz, although it still allows bitstream of DTS-HD, TrueHD, Atmos etc. But I cannot play anything multichannel PCM at high bitrate, all my multichannel music must be downsampled or it errors.
Direct connection via HDMI to the AVR allows up to 24/192, but my AVR wont pass 1440p 120Hz so I need to use HDMI out directly to the TV.
It pissed me right off because I need it to either support HDR or full bitrate audio and it wont do either.
 
Last edited:
As an Amazon Associate, HardForum may earn from qualifying purchases.
SDR and HDR color volume are very different things. OLED is 400nit color which is essentially SDR color volume. They can't do HDR color volumes because of burn in risks and have to resort to white subpixels on every pixel just to fake whites washing out colors to 800 and rolling off of even those peaks as burn in risk safety features instead of doing real HDR 1000 full colors volume (and 2000 nit out of HDR 4000 - 10000 color volume capable content on samsungs., eventually full 4000 and 10000 nit displays so the color volume gap is only going to get larger vs OLED's limitations).
I prefer to buy a display that's right for my content (that i watch/play on the regular) rather than a better display that I then have to and search for content for.
Sure Gears of War has HDR , and so does Mass Effect Andromeda, and so does AC:Origins but none of those is actually a good game, nor is it anything i'm particularly interested in.
Besides, this being a computer display, watching stuff at night and seeing the black parts / background disappear is definitely something that's of higher value than artificially high HDR settings that would cause eye discomfort. 10000 Nits man, are you insane? even if it's a small area of the screen that's like looking directly at a lightbulb.
 
Last edited:
Dunno, my 55" Samsung NU8000 does 120hz ( native ) @ 1440p .... very very happy with my Samsung ... great picture, excellent performance ... doubt I will need anything for 2 or 3 years.

Any 2080 ti can actually push 1440p at ultra / high settings ... here .. today ..... at a very affordable price. $750

There is literally nothing on the market now or for a few 2 or 3 or 4 years to come that can push 120hz native at 4K

Now, if the 2019 Samsung's can do VRR and latency improves beyond 9ms which is still better than most gaming monitors I might be interested. And no, pixel response time has nothing to do with anything these days. When is the last time anyone saw screen smearing ? Like, forever. You guys need to stop .... S T O P ... throwing "response time around" .... port latency is the important number here.

Sounds cool but ... "Nah brah, I'm good ... I'm good." lol
 
Correct me if i am wrong but you can use a Displayport to HDMI cable.

You are wrong. To turn DP 1.4 into HDMI 2.1 requires an active adapter with a HDMI 2.1 Tcon that doesn't even exist yet. Probably won't exist for years due to high cost.

And no, pixel response time has nothing to do with anything these days. When is the last time anyone saw screen smearing ? Like, forever.l

I must have missed the memo that LCD pixels have become instant and there is no longer motion blur from slow LCD pixel transitions over night. Darn it! And that TV uses PWM even at 100% brightness, Samsung still up to their old cheap tricks.
 
The most recent firmware on the Q9FN reduced input lag in game mode of 1440p 120z to - 10ms. I think that is even with interpolation on for consoles (or even for low frame rates) . I'd assume that 120hz 4k on hdmi on a 2019 model would be similar. The LG C8 OLED from what I can find is 21ms and has no 1400p mode.. which is still pretty low for a tv. I'm not sure about its interpolation capability but ive heard it stutters on 30hz games and other low frame rate content a bit. Who knows what the 2019 models will be able to achieve but the hdmi 2.1 capabilities should be great including QFT (Quick Frame Transport) which could improve input lag on all hdmi 2.1 displays potentially and will likely open up 1440p higher hz mode along with 4k 120Hz across the board.

Sample and hold blur is different from response time blur.
https://www.blurbusters.com/faq/oled-motion-blur/

https://www.blurbusters.com/blur-bu...000hz-displays-with-blurfree-sample-and-hold/

Sample and hold blur aside - a modern gaming VA monitor, fps vs response time wise, is tight to about 120fps before it would start black ghosting/trailing/smearing due outpacing the response time . If a TV can achieve similar response times or better, 120FpsHz ceiling would be fine, even preferred.. However the higher end tvs also have up to 480Hz flicker (a good thing for motion) and optinal BFI (black frame insertion), as well as interpolation available in game mode at low input lag.
 
Last edited:
Back
Top