HDR? Why is everyone going completely bonkers for this?

RogueTadhg

[H]ard|Gawd
Joined
Dec 14, 2011
Messages
1,527
I've watched a TV at the local blue store and I wasn't impressed. Is there some way I can really see the difference between HDR and SDR?
 
What was the TV? Because not all HDR implementations are made equal. Most televisions are "HDR Ready," in that they can process the metadata, but the dim the whole frame to give the appearance of increased contrast. Of course, you also need to make sure the content you're looking at was mastered properly with HDR. OLED and LCD with FALD will give the proper effect if peak brightness is 600 nits or higher.

With a proper setup, the effect will be obvious. Look at things like the sun, stars, lightning, and other objects or effects that you would expect to give off a lot of bright light when exposed to it in real life. On my PG27UQ, which is LCD FALD with 384 zones and 1000 nit peak brightness, the lightning bolts in the opening mission of Mass Effect Andromeda literally blinded me for an instant in a dimly lit room.
 
  • Like
Reactions: Nenu
like this
local stores also don't calibrate right. Demo mode is usually brightness cranked to a stupid point. I spend most of my day staring at dark screens like the dark theme Visual Studio so for me OLED is magical. OLED usually doesn't have the greatest peak brightness. HDR is heavily dependent on zones of lighting which is why I mainly only see a difference on OLED. Most tvs just aren't to a point to actually do HDR yet.
 
watch Dunkirk in Dolby Vision and then try and see the non-HDR version...everything is so much more detailed...HDR is not all about blinding you with bright images, it's about enhancing the overall contrast by making the blacks inky and the brights more vivid to produce an image that pops off the screen...that's why OLED's do an amazing job because they can produce 0 blacks which is more important then having higher peak brightness (which is why OLED's are considered the holy grail of displays)...
 
I'll third the above guys. I know a guy that works at Best buy (hey, they sell tv's and have them on display ;)) and he says they hook all the tv's up to some no name splitter in the back. So image quality probably isn't there at that point.

I would think, anyway.
 
On my PG27UQ, which is LCD FALD with 384 zones and 1000 nit peak brightness, the lightning bolts in the opening mission of Mass Effect Andromeda literally blinded me for an instant in a dimly lit room.

Still happy with it?
 
HDR is a dud. At best it looks "different," and it worst it looks worse than a normal display. I'm not usually someone to poo poo on display advances, but honestly, this just wasn't a big deal.
 
Honestly I just want a 120/144 Hz 4K g-sync screen, drop the HDR I don't need it, don't need local dimming zones either. There is no reason that this screen shouldn't already exist. So far hdr appears to be a $2000 buzzword to me.
 
Honestly I just want a 120/144 Hz 4K g-sync screen, drop the HDR I don't need it, don't need local dimming zones either. There is no reason that this screen shouldn't already exist. So far hdr appears to be a $2000 buzzword to me.

Yup. Monitors are something that should just work and not get in your way. Their features aren't SUPPOSED to be noticeable or stand out. When you see people mentioning things like feeling that they got blinded by an explosion in a game, you know HDR is a gimmick and a dud. People don't WANT to be blinded. They want something they can comfortably look at that doesn't get in their way. HDR effects are actually a bad thing that diminishes visual quality in most cases.
 
Honestly I just want a 120/144 Hz 4K g-sync screen, drop the HDR I don't need it, don't need local dimming zones either. There is no reason that this screen shouldn't already exist. So far hdr appears to be a $2000 buzzword to me.

HDR is the single largest graphical benefit since we jumped from 480p to 1080p.

As for 144 Hz/4k, that's coming in HDMI 2.1, which has the bandwidth to push 8k 120Hz HDR, and then some. HDMI 2.1 offers almost stupid levels of bandwidth.
 
I went from 1080p straight to 4K HDR so I had a significant WOW factor. Everyone in the house can tell when I do find a good HDR source, but there are so few good sources so far. Without HDR content, there is almost no difference. The few games I have that support it look amazing though.
 
HDR is a dud. At best it looks "different," and it worst it looks worse than a normal display. I'm not usually someone to poo poo on display advances, but honestly, this just wasn't a big deal.
First ask yourself what is HDR? I'm not talking about the half assed HDR. I'm talking about HDR 10 and dolby vision. Once you research what they do get back to us. The problem is it has been very poorly implemented so far. On a proper display it is something great It isn't a gimmick to have things like proper lighting and having blacks be black.
 
Yup. Monitors are something that should just work and not get in your way. Their features aren't SUPPOSED to be noticeable or stand out. When you see people mentioning things like feeling that they got blinded by an explosion in a game, you know HDR is a gimmick and a dud. People don't WANT to be blinded. They want something they can comfortably look at that doesn't get in their way. HDR effects are actually a bad thing that diminishes visual quality in most cases.
he said in a dark room he was kinda blinded. What that means is the lightning flash was super bright and everything else was super dark. A small amount of ambient light from a source other than the screen has always been advised for as long as I can remember. That is why many people have things like LED strips giving their room some dim lighting.
 
Still happy with it?
You bet! The monitor is great for all content including SDR due to the color and increased contrast with the FALD. So even though there isn't that much gaming yet to take advantage of HDR I am still very happy and confident this monitor is futureproof for at least the next few years. I think the only thing that would be a replacement for it would be a G-Sync HDR OLED.
 
  • Like
Reactions: Q-BZ
like this
HDR is meant to be watched in a dim room. You do not get much benefit from a HIGH DYNAMIC RANGE if you watch it in a bright enviroment where your eyes locked at the bright spectrum only. Both SDR and HDR can be bright, big deal, and since showroom TV's are in torch mode they really are really bright. The difference is when you watch it in a dark room with calibrated screens, SDR is locked into certain brightness range and when you try to turn it brighter everything gets bright from daylight scenes to shadow details and the screen just torches your eyes like a giant flashlight. HDR can be dark and bright depending on scene without losing details in either and there can be a clear difference in brightness between guys white shirt vs sunlight sparkling on water in the same scene.
 
I have a TCL 55in 6 series TV and (when you can find it) HDR 4K looks crazy good coming from SDR 1080P. FALD (or OLED) is basically required to get the best out of HDR. Regular 4K is nice, but 4K HDR blows it out of the water.
 
OLED's perfect blacks and contrast gives far better image quality even in SDR, than super bright HDR flashes on an LCD.

I wouldn't even want HDR, because having 1000 nits blasting at my face is just not something I want.
 
OLED's perfect blacks and contrast gives far better image quality even in SDR, than super bright HDR flashes on an LCD.

I wouldn't even want HDR, because having 1000 nits blasting at my face is just not something I want.
I use my monitor at about 100/120nits so imagining 1000+ nits is not up my alley but I've never seen it
 
I think with the right material, it can look amazing. My old Samsung (KU6300) had HDR and it was (I think) mostly a marketing bullet-point. I mean, I thought it looked a little nicer but most of the time I could not even tell the difference. Then I got a new set, the Samsung Q7F, and this has a much improved HDR experience. But it really depends on content. I've found that the TV demo material (what they show in the stores) actually looks the best. I do have a small collection of 4K Blu-Ray discs, but I feel like it's a hit or miss. Some movies look amazing, or at least have a few amazing scenes, while others barely look any better than a normal 1080p Blu-Ray. So content plays a big part.
 
If you have the opportunity, try to see God of War on the PS4 on an HDR TV vs a regular TV (and no, youtube videos dont count). The difference pretty much sold me on HDR.
 
OLED's perfect blacks and contrast gives far better image quality even in SDR, than super bright HDR flashes on an LCD.

I wouldn't even want HDR, because having 1000 nits blasting at my face is just not something I want.

personally i still find the whole 1000nit thing bs. i've looked at high end oled 1000nit screens with HDR and i've looked at HDR on my 55r617 and personally i'd rather watch HDR content on my R617.. all the OLED did was hurt my eyes every time there was a bright scene. that being said i also hate IPS monitors for the same reason.
 
Because of what Vega said.

But it's a bit early and there is a serious lack of displays that can do it properly, as well as software issues (the Win 10 implementation is far from perfect for example) and there is also a lack of content. Well, there is quite an impressive number of HDR blu-rays already to be fair, even if many of them are upscaled blu-rays the colours are greatly improved.
 
HDR on Oled or on Full Array Local Dimming LCD is pretty good but here is where you should know that makes a difference on what a TV can or cannot do.
The only drawback is that television panels themselves are driven by content which would differ per panel or per manufacturer. The limit in these panels or implementation is not universal what would make it more obvious (every HDR effect is done separately per title).

I don't think you need 1000 nits seen some people dismissing computer monitors which do 400 nits. While all in all the way you use the effect and how the panel translates it is more important then the raw nits number. That does leave the obvious where you have a bright scene and get it even brighter that the explosion would seem more appealing on higher nits panel but that would certainly not mean that the lesser effect of the lower nits panel would seem bland if the the drivers (both panel and software) do their job.

And that is why HDR is not felt as a great improvement it certainly can have its moments but over all the hardware used for HDR (and how it handles) must be a lot better to impress people.
 
I understood 1000 nits was the requirement for true HDR? Armenius

I have some experience with HDR in the home theater hobby with projectors and it’s a faux label on nearly all 10k or less projectors — because they just don’t have be brightness to truly do it justice and so they try to dim the content typically to make up the delta for the highlights and it just looks like trash - I’ve not seen it implemented correctly yet out of a half dozen home theaters of friends I know that tried to implement.
 
HDR is the most promising display tech since adaptive sync / g-sync IMO. We just need more content which will come once it is more widely adopted.
 
OLED's perfect blacks and contrast gives far better image quality even in SDR, than super bright HDR flashes on an LCD.

I wouldn't even want HDR, because having 1000 nits blasting at my face is just not something I want.
I use my monitor at about 100/120nits so imagining 1000+ nits is not up my alley but I've never seen it

Again, 1000 nits in HDR is a different thing from 1000 nits blasting to your face in SDR. 1000 nits is only for the highlights, like sparkle or really bright explosion. It will not blind you, it wont even feel uncomfortable since it is not the whole screen that is bright but a small object or detail. If you tuned your SDR screen to peak at 1000 nits then of course it will feel uncomfortable because then even the darkest parts are something like 500 nits, it is the whole screen that is bright from all shadow details to all of the whites. IF the HDR picture feels bright all the time then blame the movie creator for doing a poor job at mastering the film because that is not what should happen. Average brightness for HDR movie should be around 100-200 nits. As I said, HDR is meant for dark room viewing the maximise its benefits.

People who do not know or understand HDR and SDR and what is their difference should keep their opinions to themselves, honestly.

*edit* The last part was not for you HiCZoK
 
Again, 1000 nits in HDR is a different thing from 1000 nits blasting to your face in SDR. 1000 nits is only for the highlights, like sparkle or really bright explosion. It will not blind you, it wont even feel uncomfortable since it is not the whole screen that is bright but a small object or detail. If you tuned your SDR screen to peak at 1000 nits then of course it will feel uncomfortable because then even the darkest parts are something like 500 nits, it is the whole screen that is bright from all shadow details to all of the whites. IF the HDR picture feels bright all the time then blame the movie creator for doing a poor job at mastering the film because that is not what should happen. Average brightness for HDR movie should be around 100-200 nits. As I said, HDR is meant for dark room viewing the maximise its benefits.

People who do not know or understand HDR and SDR and what is their difference should keep their opinions to themselves, honestly.

*edit* The last part was not for you HiCZoK

I guess for a FALD LCD that can reach 1000 nits it's pretty tricky to not blind you (sometimes) because none of them have nearly enough zones yet. Which would explain the experience people are having with the PG27UQ for example. But again, this is a flaw in the implementation rather than a weakness of HDR. (I'm sure some content creators are messing things up as well from time to time)
 
HDR is great because of the increased contrast ratio due to increased range, but you pretty much need LG OLED for that. HDR for monitors is a GIMMICK. The $2000 HDR monitor has contrast ratio BELOW average VA panel. My 2009 TV has better contrast than that excessively over-priced HDR monitor from ASUS... And that thing sucks even through it has those 384 zones!
 
I'll third the above guys. I know a guy that works at Best buy (hey, they sell tv's and have them on display ;)) and he says they hook all the tv's up to some no name splitter in the back. So image quality probably isn't there at that point.

I would think, anyway.

Unless they're running the TVs on VGA instead of HDMI, it's a digital signal so running it through a splitter should make no difference at all. The only question is if it supports a new enough version of HDMI to support 4k/hdr signals or is feeding everything 1080p sdr.
 
SDR is a limited (narrow) band, having a 1000nit HDR display doesn't push that same rendered band 2x to 3x higher like it would ramping up the brightness on a SDR screen. SDR brightness is relative, HDR uses a different system for brightness which uses absolute values. It opens up the spectrum so content's highlights, shadows, etc can go into a much broader band (1000nit peak to .05 black depth). When SDR goes "outside" of it's narrow band, it crushes colors to white, and muddies dark detail to black. HDR will show the actual colors at higher brightness highlights (gleaming reflections,edges) without crushing to white. HDR shows the same content at the same brightness when that content falls within a calibrated SDR range, it does not scale up the brightness of the whole scene like turning the brightness of a SDR screen up would.
HDR video increases the range of brightness in an image to boost the contrast between the lightest lights and the darkest darks. If you’re having difficulty grasping how that translates into a more realistic image on your screen, think of the subtle tonal gradations a fine artist creates in a charcoal drawing to build the illusion of volume, mass, and texture, and you should begin to get the picture. But HDR doesn’t just improve grayscale; its greater luminance range opens up a video’s color palette as well. “Basically, it’s blacker blacks, whiter whites, and higher brightness and contrast levels for colors across the spectrum,” says Glenn Hower, a research analyst at Parks Associates.
The result is richer, more lifelike video images. Rather than washing out to white, as it would in conventional video, a ray of sunlight reflecting off a lake in HDR will gleam, and a bright cloud will appear soft and cottony. Basically any image your current TV would render shadowed, dull, muddy, or bleached out will look nuanced, vibrant, and strikingly realistic in HDR."
--------------------------------------------------------
"Realistically, most content will be mastered at 1,200-2,000 nits.
But that is the absolute peak brightness that highlights will reach. Maximum scene brightness is to remain below 400 nits."
--- This is what the uninformed thinks happens:
"If you viewed SDR content on a 1,000 nit display, everything would be displayed 10x brighter than intended.
If you viewed that same SDR content on a 500 nit display, everything would be half the brightness of the 1,000 nit display."
---This is what really happens:
If you viewed HDR content on a 1000 nit display or a 500 nit display, any scene where the peaks are lower than 500 nits should look exactly the same on both displays, due to the way that brightness is now coded. It represents an exact value, not a relative value.
For scenes which have highlights exceeding 500 nits, you will have highlight clipping on the lower brightness display (everything above 500 nits turns white) which would not be present on the higher brightness display. <---- Which will show full color gradations across the spectrum in bright highlights instead of crushing to white after hitting the sdr ceiling.
HDR enables far more vivid, saturated, and realistic images than SDR ever could.
High-brightness SDR displays are not at all the same thing as HDR displays.""
------------------------------------------------------
http://www.creativeplanetnetwork.com/news/shoot/understanding-gamma-and-high-dynamic-range/615702
"To achieve HDR, it is necessary to use bit-depth more efficiently while keeping average brightness about the same as an SDR image. This means more bit-levels for low light levels where the eye is more sensitive and fewer bit-levels for the high brightness areas where the eye cannot see the contouring. In others words, we need a Perceptual Quantizer or PQ that does a better job than the PQ of the current BT.709 gamma."
"Modern cameras are capable of capturing a wide dynamic range. But unfortunately SDR displays will either clip or blow out highlights in images."
All it takes is one look at a true HDR video on an HDR-capable screen, and consumers are convinced it’s the way to go. But having that impact requires a good understanding of gamma and its role in capturing and monitoring HDR content."
 
Eyes or rather brain already 'compresses' dynamic range of image.
With proper implementation of HDR and big display it will be nice to have my brain do it for displayed images instead of having it artificially compressed... but I am far from going 'bonkers' for HDR.
Good SDR display is what I need today for my PC usage.
 
  • Like
Reactions: N4CR
like this
HDR = richer more lifelike images? or hyperprocessed artificial looking images?

I prefer something that can replicate what I see in real life with my own eyes. Not hyperjacked up colors
 
Fan boys go bananas when you go talking bad about HDR. They bust out all these fancy numbers and explanations, etc. Then, call you wrong or my favorite, you're not doing it right.

Dunno, I have a 49" KS8000 with HDR. I've spent a great deal of time with it and ... yeah, they need to work on the out of box experience for sure. It just needs to be baked in and work without any jacking around by the end user.

If you have a high-end set and it's properly setup, it's very very difficult for me to see any difference with HDR properly configured. I have maybe 12 ultra blurays with HDR.

Also I think I read the Samsung KS8000 can do 1400 nits of brightness
 
HDR = richer more lifelike images? or hyperprocessed artificial looking images?

I prefer something that can replicate what I see in real life with my own eyes. Not hyperjacked up colors
There is a demo out by Phillips which shows older Oled 2016 panels with newer processing engine which produces better colours then a recent Phillips Oled which uses the 2017 panels.

The contrast and how you address the hardware is important and what you are saying is true the need to distinguish and drive the colour saturation and contrast up to the point where it is clear can make it rather a silly race of whom can do the most with their framebuffer effects.

The Oled panels these days are not improving that much. That is why Phillips opted for a new processing engine to get the most out of it, supposedly at the end of the year another demo will arrive.
 
Fan boys go bananas when you go talking bad about HDR. They bust out all these fancy numbers and explanations, etc. Then, call you wrong or my favorite, you're not doing it right.

Dunno, I have a 49" KS8000 with HDR. I've spent a great deal of time with it and ... yeah, they need to work on the out of box experience for sure. It just needs to be baked in and work without any jacking around by the end user.

If you have a high-end set and it's properly setup, it's very very difficult for me to see any difference with HDR properly configured. I have maybe 12 ultra blurays with HDR.

Also I think I read the Samsung KS8000 can do 1400 nits of brightness
even that tv sucks for local dimming. The only TV's I've seen HDR really work well are OLED presently. I haven't checked the higher end 2018 models but I think we're still a couple years from HDR really being here. OLED still has its own things to improve upon. HDR as a tech can improve quality a lot. Marketing buzzwords and half assed implementations are where this thread happens. that KS8000 looks fantastic BTW but HDR just isn't really here yet.
 
I've had HDR for the longest time. I go outside and I look at the trees, and I watch the sunset, and I get out of my house and go for a hike...

The best HDR and I get it for free.

Enjoy fake HDR at the sum of thousands of dollars.
 
fFrlemY.png

The same types of things were said about 16:9 screens by 4:3 users, about 1080p HD, blurays, surround sound, 1440p, 4k, 120hz, g-sync,, FALD, soon hdmi 2.1 w/VRR, OLED, HDR, etc., etc. If you don't want to adopt newer tech right away, don't. Many of us will pick our battles and buy in in time (and as content becomes more ubiquitous) even if we aren't all day 1 early adopters. Some of us have been waiting for big display advancements for years and aren't going to wait forever until a display is cheaper than an xbox. I'm personally very interested in hdmi 2.1 VRR HDR 120hz 4k LGs in 2019.
 
Last edited:
Back
Top