RogueTadhg
[H]ard|Gawd
- Joined
- Dec 14, 2011
- Messages
- 1,527
I've watched a TV at the local blue store and I wasn't impressed. Is there some way I can really see the difference between HDR and SDR?
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
On my PG27UQ, which is LCD FALD with 384 zones and 1000 nit peak brightness, the lightning bolts in the opening mission of Mass Effect Andromeda literally blinded me for an instant in a dimly lit room.
Honestly I just want a 120/144 Hz 4K g-sync screen, drop the HDR I don't need it, don't need local dimming zones either. There is no reason that this screen shouldn't already exist. So far hdr appears to be a $2000 buzzword to me.
Honestly I just want a 120/144 Hz 4K g-sync screen, drop the HDR I don't need it, don't need local dimming zones either. There is no reason that this screen shouldn't already exist. So far hdr appears to be a $2000 buzzword to me.
First ask yourself what is HDR? I'm not talking about the half assed HDR. I'm talking about HDR 10 and dolby vision. Once you research what they do get back to us. The problem is it has been very poorly implemented so far. On a proper display it is something great It isn't a gimmick to have things like proper lighting and having blacks be black.HDR is a dud. At best it looks "different," and it worst it looks worse than a normal display. I'm not usually someone to poo poo on display advances, but honestly, this just wasn't a big deal.
he said in a dark room he was kinda blinded. What that means is the lightning flash was super bright and everything else was super dark. A small amount of ambient light from a source other than the screen has always been advised for as long as I can remember. That is why many people have things like LED strips giving their room some dim lighting.Yup. Monitors are something that should just work and not get in your way. Their features aren't SUPPOSED to be noticeable or stand out. When you see people mentioning things like feeling that they got blinded by an explosion in a game, you know HDR is a gimmick and a dud. People don't WANT to be blinded. They want something they can comfortably look at that doesn't get in their way. HDR effects are actually a bad thing that diminishes visual quality in most cases.
You bet! The monitor is great for all content including SDR due to the color and increased contrast with the FALD. So even though there isn't that much gaming yet to take advantage of HDR I am still very happy and confident this monitor is futureproof for at least the next few years. I think the only thing that would be a replacement for it would be a G-Sync HDR OLED.Still happy with it?
I use my monitor at about 100/120nits so imagining 1000+ nits is not up my alley but I've never seen itOLED's perfect blacks and contrast gives far better image quality even in SDR, than super bright HDR flashes on an LCD.
I wouldn't even want HDR, because having 1000 nits blasting at my face is just not something I want.
OLED's perfect blacks and contrast gives far better image quality even in SDR, than super bright HDR flashes on an LCD.
I wouldn't even want HDR, because having 1000 nits blasting at my face is just not something I want.
OLED's perfect blacks and contrast gives far better image quality even in SDR, than super bright HDR flashes on an LCD.
I wouldn't even want HDR, because having 1000 nits blasting at my face is just not something I want.
I use my monitor at about 100/120nits so imagining 1000+ nits is not up my alley but I've never seen it
Again, 1000 nits in HDR is a different thing from 1000 nits blasting to your face in SDR. 1000 nits is only for the highlights, like sparkle or really bright explosion. It will not blind you, it wont even feel uncomfortable since it is not the whole screen that is bright but a small object or detail. If you tuned your SDR screen to peak at 1000 nits then of course it will feel uncomfortable because then even the darkest parts are something like 500 nits, it is the whole screen that is bright from all shadow details to all of the whites. IF the HDR picture feels bright all the time then blame the movie creator for doing a poor job at mastering the film because that is not what should happen. Average brightness for HDR movie should be around 100-200 nits. As I said, HDR is meant for dark room viewing the maximise its benefits.
People who do not know or understand HDR and SDR and what is their difference should keep their opinions to themselves, honestly.
*edit* The last part was not for you HiCZoK
I'll third the above guys. I know a guy that works at Best buy (hey, they sell tv's and have them on display ) and he says they hook all the tv's up to some no name splitter in the back. So image quality probably isn't there at that point.
I would think, anyway.
HDR video increases the range of brightness in an image to boost the contrast between the lightest lights and the darkest darks. If you’re having difficulty grasping how that translates into a more realistic image on your screen, think of the subtle tonal gradations a fine artist creates in a charcoal drawing to build the illusion of volume, mass, and texture, and you should begin to get the picture. But HDR doesn’t just improve grayscale; its greater luminance range opens up a video’s color palette as well. “Basically, it’s blacker blacks, whiter whites, and higher brightness and contrast levels for colors across the spectrum,” says Glenn Hower, a research analyst at Parks Associates.
--------------------------------------------------------The result is richer, more lifelike video images. Rather than washing out to white, as it would in conventional video, a ray of sunlight reflecting off a lake in HDR will gleam, and a bright cloud will appear soft and cottony. Basically any image your current TV would render shadowed, dull, muddy, or bleached out will look nuanced, vibrant, and strikingly realistic in HDR."
--- This is what the uninformed thinks happens:"Realistically, most content will be mastered at 1,200-2,000 nits.
But that is the absolute peak brightness that highlights will reach. Maximum scene brightness is to remain below 400 nits."
---This is what really happens:"If you viewed SDR content on a 1,000 nit display, everything would be displayed 10x brighter than intended.
If you viewed that same SDR content on a 500 nit display, everything would be half the brightness of the 1,000 nit display."
------------------------------------------------------If you viewed HDR content on a 1000 nit display or a 500 nit display, any scene where the peaks are lower than 500 nits should look exactly the same on both displays, due to the way that brightness is now coded. It represents an exact value, not a relative value.
For scenes which have highlights exceeding 500 nits, you will have highlight clipping on the lower brightness display (everything above 500 nits turns white) which would not be present on the higher brightness display. <---- Which will show full color gradations across the spectrum in bright highlights instead of crushing to white after hitting the sdr ceiling.
HDR enables far more vivid, saturated, and realistic images than SDR ever could.
High-brightness SDR displays are not at all the same thing as HDR displays.""
All it takes is one look at a true HDR video on an HDR-capable screen, and consumers are convinced it’s the way to go. But having that impact requires a good understanding of gamma and its role in capturing and monitoring HDR content."
There is a demo out by Phillips which shows older Oled 2016 panels with newer processing engine which produces better colours then a recent Phillips Oled which uses the 2017 panels.HDR = richer more lifelike images? or hyperprocessed artificial looking images?
I prefer something that can replicate what I see in real life with my own eyes. Not hyperjacked up colors
even that tv sucks for local dimming. The only TV's I've seen HDR really work well are OLED presently. I haven't checked the higher end 2018 models but I think we're still a couple years from HDR really being here. OLED still has its own things to improve upon. HDR as a tech can improve quality a lot. Marketing buzzwords and half assed implementations are where this thread happens. that KS8000 looks fantastic BTW but HDR just isn't really here yet.Fan boys go bananas when you go talking bad about HDR. They bust out all these fancy numbers and explanations, etc. Then, call you wrong or my favorite, you're not doing it right.
Dunno, I have a 49" KS8000 with HDR. I've spent a great deal of time with it and ... yeah, they need to work on the out of box experience for sure. It just needs to be baked in and work without any jacking around by the end user.
If you have a high-end set and it's properly setup, it's very very difficult for me to see any difference with HDR properly configured. I have maybe 12 ultra blurays with HDR.
Also I think I read the Samsung KS8000 can do 1400 nits of brightness