HDR 1000 nits vs 400 nits

insoc123

Limp Gawd
Joined
Apr 14, 2010
Messages
300
I made my mind about the new gaming monitor I want and it's specs look quite good, with HDR with 400 nits. There are two monitors (4K, 144) that offers more nits: 1000. My questions is: to really experience good HDR gaming, do I really need those 1000 nits or I would be fine with 400? Actually these 1000 vs 400 nits difference is the only one that it's stoping me from buying the monitor.

Thanks a lot,
 
I made my mind about the new gaming monitor I want and it's specs look quite good, with HDR with 400 nits. There are two monitors (4K, 144) that offers more nits: 1000. My questions is: to really experience good HDR gaming, do I really need those 1000 nits or I would be fine with 400? Actually these 1000 vs 400 nits difference is the only one that it's stoping me from buying the monitor.

Thanks a lot,

I don't think so. The more the better, but the contrast matters a lot more than the nits.
 
400 nit HDR isn't really HDR. HDR includes all the colors(and brightness is part of color) of SDR, plus a bunch, and DisplayHDR 400 monitors are for the mostpart just regular SDR LCDs with the ability to decode HDR metadata. To actually see the intense colors and bright highlights of HDR you need high brightness and to see the shadow detail you need high contrast(low black levels).

Most DisplayHDR 400 monitors don't have either.
 
High HDR brightness ups the cost a bunch. HDR400 isn't much but gives you something until backlights improve and costs come to more reasonable levels to get that high brightness.
 
I made my mind about the new gaming monitor I want and it's specs look quite good, with HDR with 400 nits. There are two monitors (4K, 144) that offers more nits: 1000. My questions is: to really experience good HDR gaming, do I really need those 1000 nits or I would be fine with 400? Actually these 1000 vs 400 nits difference is the only one that it's stoping me from buying the monitor.

Thanks a lot,

400nits is basically fake HDR, and it’s really just a marketing ploy by the monitor manufacturers to sell low end products with the “HDR” badge. AFAIK, 400nit monitors don’t come with FALD; they use standard edge LED, which makes them more or less no different than most other non-HDR monitors. They’re just a little brighter.

Everyone has a different opinion on what makes for a good HDR experience. Some people (like myself) prefer dark blacks. Others prefer high whites. With current tech, there is no perfect option out there. If you primarily game in a dark room, I would go with a VA monitor, which will give much better contrast than IPS. Contrast will be great, and if that’s combined with a wide gamut, that’s even better. If you game in a bright room, then I would suggest going for higher nits.
 
With OLED, 500nits is enough because of their pure black, With LCD, yes, 1000 is a minimum to really start noticing HDR. Frankly, 2000nits would make it obvious, 1000 is a bare minimum. That's only because LCD blacks are garbage in general. Highlights are important, but if your darks aren't also quite dark, it doesn't really matter. You need great contrast to really appreciate HDR.
 
HDR on the PC is kind of iffy anyways. Unless you plan on watching a lot of movies on it there aren't many games that implement it properly.
 
HDR is not like regular SDR brightness. It adds full color into a much higher range on light sources and bright highlights dynamically throughout a scene where the rest of the scene falls more in the SDR range. When a regular display reaches it's brightness limit, it clips to white at a peak luminance instead of showing that color through the higher brightness color range dynamically. So HDR color volume increases is nothing like turning the brightness up on a SDR monitor. In fact, true HDR uses absolute values which you don't change the brightness of manually at all while SDR uses relative values where you can move the entire contents of the narrow color brightness volume band up and down in the OSD.

SDR color gamut:
https://i.imgur.com/VR6gxX2.png

HDR color volume:
https://i.imgur.com/3r1M8aR.png

OLED are great but in order to avoid burning in they are locked down to lower color brightness levels in regard to HDR. An OLED can be calibrated at 400nit color, after that, their use of an added white subpixel (WRGB instead of just RGB), pollutes the color space "cheating" higher brightness readings using the white subpixels. In addition to that, OLEDs use ABL (auto brightness limiter) as a safety reflex to avoid burning in. ABL cuts the HDR color brightness down to 600nit. Since SDR is around 350nit before is clips to white at it's limit, that means an OLED showing HDR 1000, 4000, or 10,000 content will show something like a 350nit SDR range of a scene + 250nit of white polluted HDR color highlights and sources.

======

HDR not being ubiquitous right now I'd say OLED is a great picture for right now since most content, gaming content in particular, is still SDR. I just wouldn't get one expecting much of even fractional HDR 1000, 4000, or 10,000 color volume. This is because OLED is like 350nit SDR + 250 nit white pixel mixed color volume at 600nit ABL limits to avoid burn in.

So 600nit OLED is something like
350nit SDR color + 250 higher nit color* capability
(*white pixeled and varying - reflexively toned down from higher than 600 peak back to 600 via ABL)

1000nit is SDR +650 greater color volume capability

1800nit is SDR + 1450 greater color volume height capability

2900nit is SDR + 2550 greater color volume height capability

------------------------------------------

Some tvs like samsung Q9FN can do 1800 ~ 2000 nit color volume with a 480zone FALD array of backlights - but you get bloom/dim offset of FALD and while they are 19,000:1 contrast they aren't like oled esp concerning FALD offsets.

So you get either one of these trade-offs on a OLED or FALD display in regard to HDR:

https://imgur.com/NVsBTV1

----------------
 
The main thing here isn't the brightness, it's the black level and color quality- hdr 400 doesn't require anything special regarding blacklevel/backlight/color at all. They can just take any old lcd trash panel that can display 95% bt709 and put a stronger edgelit backlight and call it a day.
https://displayhdr.org/performance-criteria/

HDR500 and up actually have requirements that mean something and ensures pretty solid sdr and hdr quality.
 
Back
Top