So some general HDR info to try and help people out. This is overly simplified and there are all kind of things to consider, but there are sort of 4 basic "tiers" of HDR you find out there:
--The first tier is stuff with no local dimming and low brightness. HDAren't I've heard it called and that is accurate. Displays like this will carry the DisplayHDR 400 certification, if they carry one at all. These things have no real ability to do high dynamic range, they just know how to accept an HDR signal. Results tend to be pretty trash. Sometimes a game may look a little better, depending on how it handles color grading and such, but in general you shouldn't view displays like that as HDR and really shouldn't count on using it at all.
--The next tier is stuff with edge lit local dimming and medium brightness. Anything that is DisplayHDR 600 certified or above is guaranteed to be at least this. Here you get some real HDR ability. It can get reasonably bright, competitive with OLED TVs, and is able to actually dim areas of the display to be darker than others. However since all the backlights are along the edge, the dimming is imprecise and you don't get many zones. Some only have 4 or 8. This means that you aren't going to get small bright spots, if a part of the screen has a bright thing on it, it'll all be bright which can lead to some washed out areas, and bloom. Not an ideal HDR experience, but usable, at least if the local dimming is done well.
--The tier above that is stuff with full-array local dimming (FALD) and high brightness. If something has DisplayHDR 1000 certification it is probably like this, though it doesn't strictly have to be and it is almost certainly like this if it has DisplayHDR 1400 certification. With this the backlights are no longer along the edge, but in an array all along the back of the panel. This gives not only better brightness (from having more lights) but more importantly lets them dim in smaller zones. TVs with this usually start at 70 or so zones and you can find monitors with over 1000. This allows for pretty precise HDR, you have have a bright light with darkness near it, and not a lot of blooming. Still not pinpoint accuracy, but it can be pretty darn good. How precise does vary a lot though with the number of zones.
--The top tier is emissive displays, like OLEDs, that can turn individual pixels on and off. No monitors with that yet, but they will carry the DisplayHDR True Black 400/500 certifications. Since individual pixels themselves are lit, you truly can have pinpoint precision. You can get bright stars on inky black space, or the shimmer of just one strand of hair. It's the HDR we all really want, but for now there are no monitors that do it, and there are concerns about using OLED TVs for desktop usage. Also there is a slight downside as it can't get as bright, the VESA certifications are only calling for 400-500 nits, and the best OLED TVs get maybe 800ish in the real world. Good FALD TVs are easily above 1000nits, with some above 2000nits (Vizio PQX).
Also of note is that HDR isn't entirely about brightness and contrast, though those are the big selling points, hence the name. HDR is also a wide gamut color space. Technically HDR stuff is encoded in Rec.2020 but since nothing can actually do that, current stuff is designed for DCI-P3. What this means is that HDR content can be much more colorful than SDR content, when it is done right and you have a proper display.
--The first tier is stuff with no local dimming and low brightness. HDAren't I've heard it called and that is accurate. Displays like this will carry the DisplayHDR 400 certification, if they carry one at all. These things have no real ability to do high dynamic range, they just know how to accept an HDR signal. Results tend to be pretty trash. Sometimes a game may look a little better, depending on how it handles color grading and such, but in general you shouldn't view displays like that as HDR and really shouldn't count on using it at all.
--The next tier is stuff with edge lit local dimming and medium brightness. Anything that is DisplayHDR 600 certified or above is guaranteed to be at least this. Here you get some real HDR ability. It can get reasonably bright, competitive with OLED TVs, and is able to actually dim areas of the display to be darker than others. However since all the backlights are along the edge, the dimming is imprecise and you don't get many zones. Some only have 4 or 8. This means that you aren't going to get small bright spots, if a part of the screen has a bright thing on it, it'll all be bright which can lead to some washed out areas, and bloom. Not an ideal HDR experience, but usable, at least if the local dimming is done well.
--The tier above that is stuff with full-array local dimming (FALD) and high brightness. If something has DisplayHDR 1000 certification it is probably like this, though it doesn't strictly have to be and it is almost certainly like this if it has DisplayHDR 1400 certification. With this the backlights are no longer along the edge, but in an array all along the back of the panel. This gives not only better brightness (from having more lights) but more importantly lets them dim in smaller zones. TVs with this usually start at 70 or so zones and you can find monitors with over 1000. This allows for pretty precise HDR, you have have a bright light with darkness near it, and not a lot of blooming. Still not pinpoint accuracy, but it can be pretty darn good. How precise does vary a lot though with the number of zones.
--The top tier is emissive displays, like OLEDs, that can turn individual pixels on and off. No monitors with that yet, but they will carry the DisplayHDR True Black 400/500 certifications. Since individual pixels themselves are lit, you truly can have pinpoint precision. You can get bright stars on inky black space, or the shimmer of just one strand of hair. It's the HDR we all really want, but for now there are no monitors that do it, and there are concerns about using OLED TVs for desktop usage. Also there is a slight downside as it can't get as bright, the VESA certifications are only calling for 400-500 nits, and the best OLED TVs get maybe 800ish in the real world. Good FALD TVs are easily above 1000nits, with some above 2000nits (Vizio PQX).
Also of note is that HDR isn't entirely about brightness and contrast, though those are the big selling points, hence the name. HDR is also a wide gamut color space. Technically HDR stuff is encoded in Rec.2020 but since nothing can actually do that, current stuff is designed for DCI-P3. What this means is that HDR content can be much more colorful than SDR content, when it is done right and you have a proper display.