Philips’ Momentum 436M6VBPAB Will Support DisplayHDR 1000 and Adaptive Sync

Megalith

24-bit/48kHz
Staff member
Joined
Aug 20, 2006
Messages
13,000
Philips is releasing a new 43” MVA-panel display that flaunts VESA’s highest HDR certification, DisplayHDR 1000, which demands local dimming and a peak luminance of 1000 cd/m2. But despite this certification, it is arguably still “fake” HDR, as the panel is 8-bit (16 million colors) only and relies on FRC (frame rate control) to achieve more colors. A true HDR panel is 10-bit and capable of 1 billion colors.

With DisplayHDR 1000 and UHDA certifications, the Momentum 436M6VBPAB promises to deliver a great HDR experience with 97.6% coverage of the DCI-P3 color space, a peak brightness of over 1000 nits, local dimming for deeper blacks and support for 10-bit color (8-bit + FRC). The display will support HDMI 2.0, DisplayPort 1.2, mini DisplayPort and USB Type-C (DP alt mode) inputs, though at this time the monitor's Variable Refresh Rate (VRR range) is unknown.
 
Getting warmer.

43" is the perfect size for a 4k screen on a desktop.

I need to be able to use adaptive sync on Nvidia GPU's before I jump in through, as AMD just doesn't make anything fast enough for 4k.
 
  • Like
Reactions: zehoo
like this
VRR range unknown: I really wish AMD hadn't half-assed FreeSync. They knew they were going to be late copying Nvidia, at least they could have copied G-Sync right lol.
 
Exciting.
43 actually IS the perfect size for near field 4K gaming - I’ve got 3 of them.
Hopefully they don’t screw this up.
Like putting 120hz PWM dimming on it.
 
The G-Sync monitors with HDR were announced earlier this year. While a little out of my purchasing range at $2,500, I'm sure that people will be lining up to get those.
https://www.anandtech.com/show/12637/acer-and-asus-gsync-hdr-displays-listed-and-priced

[H]ardforum thread is here.
https://hardforum.com/threads/gsync-4k-144hz-hdr-monitor-prices-released.1958362/#post-1043578001

Yeah, but 27"... Way too small for 4k. One of these in 43" would be absolutely perfect.
 
Exactly - HDMI 2.1 isn’t a thing yet anywhere. Doubt even Volta will have it, yet alone any displays, TVs, monitors or otherwise.
 
Yeah, but 27"... Way too small for 4k. One of these in 43" would be absolutely perfect.

Why? You people keep saying this stuff but it makes no sense. My last 4K monitor was 28 inches and it was awesome. I'm not a bat so I could see my desktop just fine (do all of you need glasses?) and games, due to the PPI, surely looked way better than they would have a 40+ inch monitor.

No, 4K on 27-32 inches is just fine, YOU PREFER IT on a larger monitor.
 
Why? You people keep saying this stuff but it makes no sense. My last 4K monitor was 28 inches and it was awesome. I'm not a bat so I could see my desktop just fine (do all of you need glasses?) and games, due to the PPI, surely looked way better than they would have a 40+ inch monitor.

No, 4K on 27-32 inches is just fine, YOU PREFER IT on a larger monitor.

Bullshit.

Anything over 100ppi is a complete waste at normal desktop viewing distances.

I gain much more from having an immersive large screen and more desktop real estate than I'll ever do by wasting pixels on high PPI that will force me to lean in and sit stupidly close to get any benefit from it.

Phones are great at higher PPI, but we also hold them A LOT closer to our faces.
 
Bullshit.

Anything over 100ppi is a complete waste at normal desktop viewing distances.

I gain much more from having an immersive large screen and more desktop real estate than I'll ever do by wasting pixels on high PPI that will force me to lean in and sit stupidly close to get any benefit from it.

Phones are great at higher PPI, but we also hold them A LOT closer to our faces.

I agree with you 100% while it might look fine on a 28" monitor at 40-43 you are at the same dpi as a 1440p 28". Everything is just as sharp you just have a shit load more real estate. The big issue really is getting higher refresh rates on displays that big. Hdr is a step in the right direction though.
 
Samsung hdr+ tv is mybe cheaper 500. I can see gamma is strange no good for all images or colors ar to much saturated HDR+ vs HDR1000?
 
Getting warmer.

43" is the perfect size for a 4k screen on a desktop.

I need to be able to use adaptive sync on Nvidia GPU's before I jump in through, as AMD just doesn't make anything fast enough for 4k.

And here I feel like my 27" 4k monitor is way too big!
 
  • Like
Reactions: noko
like this
Really just depends on your desk setup. 43 would be way too big for my current desk setup as it's pretty damn close to where I sit to view it.
 
Bullshit.

See, I can do that too...doesn't make you any more right. 4K at 28 inches is dope, I know because I had it. I'd go back to it over a larger size actually.


Yeah, sorry, that was a little harsh. I was in a foul mood last night.

I still stand by my argument, that anything over 100dpi is completely wasted in a desktop application where monitors are typically at arms length away (I know I never sit closer than ~2.5ft to my screens) but I could have phrased that nicer.
 
And here I feel like my 27" 4k monitor is way too big!

Really just depends on your desk setup. 43 would be way too big for my current desk setup as it's pretty damn close to where I sit to view it.

I completely understand not having the stomach for a 43" screen. My 48" is quite a behemoth until you get used to it, and it is not for everyone.

I'm not talking about absolute size, but rather pixel density.

Not wanting a screen larger than 24" or 27" is completely understandable, but in that case, stick to 1080p or 1440p, respectively. There is no point in wasting GPU power on a small high resolution screen.
 
anything over 100dpi is completely wasted

I don't know...Maybe for straight desktop use I can see this...my 32 in 1440p now is 91ppi and for every day use it's nice.

However, for GAMING 28 inches at 4K was great. It was almost like looking through a window since it was all so clear. Yeah, desktop use over 100ppi I won't disagree with (though, higher never hurts certainly) but for gaming, yeah, I'd still prefer the sharpness and detail a higher ppi brings. 157ppi for a 28 inch 4K...looked amazing.
 
I had a 4k 43" samsung and I upgraded to a curved 43" 4k samsung and that was a great decision. Curved for TV is stupid. Curved for a 4K monitor that is a couple feet away from a single viewer/user is just so much more immersive.

My dream monitor would be a 4K, 43" to 45" ,curved OLED, with 144hz refresh and <10ms response time.
 
I completely understand not having the stomach for a 43" screen. My 48" is quite a behemoth until you get used to it, and it is not for everyone.

I'm not talking about absolute size, but rather pixel density.

Not wanting a screen larger than 24" or 27" is completely understandable, but in that case, stick to 1080p or 1440p, respectively. There is no point in wasting GPU power on a small high resolution screen.

Try 4k at 27"! It looks gorgeous compared to my old 1440p144hz which shows a ton of aliasing.

I hate aliasing and 27" comes close. I would prefer a 24" 144hz 4k monitor to be honest.

I love my First person shooters and smaller is better for me.

I personally want every 4k144hz size available as we all have different preferences. That isn't a bad thing!
 
Try 4k at 27"! It looks gorgeous compared to my old 1440p144hz which shows a ton of aliasing.

I hate aliasing and 27" comes close. I would prefer a 24" 144hz 4k monitor to be honest.

I love my First person shooters and smaller is better for me.

I personally want every 4k144hz size available as we all have different preferences. That isn't a bad thing!

If aliasing is your major concern, you can get better results by forcing higher anti-aliasing settings with less GPU load than increasing the resolution to 4k...
 
If aliasing is your major concern, you can get better results by forcing higher anti-aliasing settings with less GPU load than increasing the resolution to 4k...

Oh come on man...do you even game? Everyone knows AA takes a loss on detail. 4K native with no AA looks way better than 1080p/1440p with AA. Hell, even when the game ran fine with it on I STILL turned AA off just because it served no purpose other than making everything blurrier.

We can tell what I say is a fact as well. Go downsample 4K on a 1440p monitor and tell me it doesn't look better than 1440p with 4-8xAA. Especially, again, in the details. Native resolution even kind of sucks. 4K downsampled on my 1440p gave me way more detail (especially on things like leaves and grass and power cables/wires, etc).
 
Gaming at 4K 27" is a great experience and I too don't have too much issue seeing desktop etc. on a 4K 27", also Win10 scaling, if needed does a fairly decent job on most stuff.
 
Oh come on man...do you even game? Everyone knows AA takes a loss on detail. 4K native with no AA looks way better than 1080p/1440p with AA. Hell, even when the game ran fine with it on I STILL turned AA off just because it served no purpose other than making everything blurrier.

We can tell what I say is a fact as well. Go downsample 4K on a 1440p monitor and tell me it doesn't look better than 1440p with 4-8xAA. Especially, again, in the details. Native resolution even kind of sucks. 4K downsampled on my 1440p gave me way more detail (especially on things like leaves and grass and power cables/wires, etc).


I have no idea what you are talking about, man.

I suspect you are having some real placebo issues.

Some minor loss in clarity? Maybe, but even with still frames, zoomed in, there is no apparent clarity loss between No AA, FXAA, 2x or 4X, as illustrated below. In fact, apart from the reduction in aliasing, they look damned near identical.

1310225584ETksdnSnYl_3_2_l.png


Reference the 2011 FXAA Review for more examples.

Of course downsampling from 4k to 1440p is going to give you good results, as this is essentially an extreme example of FSAA/SSAA. You can get results that are indistinguishable at much lower GPU load - however.

Silly high pixel densities are not the solution to aliasing without a downsample, however, as aliasing is still apparent at very high PPI's if no anti-aliasing technique is used. It is an unfortunate side effect of raster graphics.
 
I have no idea what you are talking about, man.

I suspect you are having some real placebo issues.

Some minor loss in clarity? Maybe, but even with still frames, zoomed in, there is no apparent clarity loss between No AA, FXAA, 2x or 4X, as illustrated below. In fact, apart from the reduction in aliasing, they look damned near identical.

View attachment 68907

Reference the 2011 FXAA Review for more examples.

Of course downsampling from 4k to 1440p is going to give you good results, as this is essentially an extreme example of FSAA/SSAA. You can get results that are indistinguishable at much lower GPU load - however.

Silly high pixel densities are not the solution to aliasing without a downsample, however, as aliasing is still apparent at very high PPI's if no anti-aliasing technique is used. It is an unfortunate side effect of raster graphics.

This is very true, though I would take a downsampled (to native resolution) image over other forms of AA if the performance was the same, but it never is. That's why FXAA and other post AA are awesome.
 
I have no idea what you are talking about, man.

Well go test it out then. Go get a 28 inch 4K monitor and play games on it at native resolution with and without AA and tell me that AA on looks better...looks the same detail wise. It doesn't. Again, I would know because I've actually done it lol. I don't know why you must argue with facts. AA in effect is a blurring technique and blurring always = less detail. Have you ever seen something that's blurry that's more detailed than it not being blurry?
 
4K sub 30” sucks. Pointless, actually. Once you’ve gamed on a 43 you won’t settle for less.
Nothing immersive about gaming on a tiny 27” screen. Get with the times (er 4 years ago) folks.
 
Well go test it out then. Go get a 28 inch 4K monitor and play games on it at native resolution with and without AA and tell me that AA on looks better...looks the same detail wise. It doesn't. Again, I would know because I've actually done it lol. I don't know why you must argue with facts. AA in effect is a blurring technique and blurring always = less detail. Have you ever seen something that's blurry that's more detailed than it not being blurry?

I have run both with and without (not by choice, but due to my Pascal Titan X not being able to keep up with it at 4k) many times across many titles going back literally decades. (and before we begin to debate eyesight, yes, my vision is corrected to better than 20/20)

1.) MLAA/FXAA can result in a slight blur. It is subtle. If you do side by sides on still frames it can be quite noticeable (but not always in all titles) but in moving scenes it pretty damned subtle to the point of not being noticible during gameplay. This is to be expected though as these technologies are compromise technologies trying to gain much of the benefit seen in FSAA/SSAA, but at lower computational cost.

2.) With FSAA/SSAA, apart from the varying degrees of effectiveness at dealing with aliasing (2x is not very effective, 4x is more so, 8x is usually perfect, but rather computationally intensive) I have never noticed any degradation in clarity either in motion or on still frames. Just like the screenshots from the [H] FSAA review above there is no difference at all.



I'd argue one or two things are going on here, either:

a.) You have some weird problem with your setup; or

b.) Rather than "facts" as you put it, image quality is an inherently subjective measure, and you are suffering from some serious placebo effect, seeing problems where there are none with AA, and seeing benefits where there are none from higher PPI based solely on what you expect to see, not on what is actually there. (this is - after all - how the human brain works, it's not your fault)
 
2.) With FSAA/SSAA, apart from the varying degrees of effectiveness at dealing with aliasing (2x is not very effective, 4x is more so, 8x is usually perfect, but rather computationally intensive) I have never noticed any degradation in clarity either in motion or on still frames. Just like the screenshots from the [H] FSAA review above there is no difference at all.

FXAA is effectively a fullscreen smearing, never cared for it myself. SSAA is a bit better, and of course MSAA and it's variants are better but at the end of the day they're still blur filters. It's the inherent point of them. They "blur" neighboring pixels to reduce aliasing.

Maybe it's the larger PPI but to me on my 28 inch no AA always looked crisper. Especially playing The Division mainly at the time. Thinks like power lines and tree limbs lost detail using any AA method because the smaller thinner pieces would blur (I'm sorry, anti-aliasing) out, basically. Hell, there were somethings that would practically disappear with AA on because they were small/thin enough that they were basically eliminated once blurred slightly.

So, again, 28 inches for 4K is fine...43+ inches isn't needed or wanted by most PC users. AA blurs and lessens detail.

I mean, think about it, we know that even running native resolution doesn't really show ALL the detail. Downscaling proves this as running something like 4K on 1440p does give a lot more detail. OK...that's ADDING rendered pixels. So now if we run native resolution plus AA (blur) it makes no sense that there WOULDN'T be detail loss.

Look man, all I can say is that I know what I see. AA reduced detail in the games I played on it, all of them. I don't know what else to tell ya? With my contacts my vision is 16/20 and I used a arm so the monitor was closer to me. Outside that what I say IS true.
 
FXAA is effectively a fullscreen smearing, never cared for it myself. SSAA is a bit better, and of course MSAA and it's variants are better but at the end of the day they're still blur filters. It's the inherent point of them. They "blur" neighboring pixels to reduce aliasing.

Maybe it's the larger PPI but to me on my 28 inch no AA always looked crisper. Especially playing The Division mainly at the time. Thinks like power lines and tree limbs lost detail using any AA method because the smaller thinner pieces would blur (I'm sorry, anti-aliasing) out, basically. Hell, there were somethings that would practically disappear with AA on because they were small/thin enough that they were basically eliminated once blurred slightly.

So, again, 28 inches for 4K is fine...43+ inches isn't needed or wanted by most PC users. AA blurs and lessens detail.

I mean, think about it, we know that even running native resolution doesn't really show ALL the detail. Downscaling proves this as running something like 4K on 1440p does give a lot more detail. OK...that's ADDING rendered pixels. So now if we run native resolution plus AA (blur) it makes no sense that there WOULDN'T be detail loss.

Look man, all I can say is that I know what I see. AA reduced detail in the games I played on it, all of them. I don't know what else to tell ya? With my contacts my vision is 16/20 and I used a arm so the monitor was closer to me. Outside that what I say IS true.


This is some good theoretical discussion, but what really matters is what happens in practice.

Since I'm lacking any other good examples right now, lets refer back to the FSAA review.

In any one of those image quality side by sides, can you point out an example of the loss of detail you experience with AA?

If have the time later, maybe I'll even do some screenshots and post for comparison, but for right now, the above is what we have.
 
If have the time later, maybe I'll even do some screenshots and post for comparison

Don't bother...I did.

Game is The Division.

88rMnVL.jpg


As you can see, clearly I'd hope, while AA might not make some DRASTIC difference you can see even here the lack of the finer details when AA is applied. The AA picture just looks blurrier over all on top of the more distinct differences I marked. Now, imagine this on a monitor whose PPI is nearly twice what you see above (91ppi 1440p 32" vs. 157ppi 2160p 28") and the differences are even more drastic.

Yes, AA blurs. Here's the proof. What else do ya want man?!
 

Thanks for the clarification. Leave it to Philips to make it SO easy to differentiate between models... :/
It showed up today under the certified HDR1000 models on DisplayHDR's website. This now looks like an interesting optiom.
 
Don't bother...I did.

Game is The Division.

View attachment 68980

As you can see, clearly I'd hope, while AA might not make some DRASTIC difference you can see even here the lack of the finer details when AA is applied. The AA picture just looks blurrier over all on top of the more distinct differences I marked. Now, imagine this on a monitor whose PPI is nearly twice what you see above (91ppi 1440p 32" vs. 157ppi 2160p 28") and the differences are even more drastic.

Yes, AA blurs. Here's the proof. What else do ya want man?!
Zarathustra[H]

Still plan on a response? I'm interested to see what you have to say honestly.

SMAA uses the same type of blur filter as MLAA and FXAA. I usually avoid these, and honestly don't even consider them to be true AA.

I'm more of a proponent of various types of FSAA/SSAA or MSAA, these don't blur textures.

Maybe I confused the issue by linking back to the FSAA review. I only linked that as it was the first thing I found with screenshot samples. (it also included 2x and 4x MSAA which is what I was really interested in. Could have been clearer I guess)

Still, even if we use an inferior tech like SMAA, MLAA or FXAA it's tough to tell in your pictures what they are representative of. (are those cellphone camera pictures of a screen? Why?) How zoomed in are they?

If that tree is something dead center and important or is it something off to a corner that one barely notices? On the rare occasion I have used SMAA/MLAA/FXAA I've found that while there are some effects like these, they are usually barely noticeable when actually playing a game, and not combing through screenshots.

If you are hellbent on judging AA by the likes of SMAA/MLAA/FXAA, then sure, it will suck, be my guest, but these have always been low end hacks, and not representative of true AA quality.

In your example earlier of of doing a 4k downsample, of course it looks good, because what you are in essence doing is doing a very high setting FSAA. You are also going to pay for it in the GPU hit as much as you would by rendering in 4k native (but that won't do shit about the aliasing)

MSAA is a very good balance, but has some shortcomings, in particular with certain games implementations of grass using transparencies, but there are workarounds.

You'll always wind up with a more manageable solution GPU wise with a good AA application than with trying to brute force it with resolution and pixel density, and - as has been mentioned before - even if you go way up in pixel density, without some sort of AA processing aliasing will still be there. It may be smaller, but it will still be there.
 
Last edited:
Back
Top