Your 4K TV Is Probably the Only 4K Converter You Need

However, I will say that when watching the same 1080p blu-ray movie off my PS4 which is using the TV's built-in upscaling vs. via my computer using madVR, the madVR is clearly better. I do have some of the enhancement features of madVR enabled, so that's probably why.

IMO MadVR is Bullshit.

It's not scaling/detail. It's simple gamma and contrast.


Every MadVR comparison I have looked at, ends up like this one.This is "Pro" MadVR link and he claims the MadVR images are clearly sharper:
https://wiki.mikejung.biz/MadVR_Chroma_Upscaling_1080p_Image_Quality

But what I see that the brightness/Contrast is altered. There is no actual difference in fine detail.

All the MadVR images are noticeably darker, which can make some mid tone details stand out, but it also creates more black crush.

You can use any source and bump the contrast a bit, and maybe darken a bit for the same effect.
 
There is no "Good" upscaling. Only "Acceptable". MadVR is by far the best however.

Scaling up by 2x2 doesn't make the scaling better sorry.
It still has to interpolate and fill in the gaps.

2x2 upscaling is going to only benefit if you are doing Point integer scaling. Which is really only useful for old low res stuff. Most TVs and scalars, do NOT do point scaling.
http://www.screenshotcomparison.com/comparison.php?id=203848

The gap from 1080p to 4k, is nearly twice as big as 720p is to 1080p.(4x vs 2.25x) It's only partly masked by the fact there are so many pixels to begin with. But in general, the bigger ratio of upsampling you have to do, the worse it looks.

Take a picture at 960x540 (A PSvita game basically) and then do a generic typical linear upsample to 1080p.

Does this look "good" to you?
https://u.cubeupload.com/MrBonk/386960x5441920x1088line.jpg (2x2 upsampling to 1080p)

Nothing will generally look better than native. Even if playing a game with a huge amount of SSAA at a lower res vs playing at native with a lower level of SSAA
http://www.screenshotcomparison.com/comparison.php?id=199372
http://www.screenshotcomparison.com/comparison.php?id=199370
http://www.screenshotcomparison.com/comparison.php?id=199371


There are exceptions though, like if you are trying to emulate a CRT with a shader, the higher resolution you do this at (With typically 240p material), the better it will look.
I just want my 4k tv to show my 1080p images the exact same way as my 1080p tv, not a soft edged interpolated version of it which makes UI elements and text look fuzzy. I'll take my original source material the way it came thank you!
 
I just want my 4k tv to show my 1080p images the exact same way as my 1080p tv, not a soft edged interpolated version of it which makes UI elements and text look fuzzy. I'll take my original source material the way it came thank you!

MPC-BE lets you choose "Nearest Neighbor" as your scaling method. No softening. For playing back 1080p on 4K you would just get 4 identical pixels for each on screen pixel. Effectively turning your 4K set into a 2K set.

But there is a reason we don't use nearest neighbor type scaling.If you are sitting close enough to see 4K, you will end up seeing pixelation and it is worse than slight image softening. If you are sitting far enough away that you don't see the pixelation, then you won't be able to tell the difference with normal scaling on either.
 
Last edited:
What's wrong with softening? I feel it's DLP's strongest point. You can't really see the defined individual pixels.....just like you couldn't on (TV) CRT's. Defined pixels are fine on computers, but I do not like it on video.
 
Back
Top