Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
Allright, will do.Unfortunately not. It's the coating coming off. Be very gentle if you clean these areas to avoid this getting worse.
The game runs in borderless windowed I'm pretty sure, so it would just be downsamplingSomething weird... in final fantasy 7 intergrade, I can set the resolution up to 3840x2160, despite my delock not being able to go above 340-350mhz.
You mean upscaling ? If it is so, then, it's working very well (looks great)... I wonder what method they're using, because, as far as I can tell, Nvidia's DSR doesn't look as good, unless it's at 4x.The game runs in borderless windowed I'm pretty sure, so it would just be downsampling
You mean upscaling ?
Internal is 2304x1440 and output is 3840x2160 in this case, so it's upscaling.Upscaling is low internal resolution>high output resolution.
You are doing high internal resolution>low output resolution. So that's downsamping/downscaling.
Just bring up your monitor's OSD. It'll say the same kHz as your desktop resolution, most likely.
Ah, I thought internal was the native resolution. Good to know.Your GPU is sending a 2304x1440 signal to your monitor. That is the output resolution.
3840x2160 is the resolution the game is rendering at, internally, before being downsample to 2304x1440 to output to your monitor.
Your GPU is sending a 2304x1440 signal to your monitor. That is the output resolution.
3840x2160 is the resolution the game is rendering at, internally, before being downsample to 2304x1440 to output to your monitor.
Like I said, bring up your monitor's menu and tell me what the horizontal sync kHz reading is. I guarantee it's the same as 1440p
Nevermind, I just learned that that's what DSR does.Your GPU is sending a 2304x1440 signal to your monitor. That is the output resolution.
3840x2160 is the resolution the game is rendering at, internally, before being downsample to 2304x1440 to output to your monitor.
Like I said, bring up your monitor's menu and tell me what the horizontal sync kHz reading is. I guarantee it's the same as 1440p
Came around full circle on this. I used to think upscaling meant supersampling, so, this helped clear a lot of confusion. Thanks.Your GPU is sending a 2304x1440 signal to your monitor. That is the output resolution.
3840x2160 is the resolution the game is rendering at, internally, before being downsample to 2304x1440 to output to your monitor.
Like I said, bring up your monitor's menu and tell me what the horizontal sync kHz reading is. I guarantee it's the same as 1440p
Nevermind, 3D rendering makes sense that it'd look better.Downsampling seems paradoxical to me... how does 3840x2160 sampled down to 2304x1440 look better than native ?
And, is there a program I could use to run games with downsampling ?
Since I'm losing refresh rate with the bug-free converter, it would be nice to run a game at 2560x1440 downsampled to whatever I can run at at least 80hz.
Do you know about DLDSR yet?Looks like setting a custom resolution on nvidia's control panel, and keeping a lower resolution on the timing's options works well. 2560x1600 and 3840x2400 downsampled to 1920x1200 looks great in game. In 2D content though, there is a difference, but, not so much.
Do you know about DLDSR yet?
Basically saves a bunch of GPU resources by making 2.25x look like 4x DSR. And for games that support DLSS, you can get even better performance by using DLSS to do the internal resolution.
When you set G2, the monitor might cycle, and not give you the final consequence of that change. Try setting the G2 even lower, then wait and see what happens after you get out of the menus.Did anyone encounter an issue when you calibrate the monitor and set up G2 their black level will eventually be grey instead
Even if you set up G2 to pitch black when you can barely see anything at the end of the calibration everything goes up and way too bright in relation to value I set
I also do not have 3rd step (green gun adjustment) but the model is different so maybe that's fine
I use:Maybe I'm being nitpicky, nevermind.
Well, my reason is that I don't want to change the resolution to something other than 16:10 or 16:9 because I want to use the native resolution of whatever video is playing on screen.I use:
1600 by 1024 at 100 Hz
1880 by 1200 at 85 Hz
No reason not to tweak the resolutions to accord with the screen's actual dimensions. Somewhat uniquely these devices are true multi-scan. Why not live it up?
There is no such thing as "pitch black" on CRT
You need to raise black level to certain value to get proper gamma of 2.4.
Unless you want to play with using ICC profiles which I personally do not recommend as there is too many issues with these. And realistically all you can get is visible difference in black level between fully black screen and when anything is displayed on it. It is imho better to have low level glow and more consistent black level and gamma without having to deal with anything related to gamma sliders and especially ICC profiles. It is pretty much how CRT's were always calibrated and if you set to configure brightness/contrast by eye until image looked correct (note: correct != best black level on fully black screen) you would end up with higher black level and gamma close to 2.4
I mean… 2.4 gamma on a GDM monitor, assuming no issues, should be at least 5000:1 contrast. Can someone confirm? My Viewsonic (low to mid range) gets about 3500:1 with 2.3 gamma.The Gamma Crush problem can be corrected with 3DLut for the majority of PC games and movie playback.
Raising black level to compensate for wonky gamma tracking is really bad, because you're clipping the CRT's main Super Power, great dark-scene contrast.
The only time you need to compromise is with game consoles. <if you're made of money> you can also fix that too with an inline lut-box on digital consoles.
NOT using 3DLut with CRT, is being a filthy casual.
There is no _AT LEAST_ when it comes to CRTs. Their contrast fluctuation is similar to that of Projectors.I mean… 2.4 gamma on a GDM monitor, assuming no issues, should be at least 5000:1 contrast. Can someone confirm? My Viewsonic (low to mid range) gets about 3500:1 with 2.3 gamma.
I wasn’t talking ANSI contrast. We all know that CRT sucks in this realm due to the glass reflections. I was talking full on / full off. Sorry if I wasn’t clear.There is no _AT LEAST_ when it comes to CRTs. Their contrast fluctuation is similar to that of Projectors.
VA LCD monitors have an ANSI-checkerboard contrast of 3000:1, VA-Televisions have 5000 to 7000:1
IPS of all type tops out at around 1500:1
CRTs, with ANSI-checkerboard go as low as 75:1 which is just trash. CRT does not produce a very vivid looking bright scene in general.
CRT will only beat LCD in contrast on very dark and very sparse scenes.
The color purity also drops off when the screen gets bright and busy because of light pollution. Reds turn brown.
It's also not typically possible to achieve balanced high contrast performance on CRT without 3DLut compensation. If you run CRT's default gamma tracking, it typically follows SRGB which gives you greyish blacks and pale looking dark colors.
I edited my post to remark on that, CRTs can go well below 0.003 nits, by using the crush trick, <lower contrast (sometimes brightness)> all the way down, The dark colors will be crushed, But then you use the VCGT (gamma table) to Uncrush it, while preserving the deep blacks.I wasn’t talking ANSI contrast. We all know that CRT sucks in this realm due to the glass reflections. I was talking full on / full off. Sorry if I wasn’t clear.
I see. So to reiterate, when I calibrate my Viewsonic monitor I think I have it at 80 nits white. I lower the blacks to be inky but I make sure I don’t crush it. Usually I settle on a measured gamma of 2.3. I don’t like 2.4 unless I’m in a blacked-out room - my projector, in other words. I don’t remember the exact black level but I think it brought me to 3400:1 contrast or slightly above that.I edited my post to remark on that, CRTs can go well below 0.003 nits, by using the crush trick, <lower contrast (sometimes brightness)> all the way down, The dark colors will be crushed, But then you use the VCGT (gamma table) to Uncrush it, while preserving the deep blacks.
I know. Videos are my concern. I found the right stretch now though, so, it's all good.Most games will render at whatever resolution you select. They don't have a native res.
For CRT just turn contrast (or sometimes brightness) all the way down. Let it crush as long as it improves black level. The software can uncrush it, it's analog so it's fine.I see. So to reiterate, when I calibrate my Viewsonic monitor I think I have it at 80 nits white. I lower the blacks to be inky but I make sure I don’t crush it. Usually I settle on a measured gamma of 2.3. I don’t like 2.4 unless I’m in a blacked-out room - my projector, in other words. I don’t remember the exact black level but I think it brought me to 3400:1 contrast or slightly above that.
My personal opinion is that above 5000:1 contrast, for most viewing environments, it’s very diminishing returns. For me it’s pointless to hit those contrast levels or above unless I’m in a totally blacked-out room. Because unless I lower the white levels to compensate, it hurts my eyes.
Plasma is a good alternative. Decent used TV’s can still be had and they have phosphors so they look like CRT.I've always returned to CRT. In real life the ANSI seems fine and the difference in dynamic range and ability to resolve black very noticeable between the two technologies to me.
I do like LCD with FALD though. I probably need to replace my old TV. It will be between OLED and FALD LCD and I'm a bit concerned about the burn in aspect with the former. Want a TV I can just leave on sometimes and not have to worry about station logos and such. With my monitors I've been pretty aggressive with a screen saver, but that's not what I want for a TV.
There's no perfect display deviceI've always returned to CRT. In real life the ANSI seems fine and the difference in dynamic range and ability to resolve black very noticeable between the two technologies to me.
I do like LCD with FALD though. I probably need to replace my old TV. It will be between OLED and FALD LCD and I'm a bit concerned about the burn in aspect with the former. Want a TV I can just leave on sometimes and not have to worry about station logos and such. With my monitors I've been pretty aggressive with a screen saver, but that's not what I want for a TV.