Search results

  1. F

    Airflow case suggestions?

    I bought the R5 a few days ago and found it to be an acoustic nightmare. The only quiet fan config I could find is with the two 140mm fans blowing downwards into the floor. Any other fan config and it's hum city. Tried Noctua S12A FLX, actually makes it worse lol. Not to mention the HDD...
  2. F

    nvidia or amd for color accuracy and calibration features

    Eventually got ReShade working with Source engine and 2 other games that weren't working before. Just needed a bit of fudging with paths and an older version of reshade .dll. So it looks like I may be keeping this 1070 after all...
  3. F

    nvidia or amd for color accuracy and calibration features

    The digital output is locked at 8-bit, all of our consumer grade LED monitors are 8-bit only, that's all they can accept and display. Only a high end 10-bit capable monitor would benefit from the new 10-bit support being referred to in that article. In that case the video card would be...
  4. F

    nvidia or amd for color accuracy and calibration features

    I don't know, but I'd imagine it would be spatial. The shader route isn't panning out for me though, too many games are not compatible with ReShade. All the Source engine games for example are not compatible with it, so that pretty much rules it out for me.
  5. F

    nvidia or amd for color accuracy and calibration features

    The 3D LUT is implemented in a shader which I believe uses high precision tone mapping with dithering. If it wasn't, I think there would be so much banding in games where shaders are doing extreme tone mapping effects like dynamic HDR and bloom, you would get severe banding on those tones if...
  6. F

    nvidia or amd for color accuracy and calibration features

    Well I got a 1070 and it's still 8-bit. Even worse, none of my LUT enforcing tricks work on it. Full screen 3d games always steal the LUT and neither dispwin.exe, Powerstrip, Color Clutch, Color Sustainer or Monitor Calibration Wizard can override it. On my previous AMD card (R9 270)...
  7. F

    nvidia or amd for color accuracy and calibration features

    Thanks, but that's a different kind of 10-bit support than the one we're talking about here (still an 8-bit signal, but with dithering by the GPU to produce 10-bit effective precision). It's something that's been lacking from NVidias and is the main thing holding me back. spacediver's post...
  8. F

    nvidia or amd for color accuracy and calibration features

    Does anyone know if the new Pascal GPUs from Nvidia support the 10 bit dithering? I'm guessing the answer is "no" but just wanted to make sure before I choose 480 over 1060 :D Thanks in advance.
  9. F

    Is G-Sync worth it?

    Blur Busters UFO Motion Tests
  10. F

    My solution to tearing was to buy a powerful rig.

    Thank you for saying this, and it drives me mad that people think vsync off has smoother frame rate pacing than vsync on (if it does, something is really screwed up in the game engine or driver...try playing with frame queue size and triple buffering to fix it). The only reason to choose...
  11. F

    It's happening, affordable 4k OLED TV's are finally becoming a reality

    Most mid-high tier TV's do have 3D LUTs (independent hue/saturation/luminance controls for RGBCMY). The problem is they are not always implemented properly, and in LG's case really don't work properly at all, producing very bad colour artefacts. A lot of the reviews advise to leave these...
  12. F

    How to identify input lag?

    displaylag.com/display-database tftcentral.co.uk/reviews.htm
  13. F

    My solution to tearing was to buy a powerful rig.

    It's possible to get a type of weird vsyncless mode where there is almost no tearing. I can achieve this with radeon pro in some games when radeon pro is limiting my framerate to the refresh rate (in my case 60hz) and vsync is turned off. Something about the way that radeon pro is able to...
  14. F

    nvidia or amd for color accuracy and calibration features

    I'm trying to do this in Photoshop but can't get it to work. I've set the bit depth to 32-bits per channel, and the colour picker is still limited to only 0-255, and creating a gradient only makes 0-255. Are there some other options I need to set up as well?
  15. F

    nvidia or amd for color accuracy and calibration features

    Ah that explains it :) You are getting a true 1024 step 10-bit signal being displayed on your CRT, whereas I am getting no more than 8-bit 256 steps on my LCD.
  16. F

    nvidia or amd for color accuracy and calibration features

    I don't understand why reducing the precision of each entry from 65536 to 1024 would result in more flexibility. But without dithering, every shade will still have to snap back to the nearest 256th shade when the final 8-bit (256 steps) output is being derived from it, which will still cause...
  17. F

    nvidia or amd for color accuracy and calibration features

    Well, it's just not logically possible to change a value in the table without reducing the total number of shades, which causes banding. Dithering can solve this issue by creating more intermediate shades. Here is a ramp of only 4 shades Now here is also 4 shades, but dithered Dithering is...
  18. F

    nvidia or amd for color accuracy and calibration features

    Yep, makes sense :) I'm guessing AMD is sampling a value for each chunk of 63 to create its 10-bit table, and then dither that spatially to create the final 8-bit output.
  19. F

    nvidia or amd for color accuracy and calibration features

    So I did the dispcal -R test, and it says my effective bit depth is 10-bit :)
  20. F

    nvidia or amd for color accuracy and calibration features

    I notice at the row for digital value 1 , its output for green is 191 and blue is 193, so it seems to be specifying a luminance correction to within only 2 points out of 65 thousand! Whether dithering at 8-bit is sufficient to create such a small step in luminance is something I simply...
  21. F

    nvidia or amd for color accuracy and calibration features

    This is my lut according to Xrite It goes up to 65535 so it would be a 16-bit lut from which to create the final 8-bit output that goes to the monitor. How the GPU handles the conversion is a mystery to me. Maybe it downsizes it first to a 10-bit table, then reduces that to 8-bit by using...
  22. F

    nvidia or amd for color accuracy and calibration features

    How sure are you? Some people on doom9 are saying if an application such as MPC is set to 10-bit output that nvidia does the same thing as AMD, and on evga forums one guy got an email from a German tech support manager saying "the GTX 970 do also work with 10 BIT LUT". Maybe it depends on...
  23. F

    nvidia or amd for color accuracy and calibration features

    I'm using an AMD R9 270 and gamma calibration with LUT = no banding, none at all. Whatever 16-bit LUT + dither that AMD uses, it's magic, and I am not sure I could live without it. There are no negative artefacts introduced, so I don't know what Frode is referring to, but you get a perfectly...
  24. F

    Official Acer [XB270HU] 27" 1440p 144Hz G-Sync IPS ULMB Monitor Thread

    Gsync should be alleviating that massively as you now only have to render ~45fps to have a really smooth frame rate. Unless you want to render 100fps for the increase in motion clarity. This monitor is so freaking expensive. Right now can get a LG 55LF6300 IPS TV for $1250aud, and this...
  25. F

    Philips BDM4065UC - 40" 4K 60Hz monitor thread

    Nothing I said contradicts anything in the tftcentral review. When they say the backlight operates at a fixed 240hz, that doesn't mean it is flashing on and off at 240hz at all brightness settings. Otherwise you wouldn't have any change in brightness. What it means is that the backlight...
  26. F

    Philips BDM4065UC - 40" 4K 60Hz monitor thread

    If the backlight is fixed at 240hz strobing, then the brightness is the same at all levels. At 100% backlight , it doesn't strobe at all. It's only on lower backlight settings that it has to strobe at lower frequencies to produce lower duty cycles. afaik the 240hz figure means that its...
  27. F

    Philips BDM4065UC - 40" 4K 60Hz monitor thread

    The problem with the Philips seems to be that the frequency of the backlight is too slow (240hz) to be properly synchronised with 60hz at all brightness levels. According to the tftcentral review, when the brightness is set to 120cd/m2 (34% backlight), the backlight will be flickering on and...
  28. F

    Philips BDM4065UC - 40" 4K 60Hz monitor thread

    Are you kidding me, 20ms is freaking excellent. 1 frame usually needs to be put in the frame buffer before it's sent to the monitor, which takes 16.7ms, so anything near that is great especially considering this thing has video processing, in particular RGB controls which is a much needed...
  29. F

    Philips BDM4065UC - 40" 4K 60Hz monitor thread

    The flickering I believe is due to the response time of the pixels being too slow. The horizontal lines extending past the window (refered to as "line bleeding" on plasmas) I believe is due to voltage instability at the row/column drivers.
  30. F

    Philips BDM4065UC - 40" 4K 60Hz monitor thread

    Subtract the two numbers to get how far behind in time the Phillips is. 546-464 = 82ms 183-100 = 83ms 084-015 = 69ms 199-099 = 100ms Last one not readable. Thank you for doing this test.
  31. F

    4K 60Hz 4:4:4 HDMI 2.0 TV Database

    http://i3.minus.com/ibyJcwdIniHUEs.png http://cdn.avsforum.com/b/b4/b4a44044_vbattach208609.png
Back
Top