Gigabyte AORUS FO48U 48" 4K 120Hz OLED

I did, you re-read it. Then read the little (a) after data rate required, and the little a. at the bottom.
Yeah...8bit@4:4:4 color depth. We are talking about 4K120Hz@10bit@4:4:4 and HDR. Therefore the HDMI 2.1 device needs to support at least 40 Gbit/s...
Pls watch and learn:
 
Yeah...8bit@4:4:4 color depth. We are talking about 4K120Hz@10bit@4:4:4 and HDR. Therefore the HDMI 2.1 device needs to support at least 40 Gbit/s...
Pls watch and learn:

Great. 4K120Hz@10bit@4:4:4 is still 32gbps. The PS5 has to support more for audio channels and command and control.

Back to this monitor, which doesn't support even 32gbps, doesn't properly support the 25gbps it's supposed to, and looks terrible in HDR mode anyway.
 
Great. 4K120Hz@10bit@4:4:4 is still 32gbps. The PS5 has to support more for audio channels and command and control.

Lol. Where the hell do you get this from? Btw, lossless audio does not impact the video lanes bandwidth.
 
Last edited:
Lol. Where the hell do you get this from? Btw, lossless audio does not impact the video lanes bandwidth.


hdmi.PNG

You're right, this includes the blanking intervals, at 32gbps
 
Last edited:
Newegg was kind enough to send me another new monitor. Newegg's customer service has been surprising responsive and quick to process the RMAs. Both times the replacements came in just 2 days after sending it back. Fingers crossed this one doesn't fail on me again.

I had some issues trying to install the F05 beta firmware update using the OSD Sidekick software. It kept failing with some download fail error message after 3-5 minutes. I was able to get it to work by plugging the USB cable directly into my PC rather than through my powered USB 3.0 hub. Apparently using an USB extension also causes it to fail. The provided USB cable is too short to reach my desktop since the ports are on the other side, so I literally had to move my PC onto the floor next to the monitor just to get the USB cable to reach for the firmware flashing to work.

But even after all that hastle, I'm disappointed the F05 beta firmware didn't address any of the annoying issues I've been having:
  • The auto brightness limiter still always kicks in regardless of what brightness setting is set on the OSD.
  • The auto brightness limiter still has weird partial screen dimming artifacts and flicker on cuts when watching videos (see my previous example earlier in the thread).
  • Auto input switching loop issue still happens (only between my work PC and home PC).
  • With auto input switching disabled, it is still frustratingly difficult to switch inputs since the display turns off within seconds if the current input is not active and it takes several seconds for the input switch OSD to show giving you only a split second to change to the correct input before it just shuts off again.
  • Resolution/Displaymode changes still blanks out the screen for ages (+7-10 seconds). Just tabbing out of a full screen game which toggles g-sync will trigger this. God forbid I accidentally press the Window key while playing a competitive online game.
 
I finally got one of these, I say finally because it took about 4 weeks for the first one to get delivered, be damaged, be shipped back to Newegg, be approved to be re-shipped by Newegg, and then get re-delivered.

I'm pretty happy with it so far, I'm just having one issue where the subpixels are fuckin with me and I'm not sure if there's a way to fix it or not. When there is a yellow box around text (like searching a webpage) there is a clearly defined line on the left and right of where the box should be. This reddit post documents it pretty well.

https://www.reddit.com/r/OLED/comments/gfjeeh/is_this_normal_lg_c9_used_as_pc_monitor_red_and/

Is this just something I have to live with on this panel? I've tried adjusting every setting I can think of and nothing seems to be working.
 
  • The auto brightness limiter still always kicks in regardless of what brightness setting is set on the OSD.
You have to leave brightness at 100, but drop your contrast. This works to avoid ABL, and I don't know why.
 
I finally got one of these, I say finally because it took about 4 weeks for the first one to get delivered, be damaged, be shipped back to Newegg, be approved to be re-shipped by Newegg, and then get re-delivered.

I'm pretty happy with it so far, I'm just having one issue where the subpixels are fuckin with me and I'm not sure if there's a way to fix it or not. When there is a yellow box around text (like searching a webpage) there is a clearly defined line on the left and right of where the box should be. This reddit post documents it pretty well.

https://www.reddit.com/r/OLED/comments/gfjeeh/is_this_normal_lg_c9_used_as_pc_monitor_red_and/

Is this just something I have to live with on this panel? I've tried adjusting every setting I can think of and nothing seems to be working.
I've begun to expect a line of red on the left and cyan on the right with yellow objects in real life.
 
You have to leave brightness at 100, but drop your contrast. This works to avoid ABL, and I don't know why.
That definitely helps reduce it it, but I'm still seeing it occasional with contrast set all the way down to 0. The only thing I've noticed that eliminates it completely is to turn both the brightness/contrast down by a ridiculous amount in the nvidia control panel to limit the peak brightness output to under 100 nits.
 
That definitely helps reduce it it, but I'm still seeing it occasional with contrast set all the way down to 0. The only thing I've noticed that eliminates it completely is to turn both the brightness/contrast down by a ridiculous amount in the nvidia control panel to limit the peak brightness output to under 100 nits.
This has also been my experience. I've sent mine back to newegg for a refund because turning down the contrast so low ruins the image quality and still doesn't 100% stop the flickering.
Side note, if you use the monitor to emulate a 32" 1440p panel, it seems to get rid of the flickering, the image is much brighter, and the picture looks great; I just don't want to do that because I paid for a 48" 4k monitor and not a 32" monitor with enormous bezels.
 
I don't get flickering or ABL activation at all setting my contrast way down and image quality still holds up. You must be doing something wrong.
 
I don't get flickering or ABL activation at all setting my contrast way down and image quality still holds up. You must be doing something wrong.
Are you using HDR or SDR? I'm definitely seeing it in SDR mode. Windows in HDR mode will dim anything in the SDR luminance range when HDR is active which can also avoid the issue the same as lowing the brightness output from the NVIDIA control panel.

As for image quality, I'm noticing colors are a tinge more washed out and grey-shifted with the contrast set to the lowest setting in addition to a black level crush. Using this test page, http://www.lagom.nl/lcd-test/black.php Levels 1 - 5 is wiped out completely for any contrast level below 25%.
 
Are you using HDR or SDR? I'm definitely seeing it in SDR mode. Windows in HDR mode will dim anything in the SDR luminance range when HDR is active which can also avoid the issue the same as lowing the brightness output from the NVIDIA control panel.

As for image quality, I'm noticing colors are a tinge more washed out and grey-shifted with the contrast set to the lowest setting in addition to a black level crush. Using this test page, http://www.lagom.nl/lcd-test/black.php Levels 1 - 5 is wiped out completely for any contrast level below 25%.
Normally I keep HDR enabled at all times in Windows so that I'm not switching back and forth when I want to play an HDR game. I set HDR/SDR brightness balance at 10. ABL kicks in just the slightest bit and I don't notice it in most cases.
In native SDR, a contrast of 10 with brightness at 100, at least on my particular unit pretty much never activates ABL. Panel variance could come into play, though I might be wrong.
As for colors washing out, I don't notice much of a difference. You could try playing with color vibrance in the OSD to bring it up to your liking.
On that black level test image, what gamma is yours set to? I stick with 2.2 and while box 1 and 2 are pretty dark, you're not supposed to be able to see them as easily, even in a dark room. Windows HDR gives an incorrect SDR gamma curve that raises near blacks too much.

I would love to get a colorimeter at some point so I can try some very basic display calibration.
 
Normally I keep HDR enabled at all times in Windows so that I'm not switching back and forth when I want to play an HDR game. I set HDR/SDR brightness balance at 10. ABL kicks in just the slightest bit and I don't notice it in most cases.
In native SDR, a contrast of 10 with brightness at 100, at least on my particular unit pretty much never activates ABL. Panel variance could come into play, though I might be wrong.
As for colors washing out, I don't notice much of a difference. You could try playing with color vibrance in the OSD to bring it up to your liking.
On that black level test image, what gamma is yours set to? I stick with 2.2 and while box 1 and 2 are pretty dark, you're not supposed to be able to see them as easily, even in a dark room. Windows HDR gives an incorrect SDR gamma curve that raises near blacks too much.

I would love to get a colorimeter at some point so I can try some very basic display calibration.
Does the FO48U have separate settings for brightness and OLED light like the LG TVs do? OLED light is basically the real brightness control and brightness is something you never touch as it will just digitally adjust the image, usually making it worse. OLED light bould be something you only use for SDR content though and leave it at 100 for HDR.

I have the LG CX 48" and on that my tactic was initially to run in HDR mode and adjust Windows SDR slider to make it visually close to how I would run it in SDR mode. I set it at a very low 7% setting and that measured as 120 nits on my screen using a Spyder 5 Pro I borrowed from work. I did go back to using SDR though because in HDR mode there is an effect on color accuracy. Not a huge one, but some, mostly noticeable in blue/violet hues.

On the LG TVs what works for avoiding ABL is reducing contrast to I think 80 or 79 or something like that. It was never a huge issue for me so I just left it at 85.
 
It has separate settings for contrast and contrast. But the way they work is something I still don't fully understand. It's not straighforward like on an LG OLED TV.
Digital black level is controlled with black equalizer. 10 is neutral and needed for true black.
Brightness seems to be the OLED light. But it needs fixing in a firmware update because ABL ignores what you set brightness to. So even if it's at 50 or lower, ABL will dim the screen on a full white field.
Contrast does indeed control contrast. 50 is the neutral default and any higher runs the risk of clipping the image. Lowering it doesn't seem to have a negative impact on being able to distinguish white levels in test patterns.
 
Through some testing of the image picture quality when lowering the "contrast" setting, it does indeed appear to be doing some digital image processing which includes black crushing where as the brightness seems to be likely just dimming the OLED pixels via circuitry similar to the OLED Light setting on the LG CX. The brightness setting will actually effect the OSD where as the contrast does not which is the more obvious give away.

I've noticed the brightness setting does not crush the blacks at very low brightness levels but actually preserves the near black color, but as a result, the signature LG "near-black quantization" artifacts becomes noticeably more visible. The contrast setting crushing blacks levels make the near-black quantization less noticeable, but you lose some of the image details in the process.

Another thing I tested was lowering all 3 R G B color temp values. This has a similar effect of lowering the contrast in that it both causes a black crush and will reduce ABL by reducing the peak nit output. However, I'm noticing ABL kick in a lot less with at higher nit values than I saw when lowering the contrast. I'll need to do some more testing/measuring to get some optimal configuration but before I had to lower the contrast to about 10% to avoid ABL, where as lowering the color temp by 50% has the same result, but the peak nit output is around 200 nits with the lowered color temp vs the 100 nits getting with the contrast lowered to 10%.
 
I had a feeling brightness was directly related to OLED light / pixel brightness. Gigabyte needs to rework how ABL comes into play depending on how you set. It would also seem like there's a bug in which if you set your SDR brightness lower, it has an effect on brightness in HDR until you bring up the OSD and full HDR brightness is restored.
 
I'm pretty happy with it so far, I'm just having one issue where the subpixels are fuckin with me and I'm not sure if there's a way to fix it or not. When there is a yellow box around text (like searching a webpage) there is a clearly defined line on the left and right of where the box should be. This reddit post documents it pretty well.

https://www.reddit.com/r/OLED/comments/gfjeeh/is_this_normal_lg_c9_used_as_pc_monitor_red_and/
I've begun to expect a line of red on the left and cyan on the right with yellow objects in real life.

If it's with text rather than raster images, then you may simply need to change the cleartype setting, such as setting it to greyscale and/or just flat-out disabling it. I find using the program "BetterClearTypeTuner" to be the easiest way to achieve all of this:
 
One change I noticed with this latest firmware is that the VRR range seems to be 24 - 120Hz now (instead of 40 - 120 Hz) per the EDID. However the monitor doesn't seem to like that very much, I get some out of range errors if the framerate drops too low for a cutscene or such. Bumping the minimum refresh rate back to 40 seems to fix it (and since that range is wide enough for LFC to work, not sure why Gigabyte changed that).

Edit: And I actually just noticed my video card isn't reporting 120Hz as a valid resolution out of the box anymore. But forcing it to 120Hz by making a custom resolution still works. And now it allows 12bpc. Something screwy with this update.

Actually the missing 120Hz option seems to be some combo of the latest nVidia drivers and having my VR Headset plugged in. But the EDID change does seem to be a thing.
 
Last edited:
I'll check what Radeon Software reports for my VRR range when I get home. 24Hz is now a selectable refresh rate with the latest firmware.

EDIT: Mine still reports 40 to 120Hz. I don't want to say it's a bug with NVIDIA perse, but so far I haven't seen any out of range errors yet. I don't have an NVIDIA card to test.
 
Last edited:
!!!!! Wow!

What about at 80hz or 90hz? And does it look correct with the Testufo.com test (no ghostiing)?

UFO's in motion should look like the bottom one in this pic: https://blurbusters.com/wp-content/uploads/2017/12/strobed-display-image-duplicates.png
Both work

As far as ghosting - I can't say there's zero ghosting, but it seems to be less. Capturing it with my iPhone set to 240 fps, there's certainly a flicker that's not there with aim stabilizer turned off.

It looks like your framerate 1/2 of Hz image - just the ghosted part doesn't look as blurred? Seems almost like the monitor is internally frame doubling and then doing the BFI.
 
So I guess it looks different than it does at 100hz aim stabilizer? That's how you'd really be able to tell if it's working correctly or incorrectly.

If it was working correctly, 80hz and 100hz would look VERY similar.

I remember another guy testing BFI on a LG CX had a similar issue. It turned on, it engaged, but it didn't look right compared to 60hz, 100hz and 120hz (the three rates the CX can do BFI correctly)
 
So I guess it looks different than it does at 100hz aim stabilizer? That's how you'd really be able to tell if it's working correctly or incorrectly.

If it was working correctly, 80hz and 100hz would look VERY similar.

I remember another guy testing BFI on a LG CX had a similar issue. It turned on, it engaged, but it didn't right compared to 60hz, 100hz and 120hz (the three rates the CX can do BFI correctly)
Didn't actually test 100Hz, but it does not look like it does at 120Hz. I wish I could capture it on a photograph, but it looks perfect there.
 
Yeah, you have to use what they call a "pursuit camera". You put the the camera on a rail and move it with the UFO, to simulate how it would look with your eyes moving with the UFO. Doing by hand is kinda difficult.

The issue seems to be that the pixels aren't being pulsed at the same rate as the refresh coming from the GPU.
 
I'll check what Radeon Software reports for my VRR range when I get home. 24Hz is now a selectable refresh rate with the latest firmware.

EDIT: Mine still reports 40 to 120Hz. I don't want to say it's a bug with NVIDIA perse, but so far I haven't seen any out of range errors yet. I don't have an NVIDIA card to test.
So I think that might be a difference in the data field AMD and nVidia use to determine the VRR range. In the CTA-861 extension block, there's a separate "FreeSync" datafield that defines the range as 40 - 120Hz, but the actual range limits in the EDID are now set to 24 - 120 Hz instead of 40 - 120 like before.

Not too big of an issue on a PC since I can just use CRU to edit the VRR range limits (and in fact I find 60 - 120 and letting LFC do its thing to look better anyway), but I do wonder if there will be issues with consoles. That said, I haven't tested the HDMI ports or even the USB-C port since updating to the F05 beta.
 
Do we know this panel's peak brightness? Windows reads it as 446 nits

1631652509465.png
 
Last edited:
Not too big of an issue on a PC since I can just use CRU to edit the VRR range limits (and in fact I find 60 - 120 and letting LFC do its thing to look better anyway),

I don't have an adaptive sync display, so I've always wondered about this. I the transition into LFC really that noticeable? I would think it wouldn't really be since we're talking about sample and hold displays.
 
I don't have an adaptive sync display, so I've always wondered about this. I the transition into LFC really that noticeable? I would think it wouldn't really be since we're talking about sample and hold displays.
Depends on the monitor. I find some monitors, including this one, tend to get a bit of a flicker as the refresh rate drops below 50. Apparently the gamma can shift a bit at different refresh rates. If I limit the VRR range from 60 - 120, that's still a wide enough range to to allow LFC without there being a noticeable shift.
 
Is that Windows 11? The VESA Display HDR app from the Windows Store also reports roughly the same peak brightness.
It is yeah, didn't even realize Windows captured that data until I started playing Deathloop (which automatically sets the peak brightness to what is detected). I've tested a couple of games (Mass Effect Legendary edition comes to mind) that have the "get the box to disappear/match" HDR calibration screen, where the display seemed to hard clip somewhere around 500-550 nits. Not sure if anyone else here had seen or done any other tests in this regard, hoping we get that Rtings review soon.
 
Just FYI this monitor is now "Testing Started" at RTINGS.com so we should have some solid results in a day or two. I am also an "early access" subscriber so I'll know all the details about a week before the public. I won't simply copy & paste their early access review since I respect their work, but I'll give everyone some very important highlights, information, and direct comparisons to the 48" C1 until the full review becomes public.
 
The F05 Beta firmware breaks G-Sync for me. G-Sync worked perfectly before upgrading to F05 Beta but after upgrading, it now causes the monitor to go black whenever adaptive sync is called. Resetting the monitor back to F02 firmware fixed G-Sync but now I have all those other issues that the new firmware was supposed to fix :/

Also, not being able to turn off ABSL ( the auto dimming of the monitor when not enough is happening on screen ) is almost enough for me to return this thing. It's dimming right now while I type this. I'm still here! I'm typing! This is just inexcusable for a computer monitor. My laptop OLED screen doesn't do this. Gigabyte seriously needs to figure out their firmware issues and fast.
 
The F05 Beta firmware breaks G-Sync for me. G-Sync worked perfectly before upgrading to F05 Beta but after upgrading, it now causes the monitor to go black whenever adaptive sync is called. Resetting the monitor back to F02 firmware fixed G-Sync but now I have all those other issues that the new firmware was supposed to fix :/

Also, not being able to turn off ABSL ( the auto dimming of the monitor when not enough is happening on screen ) is almost enough for me to return this thing. It's dimming right now while I type this. I'm still here! I'm typing! This is just inexcusable for a computer monitor. My laptop OLED screen doesn't do this. Gigabyte seriously needs to figure out their firmware issues and fast.
A work around for the Gsync issue is to go into CRU and raise the lower limit of the VRR range to at least 40. That's what previous firmwares had it set to. Hopefully Gigabyte can fix it in the non-beta firmware (either by raising it back to 40 or changing whatever else is needed in the firmware to allow the range to go down to 24).
 
Windows 11 Dev Channel Build Updater auto-updated my NVIDIA driver to 510.10, which disabled RGB 4:4:4 (It selected 4:2:2 by default). I had to rollback to 471.96 to get RGB 4:4:4 back.
 
A work around for the Gsync issue is to go into CRU and raise the lower limit of the VRR range to at least 40. That's what previous firmwares had it set to. Hopefully Gigabyte can fix it in the non-beta firmware (either by raising it back to 40 or changing whatever else is needed in the firmware to allow the range to go down to 24).
Thanks Terra. Hopefully they also give us the option to turn off ASBL in the next firmware update. That has no place on a computer monitor. Old CRT monitors had the same potential burn-in problem as OLEDs. That's why screen savers and power settings were invented and they still exist. We don't need this nanny built into the firmware, the computer itself can protect the monitor.
 
Just FYI this monitor is now "Testing Started" at RTINGS.com so we should have some solid results in a day or two. I am also an "early access" subscriber so I'll know all the details about a week before the public. I won't simply copy & paste their early access review since I respect their work, but I'll give everyone some very important highlights, information, and direct comparisons to the 48" C1 until the full review becomes public.
Any info you're allowed to give out would be super appreciated.
 
Back
Top