NVIDIA Announces G-SYNC ULMB 2: Over 1000Hz Of Effective Motion Clarity

erek

[H]F Junkie
Joined
Dec 19, 2005
Messages
10,875
1000 Hz of emulated & interpolated “motion clarity” with whiz-bang voodoo AI similar to the underpinnings of DLSS2.5/3?

“With ULMB 2, the backlight is only turned on when each pixel is at its correct color value. The idea is to not show the pixels transitioning, and only show them when their color is accurate.

But this technique creates a challenge: backlights generally light up all pixels at the same time where pixels are changed on a rolling scanout. At any given point in time, a portion of the screen will have double images (as known as crosstalk).

The solution to this problem is what sets G-SYNC's ULMB 2 apart from other backlight strobing techniques: with G-SYNC, we're able to control the response time depending on where the vertical scan is, such that the pixels throughout the panel are at the right level at precisely the right time for the backlight to be flashed. We call this "Vertical Dependent Overdrive".

With Vertical Dependent Overdrive, ULMB 2 delivers great image quality even at high refresh rates where the optimal window for backlight strobing is small.

ULMB 2 Is Available Now
For ULMB 2 capability, monitors must meet the following requirements:”

1685375125188.png


Source: https://www.techpowerup.com/309298/...lmb-2-over-1000hz-of-effective-motion-clarity
 
The 2 monitors they listed that support this are 27" 360Hz and about $1050. Pretty cool tech.
 
  • Like
Reactions: erek
like this
Is it relevant for OLED or just for always on backlight affair ?

But seem way inferior for work-text-eye strains, the screen staying there until the next one render is not all negative.
Eye strain is mostly caused by 2 things with CRTs, they are insanely bright by modern standards and their low image density causes your brain to work to fill in gaps a good CRT with a dot pitch in the .28 (but most CRTs were closer to .21) range is about the equivalent of a 90 DPI LDC display.

I mean if you made a 19' LCD running 1280x1024 gave it a 90 DPI panel and cranked up the backlight to the equivalent brightness your eyes would be burning too.
It's made even worse by how the lighting in office spaces has changed over the past 20 years, it's headache-inducing.
LCD's are way way better for productivity and low FPS or essentially static images, but engineers are still finding ways to make LCDs more CRT-like when dealing with high refresh situations.
 
You're making that up.

Get out your credit cards Nvidia fans, it's time to buy another monitor.
I think you might be mistaken, there are currently 2 monitors that have updates to support ULMB2. The Acer XB273UF and the ASUS PG27AQN. No need to buy another monitor if you already own one of these 2. Support for future monitors is also a fantastic thing since Nvidia is requiring very strict quality guidelines to even claim the monitor supports ULMB2, which is a good thing since it forces manufacturers to up their game and avoid another freesync 1.0 monitor mess where more than half of the monitors had terrible issues with flickering and not being able to lower their Hz enough.
 
I think you might be mistaken, there are currently 2 monitors that have updates to support ULMB2. The Acer XB273UF and the ASUS PG27AQN. No need to buy another monitor if you already own one of these 2. Support for future monitors is also a fantastic thing since Nvidia is requiring very strict quality guidelines to even claim the monitor supports ULMB2, which is a good thing since it forces manufacturers to up their game and avoid another freesync 1.0 monitor mess where more than half of the monitors had terrible issues with flickering and not being able to lower their Hz enough.
And I am sure in another 2 maybe 3 years when AMD announces their version as FreeSync 3 Pro it will be seen as an awesome feature. That NGredia hoarded for itself instead of making open...
 
Eye strain is mostly caused by 2 things with CRTs, they are insanely bright by modern standards and their low image density causes your brain to work to fill in gaps a good CRT with a dot pitch in the .28 (but most CRTs were closer to .21) range is about the equivalent of a 90 DPI LDC display.

I mean if you made a 19' LCD running 1280x1024 gave it a 90 DPI panel and cranked up the backlight to the equivalent brightness your eyes would be burning too.
It's made even worse by how the lighting in office spaces has changed over the past 20 years, it's headache-inducing.
LCD's are way way better for productivity and low FPS or essentially static images, but engineers are still finding ways to make LCDs more CRT-like when dealing with high refresh situations.
CRT displays did not get "insanely bright." Most of them did 200 nits, at best. The amount of power needed to drive a CRT severely limited how bright they could get. Modern LCD displays can do double that without the help of HDR, and most are calibrated with the brightness and saturation turned up to insane levels out of the box.

Eye strain with CRT is from a combination of the display being additive light and having a low refresh rate. Your eyes adjusting to the frequent changing light exposure is what causes your eyes to hurt. You can help this by running the screen at a higher refresh rate. 85 Hz or higher is generally where strain is minimal on my eyes.

LCD and OLED are subtractive light with higher image persistence, which is why they're easier on the eyes. LCD that use PWM backlighting with a low pulse width exhibit the same eye strain issues as a CRT. ULMB running at a low refresh rate will also have eye strain issues, which is why it's generally only used at 120 Hz or higher. Strobing is great for motion clarity, but it's bad for your eyes.
 
CRT displays did not get "insanely bright." Most of them did 200 nits, at best. The amount of power needed to drive a CRT severely limited how bright they could get. Modern LCD displays can do double that without the help of HDR, and most are calibrated with the brightness and saturation turned up to insane levels out of the box.

Eye strain with CRT is from a combination of the display being additive light and having a low refresh rate. Your eyes adjusting to the frequent changing light exposure is what causes your eyes to hurt. You can help this by running the screen at a higher refresh rate. 85 Hz or higher is generally where strain is minimal on my eyes.

LCD and OLED are subtractive light with higher image persistence, which is why they're easier on the eyes. LCD that use PWM backlighting with a low pulse width exhibit the same eye strain issues as a CRT. ULMB running at a low refresh rate will also have eye strain issues, which is why it's generally only used at 120 Hz or higher. Strobing is great for motion clarity, but it's bad for your eyes.
Phrasing it as additive light is a better way of putting it, and yeah refresh rate matters, for the office, I don't put in anything less than 85 and nothing evenly divisible by 60, can't have the refresh rate in sync with the led lighting (120hz) because then the flicker between the lights and the monitors sync up and we get increased complaints and I do try to prevent things from getting to a point where a full ergonomic assessment is needed because they always end poorly for everybody.
 
Had ULMB 1 on an old Asus monitor. The quality of motion with ULMB enabled was insane but the monitor also had to reduce the brightness by like half which left the overall image a tad dark for my liking. Hopefully ULMB 2 makes it into more monitors as it'll be great if they figured out the brightness issue.
 
CRT displays did not get "insanely bright." Most of them did 200 nits, at best. The amount of power needed to drive a CRT severely limited how bright they could get. Modern LCD displays can do double that without the help of HDR, and most are calibrated with the brightness and saturation turned up to insane levels out of the box.
Blooming too. If you turned up the power to the guns they'd start to miss their target more and excite surrounding phosphors. On TVs they'd take that trade often because NTSC was such shit rez but on computer monitors it could be an issue. You started to crank the brightness much past 100-120 and the blooming would make things fuzzy. They just really only did dim well. The spec for print monitoring was 80nits, light controlled (aka dim) environment, with a hood over the monitor to minimize reflections.
 
To even consier it I will wait for it to be:
1) on OLED monitors
2) compatible with G-Sync enabled simultaneously
3) either not exclusive to Nvidia or when Nvidia isn't making all their sub-90 class cards either insanely overpriced or crippled with low VRAM

Think I will be waiting a long time.
 
And I am sure in another 2 maybe 3 years when AMD announces their version as FreeSync 3 Pro it will be seen as an awesome feature. That NGredia hoarded for itself instead of making open...
Well considering motion blur reduction has been around for quite some time I don't think so.
 
  • Like
Reactions: erek
like this
To even consier it I will wait for it to be:

2) compatible with G-Sync enabled simultaneously
That may never happen. I don't know that variable length frames are compatible with strobing. What may need to happen is something more along the lines of DLSS 3 combined with Gsync where instead of having variable frame output rate, we have variable frame rate rendering, and then it is upsampled to the output rate.
 
  • Like
Reactions: erek
like this
That may never happen. I don't know that variable length frames are compatible with strobing. What may need to happen is something more along the lines of DLSS 3 combined with Gsync where instead of having variable frame output rate, we have variable frame rate rendering, and then it is upsampled to the output rate.

Nah just skip to 'PlayStation 9' where G-Sync is a pill you snort 👍
 
  • Like
Reactions: erek
like this
That may never happen. I don't know that variable length frames are compatible with strobing. What may need to happen is something more along the lines of DLSS 3 combined with Gsync where instead of having variable frame output rate, we have variable frame rate rendering, and then it is upsampled to the output rate.
I'm confident it can be done with OLED The only real problem to solve with variable strobes lengths for OLED is the brightness fluctuation which is easy to fix.

The problem with OLED for strobing in general is getting the brightness high with bursts that short. DLSS3 style frame generation could help with that a lot like you said. Even without that I think the current technology is already good enough to do a VRR strobing OLED even though it may not be "1000 hz motion clarity" at normal brightness . It's just no one has tried to do simultaneous strobing and VRR with OLED. But I think we may see it soon now that OLED is becoming mainstream and there are competing OLED technologies.
 
  • Like
Reactions: erek
like this
Pretty cool tech, sadly it doesn't support gsync and strobing simultaneously yet.

Also I could never go back to LCD after using OLED anyways.
I love ULMB2. I just posted a "Blur Busters Special Sauce" article on ULMB2 on the cover page of Blur Busters (I'll let someone else post the link, I'm bound by a promise to [H] not to be the one to post links except to TestUFO educational animations, but they say anybody can post Blur Busters links)

But there's something better than ULMB2. Especially by 2030:

Strobing can eventually become obsolete for non-retro use cases with lagless frame generation on ultra-high-Hz 0ms-GtG displays (e.g. OLEDs)

It's like GSYNC+ULMB+DLSS+EyeCare+FlickerFree rolled into one -- simultaneously.

- DLSS because reprojection is a frame generation tech (even if much more lagless)
- ULMB because reprojection also reduces motion blur
- GSYNC because reprojection also can de-stutter a variable input framerate to a constant output framerate
- FlickerFree EyeCare because this is flickerless, strobeless, BFI-less, PWM-less

All of it at the same time. Maybe NVIDIA could call it ULMB 3 or GSYNC 3 or DLSS 5. But it serves all the purposes simultaneously WHILE also being PWM-free!

And when it is done properly, it has fewer artifacts than enabling strobing or reducing game detail level (to get the high frame rates necessary in an alternative way). Especially when the starting frame rate of reprojection BEGINS above flicker fusion threshold (e.g. 100fps -> 1000fps). And reprojecting sample-and-hold avoids the double-image effects of VR reprojection. Tests here show 100fps->360fps reprojection is pretty fantastic.

100fps->1000fps reprojection reduces display motion blur by 90%, just like turning on a strobe setting.
100fps->1000fps reprojection has much fewer artifacts than 45fps->90fps reprojection.
100fps->1000fps reprojection can have less lag than enabling strobing on a lower Hz display (if outputting to 1000Hz display)

Large-ratio (10:1) reprojection is a motion blur reduction technology of the future (2030s), even for esports.

Advanced reprojection can even rewind local input lag, and make reprojection esports-friendly (correcting the outdated rendertime-delayed geometry, by reprojecting the geometry in 1ms to gametime-current geometry). This requires a 2-thread GPU that is simultaneously doing original renders and reprojections, integrated into the game engine. You can even rewind 100fps 10ms rendertime lag by realtime morphing (in just 1ms) the most recent full-render frame to current gametime/inputtime. So you've backward-reprojected rather than forward-reprojected. So newer reprojection algorithms (converting rendertime lag into fresher gametime/inputreadtime lag) is a frame generation technology capable of reducing latency. But that thing has to be integrated into the game engine, much like the DLSS partnerships that NVIDIA does with studios; so reprojection will be a collaboration between game engines (e.g. Epic Megagames) and GPU vendors (e.g. NVIDIA), to make lag-reducing frame generation technology possible.

With the algorithmic engineering feats, there are ways to modify a reprojection algorithm to prioritize the gametime/inputtime independently of the underlying framerate, allowing you to convert a fluctuating feedstock framerate (original frame renders) into a perfect reprojected framerate=Hz.

Imagine ALL your games running at framerate=Hz. 240fps or 1000fps. Even UE6 or UE7 with path tracing. This can be a reality. As long as the feedstock framerate remains somewhat above the flicker fusion thresholds, the artifacts can be kept shockingly low.

Remember, Netflix (video compression) is still 23 fake frames and 1 real frame per second. But it's done so well now. The fakeframe talk is understandable, like old fashioned MPEG1, but that was yesterday. H.EVC is pretty good at faking things using compression interpolation mathematic algorithms. When reprojection becomes as artifactless as original frames, who cares? Triangles and textures are still, technically, fake representations of the real world, fabricated into a framebuffer, by a piece of silicon called a GPU. However the frame is created does not really matter, as long as it's lagless and artifactless, no? That's the bottom line, right? It's still less fake-looking than the artifacts of strobing, or the artifacts of intentionally reduced detail level.

People don't see Hz differences unless it's a dramatic upgrade (60 -> 144 -> 360 -> 1000 -> 4000) geometrically up the diminishing curve, while GtG=0 and framerate=Hz. The problem is 240-vs-360Hz is a mere 1.5x blur difference throttled to 1.1x due to slow LCD GtG (and high frequency jitters that blends to extra persistence blur). OLED and MicroLED skips majority of the GtG bottleneck, and that's why reprojection excels on those displays as a blur reduction tech. GPUs can't reduce motion blur strobelessly (via quadruple digit frame rates) without the help of lagless frame generation tech. It's going to be forced to happen (eventually) in the refresh rate race.

The main problem is reprojection (with rendertime-lag-rewind capability) needs to be built into the game engine, since good quality artifact-free reprojection needs the ground truth of inputreads/geometry/zbuffer. But 75% of a GPU can be spent rendering UE5+ frames, and 25% of a GPU can be spent reprojecting it by 5x-10x framerate for 80-90% motion blur reduction strobelessly. So NVIDIA and Epic Megagames need to team up to create this new tenology that is the "GSYNC+ULMB+DLSS+EyeCare+FlickerFree" practicaly-lagless frame generation Holy Grail.

OLED benefits fantastically from brute framerate-based motion blur reduction (large-ratio practically-lagless frame generation). 240fps 240Hz on OLED looks great, almost as clear as yesteryear LightBoost (2.4ms persistence) but with none of the strobe disadvantages. In fact, most people can add a rough biasing factor of 2x for strobeless motion blur (e.g. 2ms MPRT strobeless looks better than 1ms MPRT strobed), due to the reduced quality loss and other line items.

Also, in fact -- It's cheaper to rengineer for reprojection than to engineer VRR BFI OLED brightly enough at sub-1ms MPRT.

NVIDIA knows this. They'll pounce when ready (eventually). Probably years later. Long wait, but I want to see if reprojection can happen sooner.

Let's chase the now-easier path, as 1000Hz displays is no longer far away (~2025 to ~2027).

I'm going to be publishing some major reprojection-related articles, to help incubate reprojection on the desktop. It's the only easy way to do 1000fps UE5.2 graphics quality, with fewer artifacts than strobing, and less lag than strobing.

Strobeless motion blur reduction is the Holy Grail.

If you want LinusTechTips' simplified take, you should watch www.youtube.com/watch?v=IvqrlgKuowE. He talks about it from the angle of reducing framerate costs for budget gamers, but I talk about from the angle as a strobeless motion blur reduction technology and a UE5/UE6/UE7 pathtraced 1000fps enabling technology.
 
Last edited:
Too bad it doesn't work simultaneously with VRR, so moot for me. It's only really for low demand competitive FPS's in which you can keep 360 FPS+.
 
  • Like
Reactions: erek
like this
Too bad it doesn't work simultaneously with VRR, so moot for me. It's only really for low demand competitive FPS's in which you can keep 360 FPS+.
You don't need that much FPS. On my cheap 120hz Asus ULMB1 was already amazing and I would assume that it's less noticeable as you go up in refresh rate (in a similar way that the jump from 60 to 120 is a lot more noticeable than 120 to 240). The best way I can describe it is that movement was 'as smooth as glass'.
 
Last edited:
  • Like
Reactions: erek
like this
I'm confident it can be done with OLED The only real problem to solve with variable strobes lengths for OLED is the brightness fluctuation which is easy to fix.

The problem with OLED for strobing in general is getting the brightness high with bursts that short. DLSS3 style frame generation could help with that a lot like you said. Even without that I think the current technology is already good enough to do a VRR strobing OLED even though it may not be "1000 hz motion clarity" at normal brightness . It's just no one has tried to do simultaneous strobing and VRR with OLED. But I think we may see it soon now that OLED is becoming mainstream and there are competing OLED technologies.
OLED displays don't use backlighting, so you can't do strobing on them. OLED already has a solution for motion clarity called black frame insertion (BFI). It takes advantage of the short pixel response time inherent in OLED by inserting a black frame between frames to reset the color of the pixels before presenting the next frame.
 
  • Like
Reactions: erek
like this
OLED displays don't use backlighting, so you can't do strobing on them. OLED already has a solution for motion clarity called black frame insertion (BFI). It takes advantage of the short pixel response time inherent in OLED by inserting a black frame between frames to reset the color of the pixels before presenting the next frame.
Uh....what? You strobe the OLEDs on and off, that's the same thing.

Definition of strobe
1. flash intermittently
 
  • Like
Reactions: erek
like this
I’ll wait for you maniacs to buy the new monitors, thats a lot of scratch for a 27” monitor.
 
  • Like
Reactions: erek
like this
I already blew my wad on a GSync compatible monitor and if my previous gaming monitor is any indication I’ll have this one until at least 2031.
 
Uh....what? You strobe the OLEDs on and off, that's the same thing.

Definition of strobe
1. flash intermittently
When you are referring to display technology it's an important distinction to make between a strobing backlight array and pixels turning off from the black color information.
 
  • Like
Reactions: erek
like this
When you are referring to display technology it's an important distinction to make between a strobing backlight array and pixels turning off from the black color information.
I think you're trying to sound smart by correcting someone but aren't.

You can strobe the OLEDS without ”black color information". "Black frame insertion" is just one way of doing it and isn't even exclusive to OLED.
 
  • Like
Reactions: erek
like this
I'd like to see how it looks when the source fps is 100fps or lower. I'm hesitant to drop that kind of money on something when the games I do play don't hit 360fps.
 
  • Like
Reactions: erek
like this
I'd like to see how it looks when the source fps is 100fps or lower. I'm hesitant to drop that kind of money on something when the games I do play don't hit 360fps.
Let them use the esports people as guinea pigs and then we'll have some cheap all encompassing 'GSYNC ULMB compatible' on all monitors in 6 years 👍
 
  • Like
Reactions: erek
like this
UMLB 2 seems to be targeted towards competitive gamers and e-sports since it only works at max refresh and doesn't support G-Sync VRR so I guess it's not for me.

The updated RGB Color Space Variable Overdrive for G-sync looks interesting though. In theory, it should further reduce ghosting and inverse ghosting across the VRR range.
 
  • Like
Reactions: erek
like this
You're making that up.

Get out your credit cards Nvidia fans, it's time to buy another monitor.
Nvidia: Yes, we sell a 43" Nvidia branded TCL TV Series 2 where we hacked in the feature. Had 3 local dimming zones and only costs $3850 USD. $4500 if you need simulated HDR. For $5000, comes with a cool leather jacket, we call it the Founders Edition version.
 
  • Like
Reactions: erek
like this
So cool!

I use a 240hz oled, so I don't need it apparently. But still really neat.

Nvidia, love em or hate em, you gotta admit they keep pushing the edge and making new amazing tech.
 
So cool!

I use a 240hz oled, so I don't need it apparently. But still really neat.

Nvidia, love em or hate em, you gotta admit they keep pushing the edge and making new amazing tech.
"The more you buy, the more you save."

Wives of amateur guitarists probably disagree with that though.
 
You don't need that much FPS. On my cheap 120hz Asus ULMB1 was already amazing and I would assume that it's less noticeable as you go up in refresh rate (in a similar way that the jump from 60 to 120 is a lot more noticeable than 120 to 240). The best way I can describe it is that movement was 'as smooth as glass'.
He already knows -- he's the guy who did the triple-monitor LightBoost via hacked EDIDs.

Here's a photo of Vega's LightBoost strobing rig, powered by a quad Titan SLI back at the time, almost a decade ago:

1685652162738.png


That said, strobing has its pros / cons -- and some people do choose to move away from strobing method, and towards the brute-framerate method of motion blur reduction. The early bird canary of the brute method is already here today, being that 240fps 240Hz OLED motion clarity (4ms blur strobeless) now goes pretty close to 2012's LightBoost strobing (2.4ms blur strobed).

There happens to be an MPRT pleasingness biasing factor where a human accepts a slightly higher MPRT if the blur reduction is achieved strobelessly. This is because 4ms MPRT OLED looks way nicer than the 2.4ms MPRT LightBoost, and the minor blur differential (less than 2x blur differential) isn't noticed, due to factors such as massively brighter colors, fewer stroboscopic effects, and complete lack of strobe crosstalk.

And 240fps 240Hz OLED is currently the best strobeless motion blur reduction money can buy today, without going to a laboratory to see the 1000fps 1000Hz experiments.

You can now do the brute framerate-based motion blur reduction technique (strobing OFF) with 360fps 360Hz LCD or 500fps 500Hz LCD, but 240fps 240Hz OLED still looks slightly clearer than most of them, due to lack of major GtG. And 240fps is more achievable territory for a bigger number of games.
 
Last edited:
Back
Top