360hz Asus Rog Swift

Joined
Apr 14, 2011
Messages
668
kfTx4bj.jpg



https://rog.asus.com/articles/gamin...is-the-worlds-fastest-esports-gaming-monitor/



Overkill or Necessary ?
 
It's headroom. Doesn't matter if you reach those framerates, you are still getting more than 144 or 240 Hz monitors if you can play games at say 280-300 fps. Assuming the panel response time can keep up there is no reason why you would not want as high a refresh rate as possible.

In reality though these will most likely be more expensive than they are worth.
 
I don't normally like to cite Linus, but he did a pretty good test of high refresh and it turns out 240hz provides a significant performance increase over 144hz, especially for casual gamers, so more refresh rate can't hurt.

Would this still be the case on non-sample-and-hold displays? Who knows, but strobing has proven pretty difficult to implement in conjunction with variable refresh rates. Every attempt seems to have major flaws. So maybe more refresh rate is the best way to go.
 
I don't normally like to cite Linus, but he did a pretty good test of high refresh and it turns out 240hz provides a significant performance increase over 144hz, especially for casual gamers, so more refresh rate can't hurt.

Would this still be the case on non-sample-and-hold displays? Who knows, but strobing has proven pretty difficult to implement in conjunction with variable refresh rates. Every attempt seems to have major flaws. So maybe more refresh rate is the best way to go.

Strobing is basically PWM which is eye fatiguing, requires very high frame rates minimums to do it properly which means lower graphics or simpler games. It also mutes and dulls the screen brightness and vibrance so will likely always be incompatible with HDR and HDR color volume ranges.

Long term , we need much more advanced interpolation (without significant lag and without artifacts) to fill frames eventually for something like 100fps x10 interpolated at 1000hz I hope, which would be the same pixel persistance/sample and hold blur as a professional crt ~ 1ms which is essentially "zero" blur.
https://www.blurbusters.com/blur-bu...000hz-displays-with-blurfree-sample-and-hold/

Trying to fill very high Hz ceilings with raw frame rates while at the same time increasing screen resolutions and playing very demanding games with higher and higher graphics imagery is not going to work so it looks like better interpolation multiplying a good base frame rate is the way to go in the long run. This also goes for more extreme resolutions combined from each eye's display for VR and AR going forward.
 
I'd like to know how much this will cost. I doubt it will be less than $700.
 
Strobing is basically PWM which is eye fatiguing, requires very high frame rates minimums to do it properly which means lower graphics or simpler games. It also mutes and dulls the screen brightness and vibrance so will likely always be incompatible with HDR and HDR color volume ranges.

Long term , we need much more advanced interpolation (without significant lag and without artifacts) to fill frames eventually for something like 100fps x10 interpolated at 1000hz I hope, which would be the same pixel persistance/sample and hold blur as a professional crt ~ 1ms which is essentially "zero" blur.
https://www.blurbusters.com/blur-bu...000hz-displays-with-blurfree-sample-and-hold/

Trying to fill very high Hz ceilings with raw frame rates while at the same time increasing screen resolutions and playing very demanding games with higher and higher graphics imagery is not going to work so it looks like better interpolation multiplying a good base frame rate is the way to go in the long run. This also goes for more extreme resolutions combined from each eye's display for VR and AR going forward.
That said, higher quality strobing is arriving. Brighter color strobing in 240 Hz 1ms IPS, especially when you have sufficient refresh rate headroom. (120Hz strobing at 240Hz is higher quality than 120Hz strobing at 144Hz)

An Amazon reviewer already wrote that the XG270's strobing is superior to Sony FW900 CRT, thanks to the new Blur Busters Approved program that I worked with ViewSonic on.

This will have to do until 1000 Hz monitors with frame rate amplification technologies built into GPUs. :)
 
It's headroom. Doesn't matter if you reach those framerates, you are still getting more than 144 or 240 Hz monitors if you can play games at say 280-300 fps. Assuming the panel response time can keep up there is no reason why you would not want as high a refresh rate as possible.

In reality though these will most likely be more expensive than they are worth.
Exactly. Headroom.

You don't need 360fps to benefit from 360Hz.

1. Quick Frame Transport. Even at 100fps, your frames are transmitted to the monitor in 1/360sec. 100fps at 360Hz GSYNC is way, way, way lower lag than 100fps at 144Hz GSYNC.
2. Much less visible tearing if you use VSYNC OFF. Tearlines are visible for only 1/360sec
3. More headroom for good quality strobing. Best strobing occurs when you strobe at roughly half the max Hz refresh rate (Technical: Cramming LCD GtG in a large VBI)
4. Humongous VRR range you can drive a truck through. You want your VRR range wider than your framerate range so you don't have to worry about side effects (lag, stutter) of going outside your VRR range.

And also, 360fps provides what is essentially strobeless ULMB. 360fps = 1/360sec = 2.8ms persistence. That's getting pretty close to LightBoost (2ms persistence), but without strobing. Blurless sample-and-hold is slowly arriving FTW!

VSYNC OFF will become far less necessay once ultra-Hz equallizes latency of GSYNC, FreeSync, VSYNC ON, VSYNC OFF. The higher the Hz, the less differential between sync technologies. VSYNC OFF is simply a popular band-aid that continues to be used because of slow-scanning refresh cycles. Ultra-Hz gives the user the power to choose without worrying about lag.

And readers dismissing 360 Hz cost, remember, 4K used to cost five figures in IBM T221 days. Today, 4K is a $299 Walmart Boxing Day sale. Please at least thank 360 Hz for eventually commoditizing 120Hz, 144Hz and 240Hz. Even 120Hz is coming to your favourite smartphone this decade for example!

This is simply a refresh rate race to retina refresh rates. Intermediate Hz gradually becomes more affordable as higher new Hz slowly commoditizes previous Hz.
 
Last edited:
You guys are defending 360hz as if someone was downing it. The only complaints in this thread so far are it's a little small and needs something more than 1080p.
 
And also, 360fps provides what is essentially strobeless ULMB. 360fps = 1/360sec = 2.8ms persistence. That's getting pretty close to LightBoost (2ms persistence), but without strobing. Blurless sample-and-hold is slowly arriving FTW!

This is where it's at imo, but with the graphics settings and resolutions most crave on more demanding games (where most people crank the graphics up a little higher and rely on VRR ~ g-sync/free-sync to ride a roller coaster of frame rates which ranges into lower end fps) - a more advanced interpolation would be a lot more useful to fill those frames starting from a raw fps with decent motion definition to start with. For example 90fps x 4 interpolated = 360fps. This would allow higher graphics settings on more demanding games while eliminating the need for strobing and it should work with HDR modes and color volumes going forward. Interpolation and technologies are important already for VR (and AR)'s resolutions and their more extreme resolutions in the future (along with other resolution demand lessening techs like foveated rendering among others perhaps).

On a side note, I obviously read and utilize information found on the blurbuster's site a lot. It's a great resource. Very much appreciated thanks.
 
Exactly. Headroom.

You don't need 360fps to benefit from 360Hz.

1. Quick Frame Transport. Even at 100fps, your frames are transmitted to the monitor in 1/360sec. 100fps at 360Hz GSYNC is way, way, way lower lag than 100fps at 144Hz GSYNC.
2. Much less visible tearing if you use VSYNC OFF. Tearlines are visible for only 1/360sec
3. More headroom for good quality strobing. Best strobing occurs when you strobe at roughly half the max Hz refresh rate (Technical: Cramming LCD GtG in a large VBI)
4. Humongous VRR range you can drive a truck through. You want your VRR range wider than your framerate range so you don't have to worry about side effects (lag, stutter) of going outside your VRR range.

And also, 360fps provides what is essentially strobeless ULMB. 360fps = 1/360sec = 2.8ms persistence. That's getting pretty close to LightBoost (2ms persistence), but without strobing. Blurless sample-and-hold is slowly arriving FTW!

VSYNC OFF will become far less necessay once ultra-Hz equallizes latency of GSYNC, FreeSync, VSYNC ON, VSYNC OFF. The higher the Hz, the less differential between sync technologies. VSYNC OFF is simply a popular band-aid that continues to be used because of slow-scanning refresh cycles. Ultra-Hz gives the user the power to choose without worrying about lag.

And readers dismissing 360 Hz cost, remember, 4K used to cost five figures in IBM T221 days. Today, 4K is a $299 Walmart Boxing Day sale. Please at least thank 360 Hz for eventually commoditizing 120Hz, 144Hz and 240Hz. Even 120Hz is coming to your favourite smartphone this decade for example!

This is simply a refresh rate race to retina refresh rates. Intermediate Hz gradually becomes more affordable as higher new Hz slowly commoditizes previous Hz.

Great post and good points, some of which I hadn't considered. I look forward to the inevitable 1000hz monitors, which oled is capable of today if the driving electronics supported it.
 
Great post and good points, some of which I hadn't considered. I look forward to the inevitable 1000hz monitors, which oled is capable of today if the driving electronics supported it.
LCD too. Surprisingly. Remember, 144Hz LCD was a pipe dream back in the old 33ms 60Hz LCDs of the early-to-mid 1990s.

I have information that LCD will be able to go into the kilohertz territory within this human generation (aka 2030s).

However, OLED and microLED can join the party too. Just saying LCD's going to be a horse in this race for decades. Surprisingly so.

Fortunately, LCD blacks are solvable
I've even seen a cheap million-zone local dimming "backlight" (HiSense Dual-Cell LCD at CES 2020), so don't dismiss halo-free LCD black inability either. It looked better than some of the OLEDs that I saw. I saw lots of new impressive display technologies at CES 2020, personally, in person.

Yes, GtG needs to keep getting faster
The problem is GtG (for the whole GtG heatmap of all color combos) needs to be reliably well less than half a refresh cycle, in order to not dilute/interfere much with the high-Hz. But we've fallen from 33ms-50ms all the way down to 0.2ms-0.5ms GtG (for 10%->90% segment). While progress is slowing, it is not stopping there, either -- there's already some engineering paths to speed this up (and more consistently for all colors, too).
 
Last edited:
I can't wait for 500hz OLEDs, so I can finally stop missing my CRTs!
Nice to see the chief BlurBuster on here!
 
An Amazon reviewer already wrote that the XG270's strobing is superior to Sony FW900 CRT, thanks to the new Blur Busters Approved program that I worked with ViewSonic on.

Any idea if there's other sources of latency in a display like XG270 that the specs aren't accounting for?
 
This is one monitor I will be looking to get once a few reviews get out on it. Currently using the Asus PR348Q and its good for me
 
Honestly, I'm not holding my breath. It's ASUS and it's a monitor. I don't care. If they actually build a good screen that doesn't have stupid quality control issues then maybe I'll take notice. But until then, every ASUS monitor that comes out should be suspect.

EDIT: While I'm at it, if you're reading this ASUS - learn to build a monitor without QC issues first. Because from my perspective, it looks like your put more money into your marketing department than your actual R&D and manufacturing.
 
Last edited:
The human eye can detect light at 1/2000th of a second. Ofc higher refreshrate will help, but there are significant diminishing returns after ~90hz. It gets even more important with VR.
 
There are diminishing returns on motion definition but not sample and hold blur reduction. That is, there are diminishing returns in blur reductuion only because gpus are incapable of providing sufficient frame rates on most games and better graphics settings. That's where much better interpolation technologies (and other frame saving techs like foveated rendering perhaps) become a necessity. The 1ms ~ 1px of motion blur is where you get down to essentially the "zero" sample-and-hold blur (image persistence) equivalent of a fw900 graphics professional crt . That is a huge difference resulting in full clarity of the entire game world while mouse-looking , movement keying, and viewport panning rather than smearing at low fps-hz or "soften blurring" at other ranges.

https://www.blurbusters.com/blur-bu...000hz-displays-with-blurfree-sample-and-hold/
okw997S.png
 
Last edited:
I recently got a 144hz. Then I saw a 240 hz and the legibility of moving text was way better but not perfect. I imagine that 360 is even better.
 
I recently got a 144hz. Then I saw a 240 hz and the legibility of moving text was way better but not perfect. I imagine that 360 is even better.

Yep! I don't know what is up with people's faculties... As if they don't notice or care and then rant about it online. So annoying! It is like socialism for the senses! "Everyone use shit hardware because my senses are damaged."
 
Yep! I don't know what is up with people's faculties... As if they don't notice or care and then rant about it online. So annoying! It is like socialism for the senses! "Everyone use shit hardware because my senses are damaged."

Lol that is because many people on this forum are so obsessed with 4k maximum settings and some even claim that they see "absolutely no point" in reducing graphics settings in order to get higher fps for competitive games, it's quite ridiculous. Anyways while I appreciate the boundaries being pushed on monitor refresh rates, we kind of need CPUs/GPUs to also make the same progress to support it.
 
Yes not to beat the point to death but like you indicated - you have to be filling the new refreshes per second (Hz) with enough frames per second (fps) to match, or at least living somewhere in that new ceiling's heights which is impossible on a lot of things outside of the desktop and 2d perspective games. That's why I kept mentioning the need for more advanced high quality interpolation, at least for 1st and 3rd person games with any kind of gpu demand and especially at 4k rez or higher and as Hz maximums increase eventually on monitors (and on VR headset resolutions per eye no less). Other tech like foveated rendering, checkerboarding, dynamic resolution and such can help but they can't hit the straight up multiples that interpolation can. For example, 90fps interpolated x 3 for ~ 270fps.. capped lower for a 240hz display, or 90fps interpolated x4 for 360fps capped lower as needed on a 360hz display. Or lower motion definition at 60fps solid x 4 ---> 240fps @ 240Hz, 60fps solid x 6 interpolated ---> 360fps @ 360Hz.

Of course there are easier to render games like Halflife 2, Counterstrike, Left4Dead2, Portal, Dishonored1 , and similar game graphics in a lot of other stylized games and most VR games, as well as isometrics (mobas, arpgs, etc).
 
Yes not to beat the point to death but like you indicated - you have to be filling the new refreshes per second (Hz) with enough frames per second (fps) to match, or at least living somewhere in that new ceiling's heights which is impossible on a lot of things outside of the desktop and 2d perspective games. That's why I kept mentioning the need for more advanced high quality interpolation, at least for 1st and 3rd person games with any kind of gpu demand and especially at 4k rez or higher and as Hz maximums increase eventually on monitors (and on VR headset resolutions per eye no less). Other tech like foveated rendering, checkerboarding, dynamic resolution and such can help but they can't hit the straight up multiples that interpolation can. For example, 90fps interpolated x 3 for ~ 270fps.. capped lower for a 240hz display, or 90fps interpolated x4 for 360fps capped lower as needed on a 360hz display. Or lower motion definition at 60fps solid x 4 ---> 240fps @ 240Hz, 60fps solid x 6 interpolated ---> 360fps @ 360Hz.

Of course there are easier to render games like Halflife 2, Counterstrike, Left4Dead2, Portal, Dishonored1 , and similar game graphics in a lot of other stylized games and most VR games, as well as isometrics (mobas, arpgs, etc).

So here's a dumb question. When I had my CRT monitors, there were situations in which my older graphics card (like the GTX-560 and GTX-770) were nowhere near hitting the refresh of 85 fps at higher resolutions, and yet still the motion looked pretty clear. Is it because of how the monitors scanned out the image? IE - would rolling scan monitors make this threshold lower?
 
So here's a dumb question. When I had my CRT monitors, there were situations in which my older graphics card (like the GTX-560 and GTX-770) were nowhere near hitting the refresh of 85 fps at higher resolutions, and yet still the motion looked pretty clear. Is it because of how the monitors scanned out the image? IE - would rolling scan monitors make this threshold lower?

It as better because the CRT didn't have the GtG latency. There have been other forms of latency in LCDs since then - which seem to finally be getting reduced enough to compete with a CRT. Not to mention the quality of black on CRTs...
 
So here's a dumb question. When I had my CRT monitors, there were situations in which my older graphics card (like the GTX-560 and GTX-770) were nowhere near hitting the refresh of 85 fps at higher resolutions, and yet still the motion looked pretty clear. Is it because of how the monitors scanned out the image? IE - would rolling scan monitors make this threshold lower?

Due to the way CRTs render they break sample and hold motion blur. This is what LCDs with strobing backlight attempts to do and new LG OLEDs have a rolling scan black frame insertion that does similar thing but its goal is more to avoid the drop in brightness that occurs with black frame insertion and strobing. CRTs didn't go very bright in the first place so it's not an issue on them.

Without BFI even an OLED with immediate response time has motion blur because of the way our brain interprets the images.
 
  • Like
Reactions: P1x3L
like this
So here's a dumb question. When I had my CRT monitors, there were situations in which my older graphics card (like the GTX-560 and GTX-770) were nowhere near hitting the refresh of 85 fps at higher resolutions, and yet still the motion looked pretty clear. Is it because of how the monitors scanned out the image? IE - would rolling scan monitors make this threshold lower?


This article answers a lot of those questions

https://www.blurbusters.com/faq/oled-motion-blur/

Due to the way CRTs render they break sample and hold motion blur. This is what LCDs with strobing backlight attempts to do and new LG OLEDs have a rolling scan black frame insertion that does similar thing but its goal is more to avoid the drop in brightness that occurs with black frame insertion and strobing. CRTs didn't go very bright in the first place so it's not an issue on them.

Without BFI even an OLED with immediate response time has motion blur because of the way our brain interprets the images.

Yes without BFI which is essentially PWM and is eye fatiguing especially at lower refresh rates i.e. VRR/g-sync/free-sync roller coaster of Hz+Fps rates on monitors attempting to combine BFI with VRR so that people can still utilize VRR to pump their graphics settings higher and use higher resolutions. BFI and Strobing dim the screen and mute color vibrancy, which makes them incompatible with HDR color heights.

Using a more advanced form of interpolation in the future on very high Hz monitors will reduce (at something like 360fps at 360hz or 480fps at 480hz) and eventually essentially eliminate blur at 1000fps at 1000Hz - without having to resort to BFI/strobing/rolling scan.


BFI and strobing are often quoted as lowering the brightness by around the same percent as they reduce the blur. For example, 20% blur reduction --> 20% reduction in brightness, 75% blur reduction ---> 75% reduction in brightness. That is a huge hit on screen brightness capability and will not work with HDR color brightness ranges. OLEDs are limited in brightness and sustained brightness/%window already.

However, BFI/strobing could also benefit from very high frame rates via interpolation combined with very high Hz if the black frames/strobes could keep up, with OLED response times for example. The lower the strobe rate, the more aggravating the PWM effect.

https://forums.blurbusters.com/viewtopic.php?t=3213
OLED rolling scan can work with HDR but the chief problem is OLED light output. Rolling scan will reduce peak lumen output -- many HDR specifications have minimum light output specs -- IIRC, one of the specs (Dolby Vision, I think), specifies something like 500 nits for OLEDs and 1000 nits for LCDs.

The more refresh rate, the easier it will be to keep brightness during rolling scan, due to more rolling scan passes (of the same persistence -- "ON" duty cycle).


Either way, a much better form of interpolation combined with very high refresh rates seems to be the only way forward due to gpu limitations. https://www.blurbusters.com/blur-bu...000hz-displays-with-blurfree-sample-and-hold/
 
Last edited:
  • Like
Reactions: P1x3L
like this
This article answers a lot of those questions

https://www.blurbusters.com/faq/oled-motion-blur/



Yes without BFI which is essentially PWM and is eye fatiguing especially at lower refresh rates i.e. VRR/g-sync/free-sync roller coaster of Hz+Fps rates on monitors attempting to combine BFI with VRR so that people can still utilize VRR to pump their graphics settings higher and use higher resolutions. BFI and Strobing dim the screen and mute color vibrancy, which makes them incompatible with HDR color heights.

Using a more advanced form of interpolation in the future on very high Hz monitors will reduce (at something like 360fps at 360hz or 480fps at 480hz) and eventually essentially eliminate blur at 1000fps at 1000Hz - without having to resort to BFI/strobing/rolling scan.


BFI and strobing are often quoted as lowering the brightness by around the same percent as they reduce the blur. For example, 20% blur reduction --> 20% reduction in brightness, 75% blur reduction ---> 75% reduction in brightness. That is a huge hit on screen brightness capability and will not work with HDR color brightness ranges. OLEDs are limited in brightness and sustained brightness/%window already.

However, BFI/strobing could also benefit from very high frame rates via interpolation combined with very high Hz if the black frames/strobes could keep up, with OLED response times for example. The lower the strobe rate, the more aggravating the PWM effect.

https://forums.blurbusters.com/viewtopic.php?t=3213



Either way, a much better form of interpolation combined with very high refresh rates seems to be the only way forward due to gpu limitations. https://www.blurbusters.com/blur-bu...000hz-displays-with-blurfree-sample-and-hold/

Totally... Hopefully this information becomes popular enough for customers & manufactures to get it!
 
LCD too. Surprisingly. Remember, 144Hz LCD was a pipe dream back in the old 33ms 60Hz LCDs of the early-to-mid 1990s.

I have information that LCD will be able to go into the kilohertz territory within this human generation (aka 2030s).

However, OLED and microLED can join the party too. Just saying LCD's going to be a horse in this race for decades. Surprisingly so.

Fortunately, LCD blacks are solvable
I've even seen a cheap million-zone local dimming "backlight" (HiSense Dual-Cell LCD at CES 2020), so don't dismiss halo-free LCD black inability either. It looked better than some of the OLEDs that I saw. I saw lots of new impressive display technologies at CES 2020, personally, in person.

Yes, GtG needs to keep getting faster
The problem is GtG (for the whole GtG heatmap of all color combos) needs to be reliably well less than half a refresh cycle, in order to not dilute/interfere much with the high-Hz. But we've fallen from 33ms-50ms all the way down to 0.2ms-0.5ms GtG (for 10%->90% segment). While progress is slowing, it is not stopping there, either -- there's already some engineering paths to speed this up (and more consistently for all colors, too).
I’m always disappointed in display news coming out of CES as everyone reports the same “top 3” stories. If you can please tell us more about some of these future directions you saw, I for one am most intrigued.
Dual layer lcds and high backlight count solutions could definitely address some LCD problems of today. What I wonder about is LCD crystal twisting/ switching speeds which have increased very very slowly over the last decade a most of the gains have been through clever compensation schemes not actual performance improvements via materials or new manufacturing ideas, it seems. Anything new on that front?
 
Dual layer LCDs have been shown at like the past 3 CESes, syncing two LCDs and maintaining good input latency and response times especially at the level required for gaming, which is much tighter timings than that required for TVs, seems to be too challenging for manufacturers so far. Don't forget that just getting one LCD's backlight to respond fast enough(PG27UQ and successors) took Nvidia and Asus like two and a half years.
 
Back
Top