ASUS TUF Gaming VG32VQ: World’s 1st display with concurrent motion blur reduction & Adaptive-Sync

Necere

2[H]4U
Joined
Jan 3, 2003
Messages
2,785
Specs
  • 32"
  • VA panel
  • 2560x1440
  • 144hz
  • ELMB-SYNC (Adaptive Sync + motion blur reduction)

ASUS press release
TFT Central newspost

f.png



Interesting display for the fact that it can support both variable refresh rate and strobing being active at the same time. This has been done experimentally before to varying degrees of success, but this is the first monitor to officially support it.
 
I hope the MBR doesn't significantly reduce the brightness. That's usually why I don't use it

I mean, it must logically always reduce brightness. On an HDR display, depending on the power of the HDR-capable backlight, it may be possible for it to be quite bright compared to previous SDR monitors doing strobing.
 
Yes, finally! Any mention of a release date?

I guess I will have to choose between this and oled once hdmi 2.1 cards come out. But if this comes out before hdmi 2.1 cards I will probably just buy it.
 
This is wonderful news. I am already happy with the ULMB on my ASUS PG278Q. I really hope this tech becomes available on other ASUS screens like the upcoming XG438Q.
 
Motion blur is such a catastrophic hit to image quality that games will still look better on a screen like this than they would on a good OLED.
 
Now THAT is exciting news. Not sure I want 1440p + 32" though so I may wait for other models. Really glad it's happening and officially supported this time.
 
Definitely going to buy this and try out. I also wonder if the micro-controller that pulses the back-light will also adjust each pulses light intensity? If you have a large swing in VRR, the fast FPS will make the screen appear brighter and the screen will get dimmer as the FPS drops. Will be interesting to see.
 
https://edgeup.asus.com/2019/tuf-ga...n-and-adaptive-sync-in-a-variety-of-monitors/

They are all listed at 350 - 400 nit "HDR 10" . HDR color brightness range of 350 - 400 nit isn't even HDR 1000 (which itself is fractional HDR of HDR 4000 or HDR 10000 color ceiling mastered content). That is a SDR peak brightness range to start with. Then knock the brightness down from turning on strobing. Also consider that unless you are maintaining 100fps at the low end with VRR, anything dipping under 100fps-Hz is going to show the strobing flash. For some people even 100hz strobing is eye fatiguing after awhile.

Imo these would have to start out much brighter for SDR content and multitudes brighter for HDR content. You'd also have to be willing to pay for the gpu power and stomach setting your games to get 100fps-Hz or better on the low end of your game's frame rate graph (not 100fps-Hz average). For some people even 100 would induce PWM-like eyestrain after awhile.
 
Last edited:
For some people even 100 would induce PWM-like eyestrain after awhile.

This really depends on how well the backlight strobing is implemented.

For me, 60Hz on a CRT was headache-inducing. 75Hz was tolerable, 85Hz was usable all day, and 100Hz+ was golden.

But this isn't CRT strobing; given that the affects that sharp (square-wave shaped) strobes can have on people are well known, it's possible that the strobing may be massaged to not be offensive.
 
I've heard people's reports of strobing cutting their peak brightness by 2/3. If that were the case, for this to do HDR 1000 + strobing it would have to do 3000nit peak color brightness with strobing off. For 350nit SDR it would have to do 1050 nit peak brightness with strobing off. At it is,, according to that site these are 350nit - 400nit with strobing offt depending on the model so HDR is just a fake label.

Most people avoid PWM like the plague.
With variable refresh rate, you'd have to have a typical LOW of 100fps to maintain 100fps-HZ strobing, not an average of 100fps.
Since typical frame rate graphs tend to be +/- 30fps from the average for the most part, that could mean 130fps average for a mostly 100 - 130 - 160 fps graph. Even then, 100Hz strobing and variable rate strobing over hour(s) of game time could fatigue people's eyes.. and some have faster eyesight than others regarding flickering as well.

People use G-sync/VRR in order to push graphics settings higher while riding a frame rate range without hiccups. Between the desire for very high+ to ultra game settings relying on VRR to smooth things out seeming to be in direct opoosition to the very high frame rates required for high hz strobing, along with the very high peak color brightness required for even fractional HDR, this kind of strobing seems very situational usage wise. 1440p makes higher frame rate lows more within reach though at least.

I could see this tech being much more useful if someone developed a very high quality interpolation to go along with this, doubling or tripling the typical frame rates and combining it with a very high peak brightness to start with like a Q9Fn's 1800 - 2000 nit peak color brightness or higher. (Those Q9Fn's also have interpolation and black frame insertion I believe but I'm not sure of the quality vs artifacts. Those flagship tvs are still only a 60hz 4k line w/o hdmi 2.1 for now but they can do 1440p non native at 120hz with VRR/freesync off of amd gpu pcs and VRR off xbox one).

With a theoretical very high quality interpolation that avoided input lag and artifacts, you could run a 70fps graph of something like 40 - 70 - 100 fps interpolated x3 to 120 - 210 - 300 fps or multiplied more. That way the strobing could be very fast. Incidentally, if we ever get 1000hz super high response time displays with interpolation of something like 100fps x 10 for 1000fps at 1000Hz, we wouldn't even need strobing since there would only be 1pixel of sample and hold blur just like a crt.... essentially "zero" blur.



The below is dated vs combined VRR + Strobing but most of the tradeoffs still apply..


==================================
Easier to render games with very high fps work pretty well with ulmb.
Running 1440p or higher rez with any kind of high to ultra settings on the most demanding games won't let you sustain high fps, only average it.

As per blurbusters.com 's Q and A:

-----------------------------------------------------
Q: Which is better? LightBoost or G-SYNC?
It depends on the game or framerate. As a general rule of thumb:
LightBoost: Better for games that sustain a perfect 120fps @ 120Hz
G-SYNC: Better for games that have lower/fluctuating variable framerates.

This is because G-SYNC eliminates stutters, while LightBoost eliminates motion blur. LightBoost can make stutters easier to see, because there is no motion blur to hide stutters. However, LightBoost looks better when you’re able to do perfect full framerates without variable frame rates.

G-SYNC monitors allows you to choose between G-SYNC and backlight strobing. Currently, it is not possible to do both at the same time, though it is technically feasible in the future.
......
Main Pros:
+ Elimination of motion blur. CRT perfect clarity motion.
+ Improved competitive advantage by faster human reaction times.
+ Far more fluid than regular 120Hz or 144Hz.
+ Fast motion is more immersive.

Main Cons:
– Reduced brightness.
– Degradation of color quality.
– Flicker, if you are flicker sensitive.
– Requires a powerful GPU to get full benefits. <edit by elvn: and turning down settings a lot more at higher resolutions>

--------------------------------------------------------
During regular 2D use, LightBoost is essentially equivalent to PWM dimming (Pulse-Width Modulation), and the 2D LightBoost picture is darker than non-LightBoost Brightness 100%.
--------------------------------------------------------
Once you run at frame rates above half the refresh rate, you will begin to get noticeable benefits from LightBoost. However, LightBoost benefits only become major when frame rates run near the refresh rate (or exceeding it).
-------------------------------------------------------
If you have a sufficiently powerful GPU, it is best to run at a frame rate massively exceeding your refresh rate. This can reduce the tearing effect significantly.Otherwise, there may be more visible tearing if you run at a frame rate too close to your refresh rate, during VSYNC OFF operation. Also, there can also be harmonic effects (beat-frequency stutters) between frame rate and refresh rate. For example, 119fps @ 120Hz can cause 1 stutter per second.
Therefore, during VSYNC OFF, it is usually best to let the frame rate run far in excess of the refresh rate. This can produce smoother motion (fewer harmonic stutter effects) and less visible tearing.
Alternatively, use Adaptive VSYNC as a compromise.
-------------------------------------------------------------
Pre-requisites
Frame rate matches or exceeds refresh rate (e.g. 120fps @ 120Hz).

  1. LightBoost motion blur elimination is not noticeable at 60 frames per second.

--------------------------------------------------------------

Sensitivity to input lag, flicker, etc. (You benefit more if you don’t feel any effects from input lag or flicker)

Computer Factors That Hurt LightBoost

  • Inability to run frame rate equalling Hz for best LightBoost benefit. (e.g. 120fps@120Hz).
  • Judder/stutter control. Too much judder can kill LightBoost motion clarity benefits.
  • Framerate limits. Some games cap to 60fps, this needs to be uncapped (e.g. fps_max)
  • Faster motion benefits more. Not as noticeable during slow motion.
  • Specific games. e.g. Team Fortress 2 benefits far more than World of Warcraft.
  • Some games stutters more with VSYNC ON, while others stutters more with VSYNC OFF. Test opposite setting.
-----------------------------------end-of-blurbuster's-quotes--------------------------
 
I've heard people's reports of strobing cutting their peak brightness by 2/3. If that were the case, for this to do HDR 1000 + strobing it would have to do 3000nit peak color brightness with strobing off. For 350nit SDR it would have to do 1050 nit peak brightness with strobing off. At it is,, according to that site these are 350nit - 400nit with strobing offt depending on the model so HDR is just a fake label.

Most people avoid PWM like the plague.
With variable refresh rate, you'd have to have a typical LOW of 100fps to maintain 100fps-HZ strobing, not an average of 100fps.
Since typical frame rate graphs tend to be +/- 30fps from the average for the most part, that could mean 130fps average for a mostly 100 - 130 - 160 fps graph. Even then, 100Hz strobing and variable rate strobing over hour(s) of game time could fatigue people's eyes.. and some have faster eyesight than others regarding flickering as well.

People use G-sync/VRR in order to push graphics settings higher while riding a frame rate range without hiccups. Between the desire for very high+ to ultra game settings relying on VRR to smooth things out seeming to be in direct opoosition to the very high frame rates required for high hz strobing, along with the very high peak color brightness required for even fractional HDR, this kind of strobing seems very situational usage wise. 1440p makes higher frame rate lows more within reach though at least.

I could see this tech being much more useful if someone developed a very high quality interpolation to go along with this, doubling or tripling the typical frame rates and combining it with a very high peak brightness to start with like a Q9Fn's 1800 - 2000 nit peak color brightness or higher. (Those Q9Fn's also have interpolation and black frame insertion I believe but I'm not sure of the quality vs artifacts. Those flagship tvs are still only a 60hz 4k line w/o hdmi 2.1 for now but they can do 1440p non native at 120hz with VRR/freesync off of amd gpu pcs and VRR off xbox one).

With a theoretical very high quality interpolation that avoided input lag and artifacts, you could run a 70fps graph of something like 40 - 70 - 100 fps interpolated x3 to 120 - 210 - 300 fps or multiplied more. That way the strobing could be very fast. Incidentally, if we ever get 1000hz super high response time displays with interpolation of something like 100fps x 10 for 1000fps at 1000Hz, we wouldn't even need strobing since there would only be 1pixel of sample and hold blur just like a crt.... essentially "zero" blur.



The below is dated vs combined VRR + Strobing but most of the tradeoffs still apply..


==================================
Easier to render games with very high fps work pretty well with ulmb.
Running 1440p or higher rez with any kind of high to ultra settings on the most demanding games won't let you sustain high fps, only average it.

As per blurbusters.com 's Q and A:

-----------------------------------------------------
Q: Which is better? LightBoost or G-SYNC?
It depends on the game or framerate. As a general rule of thumb:
LightBoost: Better for games that sustain a perfect 120fps @ 120Hz
G-SYNC: Better for games that have lower/fluctuating variable framerates.

This is because G-SYNC eliminates stutters, while LightBoost eliminates motion blur. LightBoost can make stutters easier to see, because there is no motion blur to hide stutters. However, LightBoost looks better when you’re able to do perfect full framerates without variable frame rates.

G-SYNC monitors allows you to choose between G-SYNC and backlight strobing. Currently, it is not possible to do both at the same time, though it is technically feasible in the future.
......
Main Pros:
+ Elimination of motion blur. CRT perfect clarity motion.
+ Improved competitive advantage by faster human reaction times.
+ Far more fluid than regular 120Hz or 144Hz.
+ Fast motion is more immersive.

Main Cons:
– Reduced brightness.
– Degradation of color quality.
– Flicker, if you are flicker sensitive.
– Requires a powerful GPU to get full benefits. <edit by elvn: and turning down settings a lot more at higher resolutions>

--------------------------------------------------------
During regular 2D use, LightBoost is essentially equivalent to PWM dimming (Pulse-Width Modulation), and the 2D LightBoost picture is darker than non-LightBoost Brightness 100%.
--------------------------------------------------------
Once you run at frame rates above half the refresh rate, you will begin to get noticeable benefits from LightBoost. However, LightBoost benefits only become major when frame rates run near the refresh rate (or exceeding it).
-------------------------------------------------------
If you have a sufficiently powerful GPU, it is best to run at a frame rate massively exceeding your refresh rate. This can reduce the tearing effect significantly.Otherwise, there may be more visible tearing if you run at a frame rate too close to your refresh rate, during VSYNC OFF operation. Also, there can also be harmonic effects (beat-frequency stutters) between frame rate and refresh rate. For example, 119fps @ 120Hz can cause 1 stutter per second.
Therefore, during VSYNC OFF, it is usually best to let the frame rate run far in excess of the refresh rate. This can produce smoother motion (fewer harmonic stutter effects) and less visible tearing.
Alternatively, use Adaptive VSYNC as a compromise.
-------------------------------------------------------------
Pre-requisites
Frame rate matches or exceeds refresh rate (e.g. 120fps @ 120Hz).

  1. LightBoost motion blur elimination is not noticeable at 60 frames per second.

--------------------------------------------------------------

Sensitivity to input lag, flicker, etc. (You benefit more if you don’t feel any effects from input lag or flicker)

Computer Factors That Hurt LightBoost

  • Inability to run frame rate equalling Hz for best LightBoost benefit. (e.g. 120fps@120Hz).
  • Judder/stutter control. Too much judder can kill LightBoost motion clarity benefits.
  • Framerate limits. Some games cap to 60fps, this needs to be uncapped (e.g. fps_max)
  • Faster motion benefits more. Not as noticeable during slow motion.
  • Specific games. e.g. Team Fortress 2 benefits far more than World of Warcraft.
  • Some games stutters more with VSYNC ON, while others stutters more with VSYNC OFF. Test opposite setting.
-----------------------------------end-of-blurbuster's-quotes--------------------------

It's an arcade cabinet monitor. Nothing more.
 
I think I would go with the 27" ips or maybe the tn. The quicker response time could make a big difference for the ELMB.

No release date yet. If hdmi 2.1 cards come before them I'll probably just get an OLED instead.
 
I'm in the market for a 144hz 1440P monitor. I've been looking at the MAG321CQR (32" VA) from MSI but still a bit worried about how bad the PPI drop will look compared to my 27" 4k monitor. Not sure I want another 27" monitor though. Definitely don't want anymore 60hz.

Also, since my gaming monitor is a 27" 144hz 1080P TN panel, I'm worried about how much "less smooth" the VA panel will look.

How much do you guys think these monitors will be? I'm aiming for $450 or less hopefully. Also assuming they aren't way off into the distant future, because I'd like to get a new monitor with my upcoming PC overhaul in ~July..

----------Are there any existing monitors that you guys would rate "really good" in the 144hz 32" area? I'm not into ultrawide, the lack of vertical height just doesn't suit me.
 
93 PPI... :vomit:


I'm in the market for a 144hz 1440P monitor. I've been looking at the MAG321CQR (32" VA) from MSI but still a bit worried about how bad the PPI drop will look compared to my 27" 4k monitor. Not sure I want another 27" monitor though. Definitely don't want anymore 60hz.

Also, since my gaming monitor is a 27" 144hz 1080P TN panel, I'm worried about how much "less smooth" the VA panel will look.

How much do you guys think these monitors will be? I'm aiming for $450 or less hopefully. Also assuming they aren't way off into the distant future, because I'd like to get a new monitor with my upcoming PC overhaul in ~July..

----------Are there any existing monitors that you guys would rate "really good" in the 144hz 32" area? I'm not into ultrawide, the lack of vertical height just doesn't suit me.



Having owned both, I much prefer 1440p at 32" than 27" for gaming. The LG32GK850G was a godsend in late 2017. No way I'd go back to a glowing, low contrast IPS or a 27" panel. Sure the text on web pages looks a little soft but I got used to it in about a month. For gaming though it is a perfect size/res combination IMO. Immersion factor is just better with the larger panel.
 
Sure the text on web pages looks a little soft but I got used to it in about a month.

You can buy a cheap IPS to set off to the side; I paid US$100/each for a pair of Acer 24" 1080p IPS monitors that I put on a vertically stacked stand. They're perfect for web pages.
 
At that price?

Wow. I want one, but I'll wait for more thorough impressions.

Missing from the specs are the sync range for VRR and confirmation that VRR is available over both HDMI and DP. Neither should be a problem, but it'd be nice to see it in writing.

It's only $440. That's half the price the first 2560x1440 144hz Gsync monitor was.
 
It's only $440. That's half the price the first 2560x1440 144hz Gsync monitor was.

I did mean to express surprise at the low price, but I can see how my post doesn't succinctly translate to that- so yes, I am surprised that it's only US$440. For the featureset, I was also expecting it to be TN at that price, and I'm surprised that it's a VA panel.

Of course, I have a '165Hz' VA G-Sync 32" 1440p panel right now, which I only paid slightly more for. And if these panels are related, well, I'm not hopeful for ASUS' new creation.

However, if it's a new development, I'm interested to see what the community thinks of these all around.
 
By the way, it's not like this monitor has a low price. It's more that monitor prices have been absolutely INSANE for the past few years and everyone just has Stockholm syndrome.

It's completely retarded that tiny computer monitors are the same price IF NOT MORE than 55"+ TVs. Give me a fucking break.

At least AUO Optronics is on record more or less saying that gamers will pay anything, so it's okay to release overpriced products targeting them. People need to stop lapping this shit up.

"Considering the production cost of Mini LED is still relatively high, AUO will first launch the Mini LED-lit gaming monitor, according to Tsai. The company expects customers in the gaming monitor segment to have a higher price tolerance."

https://www.ledinside.com/news/2018...roduce_mini_led_to_automotive_and_vr_displays
 
Last edited:
It's completely retarded that tiny computer monitors are the same price IF NOT MORE than 55"+ TVs. Give me a fucking break.

Economies of scale are massively different, in favor of televisions.

Example: I picked up a pair of IPS 1080p60 monitors for US$100 / each a few years back. These are generally representative more or less of your typical consumer desktop monitor. And they're good monitors!

Now consider how many people actually buy desktop monitors, and how many of those are buying 'nicer' monitors (larger / better color / higher refresh rates / VRR), and compare that to your typical 55" TV buyer.

Yeah, in absolute per-unit materials cost, the monitor should be cheaper, but so many more TVs are made that the cost savings are extremely broad for so many things.
 
Yeah, in absolute per-unit materials cost

This is usually the disconnect with all pricing. Materials cost is a small percentage of the product. Human labor, whether it be actual manufacturing, the *setup costs* of manufacturing, R&D, marketing, company administration, etc is >80% of the cost of most products. And a lot of labor costs are more or less directly spread over # of units, so when you're selling few units the cost per unit is much higher, not just a little bit higher.

A 55" TV panel and a 27" IPS LCD are both one unit, the fact that the actual TV panel is just cut larger than the IPS LCD doesn't mean that much cost difference. All the costs listed above, and the actual assembly process take the same amount of time and money, but when you're selling 20-100x fewer units of the monitor, well...
 
Economies of scale are massively different, in favor of televisions.

Example: I picked up a pair of IPS 1080p60 monitors for US$100 / each a few years back. These are generally representative more or less of your typical consumer desktop monitor. And they're good monitors!

Now consider how many people actually buy desktop monitors, and how many of those are buying 'nicer' monitors (larger / better color / higher refresh rates / VRR), and compare that to your typical 55" TV buyer.

Yeah, in absolute per-unit materials cost, the monitor should be cheaper, but so many more TVs are made that the cost savings are extremely broad for so many things.

I don't buy that argument given how vanilla and cookie cutter PC monitors are. They don't have unique and exotic panels or sizes. They're all the same shit. Every monitor released is based on like 4 different panels. Even the first batch of 32:9 shit were essentially just 4k panels cut in half.

Economies of scale should be in full effect on these monitors. I could see it if there were 24", 25", 26", 27", 28", 29", 30", 31", 32", 33", etc. monitors, but no, not really. 99% of them basically come in 4 different sizes, and of those, most of them use the same panels. They're basically all the same crap.
 
They are charging high premiums for early adoption of new features in some cases but mostly it's an exclusivity of displayport 4k 120hz provided by the HDMI 2.1 roadmap, and providing smaller desk sized displays packed with features.

The first slim style VA tvs, OLEDs, HDR color brightness capability... the first 120hz monitors, g-sync monitors were a little pricey too by comparison to others at the time (a good FALD implementation has always been expensive in TVs aside from that too).
Back when I bought a 2560x1440 glossy apple ips before the korean B-grade knockoffs came out, they were $1100 - $1300, and the 2560x1600 30" dells were expensive back then too. And that was in that year's dollars so would be a bit more adjusted for dollar shrinking in today's dollars ~ + $250 to $300 more. I got my samsung S27A750D 120hz 1080p on sale for $400 in 2011 (was going for $550+ normally)... = $450 in todays dollars, $640+ non sale price in today's dollars.. My rog swift pg278Q was $808 in 2014 (~ $872 in today's dollars) ..

I will agree that this latest crop of monitors are charging a lot though. Personally I think that a 65" full featured 120hz+ 4k gaming display with VRR and 1000nit FALD for a real HDR capability (with trade offs) should be up to $2500, a 43" maybe $1000 - 1500 with 1000 nit fald+HDR+VRR.. and 27 - 32" ones with VRR, FALD+HDR 1000 maybe $800 - $900 with full features. Without 1000nit or higher 384, 512, or 1000 zone FALD (or OLED) and a g-sync module - they shouldn't cost as much as that imo.

They are cashing in on a whole other generation or so of gpus and monitors before HDMI 2.1 essentially. HDMI 2.1 will have VRR, 120hz 4k, QFT (quick frame transport) for low input lag. I'm not saying that hdmi 2.1 hardware will be cheap at release but I feel like they are cashing in on the exclusivity of displayport high hz 4k for now like it's been said a lot of times in these threads.
The only other thing they are providing otherwise vs later hdmi 2.1 displays is smaller sizes at 27" 16:9 and 35" uw.. I don't know if any 43" TV has ever had FALD (let alone HDR 1000 FALD) though either, and OLED doesn't come that small.. so 55" - 65" for the FALD or OLED full featured HDR capable HDMI 2.1 TVs most likely. So there will probably be a size exclusivity for gaming monitors still which they will charge premium for- but we'll have to see what becomes available with in the next few years. Unless they put 1000nit+ FALD arrays in similar sized gaming monitors I wouldn't buy one over a hdmi 2.1 tv though. For example a quality 43" FALD VA TV with 120hz 4k HDMI 2.1 VRR vs a similar 43" FALD VA 120hz 4k VRR gaming monitor (once hdmi 2.1 output gpus are out).

I'll consider buying a 43" non-FALD 4k 120hz gaming monitor in the meantime though, especially if I catch a sale price, and run that for a few years until hdmi 2.1 is in full swing and I get on gpus with hdmi 2.1 and a die shrink. The 43" .. "local dimming" lit non-FALD will have mediocre or worse psuedo-HDR but otherwise they should be good especially for SDR content.

I'm hoping for dual layer LCD tech (a 2nd LCD in monochrome as a pixel level backlight and 2nd light filter/blocking layer) to become a thing in gaming monitors at some point until micro-LED ultra high density FALD arrays come out much later but that's a ways off yet.
 
Last edited:
Interested, great price point to boot. These non OLED panel $2K+ monitors have lost their minds.
 
I wonder when the 24" ips and tn versions come out. I'm not sure which one I want. Someone grab this one and let us know how it is.
 
I wonder when the 24" ips and tn versions come out. I'm not sure which one I want. Someone grab this one and let us know how it is.

27" not 24". TFTC has gotten their 27" IPS version for review. Personally I'm eyeing the TN version if it can improve crosstalk over the VA/IPS options.
 
When you finally get a comfortable chair and game in the right position, you can easily use a 32" -42" or bigger monitor, because you are no longer leaning over your desk, but reclined back in a Cpt Kirk like position. Looking at the many systems in the pic threads, you can see those who use big monitors always have a big desk and a full-on ergo chair, etc.

That said, I do have a few 27" I keep around for pick-up lans and weekenders.
 
This seems like a really awesome arcade cabinet monitor to me. That's where its feature set will really shine. Those games have a very low motion blur tolerance.
 
Amazon still says ships within 1 to 2 months, which I hope is bullshit.

I'd never buy anything from B&H. They've got a crap return policy. It's more likely that you'll get a defective monitor--especially coming from Asus--than not. Makes no sense to buy stuff like this from B&H.
 
Back
Top