Why OLED for PC use?

We're never going to agree on this. I have no idea how you generated the 4th image, nor do I care in images you grade yourself - I care about playing games in HDR, and those look great so far. Most games will tone-map to the monitor capabilities, making this a non-issue. As has been said several times, yes, higher brightness capabilities would be even better, but given that's not a reality and I prefer the OLED for 80%+ of what I do over FALD since I work mostly in SDR, I think the HDR capabilities it does have are great. Do they match FALD? No. Are they good enough to play games and have an enjoyable HDR experience? Absolutely, and that'll only get better as technology advances.

I view sRGB as sRGB because it's the most natural/accurate way and looks better to me because of those factors (I've tried wider gamut and I've tried AutoHDR; both look less natural to me, regardless of monitor - others may like it and that's cool, but it's not for me). Your "better" is simply NOT better to me, my preferences, and my valuations. What looks better to me is better to me, and what looks better to you is better to you - unless we're comparing objective things like accuracy, that's all subjective.
It's not my business you always want to see worse sRGB.

What's the 4th image?

I give you a hint. It's another screenshot from the same game except this time I turn on HDR.
 
It's not my business you always want to see worse sRGB.

What's the 4th image?

I give you a hint. It's another screenshot from the same game except this time I turn on HDR.

"Worse" is only your opinion. Many, may colorists and people in the industry would disagree with you.

Okay, so is it native HDR? I don't get your point.
 
Windows may know today but I'm talking about BACK THEN. Look at the date of the original post, this was a known issue that Windows WILL default to a really high peak brightness if it detects no HDR certification. You literally say it yourself, AutoHDR works with the dynamic range of the monitor, so what if AutoHDR has no idea what the dynamic range of the monitor is? Then it goes to 1500 nits. I thought it was 2000 but it seems to be 1500 as the default value.

https://www.techpowerup.com/forums/threads/hdr-peak-brightness.288568/

And again, the only way to fix this back then was to use CRU:

"Use custom resolution utility, open the CTA-861 extension block and look for an HDR Static Metadata block. If there isn't one then your monitor isn't telling Windows through EDID what the max/avg/min luminance is in which case it will default to 1499 nits. You can actually add metadata manually to trick windows into showing custom luminance values. I believe it uses this to scale auto-hdr, though HDR isn't really worth using on that monitor."

So yes it IS/WAS an AutoHDR problem and not an X27 ABL issue.
Windows doesn't report wrong for your X27. It knows your max CLL except if it doesn't know if the max CLL is override or not.

You can get into the same game for the same scene. Your X27 will still clip regardless for that 20% max brightness.
 
"Worse" is only your opinion. Many, may colorists and people in the industry would disagree with you.

Okay, so is it native HDR? I don't get your point.
So you know how grading works? These colorspaces are transitional. They are not drama.

What's the 4th image lol.
 
You don't know how grading works. These colorspaces are transitional. They are not drama lol.

What's the 4th image?

I know what you're doing is custom grading according to your preferences, but beyond that, no.

LOL. I have no idea as you seem to be enjoying not making a decipherable point, so you tell me.
 
I know what you're doing is custom grading according to your preferences, but beyond that, no.

LOL. I have no idea as you seem to be enjoying not making a decipherable point, so you tell me.
I like how you keep lying to yourself to just defend sRGB 80nits.

I always said very specifically that I turn on HDR on the 4th image. There is an option in game said "enable HDR" lol.

What's the 4th image?



52751179067_7a9a569622_o_d.png
 
As far as calibrating your display to sRGB, I would say it’s impossible to say with 100% certainty what 100% of people will do. However if you’ve bought as an example, an X-Rite colorimeter it leads you down the “correct” path without having to have much knowledge.

I find it unlikely that there are people that have an interest in display technology or film spend $200+ on a colorimeter, actively choose the wrong profiles, and have never read or figured out even the most basic information about color spaces of what they’re watching. This especially in light of the fact that every site such as RTNGS constantly talks about color space accuracy of TV’s/displays and names the color spaces they’re striving for accuracy in.
People might care for accuracy and buy devices but not care enough to know much about any of it. Then they think colors are accurate but if they are or not doesn't matter nearly as much than they thinking they have accurate colors and being able to boast they are in elite club of glorious accurate colors, unlike those dirty uncalibrated peasants.

My posts purpose was to highlight issue that sRGB is not Rec.709 with the intention that someone using sRGB might start thinking about if they use right calibration profile.
If I was to think about what the purpose of your posts is then boasting about being professional colorist and to appear as someone who people who exactly do not want to learn anything might want to listen to.

I also disagree with the concept that because it’s possible for people to do things incorrectly that they should then simply do whatever they want rather than strive for accuracy. Though I covered that above. To retread: people of course can choose to do whatever they want like using an EQ. But I would say that’s suboptimal.
Yeah sure, not looking for signs of wrong calibration and just assuming everything is super accurate and assuming that getting $200 toy gives you automatically all the required knowledge is the way to go and to always have the most accurate colors.
And if something looks off then its always glorious "artistic intent"
 
People might care for accuracy and buy devices but not care enough to know much about any of it. Then they think colors are accurate but if they are or not doesn't matter nearly as much than they thinking they have accurate colors and being able to boast they are in elite club of glorious accurate colors, unlike those dirty uncalibrated peasants.

My posts purpose was to highlight issue that sRGB is not Rec.709 with the intention that someone using sRGB might start thinking about if they use right calibration profile.
If I was to think about what the purpose of your posts is then boasting about being professional colorist and to appear as someone who people who exactly do not want to learn anything might want to listen to.
Using most off the shelf calibration devices more or less default to the correct "everything". Users more or less only have to click "next" a bunch of times, put the calibrator on the screen, and then save the profile at the end. It only gets slightly more complicated if they have to open menus to properly balance brightness, primaries, and contrast to meet the target profile.

Even if the user can't figure out how to do any of those things, the profiler will still do the best it can via software only. Sub-optimal, but will at least get the user closer.
Yeah sure, not looking for signs of wrong calibration and just assuming everything is super accurate and assuming that getting $200 toy gives you automatically all the required knowledge is the way to go and to always have the most accurate colors.
And if something looks off then its always glorious "artistic intent"
Okay. So the solution is to not care, not learn anything, and do whatever you want.

I'm really not sure what your point is other than: it's too complicated, so therefore no one should care.
 
I like how you keep lying to yourself to just defend sRGB 80nits.

I always said very specifically that I turn on HDR on the 4th image. There is an option in game said "enable HDR" lol.

What's the 4th image?

I'm not lying to myself. I prefer using sRGB for SDR. It's my preference because it looks best *to me*, and that's a fact. Your unwillingness to accept it isn't really my issue.

The 4th image, as I said above, appears to be native HDR. What's your point? BTW, I just downloaded the game since I'm on Gamepass and it's included. Looks spectacular in native HDR on my monitor. Again, I have no idea what point you're even trying to make.
 
Personally, I enjoy seeing a colorist's perspective. I'm definitely no professional, but I did spend some money to get a colorimeter and calibrate my TV via CalMan back when I got it, and it was definitely worth it to me.

That said, most TV's and many monitors have at least decent calibrations out of the box, and some better-than-decent (Some Asus monitors, for example, come with a calibration sheet). But of course you have to know the right mode for it to be in for that mode. I tend to run into two kinds of people. The first just tend to be happy with the default picture or might run though the presets and look for the one that looks good to them, and that's good enough. And then the others (I'm this latter group) go through settings and try to find the best accuracy or might even get the equipment to do a calibration/hire a calibrator. There's no definitive right or wrong here IMO. It's just what matters to an individual. If accuracy matters, you're probably going to work to learn what you don't know to achieve at least a moderate degree of that. It's sort of like cooking. Some people are fine with making sandwiches and simple dishes to just get by while other people really want to spend the time/effort to perfect complicated recipes they really love making. Nothing wrong with either - just different people having different priorities and choosing what to spend their time and effort on.

If we're judging by simplicity alone, there might as well not be things like HDR either, as that also adds complexity as far as a bunch more settings, and in my experience, people get almost as confused by that stuff as calibration (not quite and it's getting better, but close). I know people who don't want to dip their toes into HDR because they feel lost by it and they'd just rather set-it-and-forget-it with gold old SDR. Personally, I'm definitely a fan of HDR, as I am a fan for calibration as well, but I can understand the other perspectives too.
 
Using most off the shelf calibration devices more or less default to the correct "everything". Users more or less only have to click "next" a bunch of times, put the calibrator on the screen, and then save the profile at the end. It only gets slightly more complicated if they have to open menus to properly balance brightness, primaries, and contrast to meet the target profile.
That will work for some program and can introduce banding in some hardware configurations, can reduce contrast ratio and is generally not the right way to approach calibration.

Even if the user can't figure out how to do any of those things, the profiler will still do the best it can via software only. Sub-optimal, but will at least get the user closer.
It can also bring user closer to worse results than they started with as nothing about these devices and software side of things is perfect.

Okay. So the solution is to not care, not learn anything, and do whatever you want.
Never said that.
I already said it - you do not read posts you reply to!

I'm really not sure what your point is other than: it's too complicated, so therefore no one should care.
Again, not what I said
 
Then what's the 4th image? Your OLED cannot even see native HDR where it shoots up 2000+nits. You can see a 300nits sun.

It's not like I cannot see sRGB. I can see much better in SDR then why just seeing sRGB?

The lava is searing my eyes what nits is that in?
 
That will work for some program and can introduce banding in some hardware configurations, can reduce contrast ratio and is generally not the right way to approach calibration.
Not sure what software you're using, it does everything for you. Including checking your monitor, selecting the correct display type (eg just a few: CCFL, WLED, etc), gamma/gamut, defaults to 6500k white point, defaults to 120 cdm2, etc.

If you're calibrating a laptop display, something that has automatic display control (ADC). Then it will do all of the brightness/RGB/contrast adjustments for you. If you're on any Mac using any Mac display (whether desktop or laptop), all of which have ADC, then it covers all those displays as well.
It can also bring user closer to worse results than they started with as nothing about these devices and software side of things is perfect.
In terms of all hardware display technologies, I won't claim to know the most on these boards, that title probably goes to CBB, but anyway I don't really think this is possible. It's always going to bend your targets closer to more accurate in a mathematical way. And at the end you can also retest the profiles to show what your monitor is actually reproducing. I just say that to say, the reason why it goes through all the color patches and does a recheck is to ensure the placement of colors relative to one another are the "closest your monitor can reproduce", and if things fall outside of the targets it also does its best to do that in a visually "okay" way.

Hardware wise, TV's are often calibrated before shipping these days. So as long as contrast/RGB isn't messed with, at the very minimum colors will be accurate after calibration even if the brightness level is off (often the control that is messed with the most even by lay people). However if someone knows where that stuff is in the first place, when prompted by the calibration software all they have to do is "obey what it says". The colorimeter works in real time. As you turn brightness up, it displays the cdm2 in real time, tells you the target, and then shows an up/down arrow to make it pretty idiot proof. And the same for both contrast and RGB.
 
Last edited:
I can attest that with my Sony TV, there was a learning curve to do a calibration and I did it a couple times before I was happy with the result, but it wasn't too bad and I was very pleased with the results - it was an investment because of the hardware and software.

The LG calibration software with the OLED monitor was very simple, and apart from getting ahold of a colorimeter (for whatever reason, the CalMan one I still had for my TV wasn't compatible and only works with their stuff), it would have been pretty difficult to mess up - certainly a lot simpler than the Sony TV solution and I can't think of many steps where mistakes could be made; it's also super quick. (That said, sRGB mode was pretty darn close (judging by eye) out of the box - I think the calibration just got me a bit closer as well as allowing me to set a nit level and target gamma).
 
The LG calibration software with the OLED monitor was very simple, and apart from getting ahold of a colorimeter (for whatever reason, the CalMan one I still had for my TV wasn't compatible and only works with their stuff), it would have been pretty difficult to mess up - certainly a lot simpler than the Sony TV solution and I can't think of many steps where mistakes could be made; it's also super quick. (That said, sRGB mode was pretty darn close (judging by eye) out of the box - I think the calibration just got me a bit closer as well as allowing me to set a nit level and target gamma).
Monitors that have hardware calibration capability usually work with very few probes which manufacturer bothered supporting - and out of all supported only some which were actually tested on given monitor and software has correction matrices will actually provide good results.

LG 48GQ900 calibrated very nicely using i1 Display Pro Plus but on the other hand I also calibrated LG 27GP950 with the same software and probe (which should be good for this monitor!) and got terrible results - wrong white temperature and colors were generally off. Only gamma tracking was correct but it was also correct in sRGB mode so no improvement there.
If I didn't know better I would assume all went well with delta E showing <1 and I could assume I have accurate colors and this is exactly the kind of issue someone sold on idea that they will always get more accuracy might not be aware of and think they improved clors when they in reality made them worse. Especially if inaccuracy is small and not very obvious its possible to get worse sRGB than monitors factory sRGB mode. Its obvious having correctly calibrated monitors as the reference.

Devices such as i1 Display Pro are poor attempt at creating cheap device that can be used for calibration which in reality has nothing to do with how our eyes see and can get confused by different light spectrum than it was calibrated for. Getting accurate results can in some cases require a lot more than even these devices/software allows users to do and while can be very simple it is never guaranteed it will always work as expected and so "get calibration probe and you will be closer to accurate" is imho not quite true without lots of asterisks.

It's always going to bend your targets closer to more accurate in a mathematical way
For sure, mathematically more correct. Too bad visually for human eyes it can get worse and at times way worse than just using sRGB mode calibrated by manufacturer because of some incompatibility of probe and monitor or some other hardware/software issue.

Not to mention issues software support has in especially case when software calibration is used and not hardware calibration. Person gets probe, calibrate, get correct (hopefully) colors in some programs all the while in other programs calibration applies only partially or not at all - which can make things worse than using sRGB mode everywhere for SDR. Really great experience and reason to ignore all issues with CMS just to sell people on idea they need something they might actually not really need for their use case.

I would rather avoid making it sound super easy and issue-free and would not tell everyone they can benefit from because none of these are in practice true.
If would rather say it only $200 toy and can (and often does if used correctly and limitations of it are known) improve accuracy - and at the same time to not expect it to always work correctly or without need for troubleshooting.

Professionals of course do not need to be sold on such tools.
Its only relating people who have no idea they need such things or not - please do not oversell it and not oversimplify it and do not pretend there are no issues and everything always work.

Anyways, no point in continuing this discussion, and especially here. Also BTW this whole post, even parts in reply to sgupta were for you. Please stop responding, this is going absolutely nowhere, your glass is already full as is mine :)
 
Oh god, he's back. Can't help but call out this complete nonsense though;

Monitors are much advanced than TVs. TVs are not accurate. The world first HDR1000 monitor with QD layer is out in 2018. Back then no TV can do this specs. QD layer is already on monitors. You cannot do anything serious on TVs.
I had a Sony TV, the W900A from 2012 that has a QD layer. It was marketed as Triluminos.

https://www.soundandvision.com/content/sony-kdl-55w900a-3d-lcd-hdtv-triluminos-new-breakthrough

HDR? No, it was 2012 and a 1080p TV. But it was bright for the time and capable of greater than sRGB for sure.

OLEDs will be capable of more than this old thing was, in both colour range and brightness.

Also I don't know why you seem to think 1000+ nits and HDR are needed to see anything more than sRGB. Sounds to me like you've got no idea how colour spaces work.
 
  • Like
Reactions: Nenu
like this
Monitors that have hardware calibration capability usually work with very few probes which manufacturer bothered supporting - and out of all supported only some which were actually tested on given monitor and software has correction matrices will actually provide good results.

LG 48GQ900 calibrated very nicely using i1 Display Pro Plus but on the other hand I also calibrated LG 27GP950 with the same software and probe (which should be good for this monitor!) and got terrible results - wrong white temperature and colors were generally off. Only gamma tracking was correct but it was also correct in sRGB mode so no improvement there.
If I didn't know better I would assume all went well with delta E showing <1 and I could assume I have accurate colors and this is exactly the kind of issue someone sold on idea that they will always get more accuracy might not be aware of and think they improved clors when they in reality made them worse. Especially if inaccuracy is small and not very obvious its possible to get worse sRGB than monitors factory sRGB mode. Its obvious having correctly calibrated monitors as the reference.

Devices such as i1 Display Pro are poor attempt at creating cheap device that can be used for calibration which in reality has nothing to do with how our eyes see and can get confused by different light spectrum than it was calibrated for. Getting accurate results can in some cases require a lot more than even these devices/software allows users to do and while can be very simple it is never guaranteed it will always work as expected and so "get calibration probe and you will be closer to accurate" is imho not quite true without lots of asterisks.

Fair enough - I certainly can't explain the poor results you got on the LG; it seems odd that the same colorimeter would give you results that much worse on a different display and good results on the first. I'd be curious if you tried LG's simpler software (if that model supports it) if your results would be any different, but I digress - I too would have been frustrated if after calibration a display looked worse. I'm certainly no calibration expert and my only experiences with calibrations were these two displays; that said, they were both good experiences and the results seem consistent. I think we used the same probe, but in my case the results on the LG 27GR95QE-B seem great (both dE values and to my eye as far as being consistent with other calibrated displays). As for the Sony TV, I wasn't happy with the picture before calibration and it made all the difference (especially skin tones); I did have to use a custom Sony-recommended white point, which I figured out from another hardware forum, before it looked its best.
 
Fair enough - I certainly can't explain the poor results you got on the LG; it seems odd that the same colorimeter would give you results that much worse on a different display and good results on the first. I'd be curious if you tried LG's simpler software (if that model supports it) if your results would be any different, but I digress - I too would have been frustrated if after calibration a display looked worse. I'm certainly no calibration expert and my only experiences with calibrations were these two displays; that said, they were both good experiences and the results seem consistent. I think we used the same probe, but in my case the results on the LG 27GR95QE-B seem great (both dE values and to my eye as far as being consistent with other calibrated displays). As for the Sony TV, I wasn't happy with the picture before calibration and it made all the difference (especially skin tones); I did have to use a custom Sony-recommended white point, which I figured out from another hardware forum, before it looked its best.
LG's fault, most probably. I will play with software versions and whatnot more before sending complaint to them though.

Results can be different because to get proper "mathematically" correct results something better than colorimeter is needed, something which can actually analyze whole light spectrum - spectrometer. Those have issues too with low brightness precision and ideally spectrometer + colorimeter should be put in one device. Then in theory at least any display could be supported. Otherwise what colorimeter sees does not match what eyes see and you need corrections for given probe which manufacturers make with spectrometers. Its enough for light spectrum of backlight differing a little than one with which correction matrix was made and colors will be read incorrectly.
 
Other than a few naysayers we all know HDR on an oled's infinite black depth contrast is appreciable, even level headed people who prefer FALDs know this. On top review sites oleds are ranked as most of the top 4 HDR gaming tvs with oled on top in every one I've ever seen, and most of the other top 4 or 5. The top HDR gaming monitors are typically oled as in the top 3 as well, if not #1. That's not changing in 2023.

For monitors the g8 has local dimming though and it ranks high, but still among oleds as the top choices:

"It features Mini LED backlighting with a 46x26 array for 1,196 zones"

"The Samsung Odyssey Neo G8 has an excellent response time at its max refresh rate of 240Hz. There's minimal blur trail behind fast-moving objects, but there's significant overshoot with dark transitions that leads to inverse ghosting. Enabling VRR locks you out of any overdrive setting, and the response time is quick with it, and if you don't use VRR the best overdrive setting is 'Standard' because it performs similarly. 'Faster' and 'Extreme' have too much overshoot.

Like with the Samsung Odyssey Neo G7 S32BG75, there are reports that enabling local dimming worsens the response times. While the local dimming may cause some extra blur trail with fast-moving objects, the overall motion handling looks the same with local dimming on and off."

ZMx5YwE.png


Still seems like a decent screen. The response time isn't quite as high if you are running very high fpsHz but it is bad compared to oleds and even non oleds where people aim for a 1ms response time if they can get it. Notably it's 240hz doesn't necessitate it being 1440p.

There are valid choices in the fald sphere for a HDR gaming display if you are ok with the tradeoffs of fald's non-uniformity, lifted or dimmed zones in a tetris brickwork, and some outright seepage and blooming, samsung qdled FALD's gaming mode expanding with width of zones and slower transitions for example, and the above highlighted tradeoffs incl poor response times on displays like the odyssey outlined above.

Point is they all have tradeoffs and even with OLED's tradeoffs they are ranked as and among the top HDR tvs and HDR gaming tvs due to their infinite black's contrast contrasting the color nits they are capable of showing , down to the razor's edge pixel by pixel . . even compressed range and if not for long durations on large % of the screen though that's often changing in dynamic scenes in media and rotating viewports in games anyway. Both technologies have major tradeoffs.
 
Last edited:
Oh god, he's back. Can't help but call out this complete nonsense though;


I had a Sony TV, the W900A from 2012 that has a QD layer. It was marketed as Triluminos.

https://www.soundandvision.com/content/sony-kdl-55w900a-3d-lcd-hdtv-triluminos-new-breakthrough

HDR? No, it was 2012 and a 1080p TV. But it was bright for the time and capable of greater than sRGB for sure.

OLEDs will be capable of more than this old thing was, in both colour range and brightness.

Also I don't know why you seem to think 1000+ nits and HDR are needed to see anything more than sRGB. Sounds to me like you've got no idea how colour spaces work.
You want to trash talk again just to bang on a 2012 TV with no HDR1000? You cannot do anything serious on an inaccurate TV. Maybe you can do your little CAD from home in 2023.

Even in 2023, the TV is still inaccurate. Look at them. A bunch of guys watch a bunch of TVs from C2, C3, S95B, A95
ScreenShot.PNG


The unfortunate thing is the camera captured enough data. So I can rescale and recalculate the brightness level of this picture. Put a fullfield 170nits of C2 as the baseline. Then A95K is calculated as 203nits. S95B has 229nits. They are not far from the actual testing. The newest C3 is calculated to have 242nits. I give C3 another 40nits to 280ntis for good measures due to rescaling error of the original picture.

The funniest part is a bunch of guys watch a bunch of several sub-300nits screen lol.
 
I'm not lying to myself. I prefer using sRGB for SDR. It's my preference because it looks best *to me*, and that's a fact. Your unwillingness to accept it isn't really my issue.

The 4th image, as I said above, appears to be native HDR. What's your point? BTW, I just downloaded the game since I'm on Gamepass and it's included. Looks spectacular in native HDR on my monitor. Again, I have no idea what point you're even trying to make.
Wide Gamut SDR can be your OLED HDR. That's the point.
 
That's only at most 150k a year once you account for weekends and vacation. A nice income for sure, but nothing extraordinary.
Much better than whatever he has from home. If he has earn anything he would have used a proper monitor to see much better instead of imagine it.
 
Anyways, no point in continuing this discussion, and especially here. Also BTW this whole post, even parts in reply to sgupta were for you. Please stop responding, this is going absolutely nowhere, your glass is already full as is mine :)
If you have something to say to me that's what PM's are for. If you don't want to have a discussion on a public forum, then don't post. It literally defeats the entire purpose of forums in the first place.
 
This makes no sense whatsoever. Why would I use Wide Gamut SDR when the game supports native HDR (which looks beautiful)?

I think I lost a few brain cells reading that. WTF is even "Wide Gamut" SDR??? If its just forcing wide color gamut onto SDR colorspace then I don't see how that suddenly makes it "OLED HDR" lol. If I could take a monitor with wide color gamut and let it run it's full P3 colorspace or whatever instead of clamping it down to sRGB and now suddenly have OLED HDR I wonder what witchcraft is being done.
 
I think I lost a few brain cells reading that. WTF is even "Wide Gamut" SDR??? If its just forcing wide color gamut onto SDR colorspace then I don't see how that suddenly makes it "OLED HDR" lol. If I could take a monitor with wide color gamut and let it run it's full P3 colorspace or whatever instead of clamping it down to sRGB and now suddenly have OLED HDR I wonder what witchcraft is being done.

LOL. Your guess is as good as mine. It doesn't make any sense.
 
Last edited:
Oh, and as to all the implications of people not being able to afford "a proper" monitor, that's a juvenile argument as well as being hogwash. I had budgeted quite a bit more for a monitor (why I tried the ProArt after all), but nothing in the 32" size ended up working out as well as I'd hoped nearly as much as this 27" OLED happens to for me. Even if you disregard price, I'm happier with this monitor than the other two I had tried, but considering price as well, it was a no-brainer for me; it's simply my best option for now (and will leave room in the future budget if something comes along I do want to upgrade to in the next few years). Your mileage may vary depending on what you do. FALD makes sense for a lot of uses (high-nit HDR, for example), but the benefits would be used minimally for me and the drawbacks visible constantly for me, whereas OLED is the inverse (the stuff I use most has almost nothing but pros, and the stuff it can't display quite as bright still looks good enough with tone mapping while not being used nearly as much).
 
Back
Top