LG Announces World’s First 21:9 Freesync Monitor

CommanderFrank

Cat Can't Scratch It
Joined
May 9, 2000
Messages
75,400
If you are an ardent gamer and want nothing but the most up to date and cutting edge equipment, heads up on the new LG 21:9 Freesync monitor which LG will be debuting at CES starting next week in Las Vegas.

It also has a feature called Black Stabilizer that “illuminates dark scenes” to reveal hidden details. LG thinks this could give players a competitive edge by ensuring opponents can’t ambush from the shadows.
 
Freesync offers more than just gaming advantages, another thing is it could extend the life of the monitor.
 
Two more monitors and freesync will have matched the number of gsync monitors.
 
This is stupid. If it can't do better than 60Hz, no one will care.

I'm still looking for a 120Hz or better 21:9 aspect ratio monitor.
 
This is stupid. If it can't do better than 60Hz, no one will care.

I'm still looking for a 120Hz or better 21:9 aspect ratio monitor.

This! How long have LCD monitors been out now? Something like mid 1990's first LCD came out, can we make some progress FFS its been 25 years, most are still capped at 60hz. I want something like 4k resolution at 32inch-36inch monitor with 120hz or higher refresh rate. Is it rocket science?
 
The real benefit of freesync/gsync is when the fps is between 30-50fps. A 60hz monitor is fine for that.
 
"It also has a feature called Black Stabilizer that “illuminates dark scenes” to reveal hidden details. LG thinks this could give players a competitive edge by ensuring opponents can’t ambush from the shadows."

Marketing tip: when crushing white so that black is more visible, call it a Black Stabilizer.
 
Maybe it won't be too much longer until we can finally see an actual legit review of Freesync. It feels like it's been a year now.
 
Here I am just hoping for a 27" 1440p 120hz monitor that actually has decent QC and doesn't cost $900
 
This is stupid. If it can't do better than 60Hz, no one will care.

I'm still looking for a 120Hz or better 21:9 aspect ratio monitor.

You're neglecting the reason behind needing the 120Hz monitors in the first place. Freesync actually defeats the need for higher refresh rates because the refresh rates being synced to the screen draws will eliminate tearing entirely.
 
something tells me if I buy one -- it won't be that easy to enjoy BF4 or any of my normal games on it.
 
You're neglecting the reason behind needing the 120Hz monitors in the first place. Freesync actually defeats the need for higher refresh rates because the refresh rates being synced to the screen draws will eliminate tearing entirely.

He's probably stuck with nVidia
 
You're neglecting the reason behind needing the 120Hz monitors in the first place. Freesync actually defeats the need for higher refresh rates because the refresh rates being synced to the screen draws will eliminate tearing entirely.

Huh? Peoples' interest in 120hz and 144hz monitors for gaming is about much more than just eliminating tearing. Have you ever actually used a monitor > 60Hz?
 
they have yet to post story on it, but Acer is gonna be showing off 2 new 1440p monitors at CES with G-sync. The kicker, both are IPS 144hz. Yes IPS 144hz wasn't a type-o
 
You're neglecting the reason behind needing the 120Hz monitors in the first place. Freesync actually defeats the need for higher refresh rates because the refresh rates being synced to the screen draws will eliminate tearing entirely.

yeah, that is not the reason i have a 120hz monitor... tearing was never on my radar when i bought one it was always about the higher refresh rate because 60hz feels like a slide show in comparison when playing a FPS with fast view point changes

tearing has always been minimal and not much of an issue when you focus is on other people, sure not having tearing at a low fps point in other games would be nice but its not worth giving up 120hz for.
 
You're neglecting the reason behind needing the 120Hz monitors in the first place. Freesync actually defeats the need for higher refresh rates because the refresh rates being synced to the screen draws will eliminate tearing entirely.

Talk about clueless. The reason we're buying 120 Hz monitors is because we don't like slide shows.
 
Is it the actual frame rate that you guys are noticing or is it the transition from different frame rates that you are noticing?

If a monitor is able to display at a flat 60Hz, will it be more "clear" than a monitor that is constantly swinging back and forth between 120 and 60Hz?

I believe it is the change in frame rate that most of you are noticing, not the frame rate itself.
 
Is it the actual frame rate that you guys are noticing or is it the transition from different frame rates that you are noticing?

If a monitor is able to display at a flat 60Hz, will it be more "clear" than a monitor that is constantly swinging back and forth between 120 and 60Hz?

I believe it is the change in frame rate that most of you are noticing, not the frame rate itself.

What you have in mind is easily fixed with atrocities such as vsync which I avoid no matter what.

I personally have no problem with tearing or anything like that. I just want my eyes to be hit with the most recent information. Try playing anything but RPGs and you'll probably understand. It's also a matter of latency. We've lost a lot when we had to ditch our CRTs. And I could tell the difference between a desktop being refreshed at 100 Hz and 120 Hz any day of the week.
 
Is it the actual frame rate that you guys are noticing or is it the transition from different frame rates that you are noticing?

If a monitor is able to display at a flat 60Hz, will it be more "clear" than a monitor that is constantly swinging back and forth between 120 and 60Hz?

I believe it is the change in frame rate that most of you are noticing, not the frame rate itself.

No, you can definitely see the difference between a smooth 60 and smooth 120hz. Just booting into windows you can tell the difference in explorer functions and mouse movement, let alone when you're playing games.
 
What you have in mind is easily fixed with atrocities such as vsync which I avoid no matter what.

Vsync is a cap. It keeps the monitor at 60Hz, so long as the video card can feed it a 60Hz signal.

What I am talking about is when the video card is having a problem keeping it at 60Hz and is swinging between 50 to 60 constantly (as an example). The transition of frame rates is very noticeable.

The reason film is able to get away with such a low frame rate (24hz) is because it is a constant frame rate. You do notice it on certain panning shots. But the constant locked frame rate is easy on the eyes. Having frame rates that are all over the place is distracting to the eye.
 
I like that AMD is taking a more mass market approach rather than Nvidia's proprietary BS. As someone who's got no loyalty to either AMD or Nvidia (I've always bought straight up on price/performance), I like to be able to buy a monitor without having to be tied to one manufacturer over the other. Too bad Nvidia won't support it, because I don't want me to pay the massive premium for a G-Sync monitor. Implementing a system that comes standard can massively reduce the cost, and I'm sure we'll see a big price difference between Freesync and G-Sync capable monitors.
 
Vsync is a cap. It keeps the monitor at 60Hz, so long as the video card can feed it a 60Hz signal.

What I am talking about is when the video card is having a problem keeping it at 60Hz and is swinging between 50 to 60 constantly (as an example). The transition of frame rates is very noticeable.

The reason film is able to get away with such a low frame rate (24hz) is because it is a constant frame rate. You do notice it on certain panning shots. But the constant locked frame rate is easy on the eyes. Having frame rates that are all over the place is distracting to the eye.

First of all, vsync doesn't work the way you think it does. Here's a short wiki reference. This means that the GPU is withholding otherwise complete frames until the monitor is able to display them in all of their glory. This leads to massive latency.

While you may disagree, what you're saying in your second paragraph is fixed with vsync. It also leads to tearing, which, of course, is also fixed this way (or with high frequency monitor which makes the effect almost completely invisible).

But that's not why I, and most people with 120 Hz monitors I know, bought them. Some of us play shooters, and some of us play Quake or other games where precision matters a lot. We tend to manually cap our games at 120 FPS so that we can play at reasonable framerates, as opposed to slideshows, without having to deal with massive latency (vsync).

As for your film argument, I completely stopped going to the movies because it's not only panning, it's everything. This is why the drop MASSIVE amounts of motion blur and shaky cams these days. 24 FPS is nothing. It may be easy on your eyes but not mine. I hated it even as a kid, though back then I wasn't sure what exactly was driving me crazy.

I would take dynamic frame rate over a slide show any day of the week.
 
We tend to manually cap our games at 120 FPS so that we can play at reasonable framerates, as opposed to slideshows, without having to deal with massive latency (vsync).
Capping your framerate doesn't do anything to fix screen tearing though. Even if your framerate is 120 and your monitor is 120hz, you'll still get tearing with vsync.
 
I want a 120hz monitor for the no motion blur, screen tearing was never much of a concern for me.
 
I like that AMD is taking a more mass market approach rather than Nvidia's proprietary BS. As someone who's got no loyalty to either AMD or Nvidia (I've always bought straight up on price/performance), I like to be able to buy a monitor without having to be tied to one manufacturer over the other. Too bad Nvidia won't support it, because I don't want me to pay the massive premium for a G-Sync monitor. Implementing a system that comes standard can massively reduce the cost, and I'm sure we'll see a big price difference between Freesync and G-Sync capable monitors.

This is actually supported by AMD and is a spec of DisplayPort. If Nvidia wanted to they could support it in their drivers. But I think GSync supports more than just the DP port.
 
This is actually supported by AMD and is a spec of DisplayPort. If Nvidia wanted to they could support it in their drivers. But I think GSync supports more than just the DP port.

Nvidia will not support it, they've said so. The reason this is: if they artificially create a divide between the two products by opting OUT of supporting the tech, they then create an 'AMD vs Nvidia' battle, which Nvidia has much more market share to fight. If Nvidia supported FreeSync, it would become a universal standard. By opting out, they make it an AMD standard. Since this is mostly a gaming technology on expensive monitors, Intel isn't even a player.
 
Is it the actual frame rate that you guys are noticing or is it the transition from different frame rates that you are noticing?

If a monitor is able to display at a flat 60Hz, will it be more "clear" than a monitor that is constantly swinging back and forth between 120 and 60Hz?

I believe it is the change in frame rate that most of you are noticing, not the frame rate itself.

No, you can definitely see the difference between a smooth 60 and smooth 120hz. Just booting into windows you can tell the difference in explorer functions and mouse movement, let alone when you're playing games.

Yes, you definitely can see a difference between a smooth 60 and 120 - However, you can also see a difference between a smooth 60 and a smooth 75. For me personally, 60Hz was to slow for a refresh and would tire my eyes and hurt after even an hour. But 75 - well that was perfectly fine and didn't give me a headache.

But the problem several years ago was that there was a lack of 75Hz LCDs. Additionally, 75Hz does nothing to solve the screen tearing problem when not using the dreaded vsync. Turning on vsync would still cap you at 60Hz - although it is arguable that with a 75Hz LCD and Video Card capable of pushing it, the vsync algorithm will always push frames at 60 instead of dropping to 45 or 30 (which is what it did when the hardware couldn't keep up). For the purpose of my point - don't take 75Hz literally as the only option, there were 80Hz monitors as well as 70Hz and some other odd ones in small quantities. For me though, it had to be 75 or higher.

Now comes 120Hz. My reasons for using this are simple:
* 120 is the next magical number for Vsync (Vsync is not in fact capped at 60fps)
* I have hardware capable of running my FPS/RTS/whatever games at well over 120, therefore with Vsync ON, i am getting a smooth 120 fps.
* Vsync has improved in implementation and with triple buffering has cut down on the 'massive lag' that used to mar its name.
* I continue to have no screen tearing, which I should mention also would give me a headache independently of a less then 75Hz refresh rate.

I suspect that some people have a similar problem with viewing low refresh rates, and some also don't like tearing and therefore also use Vsync. This certainly is not the only use case, but a valid one.
 
Yes, you definitely can see a difference between a smooth 60 and 120 - However, you can also see a difference between a smooth 60 and a smooth 75. For me personally, 60Hz was to slow for a refresh and would tire my eyes and hurt after even an hour. But 75 - well that was perfectly fine and didn't give me a headache.

But the problem several years ago was that there was a lack of 75Hz LCDs. Additionally, 75Hz does nothing to solve the screen tearing problem when not using the dreaded vsync. Turning on vsync would still cap you at 60 fps/B] - although it is arguable that with a 75Hz LCD and Video Card capable of pushing it, the vsync algorithm will always push frames at 60 instead of dropping to 45 or 30 (which is what it did when the hardware couldn't keep up). For the purpose of my point - don't take 75Hz literally as the only option, there were 80Hz monitors as well as 70Hz and some other odd ones in small quantities. For me though, it had to be 75 or higher.

Now comes 120Hz. My reasons for using this are simple:
* 120 is the next magical number for Vsync (Vsync is not in fact capped at 60fps)
* I have hardware capable of running my FPS/RTS/whatever games at well over 120, therefore with Vsync ON, i am getting a smooth 120 fps.
* Vsync has improved in implementation and with triple buffering has cut down on the 'massive lag' that used to mar its name.
* I continue to have no screen tearing, which I should mention also would give me a headache independently of a less then 75Hz refresh rate.

I suspect that some people have a similar problem with viewing low refresh rates, and some also don't like tearing and therefore also use Vsync. This certainly is not the only use case, but a valid one.


Fixed a mistake - in bold.
 
Nvidia will not support it, they've said so. The reason this is: if they artificially create a divide between the two products by opting OUT of supporting the tech, they then create an 'AMD vs Nvidia' battle, which Nvidia has much more market share to fight. If Nvidia supported FreeSync, it would become a universal standard. By opting out, they make it an AMD standard. Since this is mostly a gaming technology on expensive monitors, Intel isn't even a player.

I know they won't do it and its retarded. While g-sync is more capable, it limits consumers to only certain brands and increases prices.

But Intel can support freesync a well.
 
Isn't the point of 21:9 for watching movies? Is this really that good for gaming? I wonder how many games support those resolutions.
 
Isn't the point of 21:9 for watching movies? Is this really that good for gaming? I wonder how many games support those resolutions.

Yes and most. For games that don't support the 21:9 aspect ratio, it would be equivalent to a 27" 1440p display. I went from a 2x 24" 1080p set up to a single 34" 21:9 (3440x1440) and wouldn't go back. It depends on the game, but for especially games with a first person perspective you get a wider horizontal FOV and much more immersive experience. At 3440x1440 (5MP) it's easier to run than a 3x 1080p (6MP) or a 4K (8MP) set up, and you get a good chunk of the benefits of both - the wider aspect ration of a multi-monitor set up and the higher resolution of a 4K set up.

For general multitasking, it's perfect... it's like having two 5:4 monitors with no bezel.
 
It also has a feature called Black Stabilizer that “illuminates dark scenes” to reveal hidden details. LG thinks this could give players a competitive edge by ensuring opponents can’t ambush from the shadows.

Sounds like cheating to me...
 
This is great news! Now I can display even fewer lines of text on the same screen! All we need now is one more generation of stupidity so we can have nice things like 45,238:9 screens. Thanks LG for being the world's first in a race to make screens even more useless.
 
Back
Top