Acer Unveils the 28-Inch XB280HK Display with NVIDIA G-SYNC

CommanderFrank

Cat Can't Scratch It
Joined
May 9, 2000
Messages
75,399
Serious about your gaming and need a cutting edge monitor to do the rest of your setup justice? Well, how about the Acer 28” 4K/2K Ultra HD paired with NVIDIA’s G-SYNC technology. The only drawback to all of this is it’s not yet available and the price has yet to be announced.
 
Last edited by a moderator:
Soo... What's the damn refresh rate? 30Hz? G-sync for low refresh rates seems counter intuitive...
 
Soo... What's the damn refresh rate? 30Hz? G-sync for low refresh rates seems counter intuitive...

G-sync doesn't work very well at 30hz and doesn't work at all below 30hz. No one is going to release a 4k panel with G-sync and only have it at 30hz , what would be the point of that? G-sync only really becomes useful past 30 FPS.

So count on them being 60hz.
 
It doesn't have the refresh rate listed anywhere. If it is running at 30hz then it's a serious failure.
 
G-sync seems to be most marketed as a gaming feature. I'm positive it will be 60Hz.

Moving forward fewer and fewer 4k panels will be 30Hz as bandwidth issues won't be as common with standards gaining HDMI 2.0(?) and whatever Displayport rolls out.

At 4k I can't see it being a TN panel as the ASUS swift is the first 1440res TN & I doubt any one mfg would jump to 4k using TN. It takes a lot of power to run full res and that's what makes Gsync/Freesync perfect for super high res.
 
Surely the panel type would have been proudly mentioned in the press release if it were anything other than TN.
 
G-sync seems to be most marketed as a gaming feature. I'm positive it will be 60Hz.

Moving forward fewer and fewer 4k panels will be 30Hz as bandwidth issues won't be as common with standards gaining HDMI 2.0(?) and whatever Displayport rolls out.

At 4k I can't see it being a TN panel as the ASUS swift is the first 1440res TN & I doubt any one mfg would jump to 4k using TN. It takes a lot of power to run full res and that's what makes Gsync/Freesync perfect for super high res.

Umm my Sammy U28D590 is UHD, TN and runs at 60Hz with DP 1.2 Great looking for a TN as well.
 
Then those guys go on to compare it to Adaptive VSync, as if it's the same thing...

Well the idea is that Vesa had it implemented since 2007 and in 2011 Nvidia claimed to support it. Then Nvidia came out with Gsync which requires their graphic cards and special monitors. Meanwhile the vesa standard just wasn't very well supported. Then AMD comes out with freesync which is essentially adaptive sync from 2007. But then vesa does officially support adaptive sync in DisplayPort 1.2a. So if your monitor and graphics card support 1.2a then you have adaptive sync.

Point is why get Gsync when DisplayPort 1.2a is the same thing and universal. With Gsync you can only use Nvidia hardware.
 
Umm my Sammy U28D590 is UHD, TN and runs at 60Hz with DP 1.2 Great looking for a TN as well.

Well don't I look silly about now. I had no idea that was out, let alone from a top tier mfg like Samsung. My bad! :p Rest of the statement stands though. If it is a higher quality panel then get ready for (imo) $1400 price tag, which seems about right comparing to similar Dell screens.

I think that is great to see. It's a lower cost solution for those that want 4k. That solves it for me. Acer 4k TN with G-sync. If they beat the Swift out the door somehow I bet Asus could lose a lot of sales for those that want a new and shiny....ME! :D
 
Point is why get Gsync when DisplayPort 1.2a is the same thing and universal. With Gsync you can only use Nvidia hardware.

They aren't the same thing - Adapative vsync requires your computer to guess when your next frame is going to be ready, whereas g-sync can respond to the frame becoming available.
 
They aren't the same thing - Adapative vsync requires your computer to guess when your next frame is going to be ready, whereas g-sync can respond to the frame becoming available.

Are the results different? Don't say yes without proof. I personally don't care cause I would need to buy a new monitor to even use both. Nvidia's solution is just likely to cost more.
 
Gsync is dead. Long live free sync, I mean adaptive sync.

Not quite. It will depend on how fast the standard is adopted and how well it works without dedicated hardware like G-sync has. Its the same old technology battle and even though Freesync is a superior option because it could work on both that doesn't mean it will win for sure.
 
Are the results different? Don't say yes without proof. I personally don't care cause I would need to buy a new monitor to even use both. Nvidia's solution is just likely to cost more.

Until we get our hands on Freesync no one knows. I'm willing to bet however that a solution that knows the frame in advance versus the one that guesses however is likely a superior solution.

If however it turns out that visually it appears the same then G-sync will be dead.
 
Ashbringer said:
Meanwhile the vesa standard just wasn't very well supported. Then AMD comes out with freesync which is essentially adaptive sync from 2007. .
Are you serious? I thought Freesync was similar to Gsync, just on a more open format without needing the chip. If all it is is adaptive vysnc that sucks.

To be clear for everyone:

Gsync = The refresh rate of your monitor changes with your framerate, so in theory this means you should never have any tearing and your picture will be very smooth. So if you're running at 43fps, so will your monitor. Everything is in sync.

Adaptive Vsync = In the event your framerate can't keep up with your refresh rate, vsync turns itself off temporarily, then back on once it can handle it again. So if your refresh rate is 60Hz, but you're running at 43fps, it's the same as if you had no vsync at all, you get tearing. Normal Vsync would bump this down to the next common multiple, which is 30fps. So while it's less choppy than Vsync on a game that can't handle a full 60fps, it's basically nonexistent if you can't keep your framerate above 60 most of the time (or whatever your refresh rate is) and won't prevent tearing.

BIG difference between the two.
 
Are you serious? I thought Freesync was similar to Gsync, just on a more open format without needing the chip. If all it is is adaptive vysnc that sucks.

To be clear for everyone:

Gsync = The refresh rate of your monitor changes with your framerate, so in theory this means you should never have any tearing and your picture will be very smooth. So if you're running at 43fps, so will your monitor. Everything is in sync.

Adaptive Vsync = In the event your framerate can't keep up with your refresh rate, vsync turns itself off temporarily, then back on once it can handle it again. So if your refresh rate is 60Hz, but you're running at 43fps, it's the same as if you had no vsync at all, you get tearing. Normal Vsync would bump this down to the next common multiple, which is 30fps. So while it's less choppy than Vsync on a game that can't handle a full 60fps, it's basically nonexistent if you can't keep your framerate above 60 most of the time (or whatever your refresh rate is) and won't prevent tearing.

BIG difference between the two.

Your definitions are correct but freesysnc is not equivalent to nvidias adaptive vsync. It is called "adaptive sync", which is where the confusion comes in... But technically it has much in common with gsync and not adaptive vsync. See here http://www.anandtech.com/show/8008/...andard-variable-refresh-monitors-move-forward
 
Your definitions are correct but freesysnc is not equivalent to nvidias adaptive vsync. It is called "adaptive sync", which is where the confusion comes in... But technically it has much in common with gsync and not adaptive vsync. See here http://www.anandtech.com/show/8008/...andard-variable-refresh-monitors-move-forward

I thought it was called Variable VSync to differentiate between NVidia Adaptave VSync (which just turns V-Sync on and off as needed).

For those that are saying that G-Sync is the same as Free Sync and the Vesa variable Sync option . . . I thought the G-Sync actually had the ability to fully control the sync from the graphics card rather than the monitor to ensure frames are drawn exactly as they are ready, thus eliminating the need for triple buffering which both reduces memory needed for frame buffers, and also eliminates the input lag from always being a frame behind the game.

Doesn't Freesync just have the ability to adjust the sync time at the end of each frame for the upcoming frame which will still need triple buffering and still have waits on times where the frame takes longer than the estimation?

Maybe I'm mistaken because I haven't read it recently, but I thought that was the main difference I can remember between the two when the news hit a while back.
 
skiboysteve said:
Your definitions are correct but freesysnc is not equivalent to nvidias adaptive vsync. It is called "adaptive sync", which is where the confusion comes in... But technically it has much in common with gsync and not adaptive vsync. See here http://www.anandtech.com/show/8008/...andard-variable-refresh-monitors-move-forward
My mistake. I just wish we'll get SOME sync option for IPS monitors. I mean Gsync was announced 7 months ago and this is what, the SECOND monitor supporting Vsync, and another TN panel to boot? At this rate, maybe we'll have an IPS one a year from now.
 
I suspect I'll be using my 23in Samsung 1080p LED monitor for a long time still. I'm OK with it as it still looks good.
 
Umm my Sammy U28D590 is UHD, TN and runs at 60Hz with DP 1.2 Great looking for a TN as well.
If I was in the market for a 4K monitor, I'd get the Samsung U28D590 as well.

I have no issue with TN panels, and until a 28-inch or 30-inch IPS 4K panel at 60Hz with DP1.2 or DP1.2a support is at the same price as the Samsung-- around $599 to $699, then the Samsung one is a better deal.

Not everyone has $2000 to $3000 to spend on a 4K IPS panel from Dell or ASUS.
 
I hate that the Asian manufacturers are always so far off regarding their monitor announcements. When is this monitor coming out, Acer?
 
Back
Top