ASUS Republic of Gamers (ROG) Announces Swift PG27UQ

Idiots. we want a 30 inch 4k. 27 is too small. especially for $1200? come on now. vote with your wallets and brains people. #2stepsfoward5stepsback
 
Is it going to have the same horrible quality as there current ips monitors?
 
  • Like
Reactions: DF-1
like this
27" in my opinion is not big enough for 4k

The monitor i m waiting for ( and i might be waiting 5 years or more for that to be avialable at less than $1000 ) is a 40" OLED 4k and 144hz , i think by then we ll have more than enough GPU power for it .
 
My PC monitor is a 52" LG TV sitting two feet in front of me and the immersion is great. It does take a little getting used to with more head movement but I wouldn't trade it in for a 27". I might bite on a high-spec'd G-Sync in the mid 30 inch range though.

BTW, I'm impressed this thing is an IPS being driven at 144Hz. Not too long ago, all the 120Hz and above display were TN panels. Not looking forward to the $300 G-Sync tax though.


Not sure why people keep calling this a G-Sync tax?

I mean I get it that G-Sync licensing adds $300 or so to the price, but G-Sync doesn't work with ATI (AMD) cards, it's an alternative. Freesync costs more as well.

Let me explain it this way.

I have a favorite game and a GTX 1070 and a 27" 144hz G-Sync monitor run it well but the two cost me, and I am going to ballpark it at $11,000.

Now a 27" freesync monitor of similar quality is going to cost me about $450+ for a 144hz panel and the best performing ATI card is what? An R9 Fury or RX480 each running around $260 ? But are you not going to need two of these cards and a much more powerful and costly PSU to drive them and match the performance of a GTX 1070 card? $450 + $520 is $970 for the monitor and two cards in X-Fire. You still need some more power and with it you'll get more heat and noise.

I just realistically don't see any savings here, not at the same performance point. Now if I have it wrong please explain, happy to listen.
 
Not sure why people keep calling this a G-Sync tax?

I mean I get it that G-Sync licensing adds $300 or so to the price, but G-Sync doesn't work with ATI (AMD) cards, it's an alternative. Freesync costs more as well.

Now if I have it wrong please explain, happy to listen.

It's not just "an alternative". Nvidia could simply support FreeSync as well. There's literally nothing stopping them except profit. Consequently, you pay the "G-Sync tax" for the "privilege" of having an NV card and being able to adaptively sync the refresh on the display you bought. Granted, NV's tech is a little different and involves more technical boopityboops, but the point is that they could support FreeSync and G-Sync and choose not to do so for strictly selfish business reasons. Call it a tax or whatever, it still amounts to paying more "because we (NV) can do this and people still pay us". *shrug life*
 
It's not just "an alternative". Nvidia could simply support FreeSync as well. There's literally nothing stopping them except profit.

It's inferior to G-Sync in every way. They're not comparable.
 
32"+ or GTFO. I am looking to eventually replace a 30" 2560x1600 monitor I got 6 years ago.
 
Not so much if it has G-Sync. If it didn't have G-Sync then the cards would have to try and match the refresh rate or studder or tear, but since this is a G-Sync monitor it is instead able to adjust it's refresh rate to the system's ability to render the content. If the workload is easy it will fly, if it's demanding it will slow down and do what it can. For adaptive sync monitors, the max refresh rate is no longer a target to be reached but instead it is an upper limit that can be realized.

Or so that's how I see it at the moment.

But won't you get the best performance when you're running close to 144 frames with g sync? I always assumed you needed a ton of frames for the best experience

And is g sync noticeable? Is it a game changer? I don't really notice tearing because my frames don't dip, but I read you can still get tearing even if you're running games at a consistent 144 fps. Really curious about this
 
It's inferior to G-Sync in every way. They're not comparable.

How is g sync noticeable? What difference does it make as opposed to a non g sync monitor? If I run games at 120 fps with no dips, would I notice a difference?

From what I understand, it is great for people who get dips in fps. I run everything low so my frames stay high

Thanks
 
But won't you get the best performance when you're running close to 144 frames with g sync? I always assumed you needed a ton of frames for the best experience

And is g sync noticeable? Is it a game changer? I don't really notice tearing because my frames don't dip, but I read you can still get tearing even if you're running games at a consistent 144 fps. Really curious about this
GSync will prevent tearing at any frame rate, as the monitor refresh rate will follow whatever fps the game is running at.
Something isn't right if you see tearing with gsync on. Maybe vsync isn't working with the game or something.

For me it is noticeable. I no longer need to target performance above 60fps, and cap it at 60 fps to ensure I have a constantly smooth experience. In games like Witcher 3 and modded Skyrim, even my current card cannot constantly stay above 60 fps, it often fluctuates down to 50fps, but I can't tell the difference, no screen tearing or stuttering when the fps drops.
 
But won't you get the best performance when you're running close to 144 frames with g sync? I always assumed you needed a ton of frames for the best experience

And is g sync noticeable? Is it a game changer? I don't really notice tearing because my frames don't dip, but I read you can still get tearing even if you're running games at a consistent 144 fps. Really curious about this
It's been the biggest game changer for me in gaming for a long time. You don't get any tearing or judder at all no matter the framerate, as the refresh rate changes to match. Unless there is a performance issue with the game (G-Sync doesn't compensate for stutter), I can't notice when the framerate jumps between 50 and 100 FPS.

It's the exact opposite with G-Sync vs. V-Sync. You want to avoid approaching the max refresh rate of the monitor, as it acts more like V-Sync the closer you get. If you read around a lot of people set a framerate cap of 135 FPS to avoid getting close to 144. I don't sweat over specific framerate targets, so I'll just turn up or down the game settings until it runs 80-100 FPS most of the time. With my PC it usually means just picking the highest quality preset and forgetting.
 
How is g sync noticeable? What difference does it make as opposed to a non g sync monitor? If I run games at 120 fps with no dips, would I notice a difference?

From what I understand, it is great for people who get dips in fps. I run everything low so my frames stay high

Thanks

freesync is normally 45-60hz, some times they can go up to 75 hz. These are the "cheap" ones people talk about. This is why people say freesync is not as good as gsync. Once you are in the 144hz panels, the only advantage gync has is it can dip down to even 20hz in some models.

So yea freesync can suck when you have a model and all it does is freesync from 45hz to 60hz and thats it.
 
But won't you get the best performance when you're running close to 144 frames with g sync? I always assumed you needed a ton of frames for the best experience

And is g sync noticeable? Is it a game changer? I don't really notice tearing because my frames don't dip, but I read you can still get tearing even if you're running games at a consistent 144 fps. Really curious about this

Your video card has frame buffers;
In the earliest days, all cards used Vertical Synchronization meaning the card would fill the primary display buffer and send that completed image to the monitor in sync with the monitor's vertical refresh rate, typically 60hz which is equal to 60 frames a second.
But then came video games that were harder on video cards and video cards couldn't produce 60 new frames every second so what happened was, if a new frame wasn't fully ready yet, the last frame would be repeated. This is what made things look choppy, studdering, chugging alone.
Another part of the V-Sync method was that once an image was sent to the monitor, the video card dumped all the data from the buffer and started building the entire next image from scratch. Well, someone got smart, they said, "why build the whole thing from scratch, instead of dumping all the data, just over-write the old data with new data and when the monitor needs a new frame, just send it what you have, part of an old frame and part of a new one. This is what's now called "V-Sync Disabled" or turning Off V-Sync. They also started adding more buffers so that strong cards could render several frames at a time, sort of like getting a jump on things. Disabling V-Sync got rid of the stuttering but it created a visual problem with image tearing because of the partial images.

Then new smart guys asked a question, "Why make the video card have to keep up with the monitor's refresh rate? Why not let the video card run as fast as it can, and tell the monitor to refresh when the card is ready to send the next good image?" and adaptive synchronization was born.
AMD's version is called Freesync, and NVidia call's theirs G-Sync.

Now you are correct that in a perfect world, perfect frames at 144 Hz (144 FPS) would be ideal. Even better is doing it at 4K instead of 1080P. But the reality is that the game can still make even the best cards run slower at 4K. So if you want 4K images you will have to settle for the fact that you are not going to get perfect images at 144 FPS. Without adaptive sync you will still have the two old choices of V-Sync ON or OFF. But with adaptive sync your card can simply run as fast as it can make perfect images and tell the monitor to stay in step.

Most will agree that G-Sync is superior to Freesync, but everyone has a budget and everyone has choices to make. Part of the issue here is that at 4K and 144 FPS, the bandwidth needed is more than a Display Port cable can handle. My thought is that few systems are capable of producing that much data at 4K so the card won't be running at 144 FPS anyway. I figure, since the card can control the refresh rate, if it is going to saturate the Display Port bandwidth that it will simply slow down. Another thing is that display port works differently then older solutions, DP tells each pixel what color to display and how long to show it so it lessens the data that must be sent, instead of telling every pixel what to do on every refresh pass. You don't have to tell the sky to be blue 144 times in a second, just tell the sky to stay blue until I tell you to be something else.

OK, that's the short book, I hope I didn't bore you, and that you or someone else is helped by it.
 
Idiots. we want a 30 inch 4k. 27 is too small. especially for $1200? come on now. vote with your wallets and brains people. #2stepsfoward5stepsback

Not me, I want a $1,200 34" 21:9 curved display at 100hz with G-Sync for $900.00

Actually I want two, one mounted above the other, and when they are displaying content that isn't 21:9 I want it to carve the unused display area into additional virtual display desktop space. And I want to be able to combine both and display 16:9 content full-screen across both as well.
 
Not me, I want a $1,200 34" 21:9 curved display at 100hz with G-Sync for $900.00

Actually I want two, one mounted above the other, and when they are displaying content that isn't 21:9 I want it to carve the unused display area into additional virtual display desktop space. And I want to be able to combine both and display 16:9 content full-screen across both as well.
4K ultrawide? 5040x2160 :hungry:. HDMI 2.1 has enough bandwidth for it at 8 or 10 bpc at 144 Hz.
 
As I posted in another thread, as a user that's currently seeking a 4K monitor, this one - at least on paper - hits most of the specs I've been desiring. 4K, IPS w/ 10bit+wide gamut, 144hz, slim bezel, latest gen DisplayPort + HDMI multiple inputs, HDR, Quantum Dot, ultra high brightness and localized dimming regions, a form of adaptive sync etc... all sound pretty great.. There are only a couple of downsides (or potential ones). First is the size. While 27" is viable, it would be nice if someone offered a similar model with the same general specs but at 28-34", with 30-32" probably being the sweet spot. I'm sure 27" will work, I'm generally happy with it now, but larger would be okay as well given the resolution and if it came with a really slim bezel that would even be better. The other issues go hand in hand - GSync and cost.

Overall, I prefer FreeSync to GSync for a number of reasons. I approve of how AMD has allowed FreeSync standardized implementation on DisplayPort (and HDMI if I am correct) and its general open ethic in terms of both tech and (lack of significant) licensing fees. GSync on the other hand requires an additional, special board added to the monitor hardware which of course increases the cost but worse than that is the business decisions and love of the proprietary in the Nvidia mindset. FreeSync has become a success in the market, widely implemented on many monitors thanks to its openness. Now Nvidia could, if they so chose, implemented FreeSync support on their video cards/drivers so that those who bought a monitor that just so happened to be FreeSync ready, could enjoy it fully it on their Nvidia card. However, Nvidia refuses to do so not because of a technical issue but simply because they refuse to validate a "competitor's" standard. This is even more irritating when, as many G-Sync supporters suggest, that its experience is "better" than FreeSync. If G-Sync is truly better, then why not support both while extolling the virtues of your preferred tech as the alternative is "no real threat" to you? Doing otherwise gives Nvidia card owners with a FreeSync monitor, a reason to be angry with the company for blocking access to a monitor feature that could easily be supported from a technical standpoint. Hell, even from "NVidia short term greed PR land" it could benefit them because AMD literally could not do it the other way around, not just for licensing but because of the add-in board necessary. Thus, Nvidia could say "We support all adaptive sync monitor standards, from FreeSync to the superior G-Sync experience", if just making the best experience for their GPU owners wasn't enough and they needed further justification. Now, I admit that my experience with both G-Sync and FreeSync has been somewhat limited with friends and demo systems using each tech, but I did not notice one being clearly superior so it is frustrating there isn't one universal standard, much less being asked to pay a significant premium for one option...which sadly seems in line with Nvidia's ethos throughout their business.

So ultimately this Asus monitor is looking good, but it would be nice if they also offered two versions otherwise identical in spec and design - this one with G-Sync HDR and one with FreeSync 2.0 certification. Offering that option, as well as perhaps a 30-32" panel, would definitely push said family onto the top of my list/"Must buy" territory at least on paper. Now to see how well these perform, when they come to market, at what cost, and what competitors may bring out. Still,very exciting.
 
32"+ or GTFO. I am looking to eventually replace a 30" 2560x1600 monitor I got 6 years ago.

I have 3 30" I would love to replace. The PG279q was a big consideration, but I held off. I don't want smaller, and I don't want panel lottery. My HP ZR30w's were all flawless, and that was a guarantee.
 
It's not just "an alternative". Nvidia could simply support FreeSync as well. There's literally nothing stopping them except profit. Consequently, you pay the "G-Sync tax" for the "privilege" of having an NV card and being able to adaptively sync the refresh on the display you bought. Granted, NV's tech is a little different and involves more technical boopityboops, but the point is that they could support FreeSync and G-Sync and choose not to do so for strictly selfish business reasons. Call it a tax or whatever, it still amounts to paying more "because we (NV) can do this and people still pay us". *shrug life*

Wait a second, wouldn't they need to license some tech from AMD for that, and are you sure AMD would sell it?
 
I'd love this monitor, but the ROG swift monitors are absurdly expensive, I mean seriosuly, it's like 800 bucks for the current 27 " one.

Compare that to a 4k HDR tv right now, only you can get one of those at like 60" instead of 27", drawback being the input lag though.
 
GSync will prevent tearing at any frame rate, as the monitor refresh rate will follow whatever fps the game is running at.
Something isn't right if you see tearing with gsync on. Maybe vsync isn't working with the game or something.

For me it is noticeable. I no longer need to target performance above 60fps, and cap it at 60 fps to ensure I have a constantly smooth experience. In games like Witcher 3 and modded Skyrim, even my current card cannot constantly stay above 60 fps, it often fluctuates down to 50fps, but I can't tell the difference, no screen tearing or stuttering when the fps drops.

But what if you are running 120 fps consistently with no v sync? Will I notice a difference with g sync running my games at 120 fps, no dips? i am playing bf1 right now and have the fps counter on my corner, and so far I havent seen any dips

Is g sycn for people who like to run their games at high settings? i run my games on low so my frames never go below what I set it.

It's been the biggest game changer for me in gaming for a long time. You don't get any tearing or judder at all no matter the framerate, as the refresh rate changes to match. Unless there is a performance issue with the game (G-Sync doesn't compensate for stutter), I can't notice when the framerate jumps between 50 and 100 FPS.

It's the exact opposite with G-Sync vs. V-Sync. You want to avoid approaching the max refresh rate of the monitor, as it acts more like V-Sync the closer you get. If you read around a lot of people set a framerate cap of 135 FPS to avoid getting close to 144. I don't sweat over specific framerate targets, so I'll just turn up or down the game settings until it runs 80-100 FPS most of the time. With my PC it usually means just picking the highest quality preset and forgetting.


I am so confused by this. I have been playing bf4 for a while, and I read that you should always have your frames go above the refresh rate to prevent tearing. Some like it 1 frame above the fresh rate, and some like it 5 frames above the fresh rate. Like I would play bf4 at 144hz and have my frames set to 150. I would set my frames to 121 for 120 b/c it seemed like a lot of people did that.

I dont have g sync right now. I am playing bf1 and I never go below 120 frames. Will I notice a difference in gaming with g sync? I don't use v sync. From what I understand, it is great for those who cannot get consistent frames. I am getting a consistent 120 frames, so I am not sure if I will see any added benefit with g sync.

freesync is normally 45-60hz, some times they can go up to 75 hz. These are the "cheap" ones people talk about. This is why people say freesync is not as good as gsync. Once you are in the 144hz panels, the only advantage gync has is it can dip down to even 20hz in some models.

So yea freesync can suck when you have a model and all it does is freesync from 45hz to 60hz and thats it.

what if I am getting 120 frames consistently, will I see any added benefit with g sync? I do not use v sync.

Your video card has frame buffers;
In the earliest days, all cards used Vertical Synchronization meaning the card would fill the primary display buffer and send that completed image to the monitor in sync with the monitor's vertical refresh rate, typically 60hz which is equal to 60 frames a second.
But then came video games that were harder on video cards and video cards couldn't produce 60 new frames every second so what happened was, if a new frame wasn't fully ready yet, the last frame would be repeated. This is what made things look choppy, studdering, chugging alone.
Another part of the V-Sync method was that once an image was sent to the monitor, the video card dumped all the data from the buffer and started building the entire next image from scratch. Well, someone got smart, they said, "why build the whole thing from scratch, instead of dumping all the data, just over-write the old data with new data and when the monitor needs a new frame, just send it what you have, part of an old frame and part of a new one. This is what's now called "V-Sync Disabled" or turning Off V-Sync. They also started adding more buffers so that strong cards could render several frames at a time, sort of like getting a jump on things. Disabling V-Sync got rid of the stuttering but it created a visual problem with image tearing because of the partial images.

Then new smart guys asked a question, "Why make the video card have to keep up with the monitor's refresh rate? Why not let the video card run as fast as it can, and tell the monitor to refresh when the card is ready to send the next good image?" and adaptive synchronization was born.
AMD's version is called Freesync, and NVidia call's theirs G-Sync.

Now you are correct that in a perfect world, perfect frames at 144 Hz (144 FPS) would be ideal. Even better is doing it at 4K instead of 1080P. But the reality is that the game can still make even the best cards run slower at 4K. So if you want 4K images you will have to settle for the fact that you are not going to get perfect images at 144 FPS. Without adaptive sync you will still have the two old choices of V-Sync ON or OFF. But with adaptive sync your card can simply run as fast as it can make perfect images and tell the monitor to stay in step.

Most will agree that G-Sync is superior to Freesync, but everyone has a budget and everyone has choices to make. Part of the issue here is that at 4K and 144 FPS, the bandwidth needed is more than a Display Port cable can handle. My thought is that few systems are capable of producing that much data at 4K so the card won't be running at 144 FPS anyway. I figure, since the card can control the refresh rate, if it is going to saturate the Display Port bandwidth that it will simply slow down. Another thing is that display port works differently then older solutions, DP tells each pixel what color to display and how long to show it so it lessens the data that must be sent, instead of telling every pixel what to do on every refresh pass. You don't have to tell the sky to be blue 144 times in a second, just tell the sky to stay blue until I tell you to be something else.

OK, that's the short book, I hope I didn't bore you, and that you or someone else is helped by it.

You helped alot ;)

Right now I am not worried about 4k. I just like fps games and don't want a low amount of frames because I play at 120 hz with no v sync

What I do not understand is, If I am getting 120 frames consistently with no drops, will I see any added benefit with g sync? That is where my confusion lies. I do not use v sync. I read there can be tearing even if you are getting 120 frames and you're running at 120 hz with no v sync. I am getting 120 fps with no drops, so I am sure if I should go for g sync. Will I notice a difference?

I was going to go for g sync a few years back when it first came out. I decided to hold out until the technology matured. I later bought an ATI card and decided to use mantle. Mantle was incredible, and I loved every second of it. Great technology.

so yea, should I go for g sync if my frames do not dip?

thanks
 
But what if you are running 120 fps consistently with no v sync? Will I notice a difference with g sync running my games at 120 fps, no dips? i am playing bf1 right now and have the fps counter on my corner, and so far I havent seen any dips

Is g sycn for people who like to run their games at high settings? i run my games on low so my frames never go below what I set it.




I am so confused by this. I have been playing bf4 for a while, and I read that you should always have your frames go above the refresh rate to prevent tearing. Some like it 1 frame above the fresh rate, and some like it 5 frames above the fresh rate. Like I would play bf4 at 144hz and have my frames set to 150. I would set my frames to 121 for 120 b/c it seemed like a lot of people did that.

I dont have g sync right now. I am playing bf1 and I never go below 120 frames. Will I notice a difference in gaming with g sync? I don't use v sync. From what I understand, it is great for those who cannot get consistent frames. I am getting a consistent 120 frames, so I am not sure if I will see any added benefit with g sync.



what if I am getting 120 frames consistently, will I see any added benefit with g sync? I do not use v sync.



You helped alot ;)

Right now I am not worried about 4k. I just like fps games and don't want a low amount of frames because I play at 120 hz with no v sync

What I do not understand is, If I am getting 120 frames consistently with no drops, will I see any added benefit with g sync? That is where my confusion lies. I do not use v sync. I read there can be tearing even if you are getting 120 frames and you're running at 120 hz with no v sync. I am getting 120 fps with no drops, so I am sure if I should go for g sync. Will I notice a difference?

I was going to go for g sync a few years back when it first came out. I decided to hold out until the technology matured. I later bought an ATI card and decided to use mantle. Mantle was incredible, and I loved every second of it. Great technology.

so yea, should I go for g sync if my frames do not dip?

thanks

You must have some magic setup if you are running 120fps with ZERO dips all the time.
 
You must have some magic setup if you are running 120fps with ZERO dips all the time.

He said he runs at low settings. In other words, he's not taxing his card at all.

etegv, It seems to me you are denying yourself some goodness. But if you are happy with what you have I wouldn't change a thing.

Most people will not lower their settings unless it's too much for their system and it's running too slow for their likes, then they will start backing off the settings, or go and buy better equipment.

But you just are not one of these people.

As for your question, would you notice a difference, I would say not really. I say this because you probably still wouldn't sacrifice visual quality and effects for frames per second. You would still leave your settings on low, the video card still wouldn't be working that hard, and the only benefit you would get out of G-Sync is that it would force perfect frames and eliminate and artifacts or tearing related to V-Sync being OFF like you run now. You also could see G-Sync force your frame rate a little lower even with low settings.

So, if I understand you correctly, there are better placers for you to put your money. Putting it into a monitor like this with G-Sync isn't going to do it for you I think.
 
Last edited:
You must have some magic setup if you are running 120fps with ZERO dips all the time.

R9 290
4670k

I play bf1 low settings

I used the perfoverlay command and get 120 fps consistently

Would I see any added benefit with g sync gameplay wise?
 
R9 290
4670k

I play bf1 low settings

I used the perfoverlay command and get 120 fps consistently

Would I see any added benefit with g sync gameplay wise?

probably not. Gysnc is for those that light playing at higher resolutions with graphics turned up higher then low.
 
What would be the logical next size up from 27 for this? 32 or 34?

Have a 30 I want to replace that I got in 2008.
 
R9 290
4670k

I play bf1 low settings

I used the perfoverlay command and get 120 fps consistently

Would I see any added benefit with g sync gameplay wise?
BTW, the perforverlay command has been reported as being a little buggy and not always working properly. I also wonder how the commend works when you have V-Sync ON vs OFF.

Lastly, it's always sobering to remember that no matter how many FPS are being reported, it actually can not exceed the monitor's refresh rate. If a monitor is 120 Hz, that is a maximum of 120 FPS even if the software says more frames are being produced.
 
BTW, the perforverlay command has been reported as being a little buggy and not always working properly. I also wonder how the commend works when you have V-Sync ON vs OFF.

Lastly, it's always sobering to remember that no matter how many FPS are being reported, it actually can not exceed the monitor's refresh rate. If a monitor is 120 Hz, that is a maximum of 120 FPS even if the software says more frames are being produced.

Not entirely, FPS and Hz do not directly correlate. 60 FPS vs 120 FPS is definitely noticable on a 60hz monitor.
 
Not entirely, FPS and Hz do not directly correlate. 60 FPS vs 120 FPS is definitely noticable on a 60hz monitor.

How?

If you have V-Sync enabled or ON, the monitor sends a signal to the video card to keep everything sync'd up. The video card is building the frame in the buffer and once the frame is completed it listens for the signal and sends the new frame for the monitor refresh. Then the buffer is dumped and completely rebuilt.

If V-Sync is OFF, once a frame is sent, instead of dumping and completely rebuilding the buffer, the video card is over-writing data in the buffer and at whatever point in this process the video card gets the refresh signal it sends what it has and continues. There are additional details like added buffers and such, but they don't change the basics.

So what is it you are noticing between 60 and 120 FPS on a 60Hz monitor ?

I think I need a better understanding of what you mean when you say they don't directly correlate cause when a monitor has a 60Hz refresh rate, the monitor will not display more than 60 frames, not matter what the quality of those frames are, you will not get more than 60 of them at the monitor.
 
But what if you are running 120 fps consistently with no v sync? Will I notice a difference with g sync running my games at 120 fps, no dips? i am playing bf1 right now and have the fps counter on my corner, and so far I havent seen any dips

Is g sycn for people who like to run their games at high settings? i run my games on low so my frames never go below what I set it.




I am so confused by this. I have been playing bf4 for a while, and I read that you should always have your frames go above the refresh rate to prevent tearing. Some like it 1 frame above the fresh rate, and some like it 5 frames above the fresh rate. Like I would play bf4 at 144hz and have my frames set to 150. I would set my frames to 121 for 120 b/c it seemed like a lot of people did that.

I dont have g sync right now. I am playing bf1 and I never go below 120 frames. Will I notice a difference in gaming with g sync? I don't use v sync. From what I understand, it is great for those who cannot get consistent frames. I am getting a consistent 120 frames, so I am not sure if I will see any added benefit with g sync.



what if I am getting 120 frames consistently, will I see any added benefit with g sync? I do not use v sync.



You helped alot ;)

Right now I am not worried about 4k. I just like fps games and don't want a low amount of frames because I play at 120 hz with no v sync

What I do not understand is, If I am getting 120 frames consistently with no drops, will I see any added benefit with g sync? That is where my confusion lies. I do not use v sync. I read there can be tearing even if you are getting 120 frames and you're running at 120 hz with no v sync. I am getting 120 fps with no drops, so I am sure if I should go for g sync. Will I notice a difference?

I was going to go for g sync a few years back when it first came out. I decided to hold out until the technology matured. I later bought an ATI card and decided to use mantle. Mantle was incredible, and I loved every second of it. Great technology.

so yea, should I go for g sync if my frames do not dip?

thanks
Sounds like you'd be more interested in Fast Sync, which doesn't require any additional monitor tech. Just go into the NVIDIA control panel and globally set Vertical sync to Fast. This allows the game to still run as fast as possible while the display still only shows frames as fast as its maximum refresh rate. This gives you the benefit of minimal input lag and virtually no tearing, though judder is still possible (which is one of the artifacts G-Sync addresses).
 
How?

If you have V-Sync enabled or ON, the monitor sends a signal to the video card to keep everything sync'd up. The video card is building the frame in the buffer and once the frame is completed it listens for the signal and sends the new frame for the monitor refresh. Then the buffer is dumped and completely rebuilt.

If V-Sync is OFF, once a frame is sent, instead of dumping and completely rebuilding the buffer, the video card is over-writing data in the buffer and at whatever point in this process the video card gets the refresh signal it sends what it has and continues. There are additional details like added buffers and such, but they don't change the basics.

So what is it you are noticing between 60 and 120 FPS on a 60Hz monitor ?

I think I need a better understanding of what you mean when you say they don't directly correlate cause when a monitor has a 60Hz refresh rate, the monitor will not display more than 60 frames, not matter what the quality of those frames are, you will not get more than 60 of them at the monitor.

I never used VSync on my 60hz monitors. So while there was more tearing there was less FPS spikes and gameplay just had a much smoother feel to it. Hard to explain so I'll just quote this post.


"This is always told wrong. If you do not have V-sync on, and you have 70 FPS, as an example, with a 60hz screen, you will indeed see 70 FPS, but there will be many partial frames shown. While the monitor updates its image and a new frame comes in, as a result of rendering frames faster than the monitor refresh rate, the monitor will shift to a new frame before it has a chance to display the whole image.

The frame is shown, just not a complete frame. A 60hz monitor can only show 60 complete frames, but if you have 70 FPS, you'll end up seeing 70 frames at an average of 60/70 of each of those frames.

At 120 FPS, you'll see 120 half frames on average.

Obviously a 120hz monitor is better.

NOTE: if you do not use V-sync, no matter what your FPS are, you get screen tearing."
 
I never used VSync on my 60hz monitors. So while there was more tearing there was less FPS spikes and gameplay just had a much smoother feel to it. Hard to explain so I'll just quote this post.


"This is always told wrong. If you do not have V-sync on, and you have 70 FPS, as an example, with a 60hz screen, you will indeed see 70 FPS, but there will be many partial frames shown. While the monitor updates its image and a new frame comes in, as a result of rendering frames faster than the monitor refresh rate, the monitor will shift to a new frame before it has a chance to display the whole image.

The frame is shown, just not a complete frame. A 60hz monitor can only show 60 complete frames, but if you have 70 FPS, you'll end up seeing 70 frames at an average of 60/70 of each of those frames.

At 120 FPS, you'll see 120 half frames on average.

Obviously a 120hz monitor is better.

NOTE: if you do not use V-sync, no matter what your FPS are, you get screen tearing."

I think what we have going is difference in how the technical aspect is being described. It's like two people saying the same thing, but differently. The real point I was trying to make, this other guy says it as well. What he is also saying, is that with V-sync disabled, all those partial frames are going to "add up" to your 70 FPS that the card is being produced. I do wonder what he is trying to say here though cause it makes no sense to me at all.

While the monitor updates its image and a new frame comes in, as a result of rendering frames faster than the monitor refresh rate, the monitor will shift to a new frame before it has a chance to display the whole image.
I think this is just a convoluted way of saying that the new frame is part of one frame and part of another because the new frame didn't fully over write the buffer and the old frame's data.

If you asked me if being able to render faster than your refresh rate is better I would say yes, it's definitely better. In that situation, is it better to have V-Sync ON or OFF? I would say that depends on preference and what kind of app you are running. OFF for FPS gaming, hell yes, Photoshop Pro, keep V-Sync ON.
 
Back
Top