Geforcepat
[H]ard|Gawd
- Joined
- Jun 2, 2012
- Messages
- 1,170
Idiots. we want a 30 inch 4k. 27 is too small. especially for $1200? come on now. vote with your wallets and brains people. #2stepsfoward5stepsback
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
27" in my opinion is not big enough for 4k
My PC monitor is a 52" LG TV sitting two feet in front of me and the immersion is great. It does take a little getting used to with more head movement but I wouldn't trade it in for a 27". I might bite on a high-spec'd G-Sync in the mid 30 inch range though.
BTW, I'm impressed this thing is an IPS being driven at 144Hz. Not too long ago, all the 120Hz and above display were TN panels. Not looking forward to the $300 G-Sync tax though.
Not sure why people keep calling this a G-Sync tax?
I mean I get it that G-Sync licensing adds $300 or so to the price, but G-Sync doesn't work with ATI (AMD) cards, it's an alternative. Freesync costs more as well.
Now if I have it wrong please explain, happy to listen.
It's not just "an alternative". Nvidia could simply support FreeSync as well. There's literally nothing stopping them except profit.
VA panels are too slow for 144 Hz.
I'll take a 120Hz VA. I don't want the IPS Glow and backlight bleed.
It's inferior to G-Sync in every way. They're not comparable.
Not so much if it has G-Sync. If it didn't have G-Sync then the cards would have to try and match the refresh rate or studder or tear, but since this is a G-Sync monitor it is instead able to adjust it's refresh rate to the system's ability to render the content. If the workload is easy it will fly, if it's demanding it will slow down and do what it can. For adaptive sync monitors, the max refresh rate is no longer a target to be reached but instead it is an upper limit that can be realized.
Or so that's how I see it at the moment.
It's inferior to G-Sync in every way. They're not comparable.
GSync will prevent tearing at any frame rate, as the monitor refresh rate will follow whatever fps the game is running at.But won't you get the best performance when you're running close to 144 frames with g sync? I always assumed you needed a ton of frames for the best experience
And is g sync noticeable? Is it a game changer? I don't really notice tearing because my frames don't dip, but I read you can still get tearing even if you're running games at a consistent 144 fps. Really curious about this
It's been the biggest game changer for me in gaming for a long time. You don't get any tearing or judder at all no matter the framerate, as the refresh rate changes to match. Unless there is a performance issue with the game (G-Sync doesn't compensate for stutter), I can't notice when the framerate jumps between 50 and 100 FPS.But won't you get the best performance when you're running close to 144 frames with g sync? I always assumed you needed a ton of frames for the best experience
And is g sync noticeable? Is it a game changer? I don't really notice tearing because my frames don't dip, but I read you can still get tearing even if you're running games at a consistent 144 fps. Really curious about this
How is g sync noticeable? What difference does it make as opposed to a non g sync monitor? If I run games at 120 fps with no dips, would I notice a difference?
From what I understand, it is great for people who get dips in fps. I run everything low so my frames stay high
Thanks
But won't you get the best performance when you're running close to 144 frames with g sync? I always assumed you needed a ton of frames for the best experience
And is g sync noticeable? Is it a game changer? I don't really notice tearing because my frames don't dip, but I read you can still get tearing even if you're running games at a consistent 144 fps. Really curious about this
Idiots. we want a 30 inch 4k. 27 is too small. especially for $1200? come on now. vote with your wallets and brains people. #2stepsfoward5stepsback
4K ultrawide? 5040x2160 . HDMI 2.1 has enough bandwidth for it at 8 or 10 bpc at 144 Hz.Not me, I want a $1,200 34" 21:9 curved display at 100hz with G-Sync for $900.00
Actually I want two, one mounted above the other, and when they are displaying content that isn't 21:9 I want it to carve the unused display area into additional virtual display desktop space. And I want to be able to combine both and display 16:9 content full-screen across both as well.
32"+ or GTFO. I am looking to eventually replace a 30" 2560x1600 monitor I got 6 years ago.
It's not just "an alternative". Nvidia could simply support FreeSync as well. There's literally nothing stopping them except profit. Consequently, you pay the "G-Sync tax" for the "privilege" of having an NV card and being able to adaptively sync the refresh on the display you bought. Granted, NV's tech is a little different and involves more technical boopityboops, but the point is that they could support FreeSync and G-Sync and choose not to do so for strictly selfish business reasons. Call it a tax or whatever, it still amounts to paying more "because we (NV) can do this and people still pay us". *shrug life*
GSync will prevent tearing at any frame rate, as the monitor refresh rate will follow whatever fps the game is running at.
Something isn't right if you see tearing with gsync on. Maybe vsync isn't working with the game or something.
For me it is noticeable. I no longer need to target performance above 60fps, and cap it at 60 fps to ensure I have a constantly smooth experience. In games like Witcher 3 and modded Skyrim, even my current card cannot constantly stay above 60 fps, it often fluctuates down to 50fps, but I can't tell the difference, no screen tearing or stuttering when the fps drops.
It's been the biggest game changer for me in gaming for a long time. You don't get any tearing or judder at all no matter the framerate, as the refresh rate changes to match. Unless there is a performance issue with the game (G-Sync doesn't compensate for stutter), I can't notice when the framerate jumps between 50 and 100 FPS.
It's the exact opposite with G-Sync vs. V-Sync. You want to avoid approaching the max refresh rate of the monitor, as it acts more like V-Sync the closer you get. If you read around a lot of people set a framerate cap of 135 FPS to avoid getting close to 144. I don't sweat over specific framerate targets, so I'll just turn up or down the game settings until it runs 80-100 FPS most of the time. With my PC it usually means just picking the highest quality preset and forgetting.
freesync is normally 45-60hz, some times they can go up to 75 hz. These are the "cheap" ones people talk about. This is why people say freesync is not as good as gsync. Once you are in the 144hz panels, the only advantage gync has is it can dip down to even 20hz in some models.
So yea freesync can suck when you have a model and all it does is freesync from 45hz to 60hz and thats it.
Your video card has frame buffers;
In the earliest days, all cards used Vertical Synchronization meaning the card would fill the primary display buffer and send that completed image to the monitor in sync with the monitor's vertical refresh rate, typically 60hz which is equal to 60 frames a second.
But then came video games that were harder on video cards and video cards couldn't produce 60 new frames every second so what happened was, if a new frame wasn't fully ready yet, the last frame would be repeated. This is what made things look choppy, studdering, chugging alone.
Another part of the V-Sync method was that once an image was sent to the monitor, the video card dumped all the data from the buffer and started building the entire next image from scratch. Well, someone got smart, they said, "why build the whole thing from scratch, instead of dumping all the data, just over-write the old data with new data and when the monitor needs a new frame, just send it what you have, part of an old frame and part of a new one. This is what's now called "V-Sync Disabled" or turning Off V-Sync. They also started adding more buffers so that strong cards could render several frames at a time, sort of like getting a jump on things. Disabling V-Sync got rid of the stuttering but it created a visual problem with image tearing because of the partial images.
Then new smart guys asked a question, "Why make the video card have to keep up with the monitor's refresh rate? Why not let the video card run as fast as it can, and tell the monitor to refresh when the card is ready to send the next good image?" and adaptive synchronization was born.
AMD's version is called Freesync, and NVidia call's theirs G-Sync.
Now you are correct that in a perfect world, perfect frames at 144 Hz (144 FPS) would be ideal. Even better is doing it at 4K instead of 1080P. But the reality is that the game can still make even the best cards run slower at 4K. So if you want 4K images you will have to settle for the fact that you are not going to get perfect images at 144 FPS. Without adaptive sync you will still have the two old choices of V-Sync ON or OFF. But with adaptive sync your card can simply run as fast as it can make perfect images and tell the monitor to stay in step.
Most will agree that G-Sync is superior to Freesync, but everyone has a budget and everyone has choices to make. Part of the issue here is that at 4K and 144 FPS, the bandwidth needed is more than a Display Port cable can handle. My thought is that few systems are capable of producing that much data at 4K so the card won't be running at 144 FPS anyway. I figure, since the card can control the refresh rate, if it is going to saturate the Display Port bandwidth that it will simply slow down. Another thing is that display port works differently then older solutions, DP tells each pixel what color to display and how long to show it so it lessens the data that must be sent, instead of telling every pixel what to do on every refresh pass. You don't have to tell the sky to be blue 144 times in a second, just tell the sky to stay blue until I tell you to be something else.
OK, that's the short book, I hope I didn't bore you, and that you or someone else is helped by it.
But what if you are running 120 fps consistently with no v sync? Will I notice a difference with g sync running my games at 120 fps, no dips? i am playing bf1 right now and have the fps counter on my corner, and so far I havent seen any dips
Is g sycn for people who like to run their games at high settings? i run my games on low so my frames never go below what I set it.
I am so confused by this. I have been playing bf4 for a while, and I read that you should always have your frames go above the refresh rate to prevent tearing. Some like it 1 frame above the fresh rate, and some like it 5 frames above the fresh rate. Like I would play bf4 at 144hz and have my frames set to 150. I would set my frames to 121 for 120 b/c it seemed like a lot of people did that.
I dont have g sync right now. I am playing bf1 and I never go below 120 frames. Will I notice a difference in gaming with g sync? I don't use v sync. From what I understand, it is great for those who cannot get consistent frames. I am getting a consistent 120 frames, so I am not sure if I will see any added benefit with g sync.
what if I am getting 120 frames consistently, will I see any added benefit with g sync? I do not use v sync.
You helped alot
Right now I am not worried about 4k. I just like fps games and don't want a low amount of frames because I play at 120 hz with no v sync
What I do not understand is, If I am getting 120 frames consistently with no drops, will I see any added benefit with g sync? That is where my confusion lies. I do not use v sync. I read there can be tearing even if you are getting 120 frames and you're running at 120 hz with no v sync. I am getting 120 fps with no drops, so I am sure if I should go for g sync. Will I notice a difference?
I was going to go for g sync a few years back when it first came out. I decided to hold out until the technology matured. I later bought an ATI card and decided to use mantle. Mantle was incredible, and I loved every second of it. Great technology.
so yea, should I go for g sync if my frames do not dip?
thanks
You must have some magic setup if you are running 120fps with ZERO dips all the time.
You must have some magic setup if you are running 120fps with ZERO dips all the time.
R9 290
4670k
I play bf1 low settings
I used the perfoverlay command and get 120 fps consistently
Would I see any added benefit with g sync gameplay wise?
BTW, the perforverlay command has been reported as being a little buggy and not always working properly. I also wonder how the commend works when you have V-Sync ON vs OFF.R9 290
4670k
I play bf1 low settings
I used the perfoverlay command and get 120 fps consistently
Would I see any added benefit with g sync gameplay wise?
BTW, the perforverlay command has been reported as being a little buggy and not always working properly. I also wonder how the commend works when you have V-Sync ON vs OFF.
Lastly, it's always sobering to remember that no matter how many FPS are being reported, it actually can not exceed the monitor's refresh rate. If a monitor is 120 Hz, that is a maximum of 120 FPS even if the software says more frames are being produced.
Not entirely, FPS and Hz do not directly correlate. 60 FPS vs 120 FPS is definitely noticable on a 60hz monitor.
Sounds like you'd be more interested in Fast Sync, which doesn't require any additional monitor tech. Just go into the NVIDIA control panel and globally set Vertical sync to Fast. This allows the game to still run as fast as possible while the display still only shows frames as fast as its maximum refresh rate. This gives you the benefit of minimal input lag and virtually no tearing, though judder is still possible (which is one of the artifacts G-Sync addresses).But what if you are running 120 fps consistently with no v sync? Will I notice a difference with g sync running my games at 120 fps, no dips? i am playing bf1 right now and have the fps counter on my corner, and so far I havent seen any dips
Is g sycn for people who like to run their games at high settings? i run my games on low so my frames never go below what I set it.
I am so confused by this. I have been playing bf4 for a while, and I read that you should always have your frames go above the refresh rate to prevent tearing. Some like it 1 frame above the fresh rate, and some like it 5 frames above the fresh rate. Like I would play bf4 at 144hz and have my frames set to 150. I would set my frames to 121 for 120 b/c it seemed like a lot of people did that.
I dont have g sync right now. I am playing bf1 and I never go below 120 frames. Will I notice a difference in gaming with g sync? I don't use v sync. From what I understand, it is great for those who cannot get consistent frames. I am getting a consistent 120 frames, so I am not sure if I will see any added benefit with g sync.
what if I am getting 120 frames consistently, will I see any added benefit with g sync? I do not use v sync.
You helped alot
Right now I am not worried about 4k. I just like fps games and don't want a low amount of frames because I play at 120 hz with no v sync
What I do not understand is, If I am getting 120 frames consistently with no drops, will I see any added benefit with g sync? That is where my confusion lies. I do not use v sync. I read there can be tearing even if you are getting 120 frames and you're running at 120 hz with no v sync. I am getting 120 fps with no drops, so I am sure if I should go for g sync. Will I notice a difference?
I was going to go for g sync a few years back when it first came out. I decided to hold out until the technology matured. I later bought an ATI card and decided to use mantle. Mantle was incredible, and I loved every second of it. Great technology.
so yea, should I go for g sync if my frames do not dip?
thanks
How?
If you have V-Sync enabled or ON, the monitor sends a signal to the video card to keep everything sync'd up. The video card is building the frame in the buffer and once the frame is completed it listens for the signal and sends the new frame for the monitor refresh. Then the buffer is dumped and completely rebuilt.
If V-Sync is OFF, once a frame is sent, instead of dumping and completely rebuilding the buffer, the video card is over-writing data in the buffer and at whatever point in this process the video card gets the refresh signal it sends what it has and continues. There are additional details like added buffers and such, but they don't change the basics.
So what is it you are noticing between 60 and 120 FPS on a 60Hz monitor ?
I think I need a better understanding of what you mean when you say they don't directly correlate cause when a monitor has a 60Hz refresh rate, the monitor will not display more than 60 frames, not matter what the quality of those frames are, you will not get more than 60 of them at the monitor.
I never used VSync on my 60hz monitors. So while there was more tearing there was less FPS spikes and gameplay just had a much smoother feel to it. Hard to explain so I'll just quote this post.
"This is always told wrong. If you do not have V-sync on, and you have 70 FPS, as an example, with a 60hz screen, you will indeed see 70 FPS, but there will be many partial frames shown. While the monitor updates its image and a new frame comes in, as a result of rendering frames faster than the monitor refresh rate, the monitor will shift to a new frame before it has a chance to display the whole image.
The frame is shown, just not a complete frame. A 60hz monitor can only show 60 complete frames, but if you have 70 FPS, you'll end up seeing 70 frames at an average of 60/70 of each of those frames.
At 120 FPS, you'll see 120 half frames on average.
Obviously a 120hz monitor is better.
NOTE: if you do not use V-sync, no matter what your FPS are, you get screen tearing."
I think this is just a convoluted way of saying that the new frame is part of one frame and part of another because the new frame didn't fully over write the buffer and the old frame's data.While the monitor updates its image and a new frame comes in, as a result of rendering frames faster than the monitor refresh rate, the monitor will shift to a new frame before it has a chance to display the whole image.