G-SYNC Do-It-Yourself Kit Now Available

I don't know why people are having sticker shock.

If you look at the DIY kit it consists of a custom PCB for only the VG248QE and a more expensive FPGA (Altera Arria V GX) rather than a custom-designed ASIC.

So you are paying a premium to have the ability to mod a monitor now. As opposed to waiting till Q2 or Q3.

I would imagine once the mass produced screens with a custom ASIC hit the market the overall surcharge to add G-Sync will be reduced.

As we see with any technology, price will decrease over time.
 
Didn't even know what 'gsync' was so I read up on it.
Great tech, but there is no doubt for this to work out in a big way it has to go industry standard, I mean AMD, Intel, Nvidia all in it, whatever. Otherwise another good tech will die.
Maybe Nvidia can settle for a few cents per device deal and open this thing up for everybody?
I know I know just talking crazy here.
 
Didn't even know what 'gsync' was so I read up on it.
Great tech, but there is no doubt for this to work out in a big way it has to go industry standard, I mean AMD, Intel, Nvidia all in it, whatever. Otherwise another good tech will die.
Maybe Nvidia can settle for a few cents per device deal and open this thing up for everybody?
I know I know just talking crazy here.
Personally i rather have Gsync as the standard than freesync.

G-Sync GPU drives VBI; FreeSync driver speculates VBI. Similar, but one is superior to the other. FreeSync becomes more accurate as you render more frames ahead and buffer which sorta makes it a moot point.
 
I haven't noticed any lag. Is that the same kind of lag I'm getting with my wireless mouse, i.e. awful no-no for "hardcore gamers" but in reality pretty much inperceptible?

My wife also can't tell the difference... most of the time.
 
justin-timberlake-reportedly-reuniting-with-nsync-at-2013-vmas.jpg
 
I hope that nvidia gives a discount voucher (à la the shield vouchers) with the 800 series of cards on this.

Seems like it would be an effective way to try to scoop people up.
 
Yeah a buddy of mine did the mod and the gameplay that comes out of it is smooth, all without the lag of things like triple buffering. I'm picking up one of those monitors since i have a compatible gpu for it, but probably not till a couple months down the road where the price of it falls and it come integrated into the monitors.

I've seen some supposed examples of G-Sync here and there, and two things bother me about the ones I've seen, so far:

*The purported non-Gsync display examples are *much* rougher and jerkier than I see at home with my non-G-sync display while running the same games.

*The purported Gsync examples I've seen look very, very close to what I'm already seeing without it, running the same games.

I'm immediately suspicious of any product which has to make itself appear better by exaggerating the flaws of other products. Still waiting to be sold on this. (Disclaimer: I don't own a nVidia gpu so in all honesty things without G-Sync for nVidia gpus may actually be as bad as these examples make them look. I would not think so, but I couldn't say for sure.)
 
I can understand price besides being a low volume item this completely replaces the PCB for the monitor + adds a new external power supply instead of the one that used to be internal to the monitor. It also removes the dvi.
 
I've seen some supposed examples of G-Sync here and there, and two things bother me about the ones I've seen, so far:

*The purported non-Gsync display examples are *much* rougher and jerkier than I see at home with my non-G-sync display while running the same games.

*The purported Gsync examples I've seen look very, very close to what I'm already seeing without it, running the same games.

I'm immediately suspicious of any product which has to make itself appear better by exaggerating the flaws of other products. Still waiting to be sold on this. (Disclaimer: I don't own a nVidia gpu so in all honesty things without G-Sync for nVidia gpus may actually be as bad as these examples make them look. I would not think so, but I couldn't say for sure.)

As someone that offered up his own funds with no predication or bias regarding this technology I can say the results are clear cut. Take any poorly optimized game and I mean across both Nvidia and AMD and G-sync truly shines above and beyond.

If it was as simple as Nvidia using its own worst case events to make it seem better than it was then I would pack up this kit and return it without question.

My go to gaming monitor was a 1440p Korean PLS that overclocks to 120Hz easily (I've actually gotten it beyond 130hz but I need a beefier GPU setup) and there is no reason for me to personally gush about G-sync because I'm having to use it on an inferior monitor. But V-sync problems have always bothered me. I hate screen tearing as it distracts majorly from the gaming experience and forces you back out into the truth of the situation while you are trying to make your experience as involving and fun as you want it.

Now that I've had a taste of what will become common place for gamers , PC gamers to be specific , during the next few years I refuse to go back.

All of the doubt you may be feeling honestly should be cast away. This is the real deal , its not another empty promise from another overhyped and advertising saturated industry , its something that has plagued gaming across all platforms in one way or another for decades and finally someone is dealing with it.

I wouldn't recommend someone buying the kit and monitor just for the early adopter G-sync experience but I will absolutely inform anyone that wants to hear it that this isn't some kind of fake hype that will never be fully realized. Its happening right now and its going to be a must have feature for those that are bothered by screen tearing.
 
People get so used to screen tearing they probably don't notice it when they play normally and think the demos are just fake set ups, some games like BF4, Dragon age origins, Skyrim if you set V-sync off you get shit tons of tearing, which is why i played skyrim and DA:O with v-sync on but games that are multiplayer and fast pace like BF4 i leave it off, not because there is no tearing or that i'm fine with it, just that the lag is a difference maker.
 
$200 to basically enable DP1.3 before it's finalized...

Just so everyone knows, every monitor that has DisplayPort 1.3 will support g-sync and freesync (AMD), and it's finalizing in Q2. Right now we only have DP1.2. What 1.3 will bring us is variable rate just like in eDP. Seriously guys, save your money for an IPS monitor with DP1.3.
 
Oh its real. TV's are always worse in terms of input latency and while it varies from TV to TV the best I've ever seen is around 8ms and I happen to own that TV (Samsung A650 40 inch). But the vast majority of recent TV releases have heavy post processing and the lowest I've seen of recent models is around 25-30ms which is right at the limit of what most of us can perceive in terms of input latency.

Then again we all process information at different speeds so while I'm sensitive to input latency you may not be.


Eek, seems like you would be much better served spending an additional $200 on a graphics card, if a mere additional 1fps delay is distracting for you.

Screen tearing and stuttering are things that can happen every several frames, but the time it takes to draw a frame happens every single frame!

----
Personally i am more concerned about motion being represented on screen than i am about occasional screen tearing. i cant comprehend how gsync can solve actual microstuttering (i.e not drawing a complete frame), which to me, is my main complaint. If an object takes 100ms to travel across the screen ideally, and then the first 15 percent of the path takes 15ms to render, and then the next 15 percent ending of the path takes 45 ms to render, so the the monitor just displays the image a bit longer. Duh, this eliminates tearing for higher framerates than what vsync would enable. But i would just rather cut that damn 45ms time down to 15ms with additional graphics card power.
 
Back
Top