Nvidia G-Sync - A module to alleviate screen tearing

RamonGTP

Supreme [H]ardness
Joined
Nov 9, 2005
Messages
8,150
I have 144 Hz monitor... I've had Dell IPS surround also and I loved it. I'd rather have my single ASUS TN monitor. Once you've seen the difference in input lag and ultra-smoothnes... I wouldn't give it up.

I think as gamers upgrade their hardware, if this tech is good tech, it will be very consequential.

And? That does nothing to rebut what I said. The number is extremely small, even with you included.
 

rudy

[H]F Junkie
Joined
Apr 4, 2004
Messages
8,704
Most people are not going to "upgrade" their monitors, they are going to buy new ones already compatible with gsync.
My guess is that at some point soon every major brand that focuses a product line at gamers will include gsync in these high refresh rate and 3d monitors. And gamers will end up getting it intentionally or not.
 

RamonGTP

Supreme [H]ardness
Joined
Nov 9, 2005
Messages
8,150
Most people are not going to "upgrade" their monitors, they are going to buy new ones already compatible with gsync.
My guess is that at some point soon every major brand that focuses a product line at gamers will include gsync in these high refresh rate and 3d monitors. And gamers will end up getting it intentionally or not.

That's still an upgrade bro. Since when is the term "upgrade" mutually exclusive to talking about the exact same unit? I have a cheap 24" TN. If I buy a 30" IPS, that's an upgrade.
 

ccityinstaller

Supreme [H]ardness
Joined
Feb 23, 2007
Messages
4,241
You can't be stating facts if you have no facts to state. Insinuating that AMD can launch legal action because Nvidia shipped a product and AMD hasn't is beyond ridiculous.

Way to take an expression and run with it:rolleyes:..Did you happen to notice I typed that @ 4:24 in the morning? Excuse me for not being more clear..I did NOT ever state AMD could or WOULD launch any sort of legal action. I stated that AMD had gone on the record back in Feb of 2012 in regards to this type of tech, and that IT COULD end up being messy if one side tried to patent it instead of offering it up as as standard, but you conveniently left that part out didn't you?

In regards to post #197 .... well, you don't see that often.

Indeed...I too afraid to do that myself:p..
 

wonderfield

Supreme [H]ardness
Joined
Dec 11, 2011
Messages
7,396
I doubt it. The number of people who have 144Hz displays are very small, and the number of those that will upgrade to a G-Sync monitor is smaller still, and the ones that will upgrade to G-Sync AND upgrade their GPU's is quite literally, inconsequential.
I don't follow your point. The initial batch of G-Sync displays will be 144 Hz displays. Higher refresh rates mean a greater motivation for users to satisfy those refresh rates: it's a "taste for blood" effect. Thus, spending more money on GPUs to achieve that.

On what point do you disagree, exactly? I never claimed that any great number of users would do that, only that it would be a motivator.
 

rudy

[H]F Junkie
Joined
Apr 4, 2004
Messages
8,704
That's still an upgrade bro. Since when is the term "upgrade" mutually exclusive to talking about the exact same unit? I have a cheap 24" TN. If I buy a 30" IPS, that's an upgrade.

In the context of gsync and rare upgrades the only rare one that fit was the people who would rip apart their QE monitors and put the chip, that is the upgrade I was discussing. If that is not the original context then gsync and upgrades will not be some small percent. Lots of people who game are forking out the bucks for 120hz and faster monitors
 

deasnutz

Limp Gawd
Joined
Feb 9, 2012
Messages
377
So the VG248QE was released quite a while ago and pre-dates Gsync. I'm not really understanding the connectivity.

My thinking is that any DP monitor with similar specs could be altered for DIY GSync. Does the Asus just have a removable cable?

Anyone ripped open a VG248QE?
 

RamonGTP

Supreme [H]ardness
Joined
Nov 9, 2005
Messages
8,150
I don't follow your point. The initial batch of G-Sync displays will be 144 Hz displays. Higher refresh rates mean a greater motivation for users to satisfy those refresh rates: it's a "taste for blood" effect. Thus, spending more money on GPUs to achieve that.

On what point do you disagree, exactly? I never claimed that any great number of users would do that, only that it would be a motivator.

I disagree that nVidia elected to limit G-Sync to 144Hz displays to sell more GPU's.
 

Nenu

[H]ardened
Joined
Apr 28, 2007
Messages
20,075
So the VG248QE was released quite a while ago and pre-dates Gsync. I'm not really understanding the connectivity.

My thinking is that any DP monitor with similar specs could be altered for DIY GSync. Does the Asus just have a removable cable?

Anyone ripped open a VG248QE?

It will be a custom Display Port connection hooked into the monitors internals.
The computer signals a frame is ready, it is passed into Display Port, into the Gsync adapter and routed through to the monitors syncing circuits.

If the connections to use it in the monitor are not exposed and the monitor cannot decouple the normal refresh signaling, it wont work.
The monitor will have a port for the Gsync module to plug into.
The DIY version will need a monitor with this port.
 

Creig

Gawd
Joined
Sep 24, 2004
Messages
785
Well, if Nvidia follows through with trying to convince a couple of manufacturers to produce some monitors with these modules pre-installed, they better ensure that there's a pass-thru path for non-Nvidia cards to bypass the G-Sync circuitry without issues. Because you know full well that Nvidia will include an EDID checker to ensure that it works only with their cards, even if it turns out to be fully compatible with AMD cards as well.

A better option would be for VESA to adopt an open standard equivalent to G-Sync. Then people who want these features wouldn't be locked into using only Nvidia's video cards.
 

BatJoe

Gawd
Joined
Apr 4, 2012
Messages
836
Well, if Nvidia follows through with trying to convince a couple of manufacturers to produce some monitors with these modules pre-installed, they better ensure that there's a pass-thru path for non-Nvidia cards to bypass the G-Sync circuitry without issues. Because you know full well that Nvidia will include an EDID checker to ensure that it works only with their cards, even if it turns out to be fully compatible with AMD cards as well.

NVidia have basically said supporting other video cards is not part of their position right now.

http://www.youtube.com/watch?v=KhLYYYvFp9A
 

Creig

Gawd
Joined
Sep 24, 2004
Messages
785
NVidia have basically said supporting other video cards is not part of their position right now.
Unfortunately, that seems to be Nvidia's response every time they attempt something new.
 
Last edited:

FrgMstr

Just Plain Mean
Staff member
Joined
May 18, 1997
Messages
52,950
NVIDIA just told me this as I asked some very specific questions. I was on vacation last week during its event and Brent had AMD stuff to be working on for the upcoming review.

Surround support will follow initial launch
Yes works with sli now.
No products are announced other than Asus who did the press release. Gsync module does support multiple panel sizes and max refresh rates.

So currently no surround support with GSYNC.

SLI support YES.

But only looking at a 1080p GSYNC display for sale NEXT YEAR.

I would suggest to you that this technology is not needed for single screen 1080p for the most part.

Get me working 1440p 27" displays working and it will be something to look at.
 

Tamlin_WSGF

2[H]4U
Joined
Aug 1, 2006
Messages
2,988
NVidia have basically said supporting other video cards is not part of their position right now.

http://www.youtube.com/watch?v=KhLYYYvFp9A

Its a shame (since its a tech with such a great potential), but understandable. They haven't even got it out for Nvidia cards yet. Even if they were planning broader support at a later stage, it would be too early to implement it now.

I'm betting that more vendors both GPU and Monitor will look into getting such capabilities for their screens. We saw it with 3D vision and now there are vendors with 3D screens for other GPU's. Samsung is a big player here with 3D screens for AMD. I'm guessing they won't let this slide. Its a big thing after all.

If someone haven't seen the video above yet, they should. Perfect sync, without being dependent on framerate or the game itself, thats big! :)

But, they need to work on the surround support (and get it into IPS screens).
 

BatJoe

Gawd
Joined
Apr 4, 2012
Messages
836
Its a shame (since its a tech with such a great potential), but understandable. They haven't even got it out for Nvidia cards yet. Even if they were planning broader support at a later stage, it would be too early to implement it now.

I'm betting that more vendors both GPU and Monitor will look into getting such capabilities for their screens. We saw it with 3D vision and now there are vendors with 3D screens for other GPU's. Samsung is a big player here with 3D screens for AMD. I'm guessing they won't let this slide. Its a big thing after all.

If someone haven't seen the video above yet, they should. Perfect sync, without being dependent on framerate or the game itself, thats big! :)

Ideally I would like to see the tech used broadly, but I can understand where NVidia is coming from. They put the R&D into this so they want to capitalize on it. I can understand that in the beginning, but hopefully down the road it opens up.
 

Tamlin_WSGF

2[H]4U
Joined
Aug 1, 2006
Messages
2,988
Ideally I would like to see the tech used broadly, but I can understand where NVidia is coming from. They put the R&D into this so they want to capitalize on it. I can understand that in the beginning, but hopefully down the road it opens up.

It will probably open up later in one way or the other. Either by Nvidia licensing (like they do on GPU now) or that others like Samsung delivers their own version like we see with 3D monitors. But, this is something I would like to have in all screens.

Nvidia did something that should have been done years ago. Kudos!
 

Terpfen

Supreme [H]ardness
Joined
Oct 29, 2004
Messages
6,079
I don't quite follow. The temporary halving of frame rate due to a frame missing the swap is as jarring at 1080p as it is at any other resolution. If the argument is that it's easier to drive 1080p and that you're more likely to remain locked at the display's refresh rate than you would at a higher resolution, that's certainly true, but a locked 120/144 Hz is still non-trivially difficult (and in some cases completely impossible) to achieve at 1080p. Even at 120 Hz, missing a swap is still an offensive thing. At 144 Hz, it's in the realm of not being too bad, but you'd still greatly prefer to run at 130 Hz for a frame or two than 72 Hz.

The technology's a win for everyone who can't lock to their desired refresh rate at all times, which is...well, everyone.

Agreed. I'm not really sure where that stance is coming from. And I believe Nvidia has said that this will work all the way up to 4K, where it will really be needed.

I'm giving Arkham City a quick run-through now to refresh my memory and be able to compare it with Arkham Origins on Friday… and there's tearing on my 1200p display. So I'm not sure how the conclusion that G-Sync is basically unnecessary below a certain resolution was reached.
 

Lucifercy

n00b
Joined
May 20, 2013
Messages
25
I stated that AMD had gone on the record back in Feb of 2012 in regards to this type of tech, and that IT COULD end up being messy if one side tried to patent it instead of offering it up as as standard, but you conveniently left that part out didn't you?

from tdhq
http://www.thedigitalhq.com/2013/10/22/nvidia-g-sync-patent-united-states-patent-8120621/

NVIDIA G-SYNC Patent? United States Patent: 8120621
method and system are implemented to measure quantitative changes in display frame content for dynamically controlling a display refresh rate. Specifically, one embodiment of the present invention sets forth a method, which includes the steps of composing a first display frame from a first set of rendered image surfaces, composing a second display frame from a second set of rendered image surfaces, dividing the first display frame and the second display frame into a same number of frame regions. Also, for each of the frame regions, the method also includes the steps of calculating a first set of numerical codes and a second set of numerical codes representative of the content associated with the frame region in the first and second display frame, respectively; and determining an amount of changes in content between the first display frame and the second display frame based on the results of comparing the first set of numerical codes against the second set of numerical code.

Inventors: Ogrinc; Michael A. (San Francisco, CA), Hannigan; Brett T. (Philadelphia, PA), Wyatt; David (San Jose, CA)
Assignee: NVIDIA Corporation (Santa Clara, CA)
Appl. No.: 11/957,374
Filed: December 14, 2007

Update: Some readers have e-mailed us and pointed out that the above patent may apply to display power savings rather than G-SYNC itself. Hopefully as NVIDIA releases more information, we’ll learn more about its core technology and be able to focus in on what patents (if any) cover it
 

chiablo

Limp Gawd
Joined
Feb 6, 2005
Messages
387
I'd love to have G-Sync built into the Oculus Rift. With the current dev kits, any kind of screen tearing is really noticeable and destroys the immersion. You're supposed to play with V-Sync off in order to reduce latency to the absolute minimum, hopefully there's no latency introduced by G-Sync.
 

Nenu

[H]ardened
Joined
Apr 28, 2007
Messages
20,075
They have shown an interest in using it.
Read up on gsync then you will understand how it reduces latency.
 

PcZac

Limp Gawd
Joined
Aug 30, 2013
Messages
326
I was planning on buying a 2560x1600 within the next few months, but now I will wait until there is a decent IPS version with G-Sync that isn't over priced.
 

BroHamBone

[H]ard|Gawd
Joined
Apr 6, 2013
Messages
2,031
Well, shit. I guess I'm buying a screen now!

Edit: actually deals soon to be out for holiday.... II can wait till then!
 

Terpfen

Supreme [H]ardness
Joined
Oct 29, 2004
Messages
6,079
Well, shit. I guess I'm buying a screen now!

Edit: actually deals soon to be out for holiday.... II can wait till then!

You can't buy a display with G-Sync built in until Q1 2014. You can buy the VG248QE now and the G-Sync kit late this year, but there's no word on exactly what kind of modification it entails.
 

spacin9

Limp Gawd
Joined
Jan 11, 2012
Messages
253
Cool... I have that exact Asus monitor. Time to get my hammer and wrench ready for the mod kit!

To be honest... I'm running pretty damn smooth right now. I'm not sure what the benefits of G-sync are going to be... if it would be worth it.
 

Terpfen

Supreme [H]ardness
Joined
Oct 29, 2004
Messages
6,079
To be honest... I'm running pretty damn smooth right now. I'm not sure what the benefits of G-sync are going to be... if it would be worth it.

The benefits have been explained in pretty good detail on multiple sites now. The ability to have the graphics card directly control the refresh rate of the display is huge, and totally separate from just increasing the refresh rate of the display.
 

spacin9

Limp Gawd
Joined
Jan 11, 2012
Messages
253
The benefits have been explained in pretty good detail on multiple sites now. The ability to have the graphics card directly control the refresh rate of the display is huge, and totally separate from just increasing the refresh rate of the display.

Oh awesome! I hope I won't need to soldier the g-sync thing in there. I'll have my oven ready in case I need to bake it in.
 

BroHamBone

[H]ard|Gawd
Joined
Apr 6, 2013
Messages
2,031
You can't buy a display with G-Sync built in until Q1 2014. You can buy the VG248QE now and the G-Sync kit late this year, but there's no word on exactly what kind of modification it entails.

Never said I was buying a screen w/ G Sync right now.

I was responding to this,


"G-Sync could be exclusive to ASUS until late 2014. Fail."

Anyway, im referring to a benq to be clear.
 
Top