Is it time for Nvidia to support Freesync?

Discussion in 'Video Cards' started by H-street, Sep 24, 2017.

  1. dr.stevil

    dr.stevil [H]ardForum Junkie

    Messages:
    9,094
    Joined:
    Sep 26, 2008
    I was under the impression that you needed GFE to auto-update the drivers?

    Either way, the GFE shit is annoying. Perhaps I'll do a fresh install and get rid of that crap on my HTPC if I can still get automatic updates.
     
  2. noko

    noko [H]ardness Supreme

    Messages:
    4,130
    Joined:
    Apr 14, 2010
    Except Shadowplay one must use Geforce Experience and all the crap spam, tracking etc. that goes along with it. With RTG Relive is transparently supported in the drivers. Game profiles with Nvidia drivers take forever to update every time opened and then limited as in with RTG I can adjust clock speeds, voltages etc. as well. Interface wise Nvidia is clunky and somewhat ugly, amateurish from my view on the interface. I am beginning to hate Geforce Experience as bad as Raptr (which is being discontinued and sight being closed down) in the past. I want one function, Shadowplay and with Nvidia it tied to a whole bunch of crap.

    Yes, for automatic update you need GFE crap loaded, spamming, advertising etc.
     
    dr.stevil and razor1 like this.
  3. razor1

    razor1 [H]ardForum Junkie

    Messages:
    10,216
    Joined:
    Jul 14, 2005

    You won't get automtaic updates with out it but you can set up your nvidia account to email you about driver updates.
     
  4. IdiotInCharge

    IdiotInCharge [H]ardForum Junkie

    Messages:
    8,357
    Joined:
    Jun 13, 2003
    Count me as one that doesn't mind GFE. I'm just not that picky about individual game settings in most games, and GFE generally picks good settings and makes it easy to bump them in a different direction if needed.
     
  5. reaper12

    reaper12 2[H]4U

    Messages:
    2,186
    Joined:
    Oct 21, 2006

    Sorry man, my timeline is perfect. Here it is.

    September 2013 - AMD announces new cards - R9 290, R9 290x, R7 260x and R7260
    Early October 2013 - R7 260x Released.
    Mid October 2013 - Nvidia First Demo Gync. R9 290 cards released.
    November 2013 - AMD Sends Final Proposal to VESA.
    January 2014 - CES - AMD does Freesync demo on Laptop, Nvidia shows off Gync monitors, DIY Gsync Kit goes on Sale.
    June 2014 - Computex - AMD demo's Freesync on Monitors.
    August 2014 - First Gsync monitor available for sale. Asus Rog Swift.
    January 2015 - CES - 7 monitors with Freesync at show.
    March 2015 - Ben Q releases first Freesync monitor. It was released early. Launch was not supposed to be until April with the other Freesync monitors.
    Late May 2015 - Ghosting issues fixed. (To me is the actual proper release day of a working product.)
    November 2015 - AMD release LFC driver.

    Freesync for AMD wasn't a reactionary product. They were definitely working on it before the Gysnc demo. They were probably caught well off guard by Nvidia's Gsync demo. Maybe they didn't think Nvidia was working on a VRR tech, or maybe they thought they had more time.

    Do you really think that AMD somehow managed to put together a complete VRR solution between October 2013 and November 2013? Not only come up with a solution, but also to come up with a proposal to put to VESA with all the technical ins and outs. And, remember, with VESA there are couple of stages to go through before the final proposal. Is that possible for a company that moves as slow as AMD?

    But, let's say that they managed that. How did they manage to go back in time and make the cards already released compatible with the display port 1.2a adaptive sync standard (an optional standard not yet available and wouldn't be until May 2014)

    I know why it's called Free, no royalties have to be paid to AMD or VESA for Monitor manufacturers to use it. Address this part of your post to those people(idiots) who keep rabbiting on about Freesync not really been free as you still have to buy the monitor.



    Well, actually, I am sort of cheating throughout this discussion as I know I am right on this :) AMD did a question and answer session on the Overclockers UK forums shortly after the release of Freesync. It was one of the questions asked. The guy from AMD said that they were working on Freesync throughout the development of their Hawaii cards. Of course he could be lying, but not sure why he would. Nvidia still beat them to the market. Who came up first with the idea who cares really? It's quite possible that both companies begin working on a VRR solution independently. That's what the point that I have been trying to get across here. There are just too many signs pointing to AMD working on a VRR solution before the Gsync demo.

    At the end of the day first to market is usually the most important thing.
     
    {NG}Fidel likes this.
  6. razor1

    razor1 [H]ardForum Junkie

    Messages:
    10,216
    Joined:
    Jul 14, 2005
    And why was nV able to get it out a full year before AMD if AMD was working on it before? AMD not capable as nV to get things done? You think they were waiting on specs to ratified? They didn't need to worry about that, cause that's why the free sync monitors are all over the place. They could have started talking to the Monitor manufacturer well before the ratification process even began. That is the way I would do it if I was running AMD, I have done it numerous projects, just done first, and then push the committees to bend.

    on a laptop yeah, how long do you think drivers take to get up and going, this isn't that complex like a GPU, a GPU it takes about 8 months to a year for final release drivers. Not more than that.

    You think the GPU needs modification for this tech to work? err no it doesn't its just the display port spec they had used. Its all about the display port tech and version as long as it has 1.2a it can do it. That is why the 280x and few others can't do it. So yeah they were working on it right around the same time nVidia released their GPU. 1.2a was ratified when sometime in 2012? 1.3 was in 2014, so it had to be done well before then, 1.2 was 2009.

    Its not free, its got extra cost. Not much but its still there. And that is what everyone else has been saying.

    They were working on the specifications. the GPU itself has nothing to do with it..... just needs DP 1.2a that's it.

    doesn't matter at the end as you said, nV came out with it first and an over all better solution with better specifications. AMD needs to do that with Free Sync 2. If they can do that that will help them gain market share, if they can't their cards aren't going to do anything by themselves right now.
     
    Armenius, IdiotInCharge and Shintai like this.
  7. Nightfire

    Nightfire [H]ard|Gawd

    Messages:
    1,335
    Joined:
    Sep 7, 2017
  8. chenw

    chenw 2[H]4U

    Messages:
    3,987
    Joined:
    Oct 26, 2014
    Fun part is, they are even giving us an option to manually enable VRR on monitors that failed their validation.
     
  9. KazeoHin

    KazeoHin [H]ardness Supreme

    Messages:
    7,794
    Joined:
    Sep 7, 2011
    Garenteed, no, I'll bet money on it. 50 bucks says that if a monitor passes Nvidia's test to have the "GSync compatible" branding, it won't be allowed to have the AMD Freesync branding. Essentially, Nvidia are going to try to get monitor manufacturers to drop any Freesync branding because it was actually mildly successful for AMD.
     
    Maddness likes this.
  10. NKD

    NKD [H]ardness Supreme

    Messages:
    7,415
    Joined:
    Aug 26, 2007
    I honestly think the only validated some monitors becuase those brands seem closely aligned with Gsync. Anyone else notice that? Those seem to be the primary Gsync brands. May be Nvidia just pulled some kind of internal deal to push more of those monitors to make the top gsync panel makers happy. On top they just left the force option open. I really don't believe this thing about they actually tested 400 monitors lol and only 12 passed. Seems another marketing stunt to push certain monitors.
     
    euskalzabe and Maddness like this.
  11. chenw

    chenw 2[H]4U

    Messages:
    3,987
    Joined:
    Oct 26, 2014
    I wouldn't say BenQ is a major G-Sync brand though.

    I actually noticed the number of TN monitors more than the manufacturers.
     
  12. Maddness

    Maddness [H]ard|Gawd

    Messages:
    1,159
    Joined:
    Oct 24, 2014
    It's good news for gamers with those screens and an Nvidia card of whom I'm sure there are quite a few.
     
  13. Nightfire

    Nightfire [H]ard|Gawd

    Messages:
    1,335
    Joined:
    Sep 7, 2017
    You can manually activate it on all VRR monitors now, not just certified ones.
     
  14. NKD

    NKD [H]ardness Supreme

    Messages:
    7,415
    Joined:
    Aug 26, 2007
    I know that. I was just talking about their tested monitors in specific. It seems like they left the samsung 32 inch hdr monitor out. Likely because they don't make any gsync monitors.
     
  15. Sancus

    Sancus Gawd

    Messages:
    787
    Joined:
    Jun 1, 2013
    Why's that hard to believe? We know most Freesync monitors are straight trash already. They mostly don't support variable overdrive, for example. Most of the supported monitors are TN, it's entirely possible that without variable overdrive, Nvidia considers the response time out of spec for many panels at certain refresh rates. How many don't support low framerate compensation, or only support it for a poor range?

    There's lots of legitimate reasons to reject Freesync monitors as not good enough. Because most aren't.
     
    IdiotInCharge likes this.
  16. Tup3x

    Tup3x [H]ard|Gawd

    Messages:
    1,811
    Joined:
    Jun 8, 2011
    Most Adaptive Sync screens are trash. I'm not surprised if only those few offer comparable experience.
     
  17. Pieter3dnow

    Pieter3dnow [H]ardness Supreme

    Messages:
    6,362
    Joined:
    Jul 29, 2009
    You mean we will be getting more Nvidia propaganda in which brands are cool and others which suck. Next thing you know people paying extra because Nvidia cool monitors are so cool ;)
    There is choice in that market nothing that will burn a hole in your wallet, better stuff costs more. This is not a Freesync only deal this used to be the case before Freesync ever came to market.

    Freesync is not something that defines how much money you can spend unlike G sync where you need to spend money in order to get anywhere and guess what there are terrible G sync monitors (blew your mind right there didn't I?).

    But at least Nvidia's branding is subjective maybe even good enough for them to raise prices on approved Nvidia monitors that is the thing we have all been waiting for....
     
  18. Derangel

    Derangel [H]ard as it Gets

    Messages:
    16,837
    Joined:
    Jan 31, 2008
    Saying most freesync monitors are trash or substandard isn't exactly a hot take. Most of them have serious limitations in one way or another, especially in terms of their supported range. What the validated monitors have in common isn't the brands, its the quality. They are all high-quality, highly rated, popular freesync monitors with feature sets that match up to G-sync counterparts. Most likely, the only monitors that pass are going to be the best of the best freesync panels that can stack up with G-sync panels.

    I doubt they spent hours and hours on each of 400 monitors, but I wouldn't be surprised if they looked at them all in some way. A lot could probably be tossed out early or right away just based on specs while many more would fail super quick do to simply being garbage despite decent sounding specs.
     
  19. limitedaccess

    limitedaccess [H]ardness Supreme

    Messages:
    7,484
    Joined:
    May 10, 2010
    I'm going to speculate the 400 tested monitors number is inflated and out of context. I'm assuming a minimum validation requirement is likely going to be enough of a range to properly support frame doubling (or LFC in AMD terms). A huge chunk of existing VRR displays would not qualify for that (eg. 48-60/75, 90-120/144, etc.)

    If you use AMD's Freesync display page as a source that immediately filters out I think 350+ displays. All those were likely "tested" in the bare minimum sense and failed.

    After that I would guess the testing becomes more involved and likely more time/resource consuming depending on what they are looking for. As specific edge case visual artifacts would be more tricky to find beyond just range verification.

    Another aspect of this to consider is going to be branding battle for VRR and how that will play out. Technically speaking Nvidia is not supporting Freesync as the thread title alludes to, only AMD supports Freesync and is known to going forward.

    Nvidia will likely want to push the association of G-Sync with VRR or at last push the association to neutral terms like Adaptive Sync and Variable Refresh Rate. Whereas the AMD side will want to continue the existing association of Freesync. This will have implication of how displays market themselves and whether they get branding.
     
    Last edited: Jan 7, 2019
    Tup3x likes this.
  20. Verado

    Verado Limp Gawd

    Messages:
    142
    Joined:
    May 16, 2017
    I feel like I won in life when I opted to not pay the g-sync tax on black friday. Yay!
     
    euskalzabe, {NG}Fidel and Nightfire like this.
  21. MaZa

    MaZa 2[H]4U

    Messages:
    2,678
    Joined:
    Sep 21, 2008
    I think we all know why this happened. Freesync/Adaptive Sync is now part of the HDMI 2.1 standard and more and more televisions are adding VRR support. Nvidia does not really have a choice but to support it at some point if they want to keep on adding the latest HDMI ports in their cards. NVidia is just trying to spin things in their favor to make it look like they came up with the idea. Thank you console and TV manufacturers for this positive developement.
     
  22. nightanole

    nightanole [H]ard|Gawd

    Messages:
    1,996
    Joined:
    Feb 16, 2003
    Im going with "you need to freesync from zero to full speed", and most "non gaming" monitors only freesync from 40hz to 60hz. I honestly cant think of any freesync that is less than the cost of a gsync, that can run at 20-30hz so you dont tear in the dip.
     
  23. DanNeely

    DanNeely 2[H]4U

    Messages:
    3,446
    Joined:
    Aug 26, 2005
    Has anyone pulled together a speclist for the dozen passing monitors to attempt to determine what level of performance NVidia is requiring to pass its validation testing?
     
  24. Aluminum

    Aluminum Gawd

    Messages:
    571
    Joined:
    Sep 18, 2015
    Right now it looks like they are literally going down a list of monitors alphabetically, and just barely started.

    Acer acer acer acer acer acer acer benq

    LOL

    No way they tested "400" recent monitors, because good ones like the Nixeus are shoe-ins for sure.
     
    euskalzabe, {NG}Fidel and THUMPer like this.
  25. nightanole

    nightanole [H]ard|Gawd

    Messages:
    1,996
    Joined:
    Feb 16, 2003
    The Acer XFA240 FreeSync range is 48-144Hz/FPS (Frames Per Second) over DisplayPort and 48-120Hz over HDMI.

    LFC (Low Framerate Compensation) is also supported which means that even when your FPS rate drops below 48 and FreeSync stops working, the display’s refresh rate will double or triple the frame rate for a smoother performance.


    That is the key to be nvidia certified. How it handles LFC. Most have no LFC. To be nvidia certified for gsync the monitor must "do something" from 1fps to whatever the max frames are. Most freesync monitors do not have the chips for coverting very low fps back to the min native fps that the panel can handle.

    https://www.amd.com/en/products/freesync-monitors
     
    Last edited: Jan 7, 2019
    sharknice likes this.
  26. DanNeely

    DanNeely 2[H]4U

    Messages:
    3,446
    Joined:
    Aug 26, 2005
    That also goes a large way to show why NVidia had to do so. The XFA 240 is only $200 on Amazon, and offers a reasonably 24" TN 1080p/144hz entry point to high refresh/variable frame rate gaming. The cheapest similar GSsync monitor I could find, the Acer Predator XB241H is nearly than twice that. (I'm not 100% sure it is the cheapest because Amazon and Google's search results were heavily polluted with non-gsync monitors.)
     
  27. limitedaccess

    limitedaccess [H]ardness Supreme

    Messages:
    7,484
    Joined:
    May 10, 2010
    https://www.nvidia.com/en-us/geforce/products/g-sync-monitors/specs/

    You can see them listed at the bottom. A minimum spec requirement is likely going to >2.5x max refresh to min ratio to be able to support frame multiplying. This already eliminates the majority of current non G-Sync VRR displays as they have ranges like 48-60hz and etc.

    I assume that elimination compromises the current bulk of the 400+ tested. It's doubtful they've actually done extensive testing for that many monitors by now.

    Their key note also mentioned some visual artifacts with respect VRR which would require more extensive testing (eg. flickering edge cases and etc.).
     
  28. noko

    noko [H]ardness Supreme

    Messages:
    4,130
    Joined:
    Apr 14, 2010
    So is Nvidia more limited by what is on their cards compared to AMD? Have a 144hz FreeSync 2 Samsung HDR monitor and the Vega's run it perfect - smoothest game play I've ever experience. So my 1080Ti's may have issues because the hardware is not as flexible or capable as AMD?
     
  29. defaultluser

    defaultluser I B Smart

    Messages:
    12,118
    Joined:
    Jan 14, 2006

    It's the same exact hardware. Nvidia calls DP Adaptive Sync it uses on Laptops GSYNC.

    All because they use a local framebuffer to guarantee no hitching (something Freesync ditches).

    It's the exact same hardware. It will be subject to the same driver bugs you have with *SYNC displays,.
     
    chithanh and Maddness like this.
  30. noko

    noko [H]ardness Supreme

    Messages:
    4,130
    Joined:
    Apr 14, 2010
    So should be rather comparable in the end - that would be great news! Look very much forward to this.
     
    Maddness likes this.
  31. nightanole

    nightanole [H]ard|Gawd

    Messages:
    1,996
    Joined:
    Feb 16, 2003

    No. AMD sucks at protecting "freesync" which is a certification/marketing for AMD that includes VRR. "freesync" has a list of qualifications just like gsync, but AMD let everyone use "freesync" on the label reguardless of performance.

    Your samsung has all the real requirements, min lag rating, low frame rate compensation, etc.

    Looking back all the low end "freesync" monitors should have just been labeled VRR or Async compatible, not "freesync certified".

    There is a huge difference between a 48-75hz(no LFC chip) 100ms lag "freesync" tv/monitor, vs your $$$ that passed all of amds requirements.

    Im pretty sure "most" name brand freesync monitors that can do 144hz, will work just find with nvidias new drivers.


    The next fun part is getting the monitor to work with the nvidia. I had a fun time getting my viewsonic to recognize my amd apu as freesync enabled, even though it was enabled on the pc and the monitor menu.
     
    IdiotInCharge and noko like this.
  32. reaper12

    reaper12 2[H]4U

    Messages:
    2,186
    Joined:
    Oct 21, 2006
    There are no Freesync monitor. There are Adaptive Sync monitors that support AMD's Freesync. It's just people keep calling them Freesync monitors.

    There is no LFC chip either. LFC is done on the GPU not the monitor. The requirements for LFC are that the Maximum refresh rate is greater than or equal to 2 times the min refresh rate.

    What are you saying? There are monitors that have Freesync that are full range and are much cheaper than any Gsync monitor.
     
    euskalzabe likes this.
  33. nightanole

    nightanole [H]ard|Gawd

    Messages:
    1,996
    Joined:
    Feb 16, 2003

    You are correct:
    https://www.amd.com/Documents/freesync-lfc.pdf

    So its enabled on the gpu side once the gpu knows the monitor is capable of 2.5x min refresh.

    That makes me wonder if the same is true for nvidia. Does the gsync fpga handle the LFC, or does it just report back to the gpu?

    On that note, what will happen on a freesync monitor when the fps goes below 48hz, using a nvidia gpu? Will the new drivers use the gpu to LFC like amd?

    As for my second comment, There is a lot of overlap on prices for gsync/freesync with 120-144hz panels. Yes you can get god like ips gysync monitors, but they are of a much higher quality. Like wise you can get 240hz gysync monitors, which you can not get freesync equivalent.

    But for $400-500 you can get yourself a nice 27-32" freesync/gsync 1080/1440 120/144hz unit. Im not comparing a 60hz freesync $150 to a 120hz $400 gsync.
     
    Last edited: Jan 7, 2019
  34. reaper12

    reaper12 2[H]4U

    Messages:
    2,186
    Joined:
    Oct 21, 2006

    In a Gsync monitor the FPGA handles everything, It's the Frame buffer, timing control, scaler all in one.

    With Nvidia now supporting Adaptive Sync, they will have something similar to LFC on the GPU side to handle the low framerates.

    As for your last few statements, they don't match your statement in that post I quoted. You were trying to say that no cheap adaptive sync monitor could handle low frame rates like Gsync could, that you would need an adaptive sync monitor as least as expensive as a Gysnc monitor to do it. But that's not the case.
     
  35. nightanole

    nightanole [H]ard|Gawd

    Messages:
    1,996
    Joined:
    Feb 16, 2003

    Am i newegging wrong?

    27" 1080/1440p
    144hz

    looks like all the name brand stuff is around $500 for gsync/free sync that can LFC. I can not find a name brand 27"-32" freesync for 300-400 unless i really go offbrand or go for a 60hz panel. Can you really find an asus/acer/benq? At best i can only find the really new viewsonic units.
     
  36. {NG}Fidel

    {NG}Fidel [H]ardness Supreme

    Messages:
    5,817
    Joined:
    Jan 17, 2005
    Wow looks like my monitor will get some Freesync/Gsync Love. It doesn't have LFC and its range is very limited (40-60) but my system never goes below 40 at the settings I use anyway so it will be a nice free upgrade.
    I had Gsync before and wasn't a huge fan or hater of it. Just didn't like the price point I had to pay for the feature at the resolution I had gotten at the time. I switched to a Samsung Freesync display because the other specs it had are what I wanted and could afford.
    Nvidia finally realizing the smart move is rather surprising.
    Maybe losing all the money shook the greed a bit.
     
    Maddness likes this.
  37. reaper12

    reaper12 2[H]4U

    Messages:
    2,186
    Joined:
    Oct 21, 2006
    You have made several wrong statements in this thread. There are plenty of full range Freesync monitors from Samsung, Acer, Asus, AOC, MSI that run at 144hz in various screen sizes, including 27inch. There are also 240Hz Freesync monitors. A 27 inch 240Hz monitor from Acer costs about the same as the cheapest Gsync monitor.
     
  38. nightanole

    nightanole [H]ard|Gawd

    Messages:
    1,996
    Joined:
    Feb 16, 2003
    Maybe i got missunderstood...

    You repeated everything i was agreeing to. I believe my original comment was you are not going to save much money trying to find a free sync that supports LFC and has the same quality panel, vs a gsync. I guess i was trying to make and apples to apples argument and failed.


    On a ligher note, im running a viewsonic VX3258 via the igpu trick with a ryzen 2200g and a gtx 1070. It will be nice at the end of the month to not have to manually select each game exe and select performance in win10 in order to run the gtx and have free sync at the same time.
     
  39. Sancus

    Sancus Gawd

    Messages:
    787
    Joined:
    Jun 1, 2013


    This is an excellent summary of what "meets standard" means. Basic Freesync with no LFC and no variable overdrive is largely worthless IMO and should not even be considered a feature.
     
    DanNeely and Maddness like this.