NVIDIA CES 2019 Press Event: Watch the Livestream Tonight at 8 PM Pacific

Discussion in 'HardForum Tech News' started by Megalith, Jan 6, 2019.

  1. JosiahBradley

    JosiahBradley [H]ard|Gawd

    Messages:
    1,720
    Joined:
    Mar 19, 2006
    ONE INTEGER difference. I have the MG279Q NIEN NIEN NIEN
     
  2. Nobu

    Nobu 2[H]4U

    Messages:
    3,276
    Joined:
    Jun 7, 2007
    I'm honestly not surprised. A lot of stuff gets hidden by the raster/render hacks modern games use, so when you switch from that to a different technique you have to go through everything over again to be sure it looks the way it's supposed to. And then, if you still want to support the old stuff (in this case just lighting, thankfully), you have to go back and make sure the changes you made didn't break that.
     
  3. Brent_Justice

    Brent_Justice Moderator

    Messages:
    17,755
    Joined:
    Apr 17, 2000
    Also, still nothing about DLSS or ray tracing in Rise of the Tomb Raider....

    Maybe next CES?

    i kid i kid
     
  4. Rvenger

    Rvenger [H]ard|Gawd

    Messages:
    1,788
    Joined:
    Sep 12, 2012

    How close is it reaching to 11gb?
     
  5. Brent_Justice

    Brent_Justice Moderator

    Messages:
    17,755
    Joined:
    Apr 17, 2000
    You'll see when we publish the article :p

    I'm still writing it!

    /hint I see why the TITAN RTX exists now
     
    Rvenger likes this.
  6. illli

    illli [H]ard|Gawd

    Messages:
    1,244
    Joined:
    Oct 26, 2005
    yeah except last week you could buy an evga 1070ti for $359 and with 2 free games thrown in.. so paying 1070ti price for ... 1070ti performance (also with less ram)
     
    russnuck likes this.
  7. Armenius

    Armenius I Drive Myself to the [H]ospital

    Messages:
    19,028
    Joined:
    Jan 28, 2014
    We were supposed to get HDR in Rise of the Tomb Raider, and that never happened. Starting to worry that the same thing will happen with RTX and Shadow of the Tomb Raider.
     
  8. polonyc2

    polonyc2 [H]ard as it Gets

    Messages:
    16,895
    Joined:
    Oct 25, 2004
    they keep hyping up how the 2060 is more powerful then the 1070ti but they're leaving out the fact that the 1070ti has 8GB VRAM vs 6GB on the 2060...and..."With Turing’s RT Cores and Tensor Cores, it can run Battlefield V with ray tracing at 60 frames per second," says Nvidia's official press release...yeah at 1080p
     
  9. Ski

    Ski Gawd

    Messages:
    974
    Joined:
    Jun 21, 2008
    Murphy's Law mother fucker!
     
  10. chenw

    chenw 2[H]4U

    Messages:
    3,977
    Joined:
    Oct 26, 2014
    https://blogs.nvidia.com/blog/2019/01/06/g-sync-displays-ces/

    Not sure if this was posted, but nVidia just posted a list of FreeSync monitors to support VRR by drivers. The list is in one of the links in the blog, at the bottom of the table.

    Hell froze over, kinda

    EDIT: relevant part of the blog:

     
    Maddness and GhostCow like this.
  11. Derangel

    Derangel [H]ard as it Gets

    Messages:
    17,860
    Joined:
    Jan 31, 2008
    Here is a direct link to the table with the already certified monitors listed: https://www.nvidia.com/en-us/geforce/products/g-sync-monitors/specs/

    G-Sync Compatible models are posted all the way at the bottom. Be interesting to see what other get added and what the difference will be between G-Sync Compatible monitors and the uncertified ones.
     
    chenw likes this.
  12. Vercinaigh

    Vercinaigh Gawd

    Messages:
    849
    Joined:
    Jul 31, 2008
    Problem with that is though, is the fact that RayTracing has it's own set of issues that Rasterization does not. So really, all that is happening is trading one well known set of hacks/tricks to learn a whole new set of hacks/tricks that probably -also- won't really look quite right. So really, you're where you started. Unless it stays hybrid, forever. In which case the only thing one has succeeded in doing is compounding the problem of complexity and consume more dev time simply to make it "pretty".
     
  13. Usual_suspect

    Usual_suspect Limp Gawd

    Messages:
    156
    Joined:
    Apr 15, 2011
    Them finally opening up to support async just made me rethink my next GPU upgrade. Let’s see what AMD has to offer first, if their $250 1080 equivalent pans out I might just stick with team Red.
     
    Armenius likes this.
  14. sirmonkey1985

    sirmonkey1985 [H]ard|DCer of the Month - July 2010

    Messages:
    21,565
    Joined:
    Sep 13, 2008
    odds are it'll only be freesync 2.0 monitors since the requirements are far more strict compared to standard freesync.

    the 2070 can't even reliably run 60fps, there's no way in hell the 2060 is doing it with 6GB of ram.. you'd have be running low/medium settings on low DXR at 1080p.
     
  15. umeng2002

    umeng2002 Gawd

    Messages:
    923
    Joined:
    May 23, 2008
    Maybe it's time to upgrade my GTX 970...
     
    Armenius likes this.
  16. misterbobby

    misterbobby 2[H]4U

    Messages:
    3,814
    Joined:
    Mar 18, 2014
    If the rest of your sig is accurate then you will need to upgrade more than that if you get a new card...
     
    Armenius likes this.
  17. bwang

    bwang Gawd

    Messages:
    963
    Joined:
    Aug 6, 2011
    One nice thing about the 2060 is that it lowers the barrier to entry for the Turing features, which I think will be important to get any sort of market adoption. If your workload fits in 6GB, its also a cheaper way to get tensor and RT cores for machine learning or visualization tasks (and if your work scales out, its cheaper per core than a 2070). It's not a particularly groundbreaking gaming card, but its also not a step back - realistically, we're looking at 1070Ti-ish performance for 1070Ti prices, but with the added benefit of new features (doesn't hurt to have them) and a newer architecture (likely improved driver support down the line if you keep your card for a couple years).
     
    Armenius likes this.
  18. Brent_Justice

    Brent_Justice Moderator

    Messages:
    17,755
    Joined:
    Apr 17, 2000
    Not cheap enough, IMO. The sweet spot is still $200-$250 for the mainstream. Until we see video cards all the way down to that level capable of performing well with NVIDIA Ray Tracing, it will be continually hard to gain adoption, IN MY OPINION.

    This is why I think it won't happen until the next generation in 2020.
     
    N4CR, jeffj7, Armenius and 3 others like this.
  19. Dodge245

    Dodge245 Limp Gawd

    Messages:
    183
    Joined:
    Oct 8, 2018
    I need to double check but it looks like my AOC AGON 31.5 Monitor is on the approved list https://www.nvidia.com/en-us/geforce/products/g-sync-monitors/specs/ Ill have to check that model number but fairly sure thats it.

    also


    Your worrying me, looking forward to the article though.

    Hopefully AMD announces something competitive soon to drive down prices.
     
  20. NWRMidnight

    NWRMidnight Limp Gawd

    Messages:
    350
    Joined:
    Oct 23, 2010
    That list looks more like a sponsor's list, or someone who paid to have each monitor pass, which is most likely the "passing" factor with no other differences. In fact, I looked up the BenQ XL2740 to see what the difference is between that and my Freesync BenQ XL2730z is... there are three differences. The XL2740 has a higher refresh rate (240hz vs 144hz), the XL2740 is a 1080P monitor vs my 1440P XL2730z, and t he XL2740 does not have freesync, or any adaptive sync listed in it's specifications. So, it is interesting that a monitor that doesn't have adaptive sync, or free sync listed in it's specs at all, can pass the test. It is possible it is unofficially supported, but it does make a person wonder what the "passing" factor is.
     
    Last edited: Jan 7, 2019
  21. Derangel

    Derangel [H]ard as it Gets

    Messages:
    17,860
    Joined:
    Jan 31, 2008
    BOq9NJj.png

    According to AMD, the monitor supports freesync.
     
  22. NWRMidnight

    NWRMidnight Limp Gawd

    Messages:
    350
    Joined:
    Oct 23, 2010
    I am going off the Manufactures web site, which has NO mention of supporting freesync, not a GPU manufactures site that is trying to promote their technology to sell cards. (in fact all outlets selling the monitor do not list freesync or adaptive sync in it's specifications) it may be unofficially supported though. Still makes a person wonder on the "passing" qualifications. It's kind of like how Comcast told me that my modem would fully support the 400Mbps speeds on their network... uh, it doesn't per the manufacturer and actual usage after the upgrade, I had to buy a different modem to get the full speed.

    https://zowie.benq.com/en/product/monitor/xl/xl2740.html
     
    Last edited: Jan 7, 2019
  23. Derangel

    Derangel [H]ard as it Gets

    Messages:
    17,860
    Joined:
    Jan 31, 2008
    I know what the site says. Its technically considered unofficial support. The monitor technically supports it, but its disabled when using BenQ's blur reduction (which is on by default). Its the same with the XL2540 in fact.
     
    Armenius likes this.
  24. NWRMidnight

    NWRMidnight Limp Gawd

    Messages:
    350
    Joined:
    Oct 23, 2010
    which is exactly how it works on my XL2730z, you can't run both motion blur reduction and freesync at the same time, it's one or the other, yet Freesync is listed in the specification. I just find it odd, that a monitor that "unofficially" supports freesync is listed but one that fully supports it, isn't listed as I am pretty sure they both are the same freesync experience using identical freesync technology, short of the refresh rate differences. As I said, the list is more like a sponsorship list.
     
    Last edited: Jan 7, 2019
  25. Derangel

    Derangel [H]ard as it Gets

    Messages:
    17,860
    Joined:
    Jan 31, 2008
    If it was a sponsored list then the BenQ wouldn't be on there as BenQ does not seem at all interested in acknowledging that monitor's adaptive sync abilities.
     
    Shadowed and Armenius like this.
  26. tybert7

    tybert7 2[H]4U

    Messages:
    2,634
    Joined:
    Aug 23, 2007
    I feel strange, I'm mostly an amd fanboy but I enjoyed this presentation from nvidia. I thought they did a much more thorough presentation about the benefits of ray tracing with demos. I liked that chinese mmo with the reflections bouncing all around quite a bit. The effect where the light bounced off the water and showed fluctuating reflections on the stone arche looked great to me.

    I still detect a strange visual on some of the reflections, almost as if they are fuzzier than ideal, but that is likely a function of this raytracing not being as far as it can eventually get to. But that's fine.

    I take the point that performance might still not be ideal with more stuff going on, but a journey of a thousand miles begins with the first step. It is enough that they kickstarted the push for more ray tracing in games.

    Nvidia finally caving and supporting freesync is a huge win since they are so dominant. That stuff about the dlss improving performance was interesting too. I wonder how detailed scenes with rtx off would compare to rtx on with a lot more stuff on screen? Would performance boosts from dlss still be similar to the raytracing being turned off? It sounded like the dlss was something that was improving over time where they could keep chugging away to get more performance boosts over time. After all, there has already been performance boosts in battlefield 5 hasn't there?

    In any event, I thought it was a good presentation overall. Anthem is a game I might end up playing, and this kind of made me want to consider oen of these cards. Though I'm probably going to stick with my 1080 for now and hope amd comes out with something better next year.
     
  27. GSDragoon

    GSDragoon [H]Lite

    Messages:
    122
    Joined:
    Feb 24, 2004
    Adaptive sync support for 10 and 20 series cards only? :(
     
  28. big_aug

    big_aug 2[H]4U

    Messages:
    2,108
    Joined:
    Oct 13, 2006
    No mobile 20XX announcements?:cry:
     
  29. lucidrenegade

    lucidrenegade Limp Gawd

    Messages:
    385
    Joined:
    Nov 3, 2011
    Even if you have a non-certified Freesync monitor, you'll still have the option to enable support manually in the Nvidia control panel.
     
    Armenius likes this.
  30. The Cobra

    The Cobra 2[H]4U

    Messages:
    2,656
    Joined:
    Jun 19, 2003
    Let me know when the 2080ti can be had for under $600 per card.
     
    DejaWiz likes this.
  31. Vega

    Vega [H]ardness Supreme

    Messages:
    6,278
    Joined:
    Oct 12, 2004
    Eh? It was in there. Like 40 different laptops coming out with RTX 2080's.
     
    Armenius likes this.
  32. Verge

    Verge [H]ardness Supreme

    Messages:
    6,295
    Joined:
    May 27, 2001
    You meant one digit?


    One integer could be a huge difference.
     
  33. Armenius

    Armenius I Drive Myself to the [H]ospital

    Messages:
    19,028
    Joined:
    Jan 28, 2014
  34. GT98

    GT98 [H]ard|Gawd

    Messages:
    1,250
    Joined:
    Aug 29, 2001
    Glad I didn't dump $$$ on a new monitor with my new build-I was waiting till the summer to get a new 34" G-sync monitor.
     
  35. Armenius

    Armenius I Drive Myself to the [H]ospital

    Messages:
    19,028
    Joined:
    Jan 28, 2014
    Well, it's not Freesync. If NVIDIA were using Freesync they would have to call it that. Freesync is the software side of things that makes Adaptive-Sync work, and even though it is an open standard AMD themselves are still the only ones using it.
     
  36. NWRMidnight

    NWRMidnight Limp Gawd

    Messages:
    350
    Joined:
    Oct 23, 2010
    I don't think that is accurate, as AMD's Freesync and Nvidia's new Gsync-compatible are both the same thing. 2 pieces of software (drivers) that support the open standard adaptive-Sync that all Freesync monitors use. Basically, all Freesync is, is a copy write brand name for AMD. The "software" or code in the drivers that support the open standard pretty much have to be the same to be fully compatible with the open standard, which can't be copy writed, as it is part of the open standard. Nvidia would be sued if they even mentioned freesync compatibility, because they used AMD's Branding "FreeSync" name without permission. even though in the end, they are basically the same thing software wise.
     
    KedsDead likes this.
  37. Nolan7689

    Nolan7689 [H]ard|Gawd

    Messages:
    1,427
    Joined:
    Jun 5, 2015
    FreeSync is a royalty free open standard. Anyone that wants to us it can use it. However, yes it is AMDs brand name. I could only see Nvidia being sued if they attempted to claim creation. Nothing to stop them however from saying they’re using AMD Freesync technology.....but that doesn’t sound good with G-Sync.
     
  38. T_A

    T_A Limp Gawd

    Messages:
    405
    Joined:
    Aug 4, 2005
    oh crap i was wrong, i have the MG279Q also (IPS 144khz) oh well.
     
  39. NWRMidnight

    NWRMidnight Limp Gawd

    Messages:
    350
    Joined:
    Oct 23, 2010
    I suspect there is some misinformation going on here. Vesa's adaptive sync is the open standard. Freesync is AMD's proprietary or Brand name for it, which is trade marked BY AMD. Because AMD was the only manufacture using the Vesa open standard, people correlate freesync as being the open stand, which it is not. It is just AMD's trade marked name for it, and cannot be used with out their permission.

    Hence, why it shows the TM after it's use in many of the places on AMD's own site:

    https://www.amd.com/en/technologies/free-sync


    The royalty free part you are referring to, is ONLY for monitor manufactures:

    Taken straight from their faq:



    AMD has undertaken efforts to encourage broad adoption for Radeon FreeSync technology, including:

    • Royalty-free licensing for monitor manufacturers;
    • Open and standardized monitor requirements (e.g. no non-standard display controllers or ASICs);
    • Industry-standard implementation via the DisplayPort Adaptive-Sync amendment to the DisplayPort 1.2a specification; and
    • interoperability with existing monitor technologies.


    https://www.amd.com/en/technologies/free-sync-faq


    Nvidia is free to use the open Vesa adaptive sync standard, they just can't use FreeSync in it's name or description in any shape or form without AMD's approval/license.
     
    Last edited: Jan 7, 2019
    russnuck and FrgMstr like this.
  40. Nobu

    Nobu 2[H]4U

    Messages:
    3,276
    Joined:
    Jun 7, 2007
    Yep. http://tmsearch.uspto.gov/bin/showfield?f=doc&state=4803:n6oxle.2.2