NVIDIA CES 2019 Press Event: Watch the Livestream Tonight at 8 PM Pacific

I'm really excited about them finally supporting traditional a-sync, even if it has caveats. Beyond that, eh. I've seen worse presentations from Nvidia, but this one is still pretty meh.

Also, we're how many months from the launch of the RTX cards and so far only a single game supports RT and two (unless BFV didn't add it yet) support DLSS. Outside of Anthem (and no guarantee that support would come at launch) Jensen couldn't even talk about when any of the other games announced to support either feature, not to mention talk about when they would actually be added.
I'm honestly not surprised. A lot of stuff gets hidden by the raster/render hacks modern games use, so when you switch from that to a different technique you have to go through everything over again to be sure it looks the way it's supposed to. And then, if you still want to support the old stuff (in this case just lighting, thankfully), you have to go back and make sure the changes you made didn't break that.
 
Also, still nothing about DLSS or ray tracing in Rise of the Tomb Raider....

Maybe next CES?

i kid i kid
 
No one has talked about this yet, but in an upcoming article I'm going to show how limiting even 8GB of VRAM is for DXR in BFV. 6GB is going to restrain 2060, a LOT.

I can't speak about other DXR games of course, but then again, there aren't any others right now are there :p


How close is it reaching to 11gb?
 
Well the price is a little better than I expected, 1070ti performance for $100 less than what that released at. Still nowhere near the price/performance increase that the 10xx gen offered over 9xx. But better than what we got work the other Turing cards.

yeah except last week you could buy an evga 1070ti for $359 and with 2 free games thrown in.. so paying 1070ti price for ... 1070ti performance (also with less ram)
 
Also, still nothing about DLSS or ray tracing in Rise of the Tomb Raider....

Maybe next CES?

i kid i kid
We were supposed to get HDR in Rise of the Tomb Raider, and that never happened. Starting to worry that the same thing will happen with RTX and Shadow of the Tomb Raider.
 
they keep hyping up how the 2060 is more powerful then the 1070ti but they're leaving out the fact that the 1070ti has 8GB VRAM vs 6GB on the 2060...and..."With Turing’s RT Cores and Tensor Cores, it can run Battlefield V with ray tracing at 60 frames per second," says Nvidia's official press release...yeah at 1080p
 
We were supposed to get HDR in Rise of the Tomb Raider, and that never happened. Starting to worry that the same thing will happen with RTX and Shadow of the Tomb Raider.

Murphy's Law mother fucker!
 
https://blogs.nvidia.com/blog/2019/01/06/g-sync-displays-ces/

Not sure if this was posted, but nVidia just posted a list of FreeSync monitors to support VRR by drivers. The list is in one of the links in the blog, at the bottom of the table.

Hell froze over, kinda

EDIT: relevant part of the blog:

There are hundreds of monitor models available capable of variable refresh rates (VRR) using the VESA DisplayPort Adaptive-Sync protocol. However, the VRR gaming experience can vary widely.

To improve the experience for gamers, NVIDIA will test monitors. Those that pass our validation tests will be G-SYNC Compatible and enabled by default in the GeForce driver.

G-SYNC Compatible tests will identify monitors that deliver a baseline VRR experience on GeForce RTX 20-series and GeForce GTX 10-series graphics cards, and activate their VRR features automatically.

Support for G-SYNC Compatible monitors will begin Jan. 15 with the launch of our first 2019 Game Ready driver. Already, 12 monitors have been validated as G-SYNC Compatible (from the 400 we have tested so far). We’ll continue to test monitors and update our support list. For gamers who have monitors that we have not yet tested, or that have failed validation, we’ll give you an option to manually enable VRR, too.
 
https://blogs.nvidia.com/blog/2019/01/06/g-sync-displays-ces/

Not sure if this was posted, but nVidia just posted a list of FreeSync monitors to support VRR by drivers. The list is in one of the links in the blog, at the bottom of the table.

Hell froze over, kinda

EDIT: relevant part of the blog:

Here is a direct link to the table with the already certified monitors listed: https://www.nvidia.com/en-us/geforce/products/g-sync-monitors/specs/

G-Sync Compatible models are posted all the way at the bottom. Be interesting to see what other get added and what the difference will be between G-Sync Compatible monitors and the uncertified ones.
 
  • Like
Reactions: chenw
like this
I'm honestly not surprised. A lot of stuff gets hidden by the raster/render hacks modern games use, so when you switch from that to a different technique you have to go through everything over again to be sure it looks the way it's supposed to. And then, if you still want to support the old stuff (in this case just lighting, thankfully), you have to go back and make sure the changes you made didn't break that.
Problem with that is though, is the fact that RayTracing has it's own set of issues that Rasterization does not. So really, all that is happening is trading one well known set of hacks/tricks to learn a whole new set of hacks/tricks that probably -also- won't really look quite right. So really, you're where you started. Unless it stays hybrid, forever. In which case the only thing one has succeeded in doing is compounding the problem of complexity and consume more dev time simply to make it "pretty".
 
Them finally opening up to support async just made me rethink my next GPU upgrade. Let’s see what AMD has to offer first, if their $250 1080 equivalent pans out I might just stick with team Red.
 
https://blogs.nvidia.com/blog/2019/01/06/g-sync-displays-ces/

Not sure if this was posted, but nVidia just posted a list of FreeSync monitors to support VRR by drivers. The list is in one of the links in the blog, at the bottom of the table.

Hell froze over, kinda

EDIT: relevant part of the blog:

odds are it'll only be freesync 2.0 monitors since the requirements are far more strict compared to standard freesync.

they keep hyping up how the 2060 is more powerful then the 1070ti but they're leaving out the fact that the 1070ti has 8GB VRAM vs 6GB on the 2060...and..."With Turing’s RT Cores and Tensor Cores, it can run Battlefield V with ray tracing at 60 frames per second," says Nvidia's official press release...yeah at 1080p

the 2070 can't even reliably run 60fps, there's no way in hell the 2060 is doing it with 6GB of ram.. you'd have be running low/medium settings on low DXR at 1080p.
 
One nice thing about the 2060 is that it lowers the barrier to entry for the Turing features, which I think will be important to get any sort of market adoption. If your workload fits in 6GB, its also a cheaper way to get tensor and RT cores for machine learning or visualization tasks (and if your work scales out, its cheaper per core than a 2070). It's not a particularly groundbreaking gaming card, but its also not a step back - realistically, we're looking at 1070Ti-ish performance for 1070Ti prices, but with the added benefit of new features (doesn't hurt to have them) and a newer architecture (likely improved driver support down the line if you keep your card for a couple years).
 
One nice thing about the 2060 is that it lowers the barrier to entry for the Turing features, which I think will be important to get any sort of market adoption. If your workload fits in 6GB, its also a cheaper way to get tensor and RT cores for machine learning or visualization tasks (and if your work scales out, its cheaper per core than a 2070). It's not a particularly groundbreaking gaming card, but its also not a step back - realistically, we're looking at 1070Ti-ish performance for 1070Ti prices, but with the added benefit of new features (doesn't hurt to have them) and a newer architecture (likely improved driver support down the line if you keep your card for a couple years).

Not cheap enough, IMO. The sweet spot is still $200-$250 for the mainstream. Until we see video cards all the way down to that level capable of performing well with NVIDIA Ray Tracing, it will be continually hard to gain adoption, IN MY OPINION.

This is why I think it won't happen until the next generation in 2020.
 
I need to double check but it looks like my AOC AGON 31.5 Monitor is on the approved list https://www.nvidia.com/en-us/geforce/products/g-sync-monitors/specs/ Ill have to check that model number but fairly sure thats it.

also

You'll see when we publish the article :p

I'm still writing it!

/hint I see why the TITAN RTX exists now


Your worrying me, looking forward to the article though.

Hopefully AMD announces something competitive soon to drive down prices.
 

That list looks more like a sponsor's list, or someone who paid to have each monitor pass, which is most likely the "passing" factor with no other differences. In fact, I looked up the BenQ XL2740 to see what the difference is between that and my Freesync BenQ XL2730z is... there are three differences. The XL2740 has a higher refresh rate (240hz vs 144hz), the XL2740 is a 1080P monitor vs my 1440P XL2730z, and t he XL2740 does not have freesync, or any adaptive sync listed in it's specifications. So, it is interesting that a monitor that doesn't have adaptive sync, or free sync listed in it's specs at all, can pass the test. It is possible it is unofficially supported, but it does make a person wonder what the "passing" factor is.
 
Last edited:
That list looks more like a sponsor's list, or someone who paid to have each monitor pass, which is most likely the "passing" factor with no other differences. In fact, I looked up the BenQ XL2740 to see what the difference is between that and my Freesync BenQ XL2730z is... there are three differences. The XL2740 has a higher refresh rate (240hz vs 144hz), the XL2740 is a 1080P monitor vs my 1440P XL2730z, and t he XL2740 does not have freesync, or any adaptive sync listed in it's specifications. So, it is interesting that a monitor that doesn't have adaptive sync, or free sync listed in it's specs at all, can pass the test.

BOq9NJj.png


According to AMD, the monitor supports freesync.
 
View attachment 133370

According to AMD, the monitor supports freesync.

I am going off the Manufactures web site, which has NO mention of supporting freesync, not a GPU manufactures site that is trying to promote their technology to sell cards. (in fact all outlets selling the monitor do not list freesync or adaptive sync in it's specifications) it may be unofficially supported though. Still makes a person wonder on the "passing" qualifications. It's kind of like how Comcast told me that my modem would fully support the 400Mbps speeds on their network... uh, it doesn't per the manufacturer and actual usage after the upgrade, I had to buy a different modem to get the full speed.

https://zowie.benq.com/en/product/monitor/xl/xl2740.html
 
Last edited:
I'm sorry, I am going off the Manufactures web site, which has NO mention of supporting freesync, not a GPU manufactures site that is trying to promote their technology to sell cards. (in fact all outlets selling the monitor do not list freesync in it's specifications)

https://zowie.benq.com/en/product/monitor/xl/xl2740.html

I know what the site says. Its technically considered unofficial support. The monitor technically supports it, but its disabled when using BenQ's blur reduction (which is on by default). Its the same with the XL2540 in fact.
 
I know what the site says. Its technically considered unofficial support. The monitor technically supports it, but its disabled when using BenQ's blur reduction (which is on by default). Its the same with the XL2540 in fact.

which is exactly how it works on my XL2730z, you can't run both motion blur reduction and freesync at the same time, it's one or the other, yet Freesync is listed in the specification. I just find it odd, that a monitor that "unofficially" supports freesync is listed but one that fully supports it, isn't listed as I am pretty sure they both are the same freesync experience using identical freesync technology, short of the refresh rate differences. As I said, the list is more like a sponsorship list.
 
Last edited:
which is exactly how it works on my XL2730z, you can't run both motion blur reduction and freesync at the same time, it's one or the other, yet Freesync is listed in the specification. I just find it odd, that a monitor that "unofficially" supports freesync is listed but one that fully supports it, isn't listed as I am pretty sure they both are the same freesync experience using identical freesync technology, short of the refresh rate differences. As I said, the list is more like a sponsorship list.

If it was a sponsored list then the BenQ wouldn't be on there as BenQ does not seem at all interested in acknowledging that monitor's adaptive sync abilities.
 
I feel strange, I'm mostly an amd fanboy but I enjoyed this presentation from nvidia. I thought they did a much more thorough presentation about the benefits of ray tracing with demos. I liked that chinese mmo with the reflections bouncing all around quite a bit. The effect where the light bounced off the water and showed fluctuating reflections on the stone arche looked great to me.

I still detect a strange visual on some of the reflections, almost as if they are fuzzier than ideal, but that is likely a function of this raytracing not being as far as it can eventually get to. But that's fine.

I take the point that performance might still not be ideal with more stuff going on, but a journey of a thousand miles begins with the first step. It is enough that they kickstarted the push for more ray tracing in games.

Nvidia finally caving and supporting freesync is a huge win since they are so dominant. That stuff about the dlss improving performance was interesting too. I wonder how detailed scenes with rtx off would compare to rtx on with a lot more stuff on screen? Would performance boosts from dlss still be similar to the raytracing being turned off? It sounded like the dlss was something that was improving over time where they could keep chugging away to get more performance boosts over time. After all, there has already been performance boosts in battlefield 5 hasn't there?

In any event, I thought it was a good presentation overall. Anthem is a game I might end up playing, and this kind of made me want to consider oen of these cards. Though I'm probably going to stick with my 1080 for now and hope amd comes out with something better next year.
 
Even if you have a non-certified Freesync monitor, you'll still have the option to enable support manually in the Nvidia control panel.
 
Here is a direct link to the table with the already certified monitors listed: https://www.nvidia.com/en-us/geforce/products/g-sync-monitors/specs/

G-Sync Compatible models are posted all the way at the bottom. Be interesting to see what other get added and what the difference will be between G-Sync Compatible monitors and the uncertified ones.

Glad I didn't dump $$$ on a new monitor with my new build-I was waiting till the summer to get a new 34" G-sync monitor.
 
I feel strange, I'm mostly an amd fanboy but I enjoyed this presentation from nvidia. I thought they did a much more thorough presentation about the benefits of ray tracing with demos. I liked that chinese mmo with the reflections bouncing all around quite a bit. The effect where the light bounced off the water and showed fluctuating reflections on the stone arche looked great to me.

I still detect a strange visual on some of the reflections, almost as if they are fuzzier than ideal, but that is likely a function of this raytracing not being as far as it can eventually get to. But that's fine.

I take the point that performance might still not be ideal with more stuff going on, but a journey of a thousand miles begins with the first step. It is enough that they kickstarted the push for more ray tracing in games.

Nvidia finally caving and supporting freesync is a huge win since they are so dominant. That stuff about the dlss improving performance was interesting too. I wonder how detailed scenes with rtx off would compare to rtx on with a lot more stuff on screen? Would performance boosts from dlss still be similar to the raytracing being turned off? It sounded like the dlss was something that was improving over time where they could keep chugging away to get more performance boosts over time. After all, there has already been performance boosts in battlefield 5 hasn't there?

In any event, I thought it was a good presentation overall. Anthem is a game I might end up playing, and this kind of made me want to consider oen of these cards. Though I'm probably going to stick with my 1080 for now and hope amd comes out with something better next year.
Well, it's not Freesync. If NVIDIA were using Freesync they would have to call it that. Freesync is the software side of things that makes Adaptive-Sync work, and even though it is an open standard AMD themselves are still the only ones using it.
 
Well, it's not Freesync. If NVIDIA were using Freesync they would have to call it that. Freesync is the software side of things that makes Adaptive-Sync work, and even though it is an open standard AMD themselves are still the only ones using it.

I don't think that is accurate, as AMD's Freesync and Nvidia's new Gsync-compatible are both the same thing. 2 pieces of software (drivers) that support the open standard adaptive-Sync that all Freesync monitors use. Basically, all Freesync is, is a copy write brand name for AMD. The "software" or code in the drivers that support the open standard pretty much have to be the same to be fully compatible with the open standard, which can't be copy writed, as it is part of the open standard. Nvidia would be sued if they even mentioned freesync compatibility, because they used AMD's Branding "FreeSync" name without permission. even though in the end, they are basically the same thing software wise.
 
I don't think that is accurate, as AMD's Freesync and Nvidia's new Gsync-compatible are both the same thing. 2 pieces of software (drivers) that support the open standard adaptive-Sync that all Freesync monitors use. Basically, all Freesync is, is a copy write brand name for AMD. The "software" or code in the drivers that support the open standard pretty much have to be the same to be fully compatible with the open standard, which can't be copy writed, as it is part of the open standard. Nvidia would be sued if they even mentioned freesync compatibility, because they used AMD's Branding "FreeSync" name without permission. even though in the end, they are basically the same thing software wise.

FreeSync is a royalty free open standard. Anyone that wants to us it can use it. However, yes it is AMDs brand name. I could only see Nvidia being sued if they attempted to claim creation. Nothing to stop them however from saying they’re using AMD Freesync technology.....but that doesn’t sound good with G-Sync.
 
FreeSync is a royalty free open standard. Anyone that wants to us it can use it. However, yes it is AMDs brand name. I could only see Nvidia being sued if they attempted to claim creation. Nothing to stop them however from saying they’re using AMD Freesync technology.....but that doesn’t sound good with G-Sync.
I suspect there is some misinformation going on here. Vesa's adaptive sync is the open standard. Freesync is AMD's proprietary or Brand name for it, which is trade marked BY AMD. Because AMD was the only manufacture using the Vesa open standard, people correlate freesync as being the open stand, which it is not. It is just AMD's trade marked name for it, and cannot be used with out their permission.

Hence, why it shows the TM after it's use in many of the places on AMD's own site:

https://www.amd.com/en/technologies/free-sync


The royalty free part you are referring to, is ONLY for monitor manufactures:

Taken straight from their faq:



AMD has undertaken efforts to encourage broad adoption for Radeon FreeSync technology, including:

  • Royalty-free licensing for monitor manufacturers;
  • Open and standardized monitor requirements (e.g. no non-standard display controllers or ASICs);
  • Industry-standard implementation via the DisplayPort Adaptive-Sync amendment to the DisplayPort 1.2a specification; and
  • interoperability with existing monitor technologies.


https://www.amd.com/en/technologies/free-sync-faq


Nvidia is free to use the open Vesa adaptive sync standard, they just can't use FreeSync in it's name or description in any shape or form without AMD's approval/license.
 
Last edited:
I suspect there is some misinformation going on here. Vesa's adaptive sync is the open standard. Freesync is AMD's proprietary or Brand name for it, which is trade marked BY AMD. Because AMD was the only manufacture using the Vesa open standard, people correlate freesync as being the open stand, which it is not. It is just AMD's trade marked name for it, and cannot be used with out their permission.

Hence, why it shows the TM after it's use in many of the places on AMD's own site:

https://www.amd.com/en/technologies/free-sync


The royalty free part you are referring to, is ONLY for monitor manufactures:

Taken straight from their faq:



AMD has undertaken efforts to encourage broad adoption for Radeon FreeSync technology, including:

  • Royalty-free licensing for monitor manufacturers;
  • Open and standardized monitor requirements (e.g. no non-standard display controllers or ASICs);
  • Industry-standard implementation via the DisplayPort Adaptive-Sync amendment to the DisplayPort 1.2a specification; and
  • interoperability with existing monitor technologies.


https://www.amd.com/en/technologies/free-sync-faq


Nvidia is free to use the open Vesa adaptive sync standard, they just can't use FreeSync in it's name or description in any shape or form without AMD's approval/license.
Yep. http://tmsearch.uspto.gov/bin/showfield?f=doc&state=4803:n6oxle.2.2
 
Back
Top