JosiahBradley
[H]ard|Gawd
- Joined
- Mar 19, 2006
- Messages
- 1,791
ONE INTEGER difference. I have the MG279Q NIEN NIEN NIENThanks, so my monitor will finally be doing some adaptive sync...
MG278Q
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
ONE INTEGER difference. I have the MG279Q NIEN NIEN NIENThanks, so my monitor will finally be doing some adaptive sync...
MG278Q
I'm honestly not surprised. A lot of stuff gets hidden by the raster/render hacks modern games use, so when you switch from that to a different technique you have to go through everything over again to be sure it looks the way it's supposed to. And then, if you still want to support the old stuff (in this case just lighting, thankfully), you have to go back and make sure the changes you made didn't break that.I'm really excited about them finally supporting traditional a-sync, even if it has caveats. Beyond that, eh. I've seen worse presentations from Nvidia, but this one is still pretty meh.
Also, we're how many months from the launch of the RTX cards and so far only a single game supports RT and two (unless BFV didn't add it yet) support DLSS. Outside of Anthem (and no guarantee that support would come at launch) Jensen couldn't even talk about when any of the other games announced to support either feature, not to mention talk about when they would actually be added.
No one has talked about this yet, but in an upcoming article I'm going to show how limiting even 8GB of VRAM is for DXR in BFV. 6GB is going to restrain 2060, a LOT.
I can't speak about other DXR games of course, but then again, there aren't any others right now are there
How close is it reaching to 11gb?
Well the price is a little better than I expected, 1070ti performance for $100 less than what that released at. Still nowhere near the price/performance increase that the 10xx gen offered over 9xx. But better than what we got work the other Turing cards.
We were supposed to get HDR in Rise of the Tomb Raider, and that never happened. Starting to worry that the same thing will happen with RTX and Shadow of the Tomb Raider.Also, still nothing about DLSS or ray tracing in Rise of the Tomb Raider....
Maybe next CES?
i kid i kid
We were supposed to get HDR in Rise of the Tomb Raider, and that never happened. Starting to worry that the same thing will happen with RTX and Shadow of the Tomb Raider.
There are hundreds of monitor models available capable of variable refresh rates (VRR) using the VESA DisplayPort Adaptive-Sync protocol. However, the VRR gaming experience can vary widely.
To improve the experience for gamers, NVIDIA will test monitors. Those that pass our validation tests will be G-SYNC Compatible and enabled by default in the GeForce driver.
G-SYNC Compatible tests will identify monitors that deliver a baseline VRR experience on GeForce RTX 20-series and GeForce GTX 10-series graphics cards, and activate their VRR features automatically.
Support for G-SYNC Compatible monitors will begin Jan. 15 with the launch of our first 2019 Game Ready driver. Already, 12 monitors have been validated as G-SYNC Compatible (from the 400 we have tested so far). We’ll continue to test monitors and update our support list. For gamers who have monitors that we have not yet tested, or that have failed validation, we’ll give you an option to manually enable VRR, too.
https://blogs.nvidia.com/blog/2019/01/06/g-sync-displays-ces/
Not sure if this was posted, but nVidia just posted a list of FreeSync monitors to support VRR by drivers. The list is in one of the links in the blog, at the bottom of the table.
Hell froze over, kinda
EDIT: relevant part of the blog:
Problem with that is though, is the fact that RayTracing has it's own set of issues that Rasterization does not. So really, all that is happening is trading one well known set of hacks/tricks to learn a whole new set of hacks/tricks that probably -also- won't really look quite right. So really, you're where you started. Unless it stays hybrid, forever. In which case the only thing one has succeeded in doing is compounding the problem of complexity and consume more dev time simply to make it "pretty".I'm honestly not surprised. A lot of stuff gets hidden by the raster/render hacks modern games use, so when you switch from that to a different technique you have to go through everything over again to be sure it looks the way it's supposed to. And then, if you still want to support the old stuff (in this case just lighting, thankfully), you have to go back and make sure the changes you made didn't break that.
https://blogs.nvidia.com/blog/2019/01/06/g-sync-displays-ces/
Not sure if this was posted, but nVidia just posted a list of FreeSync monitors to support VRR by drivers. The list is in one of the links in the blog, at the bottom of the table.
Hell froze over, kinda
EDIT: relevant part of the blog:
they keep hyping up how the 2060 is more powerful then the 1070ti but they're leaving out the fact that the 1070ti has 8GB VRAM vs 6GB on the 2060...and..."With Turing’s RT Cores and Tensor Cores, it can run Battlefield V with ray tracing at 60 frames per second," says Nvidia's official press release...yeah at 1080p
If the rest of your sig is accurate then you will need to upgrade more than that if you get a new card...Maybe it's time to upgrade my GTX 970...
One nice thing about the 2060 is that it lowers the barrier to entry for the Turing features, which I think will be important to get any sort of market adoption. If your workload fits in 6GB, its also a cheaper way to get tensor and RT cores for machine learning or visualization tasks (and if your work scales out, its cheaper per core than a 2070). It's not a particularly groundbreaking gaming card, but its also not a step back - realistically, we're looking at 1070Ti-ish performance for 1070Ti prices, but with the added benefit of new features (doesn't hurt to have them) and a newer architecture (likely improved driver support down the line if you keep your card for a couple years).
You'll see when we publish the article
I'm still writing it!
/hint I see why the TITAN RTX exists now
That list looks more like a sponsor's list, or someone who paid to have each monitor pass, which is most likely the "passing" factor with no other differences. In fact, I looked up the BenQ XL2740 to see what the difference is between that and my Freesync BenQ XL2730z is... there are three differences. The XL2740 has a higher refresh rate (240hz vs 144hz), the XL2740 is a 1080P monitor vs my 1440P XL2730z, and t he XL2740 does not have freesync, or any adaptive sync listed in it's specifications. So, it is interesting that a monitor that doesn't have adaptive sync, or free sync listed in it's specs at all, can pass the test.
I'm sorry, I am going off the Manufactures web site, which has NO mention of supporting freesync, not a GPU manufactures site that is trying to promote their technology to sell cards. (in fact all outlets selling the monitor do not list freesync in it's specifications)
https://zowie.benq.com/en/product/monitor/xl/xl2740.html
I know what the site says. Its technically considered unofficial support. The monitor technically supports it, but its disabled when using BenQ's blur reduction (which is on by default). Its the same with the XL2540 in fact.
which is exactly how it works on my XL2730z, you can't run both motion blur reduction and freesync at the same time, it's one or the other, yet Freesync is listed in the specification. I just find it odd, that a monitor that "unofficially" supports freesync is listed but one that fully supports it, isn't listed as I am pretty sure they both are the same freesync experience using identical freesync technology, short of the refresh rate differences. As I said, the list is more like a sponsorship list.
No mobile 20XX announcements?
ONE INTEGER difference. I have the MG279Q NIEN NIEN NIEN
https://www.nvidia.com/en-us/geforce/gaming-laptops/20-series/No mobile 20XX announcements?
Here is a direct link to the table with the already certified monitors listed: https://www.nvidia.com/en-us/geforce/products/g-sync-monitors/specs/
G-Sync Compatible models are posted all the way at the bottom. Be interesting to see what other get added and what the difference will be between G-Sync Compatible monitors and the uncertified ones.
Well, it's not Freesync. If NVIDIA were using Freesync they would have to call it that. Freesync is the software side of things that makes Adaptive-Sync work, and even though it is an open standard AMD themselves are still the only ones using it.I feel strange, I'm mostly an amd fanboy but I enjoyed this presentation from nvidia. I thought they did a much more thorough presentation about the benefits of ray tracing with demos. I liked that chinese mmo with the reflections bouncing all around quite a bit. The effect where the light bounced off the water and showed fluctuating reflections on the stone arche looked great to me.
I still detect a strange visual on some of the reflections, almost as if they are fuzzier than ideal, but that is likely a function of this raytracing not being as far as it can eventually get to. But that's fine.
I take the point that performance might still not be ideal with more stuff going on, but a journey of a thousand miles begins with the first step. It is enough that they kickstarted the push for more ray tracing in games.
Nvidia finally caving and supporting freesync is a huge win since they are so dominant. That stuff about the dlss improving performance was interesting too. I wonder how detailed scenes with rtx off would compare to rtx on with a lot more stuff on screen? Would performance boosts from dlss still be similar to the raytracing being turned off? It sounded like the dlss was something that was improving over time where they could keep chugging away to get more performance boosts over time. After all, there has already been performance boosts in battlefield 5 hasn't there?
In any event, I thought it was a good presentation overall. Anthem is a game I might end up playing, and this kind of made me want to consider oen of these cards. Though I'm probably going to stick with my 1080 for now and hope amd comes out with something better next year.
Well, it's not Freesync. If NVIDIA were using Freesync they would have to call it that. Freesync is the software side of things that makes Adaptive-Sync work, and even though it is an open standard AMD themselves are still the only ones using it.
I don't think that is accurate, as AMD's Freesync and Nvidia's new Gsync-compatible are both the same thing. 2 pieces of software (drivers) that support the open standard adaptive-Sync that all Freesync monitors use. Basically, all Freesync is, is a copy write brand name for AMD. The "software" or code in the drivers that support the open standard pretty much have to be the same to be fully compatible with the open standard, which can't be copy writed, as it is part of the open standard. Nvidia would be sued if they even mentioned freesync compatibility, because they used AMD's Branding "FreeSync" name without permission. even though in the end, they are basically the same thing software wise.
ONE INTEGER difference. I have the MG279Q NIEN NIEN NIEN
I suspect there is some misinformation going on here. Vesa's adaptive sync is the open standard. Freesync is AMD's proprietary or Brand name for it, which is trade marked BY AMD. Because AMD was the only manufacture using the Vesa open standard, people correlate freesync as being the open stand, which it is not. It is just AMD's trade marked name for it, and cannot be used with out their permission.FreeSync is a royalty free open standard. Anyone that wants to us it can use it. However, yes it is AMDs brand name. I could only see Nvidia being sued if they attempted to claim creation. Nothing to stop them however from saying they’re using AMD Freesync technology.....but that doesn’t sound good with G-Sync.
Yep. http://tmsearch.uspto.gov/bin/showfield?f=doc&state=4803:n6oxle.2.2I suspect there is some misinformation going on here. Vesa's adaptive sync is the open standard. Freesync is AMD's proprietary or Brand name for it, which is trade marked BY AMD. Because AMD was the only manufacture using the Vesa open standard, people correlate freesync as being the open stand, which it is not. It is just AMD's trade marked name for it, and cannot be used with out their permission.
Hence, why it shows the TM after it's use in many of the places on AMD's own site:
https://www.amd.com/en/technologies/free-sync
The royalty free part you are referring to, is ONLY for monitor manufactures:
Taken straight from their faq:
AMD has undertaken efforts to encourage broad adoption for Radeon FreeSync technology, including:
- Royalty-free licensing for monitor manufacturers;
- Open and standardized monitor requirements (e.g. no non-standard display controllers or ASICs);
- Industry-standard implementation via the DisplayPort Adaptive-Sync amendment to the DisplayPort 1.2a specification; and
- interoperability with existing monitor technologies.
https://www.amd.com/en/technologies/free-sync-faq
Nvidia is free to use the open Vesa adaptive sync standard, they just can't use FreeSync in it's name or description in any shape or form without AMD's approval/license.