Separate names with a comma.
Discussion in 'HardForum Tech News' started by Megalith, Jan 6, 2019.
ONE INTEGER difference. I have the MG279Q NIEN NIEN NIEN
I'm honestly not surprised. A lot of stuff gets hidden by the raster/render hacks modern games use, so when you switch from that to a different technique you have to go through everything over again to be sure it looks the way it's supposed to. And then, if you still want to support the old stuff (in this case just lighting, thankfully), you have to go back and make sure the changes you made didn't break that.
Also, still nothing about DLSS or ray tracing in Rise of the Tomb Raider....
Maybe next CES?
i kid i kid
How close is it reaching to 11gb?
You'll see when we publish the article
I'm still writing it!
/hint I see why the TITAN RTX exists now
yeah except last week you could buy an evga 1070ti for $359 and with 2 free games thrown in.. so paying 1070ti price for ... 1070ti performance (also with less ram)
We were supposed to get HDR in Rise of the Tomb Raider, and that never happened. Starting to worry that the same thing will happen with RTX and Shadow of the Tomb Raider.
they keep hyping up how the 2060 is more powerful then the 1070ti but they're leaving out the fact that the 1070ti has 8GB VRAM vs 6GB on the 2060...and..."With Turing’s RT Cores and Tensor Cores, it can run Battlefield V with ray tracing at 60 frames per second," says Nvidia's official press release...yeah at 1080p
Murphy's Law mother fucker!
Not sure if this was posted, but nVidia just posted a list of FreeSync monitors to support VRR by drivers. The list is in one of the links in the blog, at the bottom of the table.
Hell froze over, kinda
EDIT: relevant part of the blog:
Here is a direct link to the table with the already certified monitors listed: https://www.nvidia.com/en-us/geforce/products/g-sync-monitors/specs/
G-Sync Compatible models are posted all the way at the bottom. Be interesting to see what other get added and what the difference will be between G-Sync Compatible monitors and the uncertified ones.
Problem with that is though, is the fact that RayTracing has it's own set of issues that Rasterization does not. So really, all that is happening is trading one well known set of hacks/tricks to learn a whole new set of hacks/tricks that probably -also- won't really look quite right. So really, you're where you started. Unless it stays hybrid, forever. In which case the only thing one has succeeded in doing is compounding the problem of complexity and consume more dev time simply to make it "pretty".
Them finally opening up to support async just made me rethink my next GPU upgrade. Let’s see what AMD has to offer first, if their $250 1080 equivalent pans out I might just stick with team Red.
odds are it'll only be freesync 2.0 monitors since the requirements are far more strict compared to standard freesync.
the 2070 can't even reliably run 60fps, there's no way in hell the 2060 is doing it with 6GB of ram.. you'd have be running low/medium settings on low DXR at 1080p.
Maybe it's time to upgrade my GTX 970...
If the rest of your sig is accurate then you will need to upgrade more than that if you get a new card...
One nice thing about the 2060 is that it lowers the barrier to entry for the Turing features, which I think will be important to get any sort of market adoption. If your workload fits in 6GB, its also a cheaper way to get tensor and RT cores for machine learning or visualization tasks (and if your work scales out, its cheaper per core than a 2070). It's not a particularly groundbreaking gaming card, but its also not a step back - realistically, we're looking at 1070Ti-ish performance for 1070Ti prices, but with the added benefit of new features (doesn't hurt to have them) and a newer architecture (likely improved driver support down the line if you keep your card for a couple years).
Not cheap enough, IMO. The sweet spot is still $200-$250 for the mainstream. Until we see video cards all the way down to that level capable of performing well with NVIDIA Ray Tracing, it will be continually hard to gain adoption, IN MY OPINION.
This is why I think it won't happen until the next generation in 2020.
I need to double check but it looks like my AOC AGON 31.5 Monitor is on the approved list https://www.nvidia.com/en-us/geforce/products/g-sync-monitors/specs/ Ill have to check that model number but fairly sure thats it.
Your worrying me, looking forward to the article though.
Hopefully AMD announces something competitive soon to drive down prices.
That list looks more like a sponsor's list, or someone who paid to have each monitor pass, which is most likely the "passing" factor with no other differences. In fact, I looked up the BenQ XL2740 to see what the difference is between that and my Freesync BenQ XL2730z is... there are three differences. The XL2740 has a higher refresh rate (240hz vs 144hz), the XL2740 is a 1080P monitor vs my 1440P XL2730z, and t he XL2740 does not have freesync, or any adaptive sync listed in it's specifications. So, it is interesting that a monitor that doesn't have adaptive sync, or free sync listed in it's specs at all, can pass the test. It is possible it is unofficially supported, but it does make a person wonder what the "passing" factor is.
According to AMD, the monitor supports freesync.
I am going off the Manufactures web site, which has NO mention of supporting freesync, not a GPU manufactures site that is trying to promote their technology to sell cards. (in fact all outlets selling the monitor do not list freesync or adaptive sync in it's specifications) it may be unofficially supported though. Still makes a person wonder on the "passing" qualifications. It's kind of like how Comcast told me that my modem would fully support the 400Mbps speeds on their network... uh, it doesn't per the manufacturer and actual usage after the upgrade, I had to buy a different modem to get the full speed.
I know what the site says. Its technically considered unofficial support. The monitor technically supports it, but its disabled when using BenQ's blur reduction (which is on by default). Its the same with the XL2540 in fact.
which is exactly how it works on my XL2730z, you can't run both motion blur reduction and freesync at the same time, it's one or the other, yet Freesync is listed in the specification. I just find it odd, that a monitor that "unofficially" supports freesync is listed but one that fully supports it, isn't listed as I am pretty sure they both are the same freesync experience using identical freesync technology, short of the refresh rate differences. As I said, the list is more like a sponsorship list.
If it was a sponsored list then the BenQ wouldn't be on there as BenQ does not seem at all interested in acknowledging that monitor's adaptive sync abilities.
I feel strange, I'm mostly an amd fanboy but I enjoyed this presentation from nvidia. I thought they did a much more thorough presentation about the benefits of ray tracing with demos. I liked that chinese mmo with the reflections bouncing all around quite a bit. The effect where the light bounced off the water and showed fluctuating reflections on the stone arche looked great to me.
I still detect a strange visual on some of the reflections, almost as if they are fuzzier than ideal, but that is likely a function of this raytracing not being as far as it can eventually get to. But that's fine.
I take the point that performance might still not be ideal with more stuff going on, but a journey of a thousand miles begins with the first step. It is enough that they kickstarted the push for more ray tracing in games.
Nvidia finally caving and supporting freesync is a huge win since they are so dominant. That stuff about the dlss improving performance was interesting too. I wonder how detailed scenes with rtx off would compare to rtx on with a lot more stuff on screen? Would performance boosts from dlss still be similar to the raytracing being turned off? It sounded like the dlss was something that was improving over time where they could keep chugging away to get more performance boosts over time. After all, there has already been performance boosts in battlefield 5 hasn't there?
In any event, I thought it was a good presentation overall. Anthem is a game I might end up playing, and this kind of made me want to consider oen of these cards. Though I'm probably going to stick with my 1080 for now and hope amd comes out with something better next year.
Adaptive sync support for 10 and 20 series cards only?
No mobile 20XX announcements?
Even if you have a non-certified Freesync monitor, you'll still have the option to enable support manually in the Nvidia control panel.
Let me know when the 2080ti can be had for under $600 per card.
Eh? It was in there. Like 40 different laptops coming out with RTX 2080's.
You meant one digit?
One integer could be a huge difference.
Glad I didn't dump $$$ on a new monitor with my new build-I was waiting till the summer to get a new 34" G-sync monitor.
Well, it's not Freesync. If NVIDIA were using Freesync they would have to call it that. Freesync is the software side of things that makes Adaptive-Sync work, and even though it is an open standard AMD themselves are still the only ones using it.
I don't think that is accurate, as AMD's Freesync and Nvidia's new Gsync-compatible are both the same thing. 2 pieces of software (drivers) that support the open standard adaptive-Sync that all Freesync monitors use. Basically, all Freesync is, is a copy write brand name for AMD. The "software" or code in the drivers that support the open standard pretty much have to be the same to be fully compatible with the open standard, which can't be copy writed, as it is part of the open standard. Nvidia would be sued if they even mentioned freesync compatibility, because they used AMD's Branding "FreeSync" name without permission. even though in the end, they are basically the same thing software wise.
FreeSync is a royalty free open standard. Anyone that wants to us it can use it. However, yes it is AMDs brand name. I could only see Nvidia being sued if they attempted to claim creation. Nothing to stop them however from saying they’re using AMD Freesync technology.....but that doesn’t sound good with G-Sync.
oh crap i was wrong, i have the MG279Q also (IPS 144khz) oh well.
I suspect there is some misinformation going on here. Vesa's adaptive sync is the open standard. Freesync is AMD's proprietary or Brand name for it, which is trade marked BY AMD. Because AMD was the only manufacture using the Vesa open standard, people correlate freesync as being the open stand, which it is not. It is just AMD's trade marked name for it, and cannot be used with out their permission.
Hence, why it shows the TM after it's use in many of the places on AMD's own site:
The royalty free part you are referring to, is ONLY for monitor manufactures:
Taken straight from their faq:
AMD has undertaken efforts to encourage broad adoption for Radeon FreeSync technology, including:
Royalty-free licensing for monitor manufacturers;
Open and standardized monitor requirements (e.g. no non-standard display controllers or ASICs);
Industry-standard implementation via the DisplayPort Adaptive-Sync amendment to the DisplayPort 1.2a specification; and
interoperability with existing monitor technologies.
Nvidia is free to use the open Vesa adaptive sync standard, they just can't use FreeSync in it's name or description in any shape or form without AMD's approval/license.