Would you rather new tech and features never get introduced?
I'd rather I pay for them when they actually work well enough to be worth spending my hard earned cash on, not fund nVIDIA's R&D while they are TRYING to develop it
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature currently requires accessing the site using the built-in Safari browser.
Would you rather new tech and features never get introduced?
I'd rather I pay for them when they actually work well enough to be worth spending my hard earned cash on, not fund nVIDIA's R&D while they are TRYING to develop it
Is it just me, or does what you said EXACTLY describe this video? I kept saying the same thing, from a distance the lighting looks amazing but as you got close to objects, it looked like textures from 10 years ago.From a distance will you be in awe of the lighting but then as you come closer to an object soon realize your looking at a low rez texture? or will better lighting/reflections actually make that low rez texture look better?
As others have said this is a bit of a concern if they're saying "1080P, 60FPS" as a target for a $1200 card in 2018. How much better will it REALLY look using this tech to justify what seems to be a possibly incredible performance hit. Furthermore, if you're launching brand new cards that are supposedly especially built with cores for this sort of raytracing and AI, upcharging an arm and a leg compared to previous generation MSRP etc.. shouldn't the quality and viability of the features on question be exceptional as opposed to struggling to reach what used to be an old baseline with it enabled?
Lastly, I'm even more concerned about how different the "RTX" raytracing and AI tech will be. Is this another GSync/PhysX/GameWorks style thing where they are pushing yet another proprietary way to do things? If so, it means that the game experience for those with AMD cards or whatnot may be very different - assuming of course these features really look/feel different - in titles that make use of them, forcing developers to make a choice. If this does come to pass, I'm more interested in seeing what sort of ideally open alternative AMD and others will embrace instead and hope that the industry prefers that interpretation to whatever proprietary crap that Nvidia may decide to push yet again.
Only thing I saw in demos that was drastically better was the bf demo with fire reflections.
Past that I expected a card with Ray Tracing cores to NOT take a penalty from Hardware designed for it.... Little sad.
Plus the 2080 it the for the masses card and I don't see it with enough horsepower to fuel my high detail settings and refresh.
I wonder if well see Ray Tracing add-on cards like the old phys-x ones?
Now that would be funny
Doubt it, but I wouldn't be surprised if they add an option in the drivers to use a second RTX card as a ray tracing processor, like they did when they bought PhysX.
^This
Everyone will turn it off to improve fps, just like they did with Hairworks.
Feels like this is one of those features that everyone gets worked up about but won't actually be usable until 3 generations from now. Just hope the 2080ti is able to provide a significant performance uplift in non-raytracing situations.
I lost it at "give consumers an option."How about they give consumers an option to not buy a RTX card ...
But it's not about when you'd rather pay for it. It's about when you will pay for it.I'd rather I pay for them when ...
NahSo much fanboyism. AMD shills must realize that rtx is showing 2x performance over previous generation with dlss (1080 v 2080) from an article that Kyle posted... which means what, 5x + vega?
It's a new tech. And is very cool. In its 1st generation of consumer gpus. You also dont need to turn it on if you prefer frame rate over graphic fidelity. The amount of whining by people who wouldn't buy nvidia in the first place in the forum posts is borderline obnoxious.
Are you all concerned this is the nail in the coffin for amd graphics division? Itd be terrible for the market (see rtx pricing) but many of the arguments against the cards performance are silly. I get the pricing issue, but u can always get an rx580... or even a 10xx with the price drops.
Given the ridiculous size of these Turing chips, NVIdia may be targeting high-profit sales to the "1%" of PC gamers that don't care about cost. They may be perfectly willing to let AMD keep competing in the less-profitable "mass market," where the GTX1060 and below play now.Are you all concerned this is the nail in the coffin for amd graphics division? Itd be terrible for the market (see rtx pricing) but many of the arguments against the cards performance are silly. I get the pricing issue, but u can always get an rx580... or even a 10xx with the price drops.
I lost it at "give consumers an option."
More seriously, NVidia has to solve the chicken-egg problem: RTX is useless without games that use it, and developers won't make games for it (unless NVidia pays them to) unless there is a substantial installed base of RTX-capable cards. Short-term, offering a "GTX 2080" might net NVidia a few more card sales, but long term it reduces the impact of what they probably hope will be a substantial competitive advantage.
So, sorry, you want an RTX-free card, you'll have to settle for something less than NVidia's 2000-series.
They're not going to sell many when they're pricing them straight out of reach of most gamers. Those prices are fucking ridiculous.
IMO this is similar to programmable shader.
When it first came out, the first generation of cards could barely take advantage of the features.
Raytracing will bring huge changes to graphics the same way programmable shades did. But it will have to go through similar transitioning period.
I guess we tend to take that for granted now.
Everything from AA, tesselation, and now raytracing are all examples of this.
LOL. How many $1000 iPhones has Apple sold?They're not going to sell many when they're pricing them straight out of reach of most gamers. Those prices are fucking ridiculous.
So much fanboyism. AMD shills must realize that rtx is showing 2x performance over previous generation with dlss (1080 v 2080) from an article that Kyle posted... which means what, 5x + vega?
It's a new tech. And is very cool. In its 1st generation of consumer gpus. You also dont need to turn it on if you prefer frame rate over graphic fidelity. The amount of whining by people who wouldn't buy nvidia in the first place in the forum posts is borderline obnoxious.
Are you all concerned this is the nail in the coffin for amd graphics division? Itd be terrible for the market (see rtx pricing) but many of the arguments against the cards performance are silly. I get the pricing issue, but u can always get an rx580... or even a 10xx with the price drops.
This is why I hope DLSS works well. If it upscales to 1440p from 1080p with RT on that’s doable.
From what Anandtech say, the majority of games only support one or the other, not both. There are exceptions though, like Mechwarrior 5.
I think four or five do both so far. But the card hasn’t even launched yet so I expect that list to expand. It’s a nice combo to have if this is the hit to expect.
Oh I'm sure it will. Well the DLSS benefits will depend on what AA nVidia were using for those slides. If they had some crazy supersampling on then the actual benefits of DLSS for the average person won't be so great.
Time will tell I guess. It's only a month, after all.
Feels like this is one of those features that everyone gets worked up about but won't actually be usable until 3 generations from now. Just hope the 2080ti is able to provide a significant performance uplift in non-raytracing situations.