Would you rather new tech and features never get introduced?

I'd rather I pay for them when they actually work well enough to be worth spending my hard earned cash on, not fund nVIDIA's R&D while they are TRYING to develop it
 
I'd rather I pay for them when they actually work well enough to be worth spending my hard earned cash on, not fund nVIDIA's R&D while they are TRYING to develop it

Every video card you have ever bought has funded the R&D of that company while they work on new products and tech. Jensen claimed they have been developing RTX for the last decade so any Nvidia card you bought in the last 10 years has helped fund R&D for RTX, along with everything else they've designed in that time.

Getting the tech out early gets game support rolling. Don't know about you, but I love being able to go back to titles and run them at settings and resolutions that used to seem impossible on hardware I had when they came out.

I said this earlier in the thread but, no one should be buying these cards for the RTX features. The 2080 and 2080 ti are both likely going to be outstanding 4K, or 1440/144hz, cards.
 
Post after Post , the RTX feels like the initial Xbox One when games couldn't go 1080P with 60FPS. remember that? Pepperidge farm remembers..
 
Only thing I saw in demos that was drastically better was the bf demo with fire reflections.

Past that I expected a card with Ray Tracing cores to NOT take a penalty from Hardware designed for it.... Little sad.

Plus the 2080 it the for the masses card and I don't see it with enough horsepower to fuel my high detail settings and refresh.

I wonder if well see Ray Tracing add-on cards like the old phys-x ones?

Now that would be funny
 
  • Like
Reactions: ltron
like this
How about they give consumers an option to not buy a RTX card and offer a GTX equivalent of the 2080Ti for $700?

I don't want ray tracing. It's not going to be useful yet. And I sure as fuck am not spending $1200 on a card that can barely do it.
 
From a distance will you be in awe of the lighting but then as you come closer to an object soon realize your looking at a low rez texture? or will better lighting/reflections actually make that low rez texture look better?
Is it just me, or does what you said EXACTLY describe this video? I kept saying the same thing, from a distance the lighting looks amazing but as you got close to objects, it looked like textures from 10 years ago.
 
As others have said this is a bit of a concern if they're saying "1080P, 60FPS" as a target for a $1200 card in 2018. How much better will it REALLY look using this tech to justify what seems to be a possibly incredible performance hit. Furthermore, if you're launching brand new cards that are supposedly especially built with cores for this sort of raytracing and AI, upcharging an arm and a leg compared to previous generation MSRP etc.. shouldn't the quality and viability of the features on question be exceptional as opposed to struggling to reach what used to be an old baseline with it enabled?

Lastly, I'm even more concerned about how different the "RTX" raytracing and AI tech will be. Is this another GSync/PhysX/GameWorks style thing where they are pushing yet another proprietary way to do things? If so, it means that the game experience for those with AMD cards or whatnot may be very different - assuming of course these features really look/feel different - in titles that make use of them, forcing developers to make a choice. If this does come to pass, I'm more interested in seeing what sort of ideally open alternative AMD and others will embrace instead and hope that the industry prefers that interpretation to whatever proprietary crap that Nvidia may decide to push yet again.

Ray tracing in games will be using MS DXR (there is also vulkan version). Just like tessellation, async compute etc it all comes down to IHV implementation. So at the very least there will be less concern about the tech being locked to one vendor only.
 
Only thing I saw in demos that was drastically better was the bf demo with fire reflections.

Past that I expected a card with Ray Tracing cores to NOT take a penalty from Hardware designed for it.... Little sad.

Plus the 2080 it the for the masses card and I don't see it with enough horsepower to fuel my high detail settings and refresh.

I wonder if well see Ray Tracing add-on cards like the old phys-x ones?

Now that would be funny

Doubt it, but I wouldn't be surprised if they add an option in the drivers to use a second RTX card as a ray tracing processor, like they did when they bought PhysX.
 
Doubt it, but I wouldn't be surprised if they add an option in the drivers to use a second RTX card as a ray tracing processor, like they did when they bought PhysX.

Probably right. I won't get sucked in a 2nd time and watch them abandon it once the next thing comes out. I did the dedicated PhysX card for awhile. It was a great way to recycle my previous gen cards and worked great too. Until they stopped supporting it.
 
I don't like that they used the RTX name. This gives the existing RTX a bad name.
 
Hope they can manage 1080p@60 on 2070 or 2080, I have a 2080 Ti coming with 2 extra gigarays/sec. So I can managed in UW with 60fps, or slightly above, since almost double CUDA.
 
Feels like this is one of those features that everyone gets worked up about but won't actually be usable until 3 generations from now. Just hope the 2080ti is able to provide a significant performance uplift in non-raytracing situations.


I was honestly thinking the same thing when I seen the FPS for DLSS. Just seems that 4K was an after thought for raytracing...

Just dont see this being the be all end all past the 1080 Ti. Guess we'll see.
 
How about they give consumers an option to not buy a RTX card ...
I lost it at "give consumers an option."

More seriously, NVidia has to solve the chicken-egg problem: RTX is useless without games that use it, and developers won't make games for it (unless NVidia pays them to) unless there is a substantial installed base of RTX-capable cards. Short-term, offering a "GTX 2080" might net NVidia a few more card sales, but long term it reduces the impact of what they probably hope will be a substantial competitive advantage.

So, sorry, you want an RTX-free card, you'll have to settle for something less than NVidia's 2000-series.
 
So much fanboyism. AMD shills must realize that rtx is showing 2x performance over previous generation with dlss (1080 v 2080) from an article that Kyle posted... which means what, 5x + vega?

It's a new tech. And is very cool. In its 1st generation of consumer gpus. You also dont need to turn it on if you prefer frame rate over graphic fidelity. The amount of whining by people who wouldn't buy nvidia in the first place in the forum posts is borderline obnoxious.

Are you all concerned this is the nail in the coffin for amd graphics division? Itd be terrible for the market (see rtx pricing) but many of the arguments against the cards performance are silly. I get the pricing issue, but u can always get an rx580... or even a 10xx with the price drops.
 
So much fanboyism. AMD shills must realize that rtx is showing 2x performance over previous generation with dlss (1080 v 2080) from an article that Kyle posted... which means what, 5x + vega?

It's a new tech. And is very cool. In its 1st generation of consumer gpus. You also dont need to turn it on if you prefer frame rate over graphic fidelity. The amount of whining by people who wouldn't buy nvidia in the first place in the forum posts is borderline obnoxious.

Are you all concerned this is the nail in the coffin for amd graphics division? Itd be terrible for the market (see rtx pricing) but many of the arguments against the cards performance are silly. I get the pricing issue, but u can always get an rx580... or even a 10xx with the price drops.
Nah
 
Are you all concerned this is the nail in the coffin for amd graphics division? Itd be terrible for the market (see rtx pricing) but many of the arguments against the cards performance are silly. I get the pricing issue, but u can always get an rx580... or even a 10xx with the price drops.
Given the ridiculous size of these Turing chips, NVIdia may be targeting high-profit sales to the "1%" of PC gamers that don't care about cost. They may be perfectly willing to let AMD keep competing in the less-profitable "mass market," where the GTX1060 and below play now.

It's a common business strategy: you make the most profit by selling to the people with the most money.
 
I lost it at "give consumers an option."

More seriously, NVidia has to solve the chicken-egg problem: RTX is useless without games that use it, and developers won't make games for it (unless NVidia pays them to) unless there is a substantial installed base of RTX-capable cards. Short-term, offering a "GTX 2080" might net NVidia a few more card sales, but long term it reduces the impact of what they probably hope will be a substantial competitive advantage.

So, sorry, you want an RTX-free card, you'll have to settle for something less than NVidia's 2000-series.

They're not going to sell many when they're pricing them straight out of reach of most gamers. Those prices are fucking ridiculous.
 
I may have found a solution for having 4k @ 60+FPS: One RTX card for Frame Rates, and a 2nd RTX for Ray Ray.
There, I solved it, you're thankome.
 
They're not going to sell many when they're pricing them straight out of reach of most gamers. Those prices are fucking ridiculous.

Two years since a new line? Yeah, I think they'll sell fine even at those prices. That 2070 is right on the edge of ridiculous. If it gives a decent boost over the 1070 I could see it selling very well.
 
I wonder if the resolution can be offset somewhat by other hacks like checkerboard rendering and upsampling.

Unreal Engine 4 has an upsampling feature and it's shocking how close it can look to the real resolution unless you're really digging. You could definitely lose 20% of your screen res and be hard pressed to notice most of the time.

But yeah, right now it's definitely in that awkward, lanky stage.
 
IMO this is similar to programmable shader.
When it first came out, the first generation of cards could barely take advantage of the features.

Raytracing will bring huge changes to graphics the same way programmable shades did. But it will have to go through similar transitioning period.

I guess we tend to take that for granted now.
 
IMO this is similar to programmable shader.
When it first came out, the first generation of cards could barely take advantage of the features.

Raytracing will bring huge changes to graphics the same way programmable shades did. But it will have to go through similar transitioning period.

I guess we tend to take that for granted now.

Its not surprising. Lots of features are barely usable when they come out and after a couple of generations or so, the features are common place and the performance hit is at least acceptable. Everything from AA, tesselation, and now raytracing are all examples of this.
 
They're not going to sell many when they're pricing them straight out of reach of most gamers. Those prices are fucking ridiculous.
LOL. How many $1000 iPhones has Apple sold?

Sure, a $1000 GPU is too expensive for "most gamers," but the wealthiest 1% can certainly afford it, and that's millions of people.
 
So much fanboyism. AMD shills must realize that rtx is showing 2x performance over previous generation with dlss (1080 v 2080) from an article that Kyle posted... which means what, 5x + vega?

That's not a 2x performance increase as-is, that's a 2x performance increase over whatever unspecified AA method they are using. They could be doing 4x super sampling for all we know.

It's a new tech. And is very cool. In its 1st generation of consumer gpus. You also dont need to turn it on if you prefer frame rate over graphic fidelity. The amount of whining by people who wouldn't buy nvidia in the first place in the forum posts is borderline obnoxious.

While I do find it cool and interesting, I don't think it's unfair to be disappointed in the performance of a card that takes a major hit doing something it has dedicated hardware for. By all accounts the new range don't seem that much faster than the 10xx series if all you want is max frames. Definitely not for the money.

I was waiting for this range (and expected Ti model ~8 months down the line) to buy my next card. I love AMDs generally better value offerings but I have more disposable income nowadays and was looking to this launch to be the decider on the next card I bought. Now it looks like I'll just wait and pick up a 1080 Ti for cheap instead.

Are you all concerned this is the nail in the coffin for amd graphics division? Itd be terrible for the market (see rtx pricing) but many of the arguments against the cards performance are silly. I get the pricing issue, but u can always get an rx580... or even a 10xx with the price drops.

We're all enthusiasts here, and I appreciate people are looking for different things when new tech comes out. But I don't think anyone is unjustified in being disappointed with nVidia's offerings when all they wanted was the next best card to watch their FPS counter go up, especially when they announced the prices. If you just want the bleeding edge and some fancier shadows and reflections, and are willing to take an abnormally high hit to the wallet, and a slightly shocking hit to frame rate, be my guest. Someone has to fund the early adoption. But your somewhat derogatory attitide towards people who place value in different things to you comes across as the whining you derided earlier.
 
Last edited:
They didn't specify, 1080p 60 Hz is the target for which RTX card? The 2080 Ti? The 2080? Or the 2070? Because this makes all the difference.
 
This is why I hope DLSS works well. If it upscales to 1440p from 1080p with RT on that’s doable.
 
This is why I hope DLSS works well. If it upscales to 1440p from 1080p with RT on that’s doable.

From what Anandtech say, the majority of games only support one or the other, not both. There are exceptions though, like Mechwarrior 5.
 
From what Anandtech say, the majority of games only support one or the other, not both. There are exceptions though, like Mechwarrior 5.

I think four or five do both so far. But the card hasn’t even launched yet so I expect that list to expand. It’s a nice combo to have if this is the hit to expect.
 
I think four or five do both so far. But the card hasn’t even launched yet so I expect that list to expand. It’s a nice combo to have if this is the hit to expect.

Oh I'm sure it will. Well the DLSS benefits will depend on what AA nVidia were using for those slides. If they had some crazy supersampling on then the actual benefits of DLSS for the average person won't be so great.

Time will tell I guess. It's only a month, after all.
 
Oh I'm sure it will. Well the DLSS benefits will depend on what AA nVidia were using for those slides. If they had some crazy supersampling on then the actual benefits of DLSS for the average person won't be so great.

Time will tell I guess. It's only a month, after all.

I’d bet they didn’t dedicate die space unless it’s significant... but yeah, time will tell. I am more interested in DLSS than RT tbh.
 
Feels like this is one of those features that everyone gets worked up about but won't actually be usable until 3 generations from now. Just hope the 2080ti is able to provide a significant performance uplift in non-raytracing situations.

Most features don't usually eat a significant area of the die to themselves.
 
Seeing game developers actual targeting these features is honestly pretty cool. I really didn't have much expectation that there would be much dev support for this in games any time soon. It seems I was wrong.

The truth is 4k isn't really that big a deal. HDR is fantastic sure but the extra pixels. Truth is most people can't even see them. If Ray tracing is going to start working its way into actual games now. I'm sorry folks expect 1080p to be the target resolution for the next 10 years.

After 2080 the next generation will aim to add even more Ray tracing, not do it faster... allow devs to use more of it. Its going to take 3-4 generations at min for the mid range cards to be able to start pushing more then 1080p with tracing on. As long as that is the case developers are not going to target anything more then 1080 with all the realism features cranked up.

IMO that isn't a bad thing at all for anyone. Its time to stop worrying about 150 FPS or 160 FPS and get back to looking at IQ. I remember how excited I was the first time I had a card doing hardware lighting in quake. I would love to get back to buying new hardware and getting excited about turning on new IQ stuff that I couldn't turn on with my old gear.
 
Ok good that they started the tech and have Ray tracing working at 2007 resolutions , But I'll pass.
 
All this tells me is that it may be time to upgrade my 660Ti to an 1060Ti. That is the price rage i would be willing to hit.
 
Too much focus and hype on the technology that's not mature.
I game at 4k. I would get 2080Ti only to get higher fps at 4k.
If RT can't be on, it's basically useless and not interesting at all. It's obviously not going to work on 1440p either.
This is the market that will actually buy 2080Ti.
I don't see any gamers purchasing 2080Ti for 1080p gaming, even if they want RT because there are just not enough games and the future is uncertain. It could be just another PhysX.
 
Back
Top