AMD reveal next-gen ‘Nvidia killer’ graphics card at CES?

Your rebuttal is literally "they'll do it because it's possible despite the drawbacks!"

Okay.
If Vulkan gets big enough they'll have to pay for multi-platform games to be ported to DX...

DXR dropped in Q4 2018. In less than a year we had AAA games on store shelves supporting RT.
You can hate on Nvidia all you like but their developer relations are on point.

Two maybe three games at most in a year? I wouldn't be crowing about it but it's a start none the less. What makes you think they only got DXR when it was public -that would be a little strange don't you think for such an important standard? They do have very good relationships which probably wouldn't be possible without cash and free resources.
Either way glad to see them giving it a go, I just think hardware still has a while to go to do it to the level needed especially out of the 1% or so that has a 2080+ level card. So naturally devs are only going to go so far if it's subsidised. As close to full screen RT as possible is the goal but the perf hit with just partial is already high currently.

I hope so, watching RTX videos with RT reflections are kinda jaded and just stick out way beyond normal making it even look more artificial or contrived. Subtlety has a more impactful realism.
This.
Current RT implementations remind me of Avatar in 3d with the over the top 3d effects in many places.
 
I hope so, watching RTX videos with RT reflections are kinda jaded and just stick out way beyond normal making it even look more artificial or contrived. Subtlety has a more impactful realism.

That's why Metro Exodus looks so good with Ray Tracing. The effects are not over the top. This is the best example I have personally seen so far.
 
Two maybe three games at most in a year? I wouldn't be crowing about it but it's a start none the less.

Why wouldn't you be crowing? Look at the names. Battlefield, Tomb Raider, Control, Call of Duty, Metro. That's big time. Metro and Control in particular went all in on RT.

What makes you think they only got DXR when it was public -that would be a little strange don't you think for such an important standard? They do have very good relationships which probably wouldn't be possible without cash and free resources.

They certainly had early access but looking at time between availability of the api and having product on shelves we can compare to other feature launches. Remember how long it took for us to get proper DX11 or DX12 games?

Either way glad to see them giving it a go, I just think hardware still has a while to go to do it to the level needed especially out of the 1% or so that has a 2080+ level card. So naturally devs are only going to go so far if it's subsidised. As close to full screen RT as possible is the goal but the perf hit with just partial is already high currently.

It's not just about hardware performance. There's also a learning curve for developers. It would be naive to expect a flood of raytraced games overnight. Though I hope that happens with this upcoming console generation.
 
That's why Metro Exodus looks so good with Ray Tracing. The effects are not over the top. This is the best example I have personally seen so far.

While I dont care much for Russian B movies I agree. Also Deliver Us The Moon is pretty cool as far as RT goes.

In the end, I'm just happy it's here (RT) and that I get to mess around with it.
 
Why wouldn't you be crowing? Look at the names. Battlefield, Tomb Raider, Control, Call of Duty, Metro. That's big time. Metro and Control in particular went all in on RT.
Yeah big names but slow to roll out considering there is support from Nvidia, I would have expected a few more but they are now trickling out. As you said the development schedule is tight and it's not so easy to just drop everything and add RT.
They certainly had early access but looking at time between availability of the api and having product on shelves we can compare to other feature launches. Remember how long it took for us to get proper DX11 or DX12 games?
True.. DX levels can take forever. In that regard they have done pretty good.. thanks for the reminder.
It's not just about hardware performance. There's also a learning curve for developers. It would be naive to expect a flood of raytraced games overnight. Though I hope that happens with this upcoming console generation.
[/QUOTE]
Yeah I'm hoping consoles will shake things up a little they are getting closer and closer to PC design with each iteration now.. more Vulkan would be great to see too.
 
Yeah big names but slow to roll out considering there is support from Nvidia, I would have expected a few more but they are now trickling out. As you said the development schedule is tight and it's not so easy to just drop everything and add RT.

Well that's exactly it. I don't understand why you think this is a slow rollout. What would you consider a fast rollout and how would you make that happen?
 
the groundhog did not see his shadown and announced to both amd and nvidia, "3 more months of ZZZZZZ videocard news"

edit: spelled nvidia as invadia
 
Well that's exactly it. I don't understand why you think this is a slow rollout. What would you consider a fast rollout and how would you make that happen?
I just expected them to add a few more titles to the fray within a year. 2 barely 3 (one is really a gazillion year old game) was a little disappointing.
 
It is not just that titles have RTX but what that RTX offers. The results overall tends to be bad for RTX cards, running out of VRam, poor frame rates, big changes in frame rates, stutters/stops, have to turn down settings, lower resolution which counteracts more than the benefits received. DSLL did not really make it better, quality was even worse when using it. The majority of RTX cards (2060, 2060 Super, 2070, 2070 Super) may not have much use for RTX. Even the 2080 Ti one will need to compromise frame rates, resolution and maybe other items but overall could balance it out and hopefully get something better using RTX but that to me seems the only viable card at this stage with the games out. The 2080 and 2080 Super is probably too limited too.

As for AMD, I don't think they will have any better magic, if a better holistic approach between software and hardware can be achieved the better. The 2080 Ti theoretically with 10 billion Rays per Second RT ability should be able to do 20 rays per pixel at 60fps at 4K. 20 rays per pixel is not even low quality (100 - 200) to render a scene. At 1440p 60 fps, 45 rays per pixel which is better but still sub low quality. If you calculate light bounce, then divide that by the number of bounces. At lower resolutions and frame rates one should be able to actually do very low quality ray traced simple scenes where all lighting is calculated by ray tracing. Since that would not go too well over for gaming a mixture of RT techniques are used to enhanced rasterization. Which appears even a much lower rays per pixel solution like 1 to 2 used to calculate what effect RT is being used for like reflections or shadows and so on to enhance the quality or accuracy where you cannot cull much of the normally unseen items, causing ram usage to skyrocket and a lot more processing for rasterization. Your no longer just rendering what the camera sees, all the geometry is now being processed. Crytek was able to do reflections pretty good with just ordinary non RT dedicated hardware so there is much hope for a better software solution, maybe a combination of camera view and reflected surface view culling so not all the geometry has to be present slowing down everything. Getting the hardware to be able to do even more samples/sec could help but even 4x one will still have to have very fast high quality denoising and it will most likely still look worse quality wise than a 1000+ sample/sec baked texture/light maps/. . . rasterized scene.

It will be interesting what AMD will have for RT, how well it will works with current RTX games or not. The rabbit hole will just get deeper.
 
If Vulkan gets big enough they'll have to pay for multi-platform games to be ported to DX...



Two maybe three games at most in a year? I wouldn't be crowing about it but it's a start none the less. What makes you think they only got DXR when it was public -that would be a little strange don't you think for such an important standard? They do have very good relationships which probably wouldn't be possible without cash and free resources.
Either way glad to see them giving it a go, I just think hardware still has a while to go to do it to the level needed especially out of the 1% or so that has a 2080+ level card. So naturally devs are only going to go so far if it's subsidised. As close to full screen RT as possible is the goal but the perf hit with just partial is already high currently.


This.
Current RT implementations remind me of Avatar in 3d with the over the top 3d effects in many places.

Nvidia litearlly paid the devs in the form of monetary compensation for anything ray tracing related they were able to iplement in their games. If you followed the ray tracing fiasco from the start, you would notice the trand in which developers weren't so keen on giving ray tracing a priority over virtually anything else on their list of 'bare minimum we can release this game with'. While there was clearly an incentive to work ray tracing into essentially finished game code, if anything it highlighted how much of a crunch still going on in every studio, and why games are always coming out half baked. Working ray tracing into a game 3 months from release, when the game is barely going to be held together on launch, it probably asking a little too much of the studios. However, after a few stability patches, Ray Tracing was probably akin to a dlc type project that rewarded post game studio hours and would make sense to utilize if possible. Future games that are build from the ground up with ray tracing being integrated as a starting point for the entire project, especially if AMD is able to deliver anything significant in next gen consoles, we will start to see some truly remarkable implementations.

Yeah big names but slow to roll out considering there is support from Nvidia, I would have expected a few more but they are now trickling out. As you said the development schedule is tight and it's not so easy to just drop everything and add RT.

It's hard enough to finish a game with the business first release schedules, Imagine telling your staff that they have to learn a new programming model, and implement it, into finished code, on top of the 60-80 hour crunch weeks they're pulling on lead up to launch. It's beyond ludicrous. . .it's straight plaid.
 
Last edited:
  • Like
Reactions: noko
like this
Nvidia litearlly paid the devs in the form of monetary compensation for anything ray tracing related they were able to iplement in their games.

I would like to see some proof of that. Ray Tracing is a win -win for the developers and the gamer. Games are easier to develop and look much better.
 
I would like to see some proof of that. Ray Tracing is a win -win for the developers and the gamer. Games are easier to develop and look much better.

Really.. easier to develop.. a feature that less than 5% can actually turn on with the fps to enjoy it.. sounds like a easy choice.. and the best place for any developer to put their time and resources, its a win-win! /S

and look much better? ...? sometimes you have to look at screenshoots with a magnifying glass to see the difference.. but atleast it tanks the performance utterly.

So, yeah - it makes sense that nVidia paid upfront for this, connect the dots.

and i speak as a 2080 owner.
 
Really.. easier to develop.. a feature that less than 5% can actually turn on with the fps to enjoy it.. sounds like a easy choice.. and the best place for any developer to put their time and resources, its a win-win! /S

and look much better? ...? sometimes you have to look at screenshoots with a magnifying glass to see the difference.. but atleast it tanks the performance utterly.

So, yeah - it makes sense that nVidia paid upfront for this, connect the dots.

and i speak as a 2080 owner.

You don't need a magnifying glass to see the difference between reflections and no reflections. Performance issues are getting a lot better with the latest DLSS. Lot of info out there.
 
Really.. easier to develop.. a feature that less than 5% can actually turn on with the fps to enjoy it.. sounds like a easy choice.. and the best place for any developer to put their time and resources, its a win-win! /S

and look much better? ...? sometimes you have to look at screenshoots with a magnifying glass to see the difference.. but atleast it tanks the performance utterly.

So, yeah - it makes sense that nVidia paid upfront for this, connect the dots.

and i speak as a 2080 owner.

Ok, you are so sure, you show me the proof?

What have at the moment is just the beginning. It has to start somewhere. Things will accelerate quickly when AMD GPUs and the consoles get Ray Tracing later this year.

Even you should have been able to see the improvement in Ray Tracing since it started. Look at Metro Exodus and Control compared to the mess that was Battle Field V.
 
You don't need a magnifying glass to see the difference between reflections and no reflections. Performance issues are getting a lot better with the latest DLSS. Lot of info out there.
I don't think there is a lot of info out there showing positive results but your right, DLSS after a rather long period of time is proving to be viable, usable and looks to be a game changer if implemented right. Wolfenstein Youngblood was/is an eye opener for DLSS and that takes AI/Tensor cores to do right - Something I am not sure AMD will have on next launch unless they have a checkerboard type solution to counteract it. Nvidia made some rather big gambles I think, over compensated with less than truthful marketing which did not help them but the hardware can deliver some unique beneficial stuff while being limited by the games themselves. The biggest advantage I see Nvidia has over AMD (just my view) is AI, their experience with implementing different AI in cars/robots/etc. and making dedicated hardware/software. For me it may just come down more to which high end card supports better the next gen Monitors and HDTV's (a.k.a HDMI 2.1 and DP2.0) more so than just RT ability, I am a graphics whore and do like pretty stuff lighting up my face but that is me, abnormal as it maybe and not the norm overall but that comes more from the display. Will be an interesting year and no one has to settle for one or the other either - get both AMD and NVidia best if you can afford it :D.
 
  • Like
Reactions: Auer
like this
I don't think there is a lot of info out there showing positive results but your right, DLSS after a rather long period of time is proving to be viable, usable and looks to be a game changer if implemented right. Wolfenstein Youngblood was/is an eye opener for DLSS and that takes AI/Tensor cores to do right - Something I am not sure AMD will have on next launch unless they have a checkerboard type solution to counteract it. Nvidia made some rather big gambles I think, over compensated with less than truthful marketing which did not help them but the hardware can deliver some unique beneficial stuff while being limited by the games themselves. The biggest advantage I see Nvidia has over AMD (just my view) is AI, their experience with implementing different AI in cars/robots/etc. and making dedicated hardware/software. For me it may just come down more to which high end card supports better the next gen Monitors and HDTV's (a.k.a HDMI 2.1 and DP2.0) more so than just RT ability, I am a graphics whore and do like pretty stuff lighting up my face but that is me, abnormal as it maybe and not the norm overall but that comes more from the display. Will be an interesting year and no one has to settle for one or the other either - get both AMD and NVidia best if you can afford it :D.

Are we sure it's using Tensor cores?

The turning point for DLSS, seemed to be Control, which didn't use Tensor cores, and was much better for it.

Wolf: YB, seems to behave a lot more like Control DLSS, than the previous versions.
 
Are we sure it's using Tensor cores?

The turning point for DLSS, seemed to be Control, which didn't use Tensor cores, and was much better for it.

Wolf: YB, seems to behave a lot more like Control DLSS, than the previous versions.

Is there a link to control not using Tensor cores? I thought they were required for DLSS?

"By enabling DLSS, which leverages Turing’s Tensor Core"

https://news.developer.nvidia.com/dlss-three-things-you-need-to-know/

Did I misunderstand something?
 
Control doesn’t use AI or tensors. It uses a fixed filtering algorithm that Nvidia is misleadingly branding as DLSS.

https://www.nvidia.com/en-us/geforce/news/dlss-control-and-beyond/

Thanks, read this: "Leveraging this AI research, we developed a new image processing algorithm that approximated our AI research model and fit within our performance budget. This image processing approach to DLSS is integrated into Control, and it delivers up to 75% faster frame rates."
 
Are we sure it's using Tensor cores?

The turning point for DLSS, seemed to be Control, which didn't use Tensor cores, and was much better for it.

Wolf: YB, seems to behave a lot more like Control DLSS, than the previous versions.
I would take that as a yes while Nvidia is not specific, brief is specific saying Nvdia DLSS, I see no other way they could get those results -> If not then any recent GPU card could do it for the most part and TAA retired.

https://www.nvidia.com/en-us/geforce/news/wolfenstein-youngblood-ray-tracing-nvidia-dlss-highlights/
 
  • Like
Reactions: Auer
like this
I would take that as a yes while Nvidia is not specific, brief is specific saying Nvdia DLSS, I see no other way they could get those results -> If not then any recent GPU card could do it for the most part and TAA retired.

https://www.nvidia.com/en-us/geforce/news/wolfenstein-youngblood-ray-tracing-nvidia-dlss-highlights/

From your link:

"As an AI algorithm, DLSS is constantly learning and improving. Wolfenstein: Youngblood demonstrates this continuous improvement, with DLSS delivering big boosts in frame rates while providing similar image quality to native resolution with TAA."

So if its not tensor core AI, what kind of Devil Magic is it?

Either way, I dont care if its powered by old socks, it's pretty awesome in WolfYB
 
  • Like
Reactions: noko
like this
From your link:

"As an AI algorithm, DLSS is constantly learning and improving. Wolfenstein: Youngblood demonstrates this continuous improvement, with DLSS delivering big boosts in frame rates while providing similar image quality to native resolution with TAA."

So if its not tensor core AI, what kind of Devil Magic is it?

Either way, I dont care if its powered by old socks, it's pretty awesome in WolfYB

We don’t know that WolfYB isn’t using tensors. It very well could be.

Nvidia confused everyone with what they did for Control and now we don’t know for sure what they’re doing when they say “DLSS”.
 
  • Like
Reactions: noko
like this
Back
Top