Will the new Radeons on 7nm compete with Nvidia?

Everything you see in games are hacks to simulate ray traycing.
Of you don’t want ray traycing...you have to give up on new games...the devs want RT...Microsoft has provided the DX12 DXR API...time to sell your gaming PC.

Oh I want ray tracing, but neither the software or hardware is ready. I learned the lesson the hard way to never buy something for its future potential, only for what it does now as way too often that promise is either partially delivered or not delivered at all. Right now we have some reflections you have to pause the game to look at and see if they're ray traced, and that's about it. That's not even what I'd consider partially delivered.
 
Everything you see in games are hacks to simulate ray traycing.
Of you don’t want ray traycing...you have to give up on new games...the devs want RT...Microsoft has provided the DX12 DXR API...time to sell your gaming PC.
Yeah how does that work again the only title that has ray tracing had such severe problems with performance that it required a patch (documented as reducing the load on the hardware) not only in software but in drivers as well.
And when game developers find out how limited they are in ray tracing you should hope that Nvidia has enough cash to make miracles happen....
 
Yeah how does that work again the only title that has ray tracing had such severe problems with performance that it required a patch (documented as reducing the load on the hardware) not only in software but in drivers as well.
And when game developers find out how limited they are in ray tracing you should hope that Nvidia has enough cash to make miracles happen....
Considering BFV has only reflective surfaces being raytraced and for performance reasons had to add Screen Space Reflections (traditional gaming technique to make believable reflections) and not any of the lighting, shadows, ambient occlusion (light bounce) etc. Also Dice is not using Nvidia noise reduction for this either (performance reasons?) or DSLL, at least not yet. That is not reflecting good for the hardware capability. Nvidia should have really helped make this game Multi-GPU able for DXR - with the price they are asking that would have paid out well I think.

Now a lot of raytracing is used in making games, baked into the textures and also lightmaps are used to light a scene. Games have been raytraced for awhile now but in a static way, turning on DXR and off in BFV is not affecting the rather awesome job Dice did with the lighting of this game which much of it was baked into it using a lot of processing power and raytracing techniques. That level of quality is not doable at this time in real time - thus something like reflections that can be dynamic and real time is a good start for real time raytracing.
 
Wake me up when RT is actually viable at something higher than egamer resolutions.
 
Considering BFV has only reflective surfaces being raytraced and for performance reasons had to add Screen Space Reflections (traditional gaming technique to make believable reflections) and not any of the lighting, shadows, ambient occlusion (light bounce) etc. Also Dice is not using Nvidia noise reduction for this either (performance reasons?) or DSLL, at least not yet. That is not reflecting good for the hardware capability. Nvidia should have really helped make this game Multi-GPU able for DXR - with the price they are asking that would have paid out well I think.

Now a lot of raytracing is used in making games, baked into the textures and also lightmaps are used to light a scene. Games have been raytraced for awhile now but in a static way, turning on DXR and off in BFV is not affecting the rather awesome job Dice did with the lighting of this game which much of it was baked into it using a lot of processing power and raytracing techniques. That level of quality is not doable at this time in real time - thus something like reflections that can be dynamic and real time is a good start for real time raytracing.

It is funny you mentioned this someone figured out a way to get stuff going on Titan V
https://www.guru3d.com/news-story/n...er-perf-(but-does-not-have-any-rt-cores).html

Not to sure if it valid but sounds quite awesome,if you have a Titan V :)
 
  • Like
Reactions: N4CR
like this
Still yet to see a breakdown on what math, exactly, that the "RT" cores perform.

A part of me thinks they just found a simple way to leverage the existing tensor cores design to perform the math needed. If the low level design is flexible enough, this would imply that adapting existing tensor core versions is not hard. There is a lot of precedence for these kind of evolutionary changes to chip architecture, despite what marketing and hype loves to claim is revolutionary.

If the performance of "RT" code is roughly the same with similar Volta dies vs Turing dies, is that proof enough?
 
  • Like
Reactions: N4CR
like this
Still yet to see a breakdown on what math, exactly, that the "RT" cores perform.

A part of me thinks they just found a simple way to leverage the existing tensor cores design to perform the math needed. If the low level design is flexible enough, this would imply that adapting existing tensor core versions is not hard. There is a lot of precedence for these kind of evolutionary changes to chip architecture, despite what marketing and hype loves to claim is revolutionary.

If the performance of "RT" code is roughly the same with similar Volta dies vs Turing dies, is that proof enough?

I wouldn't call it the same, or even close to the same performance. In areas with a lot of reflections the titan rtx is 25% faster despite similar shader power and tensor cores, so those ray tracing cores are obviously doing something. Is their importance overblown? Probably. This does give some hope that ray tracing on amd could just be a driver update away, especially if they can leverage the massive compute advantage they have to do it.

Interesting times ahead, but current rt implementation is not worth investing in.
 
RT for PC gaming is pretty much DOA without consol support, at which point you might as well buy a PS5/XBOX2. I suspect if RT is supported in the next gen consol, it will likely be implemented using compute as opposed to the Nvidia RT core. In other words, tensor core is a waste of die space and money.
 
Seems nobody followed what happened recently and what RTX does.
First : Nvidia RTX are compatible with DXR but so does Vega and will Navi, and also does Volta.
What RTX does that other don"t : it's all in the API, not the hardware. It's been a leak on a hack of an AMD Veag 64 card that does render the games made for RTX cards, but only partially. DXR renders only what the DX scene shows. Problem is the games show and render the 3D objects that only matter for the view. So the raytrace by DXR API ways won't render the background and other objecst unneeded for the scene. Solution : to take into account all the objects for the direct view and the indirect raytrace but this will be way too heavy for everything. Solution 2 : Nvidia RTX proprietary API. It takes all the objects outside the view into account and makes a new raytrace of them, then assembles that with the DXR results. No way you can do that without the Nvidia RTX and an Nvidia supporting card. Since DXR is incapable of doing that, and needs plenty of improvements, AMD is in a messy situation, including into the console world. The more we go, the worse it will become, since AMD will have to copy Nvidia behaviour and apart from copyright and licence infingement, having to develop the API and pull the gaming companies into his in house late development for its 10% share in the gaming market and only on lower end cards is going to be impossible. AMD is screwed and AMD GPU owners are screwed on this. If you want Raytracing, which is going to be mandatory in new games, it's going to be a Nvidia RTX, whatever high end super-GPU card AMD makes.
 
Seems nobody followed what happened recently and what RTX does.
First : Nvidia RTX are compatible with DXR but so does Vega and will Navi, and also does Volta.
What RTX does that other don"t : it's all in the API, not the hardware. It's been a leak on a hack of an AMD Veag 64 card that does render the games made for RTX cards, but only partially. DXR renders only what the DX scene shows. Problem is the games show and render the 3D objects that only matter for the view. So the raytrace by DXR API ways won't render the background and other objecst unneeded for the scene. Solution : to take into account all the objects for the direct view and the indirect raytrace but this will be way too heavy for everything. Solution 2 : Nvidia RTX proprietary API. It takes all the objects outside the view into account and makes a new raytrace of them, then assembles that with the DXR results. No way you can do that without the Nvidia RTX and an Nvidia supporting card. Since DXR is incapable of doing that, and needs plenty of improvements, AMD is in a messy situation, including into the console world. The more we go, the worse it will become, since AMD will have to copy Nvidia behaviour and apart from copyright and licence infingement, having to develop the API and pull the gaming companies into his in house late development for its 10% share in the gaming market and only on lower end cards is going to be impossible. AMD is screwed and AMD GPU owners are screwed on this. If you want Raytracing, which is going to be mandatory in new games, it's going to be a Nvidia RTX, whatever high end super-GPU card AMD makes.
Mandatory? I think it will be a switch to turn on and off.
 
Seems nobody followed what happened recently and what RTX does.
First : Nvidia RTX are compatible with DXR but so does Vega and will Navi, and also does Volta.
What RTX does that other don"t : it's all in the API, not the hardware. It's been a leak on a hack of an AMD Veag 64 card that does render the games made for RTX cards, but only partially. DXR renders only what the DX scene shows. Problem is the games show and render the 3D objects that only matter for the view. So the raytrace by DXR API ways won't render the background and other objecst unneeded for the scene. Solution : to take into account all the objects for the direct view and the indirect raytrace but this will be way too heavy for everything. Solution 2 : Nvidia RTX proprietary API. It takes all the objects outside the view into account and makes a new raytrace of them, then assembles that with the DXR results. No way you can do that without the Nvidia RTX and an Nvidia supporting card. Since DXR is incapable of doing that, and needs plenty of improvements, AMD is in a messy situation, including into the console world. The more we go, the worse it will become, since AMD will have to copy Nvidia behaviour and apart from copyright and licence infingement, having to develop the API and pull the gaming companies into his in house late development for its 10% share in the gaming market and only on lower end cards is going to be impossible. AMD is screwed and AMD GPU owners are screwed on this. If you want Raytracing, which is going to be mandatory in new games, it's going to be a Nvidia RTX, whatever high end super-GPU card AMD makes.

Mandatory.. that will never happen.. ever - well unless nVidia starts developing games..???!??!?!"?

This is really simple.. no RT in consoles.. so no RT in mainstream.. so therefore only RT when NV pays the big bucks.. and it performs like crap.. Mandatory.. Great, thanks for putting a smile on my lips :)
 
Seems nobody followed what happened recently and what RTX does.
First : Nvidia RTX are compatible with DXR but so does Vega and will Navi, and also does Volta.
What RTX does that other don"t : it's all in the API, not the hardware. It's been a leak on a hack of an AMD Veag 64 card that does render the games made for RTX cards, but only partially. DXR renders only what the DX scene shows. Problem is the games show and render the 3D objects that only matter for the view. So the raytrace by DXR API ways won't render the background and other objecst unneeded for the scene. Solution : to take into account all the objects for the direct view and the indirect raytrace but this will be way too heavy for everything. Solution 2 : Nvidia RTX proprietary API. It takes all the objects outside the view into account and makes a new raytrace of them, then assembles that with the DXR results. No way you can do that without the Nvidia RTX and an Nvidia supporting card. Since DXR is incapable of doing that, and needs plenty of improvements, AMD is in a messy situation, including into the console world. The more we go, the worse it will become, since AMD will have to copy Nvidia behaviour and apart from copyright and licence infingement, having to develop the API and pull the gaming companies into his in house late development for its 10% share in the gaming market and only on lower end cards is going to be impossible. AMD is screwed and AMD GPU owners are screwed on this. If you want Raytracing, which is going to be mandatory in new games, it's going to be a Nvidia RTX, whatever high end super-GPU card AMD makes.

This is so incredibly wrong, and shows such a degree of ignorance about how DXR works, I don't even know where to begin. So I won't, I have better things to do with my life. Know that you have no idea what you're talking about. And RT will never be mandatory until everyone ditches rasterizing, which likely will never happen as a combination of raster + RT brings the best of both worlds.

Crawl back to wherever you came from, troll.
 
Seems nobody followed what happened recently and what RTX does.
First : Nvidia RTX are compatible with DXR but so does Vega and will Navi, and also does Volta.
What RTX does that other don"t : it's all in the API, not the hardware. It's been a leak on a hack of an AMD Veag 64 card that does render the games made for RTX cards, but only partially. DXR renders only what the DX scene shows. Problem is the games show and render the 3D objects that only matter for the view. So the raytrace by DXR API ways won't render the background and other objecst unneeded for the scene. Solution : to take into account all the objects for the direct view and the indirect raytrace but this will be way too heavy for everything. Solution 2 : Nvidia RTX proprietary API. It takes all the objects outside the view into account and makes a new raytrace of them, then assembles that with the DXR results. No way you can do that without the Nvidia RTX and an Nvidia supporting card. Since DXR is incapable of doing that, and needs plenty of improvements, AMD is in a messy situation, including into the console world. The more we go, the worse it will become, since AMD will have to copy Nvidia behaviour and apart from copyright and licence infingement, having to develop the API and pull the gaming companies into his in house late development for its 10% share in the gaming market and only on lower end cards is going to be impossible. AMD is screwed and AMD GPU owners are screwed on this. If you want Raytracing, which is going to be mandatory in new games, it's going to be a Nvidia RTX, whatever high end super-GPU card AMD makes.

The hack let people turn on rt in bfv on amd cards,but it didn’t actually do anything, it still showed the same screen space reflections that are there without rt on. This tells you how shit ray tracing is that people can barely tell if it’s working, it’s almost placebo levels. Ooh, the option is ticked and my FPS dropped 40% so it must be doing something.

The emporer has no clothes.
 
Even according to your link adoredtv only thinks somewhere around gtx 1080 to rtx 2070.

With the 1080 we are talking about 2560 cuda core chip from 2016 that navi in 2019/2020 is just looking to match in perf and perf per watt...at a manufacturing node half the size of 16nm.

Even with these idealistic leaks its admitting at best Navi will catch up with Maxwell from 2015 in terms of architecture.
This is why Navi will likely be a good buy for those in market for GTX 1080 performance on a smaller AMD node & lower price point.
If you need greater than GTX 1080 performance, might as well buy used 1080Ti or Turing.
 
Last edited:
The hack let people turn on rt in bfv on amd cards,but it didn’t actually do anything, it still showed the same screen space reflections that are there without rt on. This tells you how shit ray tracing is that people can barely tell if it’s working, it’s almost placebo levels. Ooh, the option is ticked and my FPS dropped 40% so it must be doing something.

The emporer has no clothes.
Yes it did show some things, only not complete. DXR shows partial rendering on Vega and RTX api brings the missing part on RTX cards only. So Nvidia uses DXR and RTX to add some kind of global illumination on games. DXR compatible cards will only render the DXR part which applies only to the objects taken into account for the current view (no background and out of sight objects). Cards with GPU AMD Polaris and Nvidia Pascal or earlier won't show anything.
Oh, and about the mandatory thing earlier, I was thinking about mandatory if you want full quality game. The game will still be playable with a non RTX card. You will lack reflections and shadows. Why devs will still spend time to conceive those that RTX API can do for them. they wil put a sticker saying DX11/12 compatible but "best experience with RTX card".
 
Yes it did show some things, only not complete. DXR shows partial rendering on Vega and RTX api brings the missing part on RTX cards only. So Nvidia uses DXR and RTX to add some kind of global illumination on games. DXR compatible cards will only render the DXR part which applies only to the objects taken into account for the current view (no background and out of sight objects). Cards with GPU AMD Polaris and Nvidia Pascal or earlier won't show anything.
Oh, and about the mandatory thing earlier, I was thinking about mandatory if you want full quality game. The game will still be playable with a non RTX card. You will lack reflections and shadows. Why devs will still spend time to conceive those that RTX API can do for them. they wil put a sticker saying DX11/12 compatible but "best experience with RTX card".

Devs allready made both shadows and reflections, because - hmm they want to sell their work.. and ermm.. well imagine trying only to cater the RTX crowd.. what if you make a game that the 0,001 dosent like..?

So again.. this is a pipe dream.. it might be great someday, this is not that day, this emperor is naked.
 
Yes it did show some things, only not complete. DXR shows partial rendering on Vega and RTX api brings the missing part on RTX cards only. So Nvidia uses DXR and RTX to add some kind of global illumination on games. DXR compatible cards will only render the DXR part which applies only to the objects taken into account for the current view (no background and out of sight objects). Cards with GPU AMD Polaris and Nvidia Pascal or earlier won't show anything.
Oh, and about the mandatory thing earlier, I was thinking about mandatory if you want full quality game. The game will still be playable with a non RTX card. You will lack reflections and shadows. Why devs will still spend time to conceive those that RTX API can do for them. they wil put a sticker saying DX11/12 compatible but "best experience with RTX card".

It shows the same screen space reflections with DXR on or off on AMD. It doesn't add anything on those cards. The partial reflections are there for all cards.
 
Well boys, it's a soft launch of "Radeon Vega II" 7nm... availability "sometime in Q2 2019".
 
Yes it did show some things, only not complete. DXR shows partial rendering on Vega and RTX api brings the missing part on RTX cards only. So Nvidia uses DXR and RTX to add some kind of global illumination on games. DXR compatible cards will only render the DXR part which applies only to the objects taken into account for the current view (no background and out of sight objects). Cards with GPU AMD Polaris and Nvidia Pascal or earlier won't show anything.
Oh, and about the mandatory thing earlier, I was thinking about mandatory if you want full quality game. The game will still be playable with a non RTX card. You will lack reflections and shadows. Why devs will still spend time to conceive those that RTX API can do for them. they wil put a sticker saying DX11/12 compatible but "best experience with RTX card".
RTX is not an API. It is a middleware just like Gameworks. Let me know what you're smoking because I want to stay far away from it.
 
So what are everyone's post CES announcement thoughts? $699 for R7, due early Feb. Seems comparable to the 2080 Ti based on AMDs own benchmarks. Should see reviews soon.
 
So what are everyone's post CES announcement thoughts? $699 for R7, due early Feb. Seems comparable to the 2080 Ti based on AMDs own benchmarks. Should see reviews soon.
Meh is my response, seems to compete with 1080Ti/2080.
 
So what are everyone's post CES announcement thoughts? $699 for R7, due early Feb. Seems comparable to the 2080 Ti based on AMDs own benchmarks. Should see reviews soon.

No, it competes with the RTX 2080 (sans ray-tracing) and the GTX 1080 Ti. And at $699 that's not very competitive unfortunately.
 
No, it competes with the RTX 2080 (sans ray-tracing) and the GTX 1080 Ti. And at $699 that's not very competitive unfortunately.
What do you mean, not competitive? It will perform the same or better than the RTX 2080 (sans ray-tracing) and ray-tracing is only used in one game right now anyway. It is priced the same. That's competitive to me.

Edit to add - AND it has 16GB of HBM memory, which will be very nice for applications that can use it, while the 2080 only has 8 GB of GDDR
 
What do you mean, not competitive? It will perform the same or better than the RTX 2080 (sans ray-tracing) and ray-tracing is only used in one game right now anyway. It is priced the same. That's competitive to me.

Edit to add - AND it has 16GB of HBM memory, which will be very nice for applications that can use it, while the 2080 only has 8 GB of GDDR

It has the same performance as a GTX 1080 Ti from 2 years ago...and at the same price as the GTX 1080 Ti launched 2 years ago.

The RTX 2080 has the same price and performance, but also offers Ray Tracing and DLSS. The advantages from 16GB of HBM will very likely be much less noticeable than those from Ray Tracing and DLSS over the next few years.

So the Radeon VII has the performance of a 2 year old Nvidia card, without a feature set to compete with Nvidia's new cards, while still retaining the same price. That's what I mean by not competitive.

AMD should've stuck with GDDR5X or GDDR6 to reduce cost and launched this card at $500-550. Performance would've likely been almost identical yet much better priced.
 
It has the same performance as a GTX 1080 Ti from 2 years ago...and at the same price as the GTX 1080 Ti launched 2 years ago.

The RTX 2080 has the same price and performance, but also offers Ray Tracing and DLSS. The advantages from 16GB of HBM will very likely be much less noticeable than those from Ray Tracing and DLSS over the next few years.

So the Radeon VII has the performance of a 2 year old Nvidia card, without a feature set to compete with Nvidia's new cards, while still retaining the same price. That's what I mean by not competitive.

AMD should've stuck with GDDR5X or GDDR6 to reduce cost and launched this card at $500-550. Performance would've likely been almost identical yet much better priced.
It's perspective. DLSS is currently useless, and RT-Raytracing is only usable in one game if you are doing a slow single-player walk-through. Nothing Nvidia released lately is better than their own two-year old tech - 2080 Ti = Titan V, 2080 = 1080 Ti, 2070 = 1080. Right now, there are applications that can very well take advantage of the 16 GB RAM, and there are even games that can take advantage of more than 8 GB. There is only ONE game that can take advantage of RTX, and NONE that can take advantage of DLSS. To claim same or better performance at the same price with the lack of an unused feature compared to NVIDIA's 2nd best card is non-competitive....

'Competitive' - you keep using that word. I don't think that word means what you think it means.
 
2+ years later, same performance as 1080 ti, for the same price...2 years later - are you f-ing kidding me, AMD?

I mean, c'mon. This is some sort of bad dream. It what fairy-tale la-la land do people think this is even competitive?
 
2+ years later, same performance as 1080 ti, for the same price...2 years later - are you f-ing kidding me, AMD?

I mean, c'mon. This is some sort of bad dream. It what fairy-tale la-la land do people think this is even competitive?

Just old architecture on a new node though, right? Still waiting for something new.
 
I am honestly looking to see what they will perform like compared to a highly OCed titan X pascal.
I would love to Give AMD some cash this generation. Still I might very well end up skipping both offerings from either team until the next next generation.
 
You know when pc gaming commuity is gone to shit when people are arguing on a open avalailble graphical advancement, not exclusive...
Might never be use in console but it's like tesselation, not use that much and if it's used it's very little on console,, been avalailble for what 10 years now, got 15,fps back in the days when enabling it..

Now the whole community rage if the game doesnt do 100+ fps.

Is rtx on nvidia perfect off course not.. But rt like réflection now is better than this ssr crap we got now..
 
Vega VII compute performance +62% -> That is a lot of improvement for an already Compute powerhouse.

I will probably sit this whole generation out but you never know. Looking forward to some future [H]ardOCP reviews.
 
New Vegas need to perform better than a 1080Ti @ 4k in order for me to buy one.
 
It shows the same screen space reflections with DXR on or off on AMD. It doesn't add anything on those cards. The partial reflections are there for all cards.
Ok I understand now. You're right.
I didn't understand why they needed raytrace for reflections, since they could do that without raytracing. Now I understand they don't. In fact it's done the old way and they tell people it's RTX. There are only some scarce lighting effects they can do with raytrace. On render engines, rendering photon level effects takes a lot more time to compute than the scene without it.
So, as I imagined at first, raytrace feature is complete BS, cannot be rendered for real in real time in gaming.
So only needs to be taken into account the non RTX speed of the card.
Then Titan Volta may be on par with RTX 1080Ti. Vega II may be close to it.
Not sure Vega II is not superior to RTX 2080 since Vega II has 16GB of very fast Vram and 8GB is been shown as quite the lower limit one must own now to play new top graphics games.
If raytrace doesn't count, Vega II is the best value between all expensive graphics cards. And I am quite sure that AMD can make a driver performing well on DXR for Vega II. Vega II has all the features of Volta. Even Vega 10 GPU has most of what's needed (FP 16 and Int 8).
 
Be careful about your needs. AMD Radeon V II is not the same GPU as in Instinct. 60 units instead of 64, FP64 has been deactivated as most of its professionally dedicated units.
It is really a kind of Vega 60 built on 7nm with 16GB HBM2. Take it as an overclocked Vega Frontier Edition that also had 16GB of HBM2, and nothing more.
 
No. It's an overclocked and gimped instinct.

It will still beat the Vega 64 in compute by a large margin.

A
 
Be careful about your needs. AMD Radeon V II is not the same GPU as in Instinct. 60 units instead of 64, FP64 has been deactivated as most of its professionally dedicated units.
It is really a kind of Vega 60 built on 7nm with 16GB HBM2. Take it as an overclocked Vega Frontier Edition that also had 16GB of HBM2, and nothing more.

Its very likely its just disabled in bios. I don't believe they went out of their way to disable it in hardware.
 
Back
Top