AMD RDNA 2 gets ray tracing

Static screenshots from random forum-user in 2020 vs DF's detailed video analysis...not even a choice for me...DF wins by a mile 🤷‍♂️

TL;DR You add nothing of value.
Actually while DF and other youtube creatures make amazingly in-depth videos there is never going to be a situation were hands on perspective from a user is nothing of value. Stop being so pretentious with what other people value. Following your line of thinking we might as well not post only watch videos on youtube.
 
Actually while DF and other youtube creatures make amazingly in-depth videos there is never going to be a situation were hands on perspective from a user is nothing of value. Stop being so pretentious with what other people value. Following your line of thinking we might as well not post only watch videos on youtube.

His pictures shows what proper reviews (like DF does) that FFX has reduced image quality...but pictures are subpar to videos as the days of "static picture games are long gone.
Add to that that DF does objective analysis, something I question with the posted pictures.
 
His pictures shows what proper reviews (like DF does) that FFX has reduced image quality...but pictures are subpar to videos as the days of "static picture games are long gone.
Add to that that DF does objective analysis, something I question with the posted pictures.
I see what he was able to obtain in Rage 2 with FFX, does the DF video cover how it works in this scenario? If they didn't than a picture subpar or not will always beat a video that isn't there. They can only cover so many game and hardware combinations being limited in time and resources. That's were posts like his come in, while you may not care someone maybe interested in that use case.
 
Yep. Here's a test that shows the difference going from 4 cores to 12. The biggest jump in performance is going from 4 to 6 to 8. 8 being the sweet spot basically. After that you kind of hit a wall.
How Many Cores Really Need For Gaming?.

Sweet spot is not the point where you hit a wall. Sweet spot is where you get maximum Incremental enjoyment for minimum incremental cost.

Clearly that is not 8 cores since the incremental benefit from 6 to 8 cores is practically zero.
 
Sweet spot is not the point where you hit a wall. Sweet spot is where you get maximum Incremental enjoyment for minimum incremental cost.

Clearly that is not 8 cores since the incremental benefit from 6 to 8 cores is practically zero.
Um no. If you want to lie have it. I guess your ego needs a pick me up. But that's not what the video showed.
 
Lol words mean things. What does ego have to do with anything.
Well there's 10FPS difference between 6 to 8 if that's nothing then ok but that number is greater than the difference between most CPU model skus.

So apparently there's some ego thing afoot or you're just another angry person who feels wronged some how. Either really makes no difference to me.

I posted a link showing gains up to 8 cores if you want to convince people that they should stick to 6 cores then ok. It doesn't really affect me one way or another.
 
Sweet spot is not the point where you hit a wall. Sweet spot is where you get maximum Incremental enjoyment for minimum incremental cost.

Clearly that is not 8 cores since the incremental benefit from 6 to 8 cores is practically zero.
Did you watch the video? I know it's not the same for all games and this is just a small sample. But the fi St like 3 games show 30-40% difference in 1% lows... That's a huge difference and I would call that the sweet spot. Some of them showed barely any difference between 6 and 8. So, depending on the chosen games, either of you could be right. The fact that there are enough games starting to show such a big difference in 1% lows in multiple games, and with the cost difference between 6 and 8 cores, the 8 core is moving into the sweet spot territory and it's only going to continue to be relevant.
 
I watched the video. There is a huge difference, especially at 1%. Think I might need to upgrade...
 
  • Like
Reactions: noko
like this
It's always time for an upgrade, the more you buy the more you save
I want to see what AMD does with RDNA2. If it really beats Nvidia, I'll consider going all AMD on my main rig (something I've never done).
 
  • Like
Reactions: noko
like this
Did you watch the video?

Yep.

I know it's not the same for all games and this is just a small sample. But the fi St like 3 games show 30-40% difference in 1% lows... That's a huge difference and I would call that the sweet spot.

For the games showing 30-40% difference would the average person enjoy the game any less on the 6 core? The only thing that matters is whether you would enjoy the game any more if you spend more money. When the answer is "no" then that's your sweet spot. Unless of course you guys aren't talking about actually playing games.
 
I want to see what AMD does with RDNA2. If it really beats Nvidia, I'll consider going all AMD on my main rig (something I've never done).
I'm interested also, truthfully me running this cheapo RX580 I had as a backup is the first time in a while I didn't have Nvidia in my main system. I'm going to wait until all the cards are on the table then purchase whichever is a better deal for me with improvements over the 2080ti.
 
Yep. Here's a test that shows the difference going from 4 cores to 12. The biggest jump in performance is going from 4 to 6 to 8. 8 being the sweet spot basically. After that you kind of hit a wall.
How Many Cores Really Need For Gaming?.
Good comparison, CPU speed locked the same to each test and kept constant. This was all on the same motherboard and using a 3900x:
System: Windows 10 Pro
AMD Ryzen 9 3900x 3.8GHz (SMT - OFF)
GIGABYTE X570 AORUS PRO
GeForce RTX 2080 OC 8GB
16Gb RAM 3600Mhz

Now the downside was all the tests were done with SMT off which should have up the performance of the 4core, 6core, 8core. So for me it is inconclusive on this video other than without SMT turned on.
 
Yep.



For the games showing 30-40% difference would the average person enjoy the game any less on the 6 core? The only thing that matters is whether you would enjoy the game any more if you spend more money. When the answer is "no" then that's your sweet spot. Unless of course you guys aren't talking about actually playing games.
Seriously? Under 60fps 1% vs over 60fps 1%... Yes even an average person would enjoy it better if it means not dipping below the refresh rate of an average monitor. If you disagree that's fine, but if your argument is that nobody will notice a 40% difference?? That seems a little inconsistent. As I said, it's not all games, some are different and threaded to different extends with some being fine on less cores and others on more. Given the difference in SOME of these games (ie; those are the ones someone would play), then 8 core is the sweet spot. If you play games that don't need more cores, then 4 or 6 could be the sweet spot. But going forward, I think we'll start seeing more games take advantage of more cores. That doesn't mean 4 or 6 cores can't play a game just that the point of less return for your $ is shifting up, like it has been doing since 4 cores was the sweet spot. Then 6... in some games it's already 8, but probably not the majority yet, as this was to small of a sample, but point was we are starting to see some pretty decent seperation.
 
Good comparison, CPU speed locked the same to each test and kept constant. This was all on the same motherboard and using a 3900x:


Now the downside was all the tests were done with SMT off which should have up the performance of the 4core, 6core, 8core. So for me it is inconclusive on this video other than without SMT turned on.
That's why by itself it's just a single data point and hard to draw real conclusions. The argument was (8 vs 6) cores, not cores + HT... As in past comparisons used 6 core Intel chips w/o HT. I do agree though, since it's almost impossible to find new chips without HT now (and from AMD for a while now) it would make more sense to compare with HT on, where I fully expect 6/12 would/could have had a much better showing.
 
Seriously? Under 60fps 1% vs over 60fps 1%...

In the video I watched every single game was over 1% 60fps the entire time except for BFV that dipped under for literally 2 seconds. So you're telling me the average person would care about that 2 second dip? You have a strange definition of average.

I think we just disagree on what sweet spot means. It certainly doesn't mean "let me find some extreme edge case to show where an 8-core provides neglible benefit".
 
In the video I watched every single game was over 1% 60fps the entire time except for BFV that dipped under for literally 2 seconds. So you're telling me the average person would care about that 2 second dip? You have a strange definition of average.

I think we just disagree on what sweet spot means. It certainly doesn't mean "let me find some extreme edge case to show where an 8-core provides neglible benefit".
I still don't see how 40% is negligeable. But sure, if you think a 40% difference is it, then yes, we do have a different opinion on it. Anyways, I for one think if RDNA is 40% faster than a 2080ti it would be a pretty good ;). Of course pricing means a lot to. Anyways, I even admitted its very game dependant and can be either one depending on what you play. But it's only going to continue (slowly?) shifting upward.
 
That's why by itself it's just a single data point and hard to draw real conclusions. The argument was (8 vs 6) cores, not cores + HT... As in past comparisons used 6 core Intel chips w/o HT. I do agree though, since it's almost impossible to find new chips without HT now (and from AMD for a while now) it would make more sense to compare with HT on, where I fully expect 6/12 would/could have had a much better showing.
Also how he disabled the cores to compare the difference configurations becomes important, was his 4 core (3900x was used in all tests by disabling cores) across two CCX's? Two CCD's? How did he disable the cores? Was that just using Affinity in Windows to do? Gets really complicated and the results can be really skewed depending upon what was done.

I would say most well multi-threaded games do better with SMT on or for the 4 core and 6 core, better with SMT on then off.
 
In the video I watched every single game was over 1% 60fps the entire time except for BFV that dipped under for literally 2 seconds. So you're telling me the average person would care about that 2 second dip? You have a strange definition of average.
It's 2020. High end PC gamers aren't gaming at 60Hz.
 
Also how he disabled the cores to compare the difference configurations becomes important, was his 4 core (3900x was used in all tests by disabling cores) across two CCX's? Two CCD's? How did he disable the cores? Was that just using Affinity in Windows to do? Gets really complicated and the results can be really skewed depending upon what was done.

I would say most well multi-threaded games do better with SMT on or for the 4 core and 6 core, better with SMT on then off.
I think Intel and AMD have a few products usually the high core ones in which they recommend you turn SMT off for gaming which for games make sense because they utilize more of the core than a light highly threaded app.
 
Last edited:
It's 2020. High end PC gamers aren't gaming at 60Hz.

1% lows of 60fps aren’t good enough for “high end gamers”? Hahaha don’t make me laugh.

Not sure why high end gamers are relevant either way as they certainly aren’t running “sweet spot” hardware. Surely they’re running high end stuff!
 
Well, I can't speak for everyone, but I aim for 90 fps minimums on my my setup (160Hz panel FreeSync). Below that looks choppy.
 
Last edited:
I still don't see how 40% is negligeable. But sure, if you think a 40% difference is it, then yes, we do have a different opinion on it. Anyways, I for one think if RDNA is 40% faster than a 2080ti it would be a pretty good ;). Of course pricing means a lot to. Anyways, I even admitted its very game dependant and can be either one depending on what you play. But it's only going to continue (slowly?) shifting upward.

It can be 1000% but unless it improves your gaming experience it’s negligible wouldn’t you say?

Of course 40% faster FPS averages is a big deal! But that’s very different to 40% higher 1% FPS minimums that are already well over 60fps. Seems we’re veering onto a tangent now though...
 
L
It can be 1000% but unless it improves your gaming experience it’s negligible wouldn’t you say?

Of course 40% faster FPS averages is a big deal! But that’s very different to 40% higher 1% FPS minimums that are already well over 60fps. Seems we’re veering onto a tangent now though...
Agreed, tangent, that's why I said something about rdna2 in my last comment, just to keep it somewhat on topic. Anyways, it'll be interesting to see not only how rdna2 compares but how well it's implementation of raytracing implementation compares to ampere. I'm excited to see what's in store. Seems we will be seeing ampere before we see rdna2 though, so Nvidia will be out of the gate first. Those that have been waiting may just jump on it because it's first.
 
Yeah, even if AMD delivers the goods with RDNA2, there is the chance that everyone will have already upgraded by that point.

Hopefully we will get some solid information before that time so people have something to wait for (not just these bogus rumors, though, I mean something solid)
 
I kinda like the mystery AMD is putting out over Nvidia, on performance, price, when it will release -> lol. Including Ray Tracing performance which we can't even put numbers on it even though a number of next gen console games showed it. No AIB leaks, what better way for AMD to hit Nvidia on any weakness when they see a substantial release of Nvidia products, pricing, availability. Gives them time to find those games as well that shows off the card better, best price to compete for any segment. AMD has basically been giving out zero information for next gen GPU's, Sony and Microsoft as well has not been sharing the more details of Ray Tracing performance or anything else unique or new besides rather vague hints at performance. We don't even know if for this year there will be only one model or several? Mobile models (one area with 7nm AMD could make some significant inwards, more important than desktop)? How important Raytracing will be will come down to game support, other application support, actual performance and benefits. Raytracing may turn out to be a rather insignificant reason overall for buying for most folks, both Nvidia or AMD regardless of performance.
 
It feels like a calculated plan. Total blackout silence.

That means they have good cards. Sort of like when you get pocket Aces in Texas Hold 'em and you stop talking real quick.
 
They probably also don’t want to overhype and under deliver like basically every release in the past few years.
 
Oh yeah it has begun.

Halo infinite gameplay on XSX is universally panned for looking flat and last gen. Digital Foundry breaks down the problem to the use of last gen indirect lighting in scenes not directly lit by the sun. Solution of course is higher precision bounce lighting including hardware RT.

Halo is also a great example of why shitty graphics at 4K still looks shitty. DF recommends lowering resolution and spending the horsepower on better lighting. I couldn't agree more.

So much for people not noticing the benefits of pixel perfect light and shadows. This is a promising sign for this coming generation. Open world games with the sun as the primary source of light will need RT or SVOGI to avoid looking "flat" as people's expectations are changing.

 
Wasn't sure it was just me, the graphics look like what I remember Halo 3 looking like on Xbox360.
 
Back
Top