JustReason
razor1 is my Lover
- Joined
- Oct 31, 2015
- Messages
- 2,483
Sorry to say but need better info that those comparisons. the win7 had the lower tier cards and the win10 had higher with very little overlap between the 2.
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature currently requires accessing the site using the built-in Safari browser.
And to everyone, my experiences are that I have not experienced the OS being the cause for any performance improvements. If there are performance improvements in games, it is through driver updates that they have come, not the OS in the games I have used.
There are no games out yet which support the one feature Win10 would be the cause of benefit yet, and that is DX12.
To expect any DX11 game to work better on Win10 vs. Win 8.1 boggles my mind since it is the same API. Placebo effect is at play, out of shear will power and wanting performance to be better on a new OS people believe that it is so. The mind is a powerful trickster.
And to everyone, my experiences are that I have not experienced the OS being the cause for any performance improvements. If there are performance improvements in games, it is through driver updates that they have come, not the OS in the games I have used.
There are no games out yet which support the one feature Win10 would be the cause of benefit yet, and that is DX12.
To expect any DX11 game to work better on Win10 vs. Win 8.1 boggles my mind since it is the same API. Placebo effect is at play, out of shear will power and wanting performance to be better on a new OS people believe that it is so. The mind is a powerful trickster.
WDDM 1.3 vs WDDM 2.0. Wouldn't be surprised if CPU overhead was much lower with WDDM 2.0.
It seems clear AMD has been putting all of their driver efforts into Windows 10 the last few months, maybe even starting this Spring before Win10 launched. It's a free upgrade for everyone on 7 & 8, so with their limited resources it makes sense to drop the old OS'. Obviously AMD still supports previous versions of Windows but future optimizations will probably be limited to Windows 10 exclusively.Only if running DX12, the cpu overhead is actually similiar with WDDM 2.0 when running DX11 (there is slighting less but nothing really great like 1-2%)
It seems clear AMD has been putting all of their driver efforts into Windows 10 the last few months, maybe even starting this Spring before Win10 launched. It's a free upgrade for everyone on 7 & 8, so with their limited resources it makes sense to drop the old OS'. Obviously AMD still supports previous versions of Windows but future optimizations will probably be limited to Windows 10 exclusively.
On the flipside Nvidia is having the opposite problem with Windows 10.
I believe unless Nvidia pulls an ace from their sleeve with Pascal, AMD will dominate the DX12 era. They already have a huge head start both in DX12 and Windows 10 drivers, and considering all of the recent flubs Nvidia has made.
I do agree AMD looks pretty good initially in DX12/W10. But at least we can say it has added parity. In most of the DX12 benches/games there is no clear winner at the top, well at least not by the same margin as was the case in DX11. And have to admit the 7970 crowd has to be the most content and happy owners given the longevity of their cards.
Hey Razor: I got a question that I have been wondering about. Nvidias boost: Say a card boosts to 1500mhz from its stated base of 1100mgz. So what does overclocking to 1386mhz do to the 1500mhz boost? Reason I ask is that when it is stated that the card was boosting to that 1500 and receives said score, most say that the Nvidia cards still have a lot of OCing headroom, though I would think that Boost would account for the OCing and only achieve the same results as the original 100mhz bench. So is there something that I am missing in that?
I don't know how you can say nothing has changed, even for the Fury X just check benchmarks since its release and it's gained around 10% more performance, at this rate it will surpass the 980 Ti across the board by mid-2016. The same could be said about their entire line-up, the 280X is nipping at the 780's heels, same for the 290 and 970. AMD has basically moved all of their GPUs up a tier.
Both AotS and Fable show ~10% performance gain for AMD over their average DX11 performance. Of course there are some DX11 games that perform similarly as current DX12 benchmarks (Battlefront for example) so it's too early to claim they dominate the entire API, but it's certainly looking good for them.
Anyone denying these performance margins is simply burying their heads in the sand. How many benchmarks will it take before people acknowledge the gains? You can't deny it forever. It's right there in-front of your eyes.
Dude different games different settings read the review properly. Some of the benchmarks new and old aren't using AA like the old ones where, and we know nV cards tend to perform better with AA as they tend to take less of a hit in most games. Did you even notice that? I already stated this, well didn't really point it out point blank, I think you just don't want to read the review, instead just looking at the final numbers?
If you want me to break down the TPU reviews that you linked to in the first post of this thread, I can, but it was easy to see that.
You know what I will do it.
Assassin's creed, old review x4 aa, new review, no AA
Battlefield 3, old review x 4 aa, new review, no AA
Battlefield 4, old review x 4 aa, new review, no AA
Crysis 3, old review x 4 aa, new review, no AA
Shadows of Mordor, old review x 4 aa, new review, no AA
There are two other games in the old review that used AA, which the new review those games were replaced that don't use AA.
Pretty much a big change in the way the review was done and it doesn't show any of the benefits of AA from a performance stand point that nV got from the first review that you posted. This is why you have seen the changes that you think you see.
I don't know how you can say nothing has changed, even for the Fury X just check benchmarks since its release and it's gained around 10% more performance, at this rate it will surpass the 980 Ti across the board by mid-2016. The same could be said about their entire line-up, the 280X is nipping at the 780's heels, same for the 290 and 970. AMD has basically moved all of their GPUs up a tier.
Both AotS and Fable show ~10% performance gain for AMD over their average DX11 performance. Of course there are some DX11 games that perform similarly as current DX12 benchmarks (Battlefront for example) so it's too early to claim they dominate the entire API, but it's certainly looking good for them.
Anyone denying these performance margins is simply burying their heads in the sand. How many benchmarks will it take before people acknowledge the gains? You can't deny it forever. It's right there in-front of your eyes.
Normalization doesn't "skew" anything, it's just a scale. I unnormalized the data and calculated the % change on each AMD card relative to its Nvidia counterpart (listed in the post). If you'd like a basic math explanation on how that works I would be happy to provide it.
don't know why they're not labeling the use of AA in their latest review, but I'm pretty sure TPU is still using AA or maintaining the same test settings
http://www.techpowerup.com/reviews/AMD/R9_Nano/9.html
http://www.techpowerup.com/reviews/MSI/GTX_980_Ti_Lightning/7.html
Comparing BF3 & looking at 980 Ti #s
@1080 it's 163.9 vs 164.2
@1440 it's 102.6 vs 102
@ 2160 it's 49.5 vs 49
at X4 AA nV cards don't have much of a performance hit its pretty much free in most games, and this has been there since the g80.
Edit and if you look at the the same games without AA in both reviews, AMD cards have similar frame rates across the two reviews. The only major change is what I mentioned in the games that were not using AA in the new review anymore, and the impact is significant for AMD cards, close to 20% at times (depending on resolution, game and card)
You forget that amd cards do what you tell them whilst nvidia cards don't.
Math fail?
Since TaintedSquirrel doesn't seem to be interested, I'll do it.Oh, hey, you're back! You forgot to do something while you were gone:
x = ((Pn1 / Pa1) - (Pn2 / Pa2)) * 100
where
x = the percentage change
Pn1 = the relative performance of the 950 in Win7
Pa1 = the relative performance of the 370 in Win7
Pn2 = the relative performance of the 950 in Win10
Pa2 = the relative performance of the 370 in Win10
((88 / 76) - (31 / 32)) * 100 = 18.9%
Pn1 = (N1 / R1) * 100
Pa1 = (A1 / R1) * 100
Pn2 = (N2 / R2) * 100
Pa2 = (A2 / R2) * 100
where
N1 = the raw performance of the 950 in Win7
A1 = the raw performance of the 370 in Win7
R1 = the raw performance of the reference card (950 XtremeGaming) in Win7
N2 = the raw performance of the 950 in Win10
A2 = the raw performance of the 370 in Win10
R2 = the raw performance of the reference card (980 Ti Lightning) in Win10
x = ((((N1 / R1) * 100) / ((A1 / R1) * 100)) - (((N2 / R2) * 100) / ((A2 / R2) * 100))) * 100
= (((N1 / R1) / (A1 / R1)) - ((N2 / R2) / (A2 / R2))) * 100
= ((N1 / A1) - (N2 / A2)) * 100
Damn this makes me want to switch over to team red with the way that AMD unless team green can figure out a solution as well.
at X4 AA nV cards don't have much of a performance hit its pretty much free in most games, and this has been there since the g80.
Well I'm not sure what kind of AA you're talking about, but 4x MSAA is most definitely not "pretty much free in most games", at least if we're talking modern DX11 AAA titles.
Not sure what you are reading you link contradict your argument. The fury x has a smaller loss in performance when turning on AA and AF compared to the 980ti. Now if you mean to SSAA instead of AA(to general) I would agree since the fury x does not have enough VRAM. I didn't bother reading the entire 2nd link. They are comparing a rep constrained card to a full tdp card. I'm not sure what you expected to happen, when the 970 is tdp/temp constrained also as seen in the [H] review it does poorly also even though it is touted as an efficient card.ah yeah sorry bah lol.
But in anycase AMD does have a disadvantage even with AA, and definitely with AF.
http://www.hardwareluxx.com/index.p...5798-reviewed-amd-r9-fury-x-4gb.html?start=10
more with AA and AF tests. AMD just takes more of a hit.
http://www.pcworld.com/article/2982...e-pcs-incredible-shrinking-future.html?page=2
And another this one is x4 AA again AMD cards takes a greater hit.