AMD Radeon Software Crimson Edition 16.2

HardOCP News

[H] News
Joined
Dec 31, 1969
Messages
0
AMD sends word that its new Radeon Software Crimson Edition 16.2 non-WHQL drivers are now out. Highlights of this driver release are as follows:




    • AMD has partnered with Stardock in association with Oxide to bring gamers Ashes of the Singularity – Benchmark 2.0 the first benchmark to release with DirectX® 12 benchmarking capabilities such as Asynchronous Compute, multi-GPU and multi-threaded command buffer Re-ordering. Radeon Software Crimson Edition 16.2 is optimized to support this exciting new release.
    • The SteamVR Performance Test: we are pleased to report that our Radeon R9 390, Nano, and Fury series GPUs are all able to achieve 'VR Recommended' status, the highest achievable level of experience possible. In addition to that, our affinity multi-GPU feature is already showing significant performance uplift over a single GPU on Radeon cards in the aforementioned benchmark
    • Performance and quality improvements for ◦Rise of the Tomb Raider™ and Ashes of the Singularity – Benchmark 2
    • Crossfire Profiles available for The Division and XCOM 2
 
dl'ing now. So this benchmark, is that in the DL or something I need to DL extra?
 
NVIDIA spends time getting game ready drivers going, AMD spends time getting canned benchmark drivers ready?
 
There's something wrong with showcasing new technologies that can improve future gaming performance?
How does tuning your drivers for a canned benchmark help gamers?
 
Did you read the last two lines? Just because Nvidia calls some of their drivers "game ready" doesn't mean that they are the only ones that release drivers that work with new games.

Fancy shmancy names mean nothing except to those who are not able to see past the BS marketing.

"



    • Performance and quality improvements for ◦Rise of the Tomb Raider™ and Ashes of the Singularity – Benchmark 2



    • Crossfire Profiles available for The Division and XCOM 2
So 2 extra CF profiles and also improvments for Rise of the Tomb Raider.


All the promo from AMD is pointing to the AoS Benchmark. I have seen no PR from AMD today telling me that AMD's AoS gameplay performance is better. It seems to me that PR about a canned benchmark is what it is focusing on.
 
So the idea is that it would be better if they kept quiet about what's new in the driver? It's not like nVidia hasn't spent man years optimizing for benchmarks like 3DMark, and I'm sure if/when they have a DX12 showcase they'll shout it from the heavens.

If anything, at least this is an actual game benchmark, unlike 3DMark which they've both spent time on and which is completely useless. There's at least a chance here that optimizations will apply not only to this game, but that they're able to use it to fix bugs and performance issues in the DX12 driver in general. There's not that much DX12 code to experiment with now.

Besides, GCN is a very mature product at this point, that we would need something like "game ready drivers" is the sort of insanity that'll hopefully go away. I've played a lot of releases lately and I haven't needed to 'wait for a driver' at any point. Many of the "game ready drivers" that nVidia release seems to end up with people rolling back as they break other games, etc. It's all just dumb.

I'd rather AMD put their time and money on Vulkan/DX12 and their next hardware instead of wasting it on yesterdays garbage (DX11). Fix the APIs and the rest will follow.
 
Last edited:
As it is, AMD seems to be focusing on DX12 more right now.. and since there aren't really any DX12 games out yet, the only thing they can do is show the performance of benchmarks that are DX12.

Agreed.

So the idea is that it would be better if they kept quiet about what's new in the driver?

Not at all.

I think it is just interesting that websites still write big stories about canned benchmarks. Just being the Devil's advocate here a bit, as if this was a thread about NVIDIA and canned benchmarks, the pitchforks would be out! :)
 
Regardless of the benchmark talk, I'm rather surprised at seeing a crossfire profile for The Division. A Nvidia game that isn't out yet.
 
If you want interesting, consider that nVidia are now sending out notes that async compute is not enabled on 9xx series cards in the drivers (never mind that enabling it seems to impact performance negatively, which is a bit odd).

"Nvidia reached out to us this evening to confirm that while the GTX 9xx series does support asynchronous compute, it does not currently have the feature enabled in-driver. Given that Oxide has pledged to ship the game with defaults that maximize performance, Nvidia fans should treat the asynchronous compute-disabled benchmarks as representative at this time." (extremetech)

Yet, they've been marketing Maxwell cards as DX12 ready, explicitly mentioning "Async compute".

A devil's advocate would perhaps suggest that there might be some rather big problems there, or why isn't it in the drivers already?
 
  • Like
Reactions: atom
like this
If you want interesting, consider that nVidia are now sending out notes that async compute is not enabled on 9xx series cards in the drivers (never mind that enabling it seems to impact performance negatively, which is a bit odd).

"Nvidia reached out to us this evening to confirm that while the GTX 9xx series does support asynchronous compute, it does not currently have the feature enabled in-driver. Given that Oxide has pledged to ship the game with defaults that maximize performance, Nvidia fans should treat the asynchronous compute-disabled benchmarks as representative at this time." (extremetech)

Yet, they've been marketing Maxwell cards as DX12 ready, explicitly mentioning "Async compute".

A devil's advocate would perhaps suggest that there might be some rather big problems there, or why isn't it in the drivers already?

And all the tech journalists like Kyle here, in the pocket of NV, remains silent. They simply don't care about false advertisement, blatant lying of specs. $ is where it's at, and it's all good when they can get in a jibe against AMD for releasing their own "Game Ready" drivers. What a joke.
 
I
Yet, they've been marketing Maxwell cards as DX12 ready, explicitly mentioning "Async compute".

A devil's advocate would perhaps suggest that there might be some rather big problems there, or why isn't it in the drivers already?

Just my thoughts, it could be due to no real DX12 games existing yet. NV may be putting resources in tweaking drivers for games that are out. They will enable drivers for DX12 games when they actually come out. Especially DX12 that uses Async. And the statement said it is not enabled in the drivers. The statement did not say not present in the drivers.

AMD has spent a long time courting and working with Oxide on AOS. NV might not see the payout in working with Oxide at this time. Wouldn't be the first time either GPU company has ignored or limited resources when working with a game dev.
 
Wrong on so many levels.

It's an actual game you can play right now, it's very playable, very bug-free for a beta or early access on Steam and a very fun RTS game.

Save 50% on Ashes of the Singularity on Steam

In the same driver, AMD improved Tomb Raider, added CF profiles, fixed bugs.

Not to mention, NVIDIA actually release "Game Ready" drivers for Ashes as well, and their last one, for Hitman "Beta".

At least pretend to be neutral, and contain your hatred of AMD if you want to be a credible tech journalist.

My question is wrong? You guys are pretty easy to bait. ;)
 
And all the tech journalists like Kyle here, in the pocket of NV, remains silent. They simply don't care about false advertisement, blatant lying of specs. $ is where it's at, and it's all good when they can get in a jibe against AMD for releasing their own "Game Ready" drivers. What a joke.

We just care about how video cards work when we play games. Like here in an AMD title? :) Rise of the Tomb Raider - Rise of the Tomb Raider Video Card Performance Review

We have also found that AMD video cards below the R9 390X are doing quite well in Rise of the Tomb Raider compared to the NVIDIA counterparts. The only video card we were not impressed with is the Radeon R9 Fury. For the price, and the fact it is based on the latest GCN technology, it doesn't perform up to our expectations. The Radeon R9 Fury X is better, but both of these video cards are limiting at 4K with their 4GB of VRAM.
 
Just my thoughts, it could be due to no real DX12 games existing yet.
Exackery. If NVIDIA were out touting its performance on a title that is yet to be released and a canned benchmark, the Red Team guys would be going nuts. That is really my only point. It is my best personal interest that AMD due well. Think about that for a moment. Without competition, there is no need for HardOCP.
 
  • Like
Reactions: atom
like this
Is this benchmark free? Or is AMD now pimping a benchmark stuck behind a paywall Early Access game?
 
Is this benchmark free? Or is AMD now pimping a benchmark stuck behind a paywall Early Access game?

Press release of Ashes of the singularity <-- canned benchmark :)

I'm not to sure on what they are using I have heard that sometimes developers pick certain milestones because of certain things being in there that are finished. If you don't know what is in there you can regard it as a canned benchmark, not that the game will be 100% different from it but certain aspects could change which in a way makes it less useful as a real world benchmark.


at SA
AMD sent us a prerelease driver for testing with this AotS but we didn’t see a significant performance difference between that driver but we only ended up using it for our async compute benchmarks. The public version of the game and AMD’s driver were used for everything else.

So not everyone is doing the same benchmarking ....
 
Except it's actually released, fully playable. You haven't heard of Steam early access?

Look at all those people playing ARK Survival Evolve. It hasn't been "released" either?

Steam Charts - Tracking What's Played

Currently #5 of all gamers in Steam.

Move on with the times.
There is a very good reason we do not use yet-to-be-released, or "early access" games. Many times those in no way represent what the final product will play like. Thanks for your insight though.

And actually the price just dropped by half till 2/29/2016!!

Save 50% on Ashes of the Singularity on Steam
 
Except it's actually released, fully playable. You haven't heard of Steam early access?

Look at all those people playing ARK Survival Evolve. It hasn't been "released" either?

Steam Charts - Tracking What's Played

Currently #5 of all gamers in Steam.

Move on with the times.

You mean AOS is in Beta and still being developed?????

Also, was DX12 released for ARK yet? Last I saw from Jesse with Wildcard was DX12 support is not ready. Jesse even refers to ARK as pre-alpha.

Jan 21, 2016 "We will be releasing DX12 as soon as it's ready inside of UE4 and across all hardware targets."

But I am behind the times and want GPU devs to create drivers that work really well with the games that are already done and out.
 
I gave you the link to Ashes on Steam. Gamers are playing already for awhile.
Thanks. I must have totally not read that the first time you posted it. No wait, I totally got it.

So, how does tuning your drivers for a canned benchmark help gamers?
I don't have drivers and I don't tune those. We simply play released games and share our data and experience from that.

Btw, did you ask NV why they released "Game Ready" drivers for the Ashes benchmark ALPHA, before it was even available for gamers to play?
No. I did not ask AMD either.
 
"How does tuning your drivers for a canned benchmark help gamers?" - This was what you had asked.

I hope that by pointing out the now obvious thing, that many gamers are in fact playing Ashes already because it's available as EA beta on Steam. So AMD releasing a game ready driver for Ashes is good for gamers.

You often criticized AMD for not releasing more game ready drivers, now that they do, you are taking a jibe at them for it. Why is that?

Gamers could not play the alpha back last year, but NV released a "Game Ready" driver for that anyway and you didn't make a jibe, not even a little squeak. Next time you talk to your NV contacts, ask them why. It will be nice to tell both sides of the story.
 
"How does tuning your drivers for a canned benchmark help gamers?" - This was what you had asked.

I hope that by pointing out the now obvious thing, that many gamers are in fact playing Ashes already because it's available as EA beta on Steam. So AMD releasing a game ready driver for Ashes is good for gamers.

You often criticized AMD for not releasing more game ready drivers, now that they do, you are taking a jibe at them for it. Why is that?

Gamers could not play the alpha back last year, but NV released a "Game Ready" driver for that anyway and you didn't make a jibe, not even a little squeak. Next time you talk to your NV contacts, ask them why. It will be nice to tell both sides of the story.

Cool story bro.
 
All I would like to add is I have seen Nvidia release Game-ready Drivers for games in Alpha and Beta too. Both companies do it.
 
All I would like to add is I have seen Nvidia release Game-ready Drivers for games in Alpha and Beta too. Both companies do it.

But the PRs that I got from AMD had nothing to say about playing the game. The PR was only about a canned benchmark. This is my point.

AMD Radeon Software Crimson Edition 16.2

AMD has partnered with Stardock in association with Oxide to bring gamers Ashes of the Singularity – Benchmark 2.0 the first benchmark to release with DirectX® 12 benchmarking capabilities such as Asynchronous Compute, multi-GPU and multi-threaded command buffer Re-ordering. Radeon Software Crimson Edition 16.2 is optimized to support this exciting new release.

And the PR goes as far as to specify that the driver helps the benchmark. No mention of the actual game.

  • Performance and quality improvements for
    • Rise of the Tomb Raider™
    • Ashes of the Singularity – Benchmark 2

That just seems really odd to me. I dunno, but maybe it is just more poor PR verbiage from AMD. That has not exactly been its strong point for some time now.
 
Times are changing.....People are starting to believe more and more canned benchmarks as sad as it is. Even my 15 year old nephew and his friends ask other computer nerds (whats your 3dmark score).
Hehe, I have lived through these times before, and they are the exact reason we changed the way we do things and HardOCP, and in turn, we changed the way the industry worked as well. AMD is clutching at straws here, or that is my initial thoughts about it.

Good reading here and here from 8 and 12 years ago. AMD is making it full circle.
Introduction - Benchmarking the Benchmarks
Introduction - [H]istory of Change - Cheating the Cheaters
 
If you actually gave a damn (which you did not), you could have easily twitter to AMD's reps and ask if its for the benchmark or the game.

Funny stuff there, let me "Twitter" AMD to see what they are actually saying in their press releases. Too bad a multi-million dollar company (in losses) can't get a PR writer that can write with clarity.

Hilariously you forgot NV released several "Game Ready" drivers for Ashes all the way back to Alpha, back when it was ACTUALLY a benchmark only and not a game available to the public.

Awesome. It is just that I have not see a lot of press releases flying about lately about that and how great NV is at DX12. Funny how people like to become historians when there is no way to prop up their current argument. "Well they did it too!!!!" LOL! That is some funny shit right there.
 
but you do realize its not a "canned" benchmark right? It is a benchmark of course. ultimately better performance in the benchmark is better performance in the game.

Not to mention the driver itself has other things in there

Oh you poor poor soul. That is what they always want you to believe.
 
In this day in age, people dont want to take time with things, they want it done/easy. Canned benchmarks comparison = easy way.
It is just sad to see AMD pushing this agenda once again after all these years of doing the right thing. But like I said, maybe it is just a horrible marketing strategy and they PR folks don't even realize what they are actually telling us.
 
I don't mind canned benchmarks much, however any benchmark that requires a purchase isn't going to gain much traction. I'd like to run the benchmark myself so I can see the whiz bang DX12 features with my own eyes.

Look at the Steam VR Benchmark thread, people are posting their own numbers and talking about it. AoS has been around a while and rarely does it get mentioned unless people are taking about benchmarks run by others.
 
First, this is a thread about a driver released by AMD, no marketing campaign present just a link to the driver. So not sure why all the debate here (well I know why but no point in going on and on...).

Second as far as DX12 and AMD talking about it, do you blame them? They are showing a commanding lead right now in AotS with DX12 enabled and with async which is astounding with the performance increases it gives. Seriously if they didn't tout this victory, the marketing team, they should be shot, drawn and quartered, not specifically in that order. There are plenty of reviews out there, even a few not using canned benchmarks, actually playing the game.

So lets move on and accept it for what this thread is: a public service to let AMD GPU users know there is a new driver.

Thx.
 
True, but Nvidia pushes the Agenda as well. Both Companies do it...why? Profit and MArketing. You can't bitch about AMD trying to do the same thing as Nvidia. Both companies do it.

You mean one company seems to do both, profit and marketing, and the other is stuck just marketing. ;)


I really wish everyone would just decide is Kyle and Hard are red or green. A few years ago everyone was screaming Hawaiian Kyle was too red but now he is too green?!?!?! How could he be too green when he attended the AMD 30 celebration??? Just make up your fucking minds!!! It is getting too hard to keep up.

If any site is neutral and calls out companies shit its HardOCP.

I have been around here, with various names, for years and never saw Kyle or his team not be honest and fair. IMO that is what scares AMD. They are in a very tough position and need only good press. Its a shame since I miss red.
 
I don't mind canned benchmarks much, however any benchmark that requires a purchase isn't going to gain much traction. I'd like to run the benchmark myself so I can see the whiz bang DX12 features with my own eyes.

It's actually a game, with a benchmark mode.

You can buy the game and play it. You can also benchmark it.

The difference is it's in Early Access.

That did not stop anyone from playing and benching the heck out of Ark. It didn't stop NV releasing drivers for it, announcing it on their website and blogs, showing GameWorks tech in it.

So why the anti-AMD jibes from the editor of a tech site when they release an optimized driver for an EA title? Hypocrisy!
 
It's actually a game, with a benchmark mode.

You can buy the game and play it. You can also benchmark it.

The difference is it's in Early Access.

That did not stop anyone from playing and benching the heck out of Ark. It didn't stop NV releasing drivers for it, announcing it on their website and blogs, showing GameWorks tech in it.

So why the anti-AMD jibes from the editor of a tech site when they release an optimized driver for an EA title? Hypocrisy!

If they released the benchmark component for free, it would entice me to purchase the game (if it looked amazing).
 
Last edited:
I have been around here, with various names, for years and never saw Kyle or his team not be honest and fair. IMO that is what scares AMD. They are in a very tough position and need only good press. Its a shame since I miss red.

Honest and fair? LOL gimme a break dude.

Here's from their own review.

Power and Temp - ASUS STRIX R9 Fury DC3 Video Card Review

1436520543zZMsl7GpwE_8_1.gif


"Fury isn't as big of a power hog as the Fury X is, but it isn't as efficient as the GTX 980 is either."

Efficiency is a measure of performance per watt.

Fury is faster than the 980 at settings that aren't unplayable, in their own data.

1436520543zZMsl7GpwE_3_4.gif


1436520543zZMsl7GpwE_4_4.gif


1436520543zZMsl7GpwE_5_4.gif


1436520543zZMsl7GpwE_6_4.gif




Case in point, enabling GameWorks (Enhanced Godrays + HBAO+) makes both GPUs unplayably slow.

1436520543zZMsl7GpwE_6_3.gif



In their conclusion:

Efficiency
There are still factors, other than raw performance, that people judge video cards by. You cannot deny the efficiency of the GeForce GTX 980 over the new Radeon R9 Fury. The GeForce GTX 980 is able to deliver more performance per watt. The overall system wattage usage is a lot less on GTX 980 versus R9 Fury.

Somehow NVIDIA's Maxwell architecture is magical when it comes to getting the most performance out of each watt of power. This is something AMD's Fiji hasn't mastered.



Would a non biased reviewer conclude with such blatant falsehood? It even contradicts their own DATA.
 
With these baseless statements and jibes at AMD, [H] isn't even pretending to be neutral anymore. It's just outright hostility against AMD.

I am sure it all came to light with the Nano incident and bridges were burnt.
 
With these baseless statements and jibes at AMD, [H] isn't even pretending to be neutral anymore. It's just outright hostility against AMD.

I am sure it all came to light with the Nano incident and bridges were burnt.

I don't know if you are so blinded for team red you fail to read, your tin foil hat is on so tight it is cutting off the circulation, you hope to win some subreddit award or you suffer from some type of diminished capacity. It could be all of the above.

If Kyle and Hard's bias was so great then why would Kyle show up at AMD 30 celebration? Kyle even posted today AMD needs to stick around.

W0VeTNm-1200x675.jpg


OMG LOOK AT THAT HATE!!! Can you believe Kyle said things like, "AMD has been part of my life for 17 or 18 years" and "good thing to see you all still around." You can just feel the hate! :rolleyes:


Since you mentioned the Nano, why did Hard give Nano a gold award? If Hard and Kyle were truly not..."even pretending to be neutral anymore" or "it's just outright hostility against AMD" then the review could have been "its a small hot piece of shit." Truth is the Nano is a great little card and Brent/Kyle reviewed it honestly. In fact they gave AMD cards multiple gold and silver awards in the past 6 months.

It is abundantly clear Bahanime, you are a ridiculous fanboi. Not a PC gaming enthusiast. You are blinded by some sort of bullshit loyalty that shouldn't exist. It shouldn't matter if its green, red, blue, purple, etc. as long as it gives gamers the best gaming performance. You fail to see that.

Its fine to love a brand you trust but making up unfounded vitriol against Kyle and his team is retarded. We are all now dumber for having read your posts. I am now going to add you to a list where I put special people. Enjoy your stay!
 
I haven't tried this new AMD driver yet, but I did switch from about 15 years of nothing but Nvidia to AMD in the last year. I now own an AMD Fury X card. Its very very nice!
It's very quiet, runs cool, works flawlessly, and visually looks sharp. The 120mm fan spins up to about 1050-1100 RPM which is still quiet under full gaming load and the temp is in the mid 60*c in a case which doesn't have much ventalation. (Cosmos 1010) Much cooler than a 980TI would run because the heat is pumped out of the case since the radiator is mounted to the back of the case. My case is sitting on the floor and between the Fury X rad and the Corsair GTX110i GTX --- my machine is nearly silent under load ---- but still packs in a LOT of performance.

I read HardOCPs review of Rise of the Tomb Raider which Kyle linked earlier in this thread. If you read the review it'll come across as Nvidia 980ti is the clear winner in the paragraph based text, but if you look at the FPS charts it looks to me like the AMD Fury X is the clear winner.

....Stranger still???

The game is marketed heavily by Nvidia - its bundled with Nvidia cards, the splash screeen says Nvidia, the main menu says built in cooperation with Nvidia -- it's promoted heavily to be Nvidia optimized. Yet the game appears to run better with AMD Fury X both in single card and dual card based on the benchmarks in the review.

Furthermore, I'm running the game with a i7 4770k at 4.5ghz with every single setting maxed out and getting what I consider extremely smooth fps at 2560x1600 on my Fury x - with no stuttering or hitching. I dont know why the reviewer was having to turn settings down at 2560x1440 since I have no trouble at max settings at 1600p.

The fact that AMD just released a performance update is commendable because the game already ran so good on the Fury X - even without the recent update.

I've not followed [H] reviews much recently but don't think they historically were obviously biased, but darned if something doesn't seem a bit off with that particular review of Rise of the Tomb Raider. In fairness to this discussion It does read a bit biased as compared to the objective data published.

Signed, a long time nearly exclusive Nvidia advocate --- who is now very much surprised to be appreciating a Fury X card. Its quiet, fast, cool, and I haven't encountered any weird/unique driver problems. AMD exclusively supports hardware PLP 20"30"20" monitor configuration and that's why I initially jumped to red team since that is my desktop monitor setup. Frankly, I didn't expect as good an experience as I'm having.

I know some will say this is rubbish, but I also feel like the old addage of a bit better picture quality seems to hold true as well. The image of the AMD card on my home theater projector through HDMI appears to have better black levels than I could ever manage with the Nvidia cards regardless of the settings I tried to manipulate based on recommendations from different threads I found. I used full 0-255 HDMI spec on both, but the AMD is inkier blacks IMO - without being overly crushed... (my opinions anyway) My previous cards have been Nvida exclusively since 3dfx and my old Viper S2000. My last few cards have been 460, 560ti, 670 - I then switched to AMD to get PLP support and started with an AMD 285 and now a Fury X

Reference:
Nvidia GTX670, Onkyo PR5508, Panasonic AE8000U - HTPC blacks suck... - AVS Forum | Home Theater Discussions And Reviews
 
Last edited:
I don't know if you are so blinded for team red you fail to read, your tin foil hat is on so tight it is cutting off the circulation, you hope to win some subreddit award or you suffer from some type of diminished capacity. It could be all of the above.

If Kyle and Hard's bias was so great then why would Kyle show up at AMD 30 celebration? Kyle even posted today AMD needs to stick around.

W0VeTNm-1200x675.jpg


OMG LOOK AT THAT HATE!!! Can you believe Kyle said things like, "AMD has been part of my life for 17 or 18 years" and "good thing to see you all still around." You can just feel the hate! :rolleyes:


Since you mentioned the Nano, why did Hard give Nano a gold award? If Hard and Kyle were truly not..."even pretending to be neutral anymore" or "it's just outright hostility against AMD" then the review could have been "its a small hot piece of shit." Truth is the Nano is a great little card and Brent/Kyle reviewed it honestly. In fact they gave AMD cards multiple gold and silver awards in the past 6 months.

It is abundantly clear Bahanime, you are a ridiculous fanboi. Not a PC gaming enthusiast. You are blinded by some sort of bullshit loyalty that shouldn't exist. It shouldn't matter if its green, red, blue, purple, etc. as long as it gives gamers the best gaming performance. You fail to see that.

Its fine to love a brand you trust but making up unfounded vitriol against Kyle and his team is retarded. We are all now dumber for having read your posts. I am now going to add you to a list where I put special people. Enjoy your stay!

So I ran the numbers provided by Bahanime.
GTX980: 320 Watts, average FPS: 51.9
R9 Fury: 367 Watts, average FPS: 60.4

GTX980 = 0.16 Frames/watt
Fury = 0.17 Frames/watt

Or

GTX980 = 6.2 watts/frame
Fury = 6.1 watts/frame

So mathematically, meaning empirically, speaking... Kyle's conclusion was wrong.

Bahanime is correct. The R9 Fury delivers more performance per watt than the GTX 980. The R9 Fury is therefore more"efficient" in terms of power usage when compared to delivered performance.



Ps. 6 years ago, I was an nVIDIA customer, now I'm not. Kyle is biased, not only in his reviews (as shown mathematically) but in his comments here as well. A biased journalist does not exist, that's called being an editorial columnist. That's how he is behaving here and it shows in the content of his reviews (when it comes time to opine on the data).

If anyone wishes to dispute any of this, I suggest you use some of that Maxwell "magic" Kyle was talking about. Because you'd be wrong from an empirical standpoint.
 
Last edited:
Back
Top