ASUS STRIX Radeon R9 Fury DC3 CrossFire at 4K Review @ [H]

You do realize Reddit was the one who found the 970 GTX only being 3.5GB? If we didn't listen to the reddit poster we would of never caught Nvidia in a lie?
There are statements from both NVIDIA and the developer that they are not using any NVIDIA GPU specific PhysX features in Project CARS. So, given those two public statements, I would tend to take their word for it over the word of an anonymous poster on a website. Looking at the data, yes AMD is slower. That doesn't necessarily indicate it was done intentionally which is where I take issue with your statement. AMD and NVIDIA GPUs handle workloads differently and both have strengths and weaknesses. There are cases where AMD out performed NVIDIA - I believe this was the case with Far Cry 4 on release, and that was a GameWorks game. I don't believe that the developer intentionally crippled AMD performance.
 
I call shens.

My testing does not support your theory.

Maybe in CF at 4k. But at any other res and in single GPU its much slower on AMD GPUs. Project Cars being added to your test suite makes it ridiculous as future single GPU tests will be even more biased. Project Cars is so freaking bad on AMD GPUs that review sites write aggregate scores with and without Project Cars. lmao.

https://www.techpowerup.com/reviews/AMD/R9_Fury_X/31.html

Anyway you guys should have a long hard look at your game selection methodology. This reason of we test the games which are popular is just bull shit. Atleast back up your selection methodology by showing game sales numbers and proving that each game in your test suite is in the top 10 selling games in the last 1-2 years. There are games like Middle Earth Shadow of Mordor or Dragon Age Inquisition which are much better selling and critically acclaimed than a Project Cars or Dying Light and also demanding at high resolutions.
 
Last edited:
A lot of readers here are missing an important point. The assumption that the latest driver from AMD or nVidia is the one for you, buys a lot of people trouble that reading the vendor documentation (and to some extent sites like [H]ardOCP) would let you avoid.

To be fair, web sites face difficulties that you don't. One of the things that [H]ardOCP does well is testing to see if performance improvements really do increase your enjoyment in game. But no one enjoys driver hell, so why don't people read the appropriate vendor's page?

If you try to do that with AMD, you discover that the 15.8 beta drivers used in the article have been replaced by the 15.9 beta drivers. The currently recommended drivers are 15.7.1 for most setups, and there is specific advice not to use 15.9 beta with Windows 8, or with Windows 8.1 or 10 in 32-bit mode. Would [H]ardOCP have had any problems with 15.7.1? I don't know, but it is the way to bet. Beta drivers are drivers with known problems put out both so game manufactures and others can check for compatibility, and so that some people who need something not available in current released drivers can get early access. (For example a newly released game may need a bug fix found only in the beta.)

My most recent purchase was a R9 380. I installed the (recommended) 15.7.1, no problems. Why not 15.7? I don't know, and don't care. AMD doesn't recommend it with the R9 380, or R9 285, that's good enough for me.

If I upgrade to a 4k display, I'll probably go to CrossFire. And, yes I'll stay with AMD/ATi video cards as I have for years because paying attention to their website has saved me lots of (potential) agony.

I would tell ya not to go crossfire. As someone who has wasted money on AMD crossfire for the last 5 years, Just not worth the headache IMO. Single cards are great for price/performance, but when it comes to multi-gpu....my god man save yourself the headache!
 
I have to say, it's surprising to me how close the 390X performs to the Fury, given the massive shader count boost, HBM, and the optimizations in GCN 1.3 vs 1.1. They must be really handicapping themselves by having not increased the ROP count.
 
There are statements from both NVIDIA and the developer that they are not using any NVIDIA GPU specific PhysX features in Project CARS. So, given those two public statements, I would tend to take their word for it over the word of an anonymous poster on a website. Looking at the data, yes AMD is slower. That doesn't necessarily indicate it was done intentionally which is where I take issue with your statement. AMD and NVIDIA GPUs handle workloads differently and both have strengths and weaknesses. There are cases where AMD out performed NVIDIA - I believe this was the case with Far Cry 4 on release, and that was a GameWorks game. I don't believe that the developer intentionally crippled AMD performance.

You actually believe press statements? Let me guess you believe Nvidia didn't lie about the 970 being 3.5/0.5? It was just a press mistake.

I am sorry but when it comes to games like Project Cars (Hell even Dirt 3/showdown) Those games are skewed for 1 companies video cards.

The Witcher 3 and Tomb Raider are games that bench evenly fair IMO.
 
You actually believe press statements? Let me guess you believe Nvidia didn't lie about the 970 being 3.5/0.5? It was just a press mistake.

I am sorry but when it comes to games like Project Cars (Hell even Dirt 3/showdown) Those games are skewed for 1 companies video cards.

The Witcher 3 and Tomb Raider are games that bench evenly fair IMO.
It depends on the circumstances. NVIDIA really had no reason at all to comment on this, since the game was developed by a third party developer, so any perceived bias would have been on the fault of the developer and not NVIDIA. If they came out, lied, and were caught, it would be massively negative vs just shutting up. I don't believe they would be so dumb as to do that.

Plus, the developer is an unbiased third party. The devs have commented on the issue several times.

Context is everything.
 
I am sorry but when it comes to games like Project Cars (Hell even Dirt 3/showdown) Those games are skewed for 1 companies video cards.

The Witcher 3 and Tomb Raider are games that bench evenly fair IMO.

very well said. Project Cars is a joke of a game to be included in a test suite. The next time there is a game which plays much faster on AMD GPUs and is also graphically demanding I will be looking at whether hardocp include it in their test suite. The next 6 months have a lot of AMD GE titles. Starwars Battlefront, Ashes of Singularity, Hitman, Deus Ex Mankind Divided and Mirror's Edge Catalyst. hardocp will have to walk the talk about choosing the most popular and demanding games. :rolleyes:
 
I don't understand the driver fuss. I've never had AMD driver issues except launch weeks for a new OS. I've been running nothing but Crossfire setups since my old pair of x1600XT's I put together so I could play LOTRO on my old 1024x1280 monitor with decent settings.
 
A lot of readers here are missing an important point. The assumption that the latest driver from AMD or nVidia is the one for you, buys a lot of people trouble that reading the vendor documentation (and to some extent sites like [H]ardOCP) would let you avoid.

To be fair, web sites face difficulties that you don't. One of the things that [H]ardOCP does well is testing to see if performance improvements really do increase your enjoyment in game. But no one enjoys driver hell, so why don't people read the appropriate vendor's page?

If you try to do that with AMD, you discover that the 15.8 beta drivers used in the article have been replaced by the 15.9 beta drivers. The currently recommended drivers are 15.7.1 for most setups, and there is specific advice not to use 15.9 beta with Windows 8, or with Windows 8.1 or 10 in 32-bit mode. Would [H]ardOCP have had any problems with 15.7.1? I don't know, but at least some of the problems they hit may not occur in 15.7.1. Beta drivers are drivers with known problems put out both so game manufactures and others can check for compatibility, and so that some people who need something not available in current released drivers can get early access. (For example a newly released game may need a bug fix found only in the beta.)

My most recent purchase was a R9 380. I installed the (recommended) 15.7.1, no problems. Why not 15.7? I don't know, and don't care. AMD doesn't recommend it with the R9 380, or R9 285, that's good enough for me.

Fair enough point.
However when it comes to AMD drivers, the beta can just as well solve issues with new games as create issues.. and they might fix one game while breaking another. Hence the term "driver hell". I sure as fuck am not going to be uninstalling and reinstalling drivers all the time so I can swap between playing Far Cry 4 vs whatever the hell else I want to play. Seriously, AMD needs to get their shit together. They've laid off soo many people in R&D, I'm worried about the companies' future. The driver dept either is understaffed (with the talented guys) or staffed full of interns. AMD driver hell for me started with a 9700 Pro and lasted 6 months. It was so bad that I bought an nvidia 5800 hairdryer and didn't even care that it was slower, I just wanted it to work.

If I upgrade to a 4k display, I'll probably go to CrossFire. And, yes I'll stay with AMD/ATi video cards as I have for years because paying attention to their website has saved me lots of (potential) agony.

I'm glad AMD has worked out for you. But you will fall into the Fanboy category if each generation you are not evaluating all the options.

Intel Integrated graphics might make some magic leap and work just fine for half the price. AMD might make some awesome drivers where crossfire is smooth. Nvidia might make a GPU so fast some weird time-warping (as opposed to normal warping) of the space-time continuum happens in which time runs faster, giving a free "OC" effect of 1000%, setting all sorts of records, and blowing the pixels out of our LCD's its so fast.
 
I don't understand the driver fuss. I've never had AMD driver issues except launch weeks for a new OS. I've been running nothing but Crossfire setups since my old pair of x1600XT's I put together so I could play LOTRO on my old 1024x1280 monitor with decent settings.
It all comes down to the games you play. In general I have had a better experience with NVIDIA SLI than CrossFire.
 
Project Cars is also a game that is on the list to possibly get a DX12 patch in the future, raising its importance for future gameplay on Windows 10 with GPUs. I ensure to test it, when that patch comes out. This makes the game, a little more relevant.
 
Without meaning to derail the thread any more, I'd like to know from the people who are claiming bias in game selection (which I call complete horseshit on), what mainstream games that are relevant to a large number of gamers today would you like to see tested?
 
How much longer can AMD afford to put out these terrible products? The AMD fan base can only stay ignorant for so long....
 
Well, the vast majority of video cards sold are in the sub $350 range, and AMD is a lot more competitive there with the R9 390 and R9 390X. It just makes the rest of their lineup look worse when the >$500 products are not competitive. I think Fury needs a price cut.
 
That was a very interesting review, best I've read in awhile. Loved the different data points including the one with GameWorks on and off where AMD went off the charts high :D. Kinda hints/maybe prove the handicap of having a closed source code library. Also using those data points to compare cards may skew the final findings unless all games will use Gameworks.

In FarCry 3, I know you guys tested FarCry 4, many had the same experience with CFX. When I tried it, low and behold stutter city. One game setting in the game made it butter smooth, don't recall exactly what it is called but it was pre-render frames ahead, I set it to zero and CFX worked like a champ (could have been 1 but believe it was zero that made a huge difference). I wonder if FC 4 has this same setting. I don't have this game yet, next sell maybe.

Another thing I've found that can smooth out game play, was limiting the frame rate to slightly faster then my monitor (60hz) it smoothed out the game rather well - maybe something else to tinker with in CCC. Then hit AMD for making it more complicated in getting CFX experience good.

Still this was the best review I've read with current generation multiple cards by far and much to study from it. Also moving up to the rarer 3card + category I just can't help believe the 390x will shine rather well (recommend comparing 3 390x to the 980 SLI, FuryX CFX, and Titan X SLI since cost wise it is cheaper in the end for the cards).
 
Last edited:
CrossFire and SLI tend to not scale very well past two cards, plus you run into even more multi-GPU driver issues with 3+ cards. It's usually not worth it unless you have so much money you don't care.
 
I dont get the stuttering in the 3 games. Why are they so bad? Since AMD is not using Xfire bridges anymore, and Xfire is run trough PCIe, can the problem be on the mobo chipset side? I know that benchmarking requires the best platform for the test, but could you try to see if the stuttering is still there on a AMD highend platform? (yea, i know, "AMD highend" :p)
 
I have an AMD rig, FX9590, 290x/290 CFX rig. Unfortunately I don't have any of the games except BF4 :. But for example, Shadow Of Mordor, the "Canned" benchmark in the game gives some great numbers for scaling using CFX, except when you play it, then what [H] describes in this review is, well very similar. It jerks and stutters yet the FPS look great but really the playability is shit. Fortunately it plays good with a single card at 3440x1440 virtually maxed out.

Other games like Tomb Raider(Awesome game!) , Crysis 3, FarCry3 etc is really good in CFX. The key here is not the FPS but the real experience - this I believe really sets [H] apart from the rest.

Now Kyle or was it Brent? Did indicate how you select the games can influence who looks better in the performance/game experience and thus conclusions. I just wish [H] could switch the games up more between the reviews, cover 4-6 games but similar card reviews get some different games in there. For example one review has Project Cars, Next Dirt Showdown. One with Witcher, next with Shadow of Mordor - something like that. I would think it would more fun, some unexpected difference would show up - maybe a surprise game thrown in each review (maybe even an indie). The same game set does get boring, at least for me a little. I want more data to get a broader picture in other words to rule out isolated cases.
 
CrossFire and SLI tend to not scale very well past two cards, plus you run into even more multi-GPU driver issues with 3+ cards. It's usually not worth it unless you have so much money you don't care.

Well from the other sites, albeit with "canned" benchmarks, Fiji scales virtually perfectly, As in if one card gets 30fps, 2nd card 55fps, 3rd card 80fps etc. Now if you run into memory limitations that would be interesting to see what happens.
 
Also moving up to the rarer 3card + category I just can't help believe the 390x will shine rather well (recommend comparing 3 390x to the 980 SLI, FuryX CFX, and Titan X SLI since cost wise it is cheaper in the end for the cards).


I like that idea. When crossfire is working well, it works well for a 3rd card. With 8 GB, It would be a nice option in the $1300 card. However, I think you would need 3 of the regular 390 cards at $1200 total. That extra $100 easily gets eaten up by the massive power supply needed as well as a motherboard that supports a 3rd PCI-E card.
 
Now Kyle or was it Brent? Did indicate how you select the games can influence who looks better in the performance/game experience and thus conclusions. I just wish [H] could switch the games up more between the reviews, cover 4-6 games but similar card reviews get some different games in there. For example one review has Project Cars, Next Dirt Showdown. One with Witcher, next with Shadow of Mordor - something like that. I would think it would more fun, some unexpected difference would show up - maybe a surprise game thrown in each review (maybe even an indie). The same game set does get boring, at least for me a little. I want more data to get a broader picture in other words to rule out isolated cases.
IMHO doing this really hurts the continuity of reviews and comparability of results. By using a relatively stable selection of games, particularly within a GPU generation, it makes it easier to read consecutive reviews and get a better idea of the state of GPU competition.

So, if the original GTX 980 review has similar games to the FuryX review, despite the intervening time periods, it makes it easier to really get an idea of what I can expect from the 980, and compare that to early results from the Fury X. If you mix up the games too often, you can't get that more meta-view.

ANd the other thing is that there just aren't that many games out there worth benching. Too many are just console ports that cannot truly be used to differentiate between modern GPUs. Shadow of Mordor is a good example. Sure, it's a good looking game, but for actually stressing a graphics card, when you turn up the settings, all it does it yank the texture res way up. That's a known value: vram capacity and vram capacity only. It's not otherwise doing anything interesting with SSAO, AA etc. A good game for benchmarking is one that hammers the GPU on all fronts so that it identifies relative shortcomings in each card. Running a game like SoM will not do that: it will just tell you what you already know, i.e. vram capacity.
 
I fixed your quote and this has (for the most part) been fixed by AMD and a high end CPU.

Low end CPU's still tank performance with AMD cards in the game. It is 100% CPU and driver related.

That's great. Reviews from a few months ago showed AMD tanking on review systems that had overclocked haswells and Ivy Bridge-Es.
 
Can we PLEASE stop the crying about the games that are included in the test suite? They include games based on popularity. Right now there aren't a lot of new and/or popular games that are AMD-sponsored, other than Battlefield 4 and Dragon Age Inquisition. DIRT Rally is AMD-backed but is still in early access. The next major AMD-backed game is to release is Deus Ex: Mankind Divided. There just aren't as many AMD-backed titles as there are NVIDIA-backed titles. Read into that what you will.

And Project CARS DOES NOT use GPU PhysX so it doesn't matter if you have an AMD GPU or NVIDIA GPU. https://www.reddit.com/r/pcgaming/c...eworks_project_cars_and_why_we_should/crc3ro1

Every single game in their review except for Witcher 3 I have zero desire to play after trying them all for a few hours. Dying light is good to show what happens at memory limits, but it's shocking why it needs to much memory.

If it was based on popularity you'd see Skyrim in there still, WoW, LoL, CoD etc.

Project Cars? no. I'm surprised that's even been added after their developers went full on derp "It used to work with AMD cards we tested, but we haven't tested for months with them or bothered to use newer drivers" Zero respect for developers that test on year old drivers then claim to be innocent when issues are brought up.

Can't wait for the new games to start coming out though, Doom level building is just nostalgic and awesome!
 
Ya know, we just use games.

We aren't going to start cherry picking games, that'd be biased.

If you have a problem with a developers decision in the implementation of technology in a game, take it up with them.

Our goal is to relate the gameplay experience of games to you using video cards. You can make your own decision which video cards best fit the bill for your needs.

If we tried to sort out games that everyone claims is "biased" there would be no games to use. Would you rather we leave out new games and don't ever put new games in just because developers are integrating certain technologies into new games? Think about what you are asking.

Try to look at games from an outside perspective, they are what they are, we evaluate them as they are and relate that experience to you and which video cards provide the best experience for the money. It's really that simple and to put anything more on it, is just overthinking the whole thing.
 
I think a lot of what people are perceiving as negative toward gaming "bias" comes from an extension of thinking of games as "benchmarks" to "benchmark" video card performance.

That is not how we think about it. Games to us are just games, our goal is to relate the gaming experience of the game to you, using video cards, and let you know how performance and IQ is and which card provides the best experience versus how much it costs versus other cards at similar prices. We aren't benchmarking video cards, we aren't using games as tools to find out which video card is faster, or how one video card does better at a certain 3D feature like Tessellation, no, that isn't what we are testing. What we do is simply play games, and let you know how those games perform.

Try not to think about the game as a benchmark, but rather, just a game that people enjoy playing for entertainment. We want help you decide which video card is going to best provide you the best entertainment in that specific game.
 
Is this the first time you guys used Project Cars in your benches? Why add it now? :confused:

We have gone back to our original intentions in evaluating games and have now made 6 games in reviews standard again, versus 5 in the past couple of years. Project Cars was just added as a new game, I have instructed Grady and David on its use and will be used in all reviews moving forward.

This game is graphically intense, will probably get a DX12 patch, and is forward looking in terms of 3D graphics features and options on the PC. This is a very PC oriented game in its features. It is one of the best racing simulators in terms of visual quality I have ever seen, it is quite stunning at max settings and takes a fast video card solution to sustain those high framerates needed with max settings. It has been a popular game in this genre.
 
It doesn't matter what games you use, rabid fanboys will find a way to nitpick it into oblivion, otherwise they will never feel like they made the right choice on hardware.

I just sold my Fury, but I bought it out of simple curiosity to test it myself and enjoy it for what it is. I knew the general performance going in, and did not try to delude myself about what it could and could not do. Too many people get so damn emotionally invested in a graphics card company, its really kind of sad.
 
Well from the other sites, albeit with "canned" benchmarks, Fiji scales virtually perfectly, As in if one card gets 30fps, 2nd card 55fps, 3rd card 80fps etc. Now if you run into memory limitations that would be interesting to see what happens.
Yes, Fury does seem to scale extremely well and that meshes with what I have seen with AMD cards using XDMA. However, you still run into driver issues, and usually, but not always, there are more problems with 3+ card setups than 2 cards.
 
Fair enough point.

I'm glad AMD has worked out for you. But you will fall into the Fanboy category if each generation you are not evaluating all the options.

I take your point, but don't miss mine. I did have a Crossfire setup with dual 6950 cards. The R9 380 was because my air conditioner couldn't keep up. As long as I was running just one card, why not get a (much) faster but lower powered card and decide if I needed a second one? (Not for now. I have two 1920x1200 displays, but use only one for gaming.)

Anyway, since I tend to play just one game for months, it was worth reading the web sites, and AMD's Crossfire profiles, to find what I should do. I do not upgrade to a new driver or Crossfire profile unless I know specifically what I expect to get from it. Of course, it means that I need to know a lot more about AMD cards and drivers, which is why I tend to stay with one vendor. Why did I switch from nVidia to ATi? The 4850 was just too good a bargain to pass up. I originally planned to go CrossFire if necessary, but it wasn't with a 1600x1200 monitor. (Why did I go CrossFire later? The games were stressing the graphics much harder, and I had a chance to get a used card for $100. ;-)

If you install new drivers from nVidia or AMD without knowing how they will work in your system and with the games you play, you get what you deserve.

On the other hand, there is Microsoft. I upgraded this machine to Win10, and the USB drivers didn't work. Dug out a PS2 mouse and keyboard, and Win10 is turning off all the USB drivers (three different ones), won't turn them back on, and says they are up to date. So this is being written in Win7Pro...
 
H did an excellent trifire review on the 295x2 with a 290x. Trifire scaled excellent while going to the 4th gpu showed little gain in most cases. Not to mention quadfire 290x2 requires insane power while trifire could get away with a 1200 watt psu. Sorry, no link. too hard to do on my phone. Trifire 390 would be just as good and the 8 gb ram would only help.
 
If I'm spending thst kind of money to begin with, I'm buying to 980ti s for only another $150! That would have been the real comparison.
 
as if you need anyone to tell you, but keep doing what youre doing [H].

i buy new GPUs to play new games, not run synthetic benchmarks or old crap like WoW.
 
CrossFire and SLI tend to not scale very well past two cards, plus you run into even more multi-GPU driver issues with 3+ cards. It's usually not worth it unless you have so much money you don't care.

Numerous articles I've read indicate that 3 cards is actually the sweet spot: not for scaling performance but for smoothness.
 
Numerous articles I've read indicate that 3 cards is actually the sweet spot: not for scaling performance but for smoothness.

The sweet spot for smoothness is obviously a single GPU, provided you can get away with it at your resolution in your titles.
 
Those AMD drivers. :eek:

I'd love to see what the charts would look like if AMD had some NVidia driver people working for them. If there's anything that has ever kept NVidia looking good in the eyes of the public it would definitely be their software side of the house.

.

I remember when Vista came out, and Nvidia cards were crashing and burning, while irate fans were creating "Fcuk Nvidia!" websites and talking about getting lawyers, That was most epic driver related meltdown I ever saw.

Not that Nvidia drivers aren't often better (and they should be since I see so many games flashing Nvidia name on their game intros). AMD drivers do have a history of underperfoming. Alas I've bought from red and green depending on who seems to be getting things right the best at the time.

One thing I dont do is get caught up with 1400w psus and $1k video cards to get tangled up in constant SLI/CF headaches - all to play the same mediocre games. PC gaming is hurting itself with all the hype over ridiculous overbuilds to get ego tickles and extra frames nobody can even see . Making kids feel they need $2000 in hardware to shoot zombies and dragons just makes consoles etc look that much better
 
We have gone back to our original intentions in evaluating games and have now made 6 games in reviews standard again, versus 5 in the past couple of years. Project Cars was just added as a new game, I have instructed Grady and David on its use and will be used in all reviews moving forward.

This game is graphically intense, will probably get a DX12 patch, and is forward looking in terms of 3D graphics features and options on the PC. This is a very PC oriented game in its features. It is one of the best racing simulators in terms of visual quality I have ever seen, it is quite stunning at max settings and takes a fast video card solution to sustain those high framerates needed with max settings. It has been a popular game in this genre.

Project Cars engine is built around PhysX. AMD has no way to run physx thus their drivers get slowed down to to physx calculations hogging the cpu.

Why would you pick this game for review I have no idea....it's not popular and no one I know on Steam plays it ( 100+ people).
 
Project Cars engine is built around PhysX. AMD has no way to run physx thus their drivers get slowed down to to physx calculations hogging the cpu.

Why would you pick this game for review I have no idea....it's not popular and no one I know on Steam plays it ( 100+ people).

100+ people? Really!? Such a sample size. :rolleyes:

Ask yourself this: Are people who own AMD cards playing Project Cars? Are peopl ebuying GPUs playing Project Cars? Is Project Cars one of the newest, top-end (graphically) racing games for PC? Yes? Well then I'd say that it's a viable title to use. Don't you think that a Project Cars player looking to upgrade his card would be interested to see how AMD cards perform? And if it's garbage, is that [H]'s fault or ProjectCars'/AMD's?

Furthermore, think about what you are saying: You want [H] only to test games that perform well on both platforms. WHAT THE HELL GOOD WILL THAT DO?! You'd never test the newest, hardest games (graphically) and all your charts would show equal performance.
 
Project Cars engine is built around PhysX. AMD has no way to run physx thus their drivers get slowed down to to physx calculations hogging the cpu.

Why would you pick this game for review I have no idea....it's not popular and no one I know on Steam plays it ( 100+ people).

Rizen mentioned on Page 4 that Project Cars doesn't use GPU Physx and provided a source, so your argument is invalid.

[H] staff have explained several times why Project Cars was included. Go back and read the thread.
 
Felt I should post this here since I did mention the (new) 15.9 beta: http://support.amd.com/en-us/kb-articles/Pages/AMD-Catalyst-15.9-Beta-Memory-Leak.aspx

A little more information (which is all anyone has). The graphics memory will fill with junk. In some games it is not a problem--but the game does show as using all of the available memory. Apparently in some browsers if you resize the image often enough you can produce a crash.

Do I expect this problem to be found and fixed fairly quickly? Yeah, genuinely new bugs can usually be traced to a (small) change made in the new driver.

But again, it may make sense for [H]ardOCP to use beta drivers. You and me? Stay with the stable, endorsed driver (15.7.1) for now.
 
Back
Top