AMD R9 Fury X benchmarks leaked, Significantly Faster than the GTX 980Ti at 4K

"Upcoming Titles with GameWorks Technologies

Assassin’s Creed: Unity | HBAO+, TXAA, PCSS, Tessellation
Batman: Arkham Knight | Turbulence, Environmental PhysX, Volumetric Lights, FaceWorks, Rain Effects
Borderlands: The Pre-Sequel | PhysX Particles
Far Cry 4 | HBAO+, PCSS, TXAA, God Rays, Fur, Enhanced 4K Support
Project CARS | DX11, Turbulence, PhysX Particles, Enhanced 4K Support
Strife | PhysX Particles, HairWorks
The Crew | HBAO+, TXAA
The Witcher 3: Wild Hunt | HairWorks, HBAO+, PhysX, Destruction, Clothing
Warface | PhysX Particles, Turbulence, Enhanced 4K Support
War Thunder | WaveWorks, Destruction"

http://www.geforce.com/whats-new/tag/nvidia-gameworks

No more hearsay here are the facts. The game uses game works.


I'd think a higher percentage of AAA PC games will use AMD tech considering that's what's on the consoles.
 
Yeah that looks to be fair enough.
However, as I stated in my last post, PCars works fine with AMD on Windows 10.
It looks like the problem is the single threaded AMD video driver on Windows 7/8

Yeah you are correct the single threads are killing AMD performance, since bulldozer AMD has bet on multi-core/ thread usage but it has not come to fruition yet.
Win 10 with dx12 will be interesting for sure.
 
Good Lord, what have I done?
I just want new benchmarks for PCars because how is a 290 getting less avg fps than my 670? Had no idea this would turn into a gameworks battle, I dont even care about that. I just want to know the furyX can pump out 60fps and not get half frame rates like the 200 series are getting. Let's focus on the details that actually matter...
 
Good Lord, what have I done?
I just want new benchmarks for PCars because how is a 290 getting less avg fps than my 670? Had no idea this would turn into a gameworks battle, I dont even care about that. I just want to know the furyX can pump out 60fps and not get half frame rates like the 200 series are getting. Let's focus on the details that actually matter...

I guess it will depend on whether or not the card can brute-force it more than the lack of driver efficiency holds it back.




PS: There is no GameWorks in Project Cars. Except for PhysX, which pre-dates GW, just about everything was developed in-house.
 
I believe the actual name is now "GameWorks PhysX" as it states on the GameWorks official website.
That being said, non-HW accelerated physx is in hundreds of games with no problems.

The label is meaningless; GTA V has TXAA which is undeniably a GameWorks feature yet GTA V is not a "GameWorks game" and of course it runs fine on AMD hardware.
 
I should have gone to bed earlier last night...

PS: There is no GameWorks in Project Cars. Except for PhysX, which pre-dates GW, just about everything was developed in-house.

- Project Cars is not a GameWorks product. We have a good working relationship with nVidia, as we do with AMD, but we have our own render technology which covers everything we need.
- The Madness engine runs PhysX at only 50Hz and not at 600Hz as mentioned in several articles
- The Madness engine uses PhysX for collision detection and dynamic objects, which is a small part of the overall physics systems
http://www.pcgamer.com/project-cars...s_source=steam&ns_linkname=0&ns_fee=0

Can we just say it's debatable and leave it at that now?
What's important is, will Project Cars get half the fps (seriously, my 670 over a 290X, wtf is going on?) on the FuryX, like the previous gen has been suffering.
 
I guess even if you prove people wrong with facts they will stick with their story. Project cars AS I linked above has game works and is even considered a game works game on the Nvidia website. I mean if Nvidia says it is a game works game it must be a game works game.
 
Last edited:
I should have gone to bed earlier last night...



http://www.pcgamer.com/project-cars...s_source=steam&ns_linkname=0&ns_fee=0

Can we just say it's debatable and leave it at that now?
What's important is, will Project Cars get half the fps (seriously, my 670 over a 290X, wtf is going on?) on the FuryX, like the previous gen has been suffering.

I found the thread where they tested the Windows 10 AMD drivers on Windows 7.
If you want to have a go, its here
http://forums.guru3d.com/showthread.php?t=399591
 
You guys still going around and around in circles about Project Cars? The irony is not lost on me.
 
Who has a Fury X to test there?
I already know how bad it is on the 200 series, a new driver isn't going to fix half the frame rate.

We're all going to have to wait on the reviews I guess. Though I don't know if Project Cars will even be in review benchmarks. So maybe we'll have to wait until people can actually buy them and we get anecdotal feedback. But don't expect miracles. :)
 
Yeah, guess so. Obviously nothing else to leak since they didn't test it. Which probably doesn't bode well for that game, but I definitely want to compare the results once they are available.
 
Who has a Fury X to test there?
I already know how bad it is on the 200 series, a new driver isn't going to fix half the frame rate.

I thought you had an AMD card, its not just a FuryX driver.

Windows 10 has been shown to run Project Cars on AMD hardware just fine.
The whole point of that thread is to use the Windows 10 threaded dll on Windows 7 to get better performance on Witcher 3 and Project Cards.

Oh well, I tried :)
 
Oh, right.
I did have a 290X 2 months ago, but it had heat issues and had to be returned. Figured I'd just wait at that point.
Especially with those results...

Pcars, I mean. Other games were fine...
 
So Fury X is faster than 980 Ti at 4K, but the question is, does it also deliver a stutter-free performance with that 4 GB of VRAM?

How does Fury X handle new textures having to be swapped in in the middle of gameplay because 4 GB cannot hold them all?
 
So Fury X is faster than 980 Ti at 4K, but the question is, does it also deliver a stutter-free performance with that 4 GB of VRAM?

How does Fury X handle new textures having to be swapped in in the middle of gameplay because 4 GB cannot hold them all?

Don't know. Wait until the reviews come out and we see how it does.
 
Yeah you are correct the single threads are killing AMD performance, since bulldozer AMD has bet on multi-core/ thread usage but it has not come to fruition yet.
Win 10 with dx12 will be interesting for sure.

I am also very curious about Win 10s impact on my frame rate.
 
We're all going to have to wait on the reviews I guess. Though I don't know if Project Cars will even be in review benchmarks. So maybe we'll have to wait until people can actually buy them and we get anecdotal feedback. But don't expect miracles. :)

bug brent enough, maybe it'll get add to the benchmark list. ;)
 
So Fury X is faster than 980 Ti at 4K, but the question is, does it also deliver a stutter-free performance with that 4 GB of VRAM?

How does Fury X handle new textures having to be swapped in in the middle of gameplay because 4 GB cannot hold them all?

Why does it matter? No HDMI 2.0, no care. Pretty much a requirement for 4K right now.
 
Why does it matter? No HDMI 2.0, no care. Pretty much a requirement for 4K right now.

If you're a peasant who is using a 4K hdtv, a miniscule number of pc gamers vs those using a 4K monitor that has display port.
 
Holy hardware angst, Batman. Look, if you have a niche goal like 4k tv gaming, buy the hardware that supports that goal. If you are too cheap to buy a monitor that supports DP you're probably not shelling out top dollar for a flagship video card. Please just quit the moaning and complaining.
 
Less than 5FPS difference in most games is not significant.... 30% increase over a 980Ti would be considered significant.

This a great example of a member looking for people to click on a thread. Worthless.
 
HDMI 2.0 will be a non-issue once the custom Furys hit the market. God willing they are full 4096 SP chips otherwise AMD is basically abandoning a performance tier for all 4K TV users.
Still have to wait another 3 weeks for those cards to come out, a feature that should probably be standard by now, considering a lot of their old models use it.

AMD is already telling SFF users with 4K TVs (living rooms, hello?) to go screw themselves. Weird move.
 
HDMI 2.0 will be a non-issue once the custom Furys hit the market. God willing they are full 4096 SP chips otherwise AMD is basically abandoning a performance tier for all 4K TV users.
Still have to wait another 3 weeks for those cards to come out, a feature that should probably be standard by now, considering a lot of their old models use it.

AMD is already telling SFF users with 4K TVs (living rooms, hello?) to go screw themselves. Weird move.

I think they mentioned an adapter is coming out in the summer. Thats not telling 4k tv users to screw themselves.

BUT it's still freakin odd AMD didn't go HDMI 2.0, but if the adapter works, It is a non-issue.

If someone is going to spend $1500+ on a tv and $650 on a video card, they can spend another $50-100? (who knows the price) on the adapter....thats IF amd doesn't provide one.
 
AMD should stop giving consumers excuses to buy their competition.
If someone is on the fence, they'd rather buy a 980 Ti than gamble with adapters with the Fury X.
 
AMD should stop giving consumers excuses to buy their competition.
If someone is on the fence, they'd rather buy a 980 Ti than gamble with adapters with the Fury X.

I look at it this way. AMD wasn't worried about the 4k tv gaming crowd. Why? It's a very small niche market right now.

Now would the fury make a badass mitx/matx video card for front room gaming? O fuck yea, specially the Nano, so it really is a head scratcher why they didnt add HDMI 2.0. (although i would rather get a nice low input TV thats 1080p, and use VSR/DSR which is what I do with my 46inch large format monitor).

But the adapter (they say) will be released during the summer, so that very small niche people who PC game on tv's can still buy a fury if need be. AND if AMD provide the adapter with the card (which would be VERY VERY VERY SMART if they added it to the Fury Nano), all is solved.

All speculation of course
 
There is also the fact that display port can do both DVI and HDMI 2. In a case like that its simply smarter to have more of the versatile connection than niche ones.

If you need hdmi 2 get the converter when it comes.
 
I think they mentioned an adapter is coming out in the summer. Thats not telling 4k tv users to screw themselves.

BUT it's still freakin odd AMD didn't go HDMI 2.0, but if the adapter works, It is a non-issue.

If someone is going to spend $1500+ on a tv and $650 on a video card, they can spend another $50-100? (who knows the price) on the adapter....thats IF amd doesn't provide one.

But if the adapter adds $100-$125 to the total cost of entry (not to mention latency), it's an issue. And people will just buy 980 Ti, or a $329 970, or a $199 960. AMD is leaving money on the table by not bothering to tie their shoelaces before entering the ring here.

Its also very naive to keep referring to TV gaming as a "very small niche". It's only increasing, and people tend to buy with future proof in mind.
 
But if the adapter adds $100-$125 to the cost of entry (not to mention latency), it's an issue.

I don't so for people willing to work out over $2k for a TV and video card. It's not like the 4k TV gaming crowd is cheap. Specially if you want a GOOD 4k60hz TV for gaming with low input.

but yea who knows the price of the adapter, or if AMD will provide one. Too many unknowns right now.
 
I look at it this way. AMD wasn't worried about the 4k tv gaming crowd. Why? It's a very small niche market right now.

Now would the fury make a badass mitx/matx video card for front room gaming? O fuck yea, specially the Nano, so it really is a head scratcher why they didnt add HDMI 2.0. (although i would rather get a nice low input TV thats 1080p, and use VSR/DSR which is what I do with my 46inch large format monitor).

But the adapter (they say) will be released during the summer, so that very small niche people who PC game on tv's can still buy a fury if need be. AND if AMD provide the adapter with the card (which would be VERY VERY VERY SMART if they added it to the Fury Nano), all is solved.

All speculation of course

Your right as I am not dropping money yet for 4K gaming..
 
But if the adapter adds $100-$125 to the total cost of entry (not to mention latency), it's an issue. And people will just buy 980 Ti, or a $329 970, or a $199 960. AMD is leaving money on the table by not bothering to tie their shoelaces before entering the ring here.

Its also very naive to keep referring to TV gaming as a "very small niche". It's only increasing, and people tend to buy with future proof in mind.

I could understand the argument for picking the 980 Ti due to the need of an additional adapter, but a $329 970 or a $199 960 will definitely not be driving content at 4k 60hz anyways, so isn't the entire HDMI 2.0 argument useless if you want to include those other cards?
 
Why would AMD want to put HDMI 2 on the card just to sell more cards?
I don't think they are making the cards to make money.....
 
I could understand the argument for picking the 980 Ti due to the need of an additional adapter, but a $329 970 or a $199 960 will definitely not be driving content at 4k 60hz anyways, so isn't the entire HDMI 2.0 argument useless if you want to include those other cards?

I guess it depends on the games people typically play, which will obviously vary wildly. Sure a 970 isn't going to drive Witcher 3 at 4K60, but I've played many different games just fine at 4K60 on the livingroom TV, driven by a single GTX970.

Of course lack of HDMI 2.0 won't affect everyone, and of course it might seem like its being blown out of proportion but that's mostly just because there's not much else to talk about until the 24th. In three more days I reckon the HDMI 2.0 chatter will die down and new squabbles will arise.
 
I am strongly considering a 970 because it includes an HDMI 2.0, however if the new AMD cards can use a passive adapter to the same effect then i would consider a 390x instead. Though i returned my 4k tv and im not sure if im going to replace it anytime soon.
 
I am strongly considering a 970 because it includes an HDMI 2.0, however if the new AMD cards can use a passive adapter to the same effect then i would consider a 390x instead. Though i returned my 4k tv and im not sure if im going to replace it anytime soon.

or you can get a 290x with 8 gb and slap 390x drivers and remove the 2 and replace it with 3
 
Back
Top