AMD R9 Fury X benchmarks leaked, Significantly Faster than the GTX 980Ti at 4K

I am strongly considering a 970 because it includes an HDMI 2.0, however if the new AMD cards can use a passive adapter to the same effect then i would consider a 390x instead. Though i returned my 4k tv and im not sure if im going to replace it anytime soon.
So you don't have a UHDTV and you are worried about HDMI 2?
Also, that 970 won't see 60fps on 4k so it would be a moot point.
 
So you don't have a UHDTV and you are worried about HDMI 2?
Also, that 970 won't see 60fps on 4k so it would be a moot point.

Big tires = HDMI 2.0
The rest = GTX 970 at 4K

maxresdefault.jpg
 
Why does it matter? No HDMI 2.0, no care. Pretty much a requirement for 4K right now.

DP is a requirement for 4K. It is also a requirement for multimonitor 4K freesync. More important than some lame 4K TV setup and address a larger market
 
I didn't know 4k was such a big deal before this thread (other than for photos and productivity).

I've been using a Dell 3007wfp (2560x1600) almost since it came out. Every year was supposed to be the year that single cards would steamroll games at 2560x1600@60hz, then a new game came out...then SLI was mandatory and still can't run things maxxed out in a lot of cases at 60 fps. Witcher 3 is a great example.

Now it seems like the same marketing gimmicks with 4k.

The benchmarks posted at the start of this thread show FPS in the 30s in some newer games...with settings turned down....who wants to play on Medium at 30 fps? 30 fps is 30 fps, I don't care how high res it is.

What does it take to run Witcher 3 or Crysis 3 or AC5 on Ultra at 4k? Triple 980s or triple TIs/Fury Xs? What about when next year's games out? Time to plop another 2k on video cards?

Could be I am out of touch but I don't see 4k becoming more mainstream until the consoles make the jump and high power single cards or midrange SLI can tackle the issue.
 
I didn't know 4k was such a big deal before this thread (other than for photos and productivity).

I've been using a Dell 3007wfp (2560x1600) almost since it came out. Every year was supposed to be the year that single cards would steamroll games at 2560x1600@60hz, then a new game came out...then SLI was mandatory and still can't run things maxxed out in a lot of cases at 60 fps. Witcher 3 is a great example.

Now it seems like the same marketing gimmicks with 4k.

The benchmarks posted at the start of this thread show FPS in the 30s in some newer games...with settings turned down....who wants to play on Medium at 30 fps? 30 fps is 30 fps, I don't care how high res it is.

What does it take to run Witcher 3 or Crysis 3 or AC5 on Ultra at 4k? Triple 980s or triple TIs/Fury Xs? What about when next year's games out? Time to plop another 2k on video cards?

Could be I am out of touch but I don't see 4k becoming more mainstream until the consoles make the jump and high power single cards or midrange SLI can tackle the issue.

You are probably right however for some reason I don't feel like there will be a next gen series of consoles if for nothing else but just the rate at which smart phone tech progresses 5 years from now phone could dwarf what is already in current consoles by a fair margin.

That said next year's video cards could be a monstrous jump in performance due to the massive shrink from 28nm to 14/16nm assuming TSMC and gloflo can get up and running and suitable for complex chips like gpus as well as hbm 2.0 and dx12.
 
Could be....or could be massively smaller and lower power consumption with a slight performance increase, which is how the market is going.

I know it hurts me as a desktop enthusiast but I think it's cool laptop gpus are less and less cut down. Hopefully phones follow.
 
I didn't know 4k was such a big deal before this thread (other than for photos and productivity).

I've been using a Dell 3007wfp (2560x1600) almost since it came out. Every year was supposed to be the year that single cards would steamroll games at 2560x1600@60hz, then a new game came out...then SLI was mandatory and still can't run things maxxed out in a lot of cases at 60 fps. Witcher 3 is a great example.

Now it seems like the same marketing gimmicks with 4k.

The benchmarks posted at the start of this thread show FPS in the 30s in some newer games...with settings turned down....who wants to play on Medium at 30 fps? 30 fps is 30 fps, I don't care how high res it is.

What does it take to run Witcher 3 or Crysis 3 or AC5 on Ultra at 4k? Triple 980s or triple TIs/Fury Xs? What about when next year's games out? Time to plop another 2k on video cards?

Could be I am out of touch but I don't see 4k becoming more mainstream until the consoles make the jump and high power single cards or midrange SLI can tackle the issue.

Not really sure what benchmarks you are looking at but mid range sli cards run 4k at about 80% maxed out. I think I have three things basically turned off on GTA V and pegged at 60 in the city and low 50's in the country side (grass basically off, Extended viewing distance only halfway, no post FX). All of the other 20+ sliders are at max setting.

Honestly a 980TI @ 4k is playing crysis 3 and GTA V better than my higher end rig did in 2008 for the same games. GTX 295 (that was the fastest card available at the time) a new i7-920 and a 1920X1200 monitor couldn't max out GTA IV or Crysis 1 back then. FWIW the rigs costs about the same I think my 4k monitor was maybe $200 more than what I paid in 2007-2008 but after inflation probably very similar.

I just looked too NOBODY is running AA @ 4k. The whole point of 4k is not having to run AA. That is why the framerates are so low.
 
Yeah, I'm running a Dell 3415W on a single 290 and for the most part I'm running games at 75% details and it's fine. I had a 30" 2560x1600 before this and the single 290 ran alright too.

While it's certainly nice to have everything turned up, simply removing things like AA and the like can make a world of difference and it still looks a lot nicer at higher res's than 1080p with all the candy turned on imo.

A 980TI or Fury would be fine for me with all the details turned on I think - I'm happy with 40-60 fps. I'm getting that now with GTA5 (although Witcher is pretty taxing)
 
GPU Makers really know how to market their stuff. It appears that ULTRA with 16xAA, etc. is considered the minimum settings now. Remember back when HIGH would kill your frame rate and ULTRA just breaks everything to a standstill? We were gaming on 720P where AA was absolutely needed. Now we're putting 16xMSAA on 4K that's 10x the resolution of 720P.

Just when Ultra is achievable on 1080P, they throw in Gameworks. GPU makers doing their thing and people are just buying into it. How else are they going to sell dual and triple GPU SLI/CF?
 
I didn't know 4k was such a big deal before this thread (other than for photos and productivity).

I've been using a Dell 3007wfp (2560x1600) almost since it came out. Every year was supposed to be the year that single cards would steamroll games at 2560x1600@60hz, then a new game came out...then SLI was mandatory and still can't run things maxxed out in a lot of cases at 60 fps. Witcher 3 is a great example.

Now it seems like the same marketing gimmicks with 4k.

The benchmarks posted at the start of this thread show FPS in the 30s in some newer games...with settings turned down....who wants to play on Medium at 30 fps? 30 fps is 30 fps, I don't care how high res it is.

What does it take to run Witcher 3 or Crysis 3 or AC5 on Ultra at 4k? Triple 980s or triple TIs/Fury Xs? What about when next year's games out? Time to plop another 2k on video cards?

Could be I am out of touch but I don't see 4k becoming more mainstream until the consoles make the jump and high power single cards or midrange SLI can tackle the issue.

I agree. All the benchmarks showed was how useless 4k is. I get the feeling that the whole 4k thing has been a big leap backwards in gaming. It used to be we wanted eye candy features cranked as high as they'll go at high frame rates to be called acceptable performance. With 4k now people get all worked up over medium settings that can barely eek out 30 fps. Really? In a sane world we'd call that massive fail.
 
I agree. All the benchmarks showed was how useless 4k is. I get the feeling that the whole 4k thing has been a big leap backwards in gaming. It used to be we wanted eye candy features cranked as high as they'll go at high frame rates to be called acceptable performance. With 4k now people get all worked up over medium settings that can barely eek out 30 fps. Really? In a sane world we'd call that massive fail.

Have you seen the 4K gameplay or are you saying this from your imagination?
 
Some of the custom 390X boards probably have HDMI 2.0. Better check.

Nope, HDMI 2.0 needs to be enabled in the gpu itself, you can't just add a chip to it. DVI will definitely be added to a lot of Fury cards because it's supported in the driver and gpu.
 
Have you seen the 4K gameplay or are you saying this from your imagination?

Been there, tried that. Switched to 1440p 144hz gsync. Never looked back. 4k isn't ready for gaming. Watching videos or other work it's fine, but for gaming it doesn't belong yet.
 
Been there, tried that. Switched to 1440p 144hz gsync. Never looked back. 4k isn't ready for gaming. Watching videos or other work it's fine, but for gaming it doesn't belong yet.

Same here. 4k gaming just isn't there yet.
 
Same here. 4k gaming just isn't there yet.

+1

That's why I am staying at 1080 and 1440. I crave my fluid fast FPS. So strapping these "4K" cards into my rig and gaming at =<1400P, I am cruising with full eye candy.
 
Am I missing something, or are these benchmarks kinda meh? GTX 980Ti and Fury X are pretty much the same price for very little variation in performance. Thought the Fury X was going to be something amazing like the "Titan X Killer" or among those lines. Seems like a choice of personal preference now?
 
Am I missing something, or are these benchmarks kinda meh? GTX 980Ti and Fury X are pretty much the same price for very little variation in performance. Thought the Fury X was going to be something amazing like the "Titan X Killer" or among those lines. Seems like a choice of personal preference now?
It is "meh", it should have been around 10% faster than the Titan X according to early math.
Now it looks like it's going to trade blows across the board.
 
I still love my 3440x1440 curved. Looks beautiful, 34" wide, use DSR when I have extra performance to burn. Not unnecessarily brutal on the PPI. Almost like surround without all the bezels. For immersion I have to think 34" 21:9 is >> than a 16:9 4k. Only thing better I would think is VR.
 
It is "meh", it should have been around 10% faster than the Titan X according to early math.
Now it looks like it's going to trade blows across the board.

TechReport podcast a week back hinted that it would trade blows, not surprised. There will likely be situations where the Fury is 10% ahead but it'll be highly dependent on game and settings used.
 
TIL "meh" = "this card is about as fast as this other card, but it's costs $400 less"
 
TIL "meh" = "this card is about as fast as this other card, but it's costs $400 less"

Looks like it's going to trade blows with the Ti, honestly - at a $100 to $150 savings reference vs reference. That's fine with me.
 
TIL "meh" = "this card is about as fast as this other card, but it's costs $400 less"
You mean the same price?
I posit the following:

Do you want an air-cooled 980 Ti or a water-cooled Fury X for the same price?
Are AMD's drivers worth the "free" AIO water-cooler?
How well does the Fury X OC? (20-25% performance gain required)

After pondering these questions myself, I can't help but feel this is a failed launch for AMD.
Fury might be in a good spot... We'll see. Fury X is not.

9BY0QuL.png
 
Never understood the hate on AMD's drivers. I was using nVidia cards for awhile, the jumped over to 6950 Crossfire. Never notice any real difference in the "quality" of the drivers, even though I was apparently using the worst combination ever: AMD and Crossfire.

Yeah, I'm fine with AMD's drivers.

Just amusing to me that the prevailing attitude seems to be "If any card from AMD doesn't completely dominate the card from nVidia then it's a complete fail". I mean, have that attitude if you want. I'll just be over here figuring out how to get the most performance with my money, and if AMD is 10% faster at the same or better price, that's a win in my book and I'll buy it.
 
Never understood the hate on AMD's drivers. I was using nVidia cards for awhile, the jumped over to 6950 Crossfire. Never notice any real difference in the "quality" of the drivers, even though I was apparently using the worst combination ever: AMD and Crossfire.

Yeah, I'm fine with AMD's drivers.

Just amusing to me that the prevailing attitude seems to be "If any card from AMD doesn't completely dominate the card from nVidia then it's a complete fail". I mean, have that attitude if you want. I'll just be over here figuring out how to get the most performance with my money, and if AMD is 10% faster at the same or better price, that's a win in my book and I'll buy it.
Based on past AMD release cycles, the regular Fury should be around $400. It's $550.
We're rightly disappointed with AMD. They've set their own bar and they failed to reach it this time.
 
Lol......Typical Nvidia fanboy........ Anyway, Nvidia fanboys should stick with Nvidia cards.

TaintedSquirrel shits on both sides of the fence, can't really accuse him of being a fanboy either way.

The revived accounts/fresh accounts purely posting AMD shill material lately though, is tremendous.
 
Lol......Typical Nvidia fanboy........ Anyway, Nvidia fanboys should stick with Nvidia cards.
I have my prejudices, but if the choice is "buy AMD for water" then I will go with Nvidia in a heartbeat since I don't like water-cooling, especially those cheap CLCs.

Fury X's OCability and 4 GB limitations remain to be seen. Those will be the deciding factors. If the FX hits a VRAM wall in games like GTA V, and only does 10-15% on OC then it will be a flop.

Good then don't buy, stick with Nvidia. No one is forcing you to buy AMD cards.
Well AMD was doing pretty good in recent years but this is not impressive.
I'm more interesed in the regular Fury models, anyway.

Depending how the FX fares in benchmarks I will end up with a 980 Ti or waiting for Fury.
 
He is just a typical Nvidia fanboy......

Seems to be he's been genuinely interested in waiting for the Fury X to come out and, if it satisfied what he needed in a card from a feature/performance standpoint - he'd buy it. He's now disappointed and having a laugh about it. Don't see the issue. A guy with 50 posts, all in AMD threads, calling someone out as a fanboy is laughable.
 
OK, good for you. Glad you decide to go Nvidia.
I haven't decided anything yet.
Fury X is simply not the "Titan Killer" we all expected and thus, considerations have to be made. If AMD's selling point is water, then I'm not interested in that.

AMD isn't competing as well on the value front anymore and that's a fact.
 
Lol......Typical Nvidia fanboy........ Anyway, Nvidia fanboys should stick with Nvidia cards.

He is right though, if the fury x is as good as a 980ti and can't overclock well it is not a win for AMD. Why do you think they priced the card a $650 with an AIO?

As far as Nvidia drivers, they drop support for older cards faster than AMD. So you get great drivers to start but in the long run AMD cards are better.
 
Back
Top