Fury X Benchmarked @ 4K (via AMD)

You can use an adapter if 2 0 is that important to you, if you don't want to hassle with an adapter:
Then don't buy it.
They won't miss you.

There is currently no adapter that does this. Check the Samsung thread. When it does come out, it will be ~100 bucks given prior pricing, and when used - it will add additional lag.
 
So glad to see AMD alive and kicking. I knew those 128 ROPs and 512GB/s bandwidth would shine at 4K where raw shader performance isn't always king. I hope they sell the shit out of this card.
 
So glad to see AMD alive and kicking. I knew those 128 ROPs and 512GB/s bandwidth would shine at 4K where raw shader performance isn't always king. I hope they sell the shit out of this card.

Me2 unfortunately i might not be one to buy.
 
Looks like it will be 5-10% faster on stock settings than 980 ti but GCN usually overclocks into 1150-1200 Mhz range so it should end very similar to fully overclocked 980 ti.
 
Looks like it will be 5-10% faster on stock settings than 980 ti but GCN usually overclocks into 1150-1200 Mhz range so it should end very similar to fully overclocked 980 ti.
I imagine water will give the Fury X a slight edge, assuming both chips are equally capable.
Maybe higher binned as well.
 
I saw something that they are only 1.4? Thats what the amd rep said on overclock.net at least.

If this cant do 4k60hz thru hdmi its kinda a deal breaker for me as I want to get a 4k tv for gaming.

I was going to buy the Fury X, but the cover looks like an ice cream sandwich.
This will make users hungry for ice cream every time they look at it...
Fury X leads to overeating and obesity....
Obesity leads to heart disease and cancer...

AMD Fury X... only buy it if you like cancer.
 
Looks promising, fortunately I got a Panasonic 4k TV with DP though I understand this year's model dropped the DP; anyway looks like I may be saying good buy to my old 770's. Does seem strange a card touted for 4k doesn't have HDMI 2 though.
 
Looks promising, fortunately I got a Panasonic 4k TV with DP though I understand this year's model dropped the DP; anyway looks like I may be saying good buy to my old 770's. Does seem strange a card touted for 4k doesn't have HDMI 2 though.

DVI was dropped not Displayport.
 
I saw something that they are only 1.4? Thats what the amd rep said on overclock.net at least.

If this cant do 4k60hz thru hdmi its kinda a deal breaker for me as I want to get a 4k tv for gaming.

Not sure how well they will work but a few companies have been showing DP to HDMI 2.0 adaptors. If they work then why license HDMI ports on a card at all? Just give it 6 DP ports.
 
DVI was dropped not Displayport.

He's talking about the latest Panasonic TVs.

I think Panasonic is the only TV manufacturer that has been incorporating DP. Every other vendor has been using HDMI... which sucks because DisplayPort is the superior interface and Sony's XBRs don't have them.
 
Not sure how well they will work but a few companies have been showing DP to HDMI 2.0 adaptors. If they work then why license HDMI ports on a card at all? Just give it 6 DP ports.

No options until recently, expensive , add lag
 
Not sure how well they will work but a few companies have been showing DP to HDMI 2.0 adaptors. If they work then why license HDMI ports on a card at all? Just give it 6 DP ports.
Yeah but they arent out yet and theres no word on release. And if its anything like the current active adapters there going to be rather expensive at $100+ and they will still have issues like the current dp to dvi dual link. At this point its easier to spend $650 on a 980ti and not have to buy any extra cables or adapters. Its a bummer because I was really excited for this card too. Oh well I wont have to deal with amd catalyst anymore at least.
 
For the last few weeks: "If Fury X is so fast, why doesn't AMD release benchmarks?"

Except nobody actually said that but cool strawman. It was more like why not lift the NDA if there's nothing to hide. But I realize they were probably buying time to tweak drivers, so all the reviewers could be on the same page.

If the [H] review mirrors these results, then they could have a winner especially if AMD starts taking driver releases more seriously. Looks good for stock clocks.

Still need to know 1080p and 1440p performance and which card overclocks better. Will await the [H] review.
 
Last edited:
Except nobody actually said that but cool strawman. It was more like why not lift the NDA if there's nothing to hide.
I've been scouring the depths of the internet for months, trust me plenty of people have said it.
The same sentiment here and here.

You can't lift the NDA on a card that nobody physically has in their hands, or at least until they're done testing it.
 
Oh please, everyone was crying for benchmarks, people can't even wait for a week.

Welps, look what I found: http://hardforum.com/showpost.php?p=1041647359&postcount=24

Well the quoted post does have a point, they likely did tweak their clocks and driver last minute to get neck and neck with a stock clocked 980 Ti. Too bad for AMD that there are aftermarket 980 Ti already faster than that. It's good that they released benchmarks but choosing to disable features in certain games still shows that AMD can't compete with everything enabled. GW may be an nvidia initiative but one would think AMD would've found a way to compensate at the driver level by now. Plus having to sell a water cooled card with hbm just to be able to compete against a gddr5 equipped air cooled reference card can't be good for margins.
 
Well the quoted post does have a point, they likely did tweak their clocks and driver last minute to get neck and neck with a stock clocked 980 Ti. Too bad for AMD that there are aftermarket 980 Ti already faster than that. It's good that they released benchmarks but choosing to disable features in certain games still shows that AMD can't compete with everything enabled. GW may be an nvidia initiative but one would think AMD would've found a way to compensate at the driver level by now. Plus having to sell a water cooled card with hbm just to be able to compete against a gddr5 equipped air cooled reference card can't be good for margins.

It's a 275W card. It doesn't REQUIRE liquid cooling.
 
GW may be an nvidia initiative but one would think AMD would've found a way to compensate at the driver level by now.

I don't think you understand how GW works... there are fundamental limits to how much performance you can squeeze out of the PC graphics stack when limited to only driver-level optimizations. By the time the D3D/GL call stream gets to them there’s little you can do to manipulate the callstream for better performance.

Then again, this knowledge will fall on deaf ears because neither you or anyone from that playground care anyway.
 
Shadow of Mordor on ultra still seems to run fine at 4K. Then again I'm sure some will be crying foul at average FPS so let's just wait for the [H] review.
 
I don't think you understand how GW works... there are fundamental limits to how much performance you can squeeze out of the PC graphics stack when limited to only driver-level optimizations. By the time the D3D/GL call stream gets to them there’s little you can do to manipulate the callstream for better performance.

Then again, this knowledge will fall on deaf ears because neither you or anyone from that playground care anyway.

Don't worry, some random guy from Reddit will figure out a relatively easy workaround within 48 hours while the AMD driver team drinks lattes and calls people out for their own shortcomings.
 
Deflecting and goal post shifting, typical AMD fanboy tactics when faced with amd driver shortcomings. What's next, personal insults?

I don't think you get it. Overriding tesselation values shouldn't be required. In fact, a workaround shouldn't be required at all.

Secondly, can you explain why the 960 was performing better than the 780 in that benchmark?

In fact, why are you even here commenting in AMD flavor for a card you won't even purchase? You don't see me posting in Nvidia flavor about their products. You seem to love talking down about all of AMD's cards so what's the point of posting here?
 
Here it goes again! Thread meltdown! Fanboy testicles test!
 
I don't think you get it. Overriding tesselation values shouldn't be required. In fact, a workaround shouldn't be required at all.

Secondly, can you explain why the 960 was performing better than the 780 in that benchmark?

In fact, why are you even here commenting in AMD flavor for a card you won't even purchase? You don't see me posting in Nvidia flavor about their products. You seem to love talking down about all of AMD's cards so what's the point of posting here?

Not talking about the 960 or 780. Talking about the AMD driver team having several weeks prior to release and they

a) couldn't release drivers to fix the tesselation issue
b) couldn't even bother to figure out and release a temporary workaround
c) were likely too incompetent to even figure out there was a performance issue

Pick one. AMD chose d) blame Nvidia/projektred
 
Not talking about the 960 or 780. Talking about the AMD driver team having several weeks prior to release and they

a) couldn't release drivers to fix the tesselation issue
b) couldn't even bother to figure out and release a temporary workaround
c) were likely too incompetent to even figure out there was a performance issue

Pick one. AMD chose d) blame Nvidia/projektred

I'll pick e) TWITMTBP / GameWorks game, closed-source title; AMD is compromised from the start.
 
I'll pick e) TWITMTBP / GameWorks game, closed-source title; AMD is compromised from the start.


There really is no such thing as closed source when it comes to shaders, all shaders are compiled to ASM (which is readable) by D3D compilers before they are sent to the drivers for final compilation to machine language. It is harder to read then high level code and harder to optimize because of many reasons due to ASM's nature, but its doable, just takes more time and money which AMD is short of both I think?
 
There really is no such thing as closed source when it comes to shaders, all shaders are compiled to ASM (which is readable) by D3D compilers before they are sent to the drivers for final compilation to machine language. It is harder to read then high level code and harder to optimize because of many reasons due to ASM's nature, but its doable, just takes more time and money which AMD is short of both I think?

Or Nvidia could do it like all the non Nvidia sponsored games in my library. Stop making excuses for Nvidia's shitty coding practices. ;) If AMD drivers were so bad, why do the other 2,000 games I own run so well with my R9 290?
 
Christ, here come the fucking armchair engineers....

Yeah, not going to even bother replying anymore.

What's amusing is how they all refuse to acknowledge that their precious company keeps pulling the same tricks with tessellation that it has been playing since Fermi launched. They purposely ignore it because HURRDURR AMD sucks.

Extremetech covered it best:
AMD attempted to provide Warner Bros. Montreal with code to improve Arkham Origins performance in tessellation, as well as to fix certain multi-GPU problems with the game. The studio turned down both. Is this explicitly the fault of GameWorks? No, but it’s a splendid illustration of how developer bias, combined with unfair treatment, creates a sub-optimal consumer experience.

Under ordinary circumstances, the consumer sees none of this. The typical takeaway from these results would be “Man, AMD builds great hardware, but their driver support sucks.”

Source
 
Deflecting and goal post shifting, typical AMD fanboy tactics when faced with amd driver shortcomings. What's next, personal insults?

You pointed out drivers, so did he.

Was that dig on NVidia a little too close to home for you or something? Because no, personal insults are not next in a discussion about driver faults from either company...
 
Guys, there's a reason why AMD only holds 30% of the GPU market.

Their driver support does suck. I am talking from experience where using 2 x 290 TriX was a pain with Mantle. BF4 micro stuttering and a host of other issues got me to shift to a single 290X.
 
Or Nvidia could do it like all the non Nvidia sponsored games in my library. Stop making excuses for Nvidia's shitty coding practices. ;) If AMD drivers were so bad, why do the other 2,000 games I own run so well with my R9 290?


Hey I'm not saying its right or wrong, nV made the code its theirs by law, now if AMD wants to play the blame game, they should take a look at what the did with HL2, its the same stuff man. If AMD has the chance they do something they do it too. And remember this the HL2 shader that caused performance issues to nV hardware at the time was fixed by a guy playing around with the shader. AMD didn't want to fix it for nV.

Sorry at the time it was ATi.
 
Last edited:
People who might be mad = Titan X owners (going to be undercut by both)

People who should be happy to see all this performance from both brands at decent pricing = everyone else.
 
Back
Top