Its alright let him get the fury and have him realize it wont work with his monitors.Is it though? Custom 980Tis are already faster than most of these Fury X benches.
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
Its alright let him get the fury and have him realize it wont work with his monitors.Is it though? Custom 980Tis are already faster than most of these Fury X benches.
You can use an adapter if 2 0 is that important to you, if you don't want to hassle with an adapter:
Then don't buy it.
They won't miss you.
So glad to see AMD alive and kicking. I knew those 128 ROPs and 512GB/s bandwidth would shine at 4K where raw shader performance isn't always king. I hope they sell the shit out of this card.
So glad to see AMD alive and kicking. I knew those 128 ROPs and 512GB/s bandwidth would shine at 4K where raw shader performance isn't always king. I hope they sell the shit out of this card.
I imagine water will give the Fury X a slight edge, assuming both chips are equally capable.Looks like it will be 5-10% faster on stock settings than 980 ti but GCN usually overclocks into 1150-1200 Mhz range so it should end very similar to fully overclocked 980 ti.
I saw something that they are only 1.4? Thats what the amd rep said on overclock.net at least.
If this cant do 4k60hz thru hdmi its kinda a deal breaker for me as I want to get a 4k tv for gaming.
Looks promising, fortunately I got a Panasonic 4k TV with DP though I understand this year's model dropped the DP; anyway looks like I may be saying good buy to my old 770's. Does seem strange a card touted for 4k doesn't have HDMI 2 though.
I saw something that they are only 1.4? Thats what the amd rep said on overclock.net at least.
If this cant do 4k60hz thru hdmi its kinda a deal breaker for me as I want to get a 4k tv for gaming.
DVI was dropped not Displayport.
Not sure how well they will work but a few companies have been showing DP to HDMI 2.0 adaptors. If they work then why license HDMI ports on a card at all? Just give it 6 DP ports.
Yeah but they arent out yet and theres no word on release. And if its anything like the current active adapters there going to be rather expensive at $100+ and they will still have issues like the current dp to dvi dual link. At this point its easier to spend $650 on a 980ti and not have to buy any extra cables or adapters. Its a bummer because I was really excited for this card too. Oh well I wont have to deal with amd catalyst anymore at least.Not sure how well they will work but a few companies have been showing DP to HDMI 2.0 adaptors. If they work then why license HDMI ports on a card at all? Just give it 6 DP ports.
For the last few weeks: "If Fury X is so fast, why doesn't AMD release benchmarks?"
I've been scouring the depths of the internet for months, trust me plenty of people have said it.Except nobody actually said that but cool strawman. It was more like why not lift the NDA if there's nothing to hide.
Except nobody actually said that but cool strawman.
Oh please, everyone was crying for benchmarks, people can't even wait for a week.
Welps, look what I found: http://hardforum.com/showpost.php?p=1041647359&postcount=24
Well the quoted post does have a point, they likely did tweak their clocks and driver last minute to get neck and neck with a stock clocked 980 Ti. Too bad for AMD that there are aftermarket 980 Ti already faster than that. It's good that they released benchmarks but choosing to disable features in certain games still shows that AMD can't compete with everything enabled. GW may be an nvidia initiative but one would think AMD would've found a way to compensate at the driver level by now. Plus having to sell a water cooled card with hbm just to be able to compete against a gddr5 equipped air cooled reference card can't be good for margins.
GW may be an nvidia initiative but one would think AMD would've found a way to compensate at the driver level by now.
I don't think you understand how GW works... there are fundamental limits to how much performance you can squeeze out of the PC graphics stack when limited to only driver-level optimizations. By the time the D3D/GL call stream gets to them there’s little you can do to manipulate the callstream for better performance.
Then again, this knowledge will fall on deaf ears because neither you or anyone from that playground care anyway.
Don't worry, some random guy from Reddit will figure out a relatively easy workaround within 48 hours while the AMD driver team drinks lattes and calls people out for their own shortcomings.
Don't worry, some random guy from Reddit will figure out a relatively easy workaround within 48 hours while the AMD driver team drinks lattes and calls people out for their own shortcomings.
It's not really their shortcoming when you had people with 780s getting outperformed by people who bought 960s.
Deflecting and goal post shifting, typical AMD fanboy tactics when faced with amd driver shortcomings. What's next, personal insults?
I don't think you get it. Overriding tesselation values shouldn't be required. In fact, a workaround shouldn't be required at all.
Secondly, can you explain why the 960 was performing better than the 780 in that benchmark?
In fact, why are you even here commenting in AMD flavor for a card you won't even purchase? You don't see me posting in Nvidia flavor about their products. You seem to love talking down about all of AMD's cards so what's the point of posting here?
Not talking about the 960 or 780. Talking about the AMD driver team having several weeks prior to release and they
a) couldn't release drivers to fix the tesselation issue
b) couldn't even bother to figure out and release a temporary workaround
c) were likely too incompetent to even figure out there was a performance issue
Pick one. AMD chose d) blame Nvidia/projektred
I'll pick e) TWITMTBP / GameWorks game, closed-source title; AMD is compromised from the start.
I'll pick e) TWITMTBP / GameWorks game, closed-source title; AMD is compromised from the start.
There really is no such thing as closed source when it comes to shaders, all shaders are compiled to ASM (which is readable) by D3D compilers before they are sent to the drivers for final compilation to machine language. It is harder to read then high level code and harder to optimize because of many reasons due to ASM's nature, but its doable, just takes more time and money which AMD is short of both I think?
Christ, here come the fucking armchair engineers....
Christ, here come the fucking armchair engineers....
AMD attempted to provide Warner Bros. Montreal with code to improve Arkham Origins performance in tessellation, as well as to fix certain multi-GPU problems with the game. The studio turned down both. Is this explicitly the fault of GameWorks? No, but its a splendid illustration of how developer bias, combined with unfair treatment, creates a sub-optimal consumer experience.
Under ordinary circumstances, the consumer sees none of this. The typical takeaway from these results would be Man, AMD builds great hardware, but their driver support sucks.
Deflecting and goal post shifting, typical AMD fanboy tactics when faced with amd driver shortcomings. What's next, personal insults?
Or Nvidia could do it like all the non Nvidia sponsored games in my library. Stop making excuses for Nvidia's shitty coding practices. If AMD drivers were so bad, why do the other 2,000 games I own run so well with my R9 290?