• Some users have recently had their accounts hijacked. It seems that the now defunct EVGA forums might have compromised your password there and seems many are using the same PW here. We would suggest you UPDATE YOUR PASSWORD and TURN ON 2FA for your account here to further secure it. None of the compromised accounts had 2FA turned on.
    Once you have enabled 2FA, your account will be updated soon to show a badge, letting other members know that you use 2FA to protect your account. This should be beneficial for everyone that uses FSFT.

AMD Fury X... Wait, no DVI?!

Amazing with the amount of these Korean 1440p monitors out there that AMD would do this. I have one as well that is DVI only ;/ No way Id spend $100+ for an adapter. Ill just sit on what I have for awhile and go SLI if I have to ;/

Well fact is, there is not that many when you consider the general population. Enthusiasts and a few who would buy these are far and few between vs most people with computers who will buy Dell, HP and other brands monitors with their computers.
 
Nobody wanted HBM...LOL. With that logic we would still be on DDR.

DDR reached its bandwidth limit. GDDR5 hasn't. Proof is that overclocking 980Ti's memory doesn't increase FPS. More bandwidth doesn't help frame rate if it wasn't bandwidth starved in the first place. And if 4GB doesn't hurt, why are some of the leaked benchmarks running at no AA and 0xAF? Marketers don't play games unless they have to.
 
DDR reached its bandwidth limit. GDDR5 hasn't. Proof is that overclocking 980Ti's memory doesn't increase FPS. More bandwidth doesn't help frame rate if it wasn't bandwidth starved in the first place. And if 4GB doesn't hurt, why are some of the leaked benchmarks running at no AA and 0xAF? Marketers don't play games unless they have to.

Have you actually ever used in AA in some of those games? In Battlefield it's almost completely useless and just costs a ton of performance. At 4K you're most likely going to run out of GPU power way before VRAM becomes an issue, plus you go up to 4K so you don't have to use AA. (Higher resolution = less jaggies) Right now is the perfect time to introduce HBM, why? Because 4GB is still good enough, why introduce it later when/if 4GB won't be good enough anymore?
 
AMD-Radeon-R9-Fury-X-Overclock-performance.png

:confused:
 
The configuration doesn't have to exclude DVI. Some partners might have it some might not. Just find a model that has it or includes an adapter
 

What are you trying to prove? The 980Ti is faster than the Fury in some games? VRAM doesn't have anything to do with that graph, if it did the Fury would be in the single digits.
 
What are you trying to prove? The 980Ti is faster than the Fury in some games? VRAM doesn't have anything to do with that graph, if it did the Fury would be in the single digits.

Do you work for AMD? You're a little too defensive.
 
With all this talk about DVI and HDMI 2.0 you guys are missing the real issue.
The fury x has no VGA port. I know it's the flashy digital age for a lot of people but many of us still prefer good old quality analog signals.

With their core customer base abandoned, I feel like AMD's sales are going to suffer here.
 
With all this talk about DVI and HDMI 2.0 you guys are missing the real issue.
The fury x has no VGA port. I know it's the flashy digital age for a lot of people but many of us still prefer good old quality analog signals.

With their core customer base abandoned, I feel like AMD's sales are going to suffer here.

VGA and DVI, yes I agree.

But HDMI 2.0? Your point fails on that one. ;)
 
With all this talk about DVI and HDMI 2.0 you guys are missing the real issue.
The fury x has no VGA port. I know it's the flashy digital age for a lot of people but many of us still prefer good old quality analog signals.

With their core customer base abandoned, I feel like AMD's sales are going to suffer here.

As much as I like AMD cards from price/performance perspective, my next card will have to be Nvidia if AMD excludes VGA again.
 
not to mention that any active adapter will also adds lag and 4K TV is no longer a small market segment as you can get a 4K 4:4:4 @60Hz 40" Samsung cheaper than any 32" 4K monitor.
 
I posted this in another thread. So I will copy/paste.

No serious gamer is worried about HDMI 2.0 for serious gaming on a 4k TV. Which #1 will have horrible input lag and #2 most 4k TV's cannot even do chroma 4:4:4.

I am 99% sure AMD will add a Display port to DVI adapter.
 
What are you trying to prove? The 980Ti is faster than the Fury in some games? VRAM doesn't have anything to do with that graph, if it did the Fury would be in the single digits.


That table shows Far Cry 4 at 4k with 2xMSAA and Fury X takes a 20% hit compared to when its not using MSAA, the 4GB is hurting it.
 
I wish my new truck came with a tape deck (DVI)! I for one am happy to see old tech get phased out. Next is getting the ps2 ports eradicated. I hate seeing that ugly purple/green ps2 port on my mobo, I'll overhaul again next tax season.

I can do it too, watch: "wai no floppy driv!" Am I doing it right? Except DVI isn't old tech, still relevant.. And in any case there's no excuse for no HDMI 2.0. Hell even the GTX960 has HDMI 2.0.

In any case its AMD's loss, not the person that needs DVI -- you guys don't seem to grasp that. This isn't going to force people to throw their perfectly working monitor out just to be able to get a card that's merely on par with the competing brand that still has DVI this generation.
 
Last edited:
SMAA = 45.31 fps
2xMSAA = 35.379
w/ OC = 37

Left column is Fury X, right column is Fury X with a core bump.

WhyCry: This chart has an error, the right column is not 980 Ti, but overclocked Fury X. If you don’t believe check benchmark settings, 980 Ti does not support Mantle.
 
I posted this in another thread. So I will copy/paste.

No serious gamer is worried about HDMI 2.0 for serious gaming on a 4k TV. Which #1 will have horrible input lag and #2 most 4k TV's cannot even do chroma 4:4:4.

I am 99% sure AMD will add a Display port to DVI adapter.

You're wrong then and you're wrong now. Samsung is 4:4:4 and 20.7ms in game mode. The future is big screen 4K. It's glorious.
 
DDR reached its bandwidth limit. GDDR5 hasn't. Proof is that overclocking 980Ti's memory doesn't increase FPS. More bandwidth doesn't help frame rate if it wasn't bandwidth starved in the first place. And if 4GB doesn't hurt, why are some of the leaked benchmarks running at no AA and 0xAF? Marketers don't play games unless they have to.

That plus the pre-launch defensive text from AMD that devs could utilize VRAM better has me a little worried. The thing didn't even launch yet.

All I know is if Star Citizen uses over the VRAM you have available it outright crashes. When I had a 980 I would run out of VRAM way before GPU horsepower. I'd rather not worry about that if I am building a rig in the thousands of dollars.

Hopefully AMD hurries up with gen II HBM.
 
That plus the pre-launch defensive text from AMD that devs could utilize VRAM better has me a little worried. The thing didn't even launch yet.

All I know is if Star Citizen uses over the VRAM you have available it outright crashes. When I had a 980 I would run out of VRAM way before GPU horsepower. I'd rather not worry about that if I am building a rig in the thousands of dollars.

Hopefully AMD hurries up with gen II HBM.

yea vram size is an issue if it is dependent on AMD drivers to optimize utilization. Unless there is something in the HBM that does it hardware side 4 gb is a limitation unless AMD suddenly starts pumping out driver updates frequently but I wouldn't bet on it.
 
You're wrong then and you're wrong now. Samsung is 4:4:4 and 20.7ms in game mode. The future is big screen 4K. It's glorious.

20ms isnt glorious. It's down right horrible.......

That isn't serious 4k gaming.
 
20ms isnt glorious. It's down right horrible.......

That isn't serious 4k gaming.

I doubt most serious twitch shooter players are sitting in the living room though. So it doesn't matter. 20ms is perfectly fine for wandering around in Witcher 3.

A lot of people enjoy relaxing and gaming on the couch in front of a big ass TV. And with SteamOS and Steam Machines making the big push into the living room at the end of the year, AMD is going to be regretting the decision to omit HDMI 2.0 when people are making their livingroom GPU buying choice.
 
I doubt most "serious" twitch shooter players are sitting in the living room. So it doesn't matter.

Exactly. They are using 4k monitors that use Display Port, Or 1440p144hz monitors with display port. Not 4k Tv's.
 
I doubt most serious twitch shooter players are sitting in the living room though. So it doesn't matter. 20ms is perfectly fine for wandering around in Witcher 3.

A lot of people enjoy relaxing and gaming on the couch in the living room. And with SteamOS and Steam Machines making the big push into the living room at the end of the year, AMD is going to be regretting the decision to omit HDMI 2.0 when people are making their livingroom GPU buying choice.

You're right. It seems that twitch game players believe they live in a bubble where everyone has to have 144hz no lag monitors. There is a world out there for RPG, action, tactical and strategy game players who want graphical fidelity and casual FPS. The world doesn't revolve around twitch games.
 
I don't really know why people are so quick to defend this or accuse other people of "not being real gamers" or "not really being enthusiasts" because of the connection their monitor or TV has.

The fact of the matter is, this all could have been avoided by including HDMI 2.0 and a DVI port. If they truly don't plan to offer those connections, frankly it is short sighted. And I'm saying this as someone who has an $800 1ms 144hz 2560 GSYNC monitor with a DisplayPort connection. It doesn't impact me. But when I start to shop around for 4K TVs and receivers later in the year, I'm going to have to either shop for a VERY limited selection of receivers that have DisplayPort, or go NVIDIA. That's not a good position to put potential purchasers in. Especially since these cards are otherwise so well suited to SFF and HTPC machines.
 
I don't really know why people are so quick to defend this or accuse other people of "not being real gamers" or "not really being enthusiasts" because of the connection their monitor or TV has.

The fact of the matter is, this all could have been avoided by including HDMI 2.0 and a DVI port. If they truly don't plan to offer those connections, frankly it is short sighted.

The HDMI 2.0 connection does make no sense. Not sure why AMD did not go that route. IT is not a huge deal, but it is a head scratcher.

DVI port isn't a big deal since its considered a legacy port, and AMD usually adds in a display port to DVI adapter.
 
The HDMI 2.0 connection does make no sense. Not sure why AMD did not go that route. IT is not a huge deal, but it is a head scratcher.

DVI port isn't a big deal since its considered a legacy port, and AMD usually adds in a display port to DVI adapter.
I understand it's a legacy port, and maybe on a high end product it makes sense to target high end monitors (I'm sure they are targeting users who have a FreeSync monitor, ultimately), it just seems strange given the popularity of DVI monitors still with gamers. I don't feel like we are quite there yet where we can totally phase out DVI. My girlfriend is still playing on a 2560x1440 Korean IPS panel, which totally suits her, and I don't plan on replacing it or putting a $100 active adapter on it.
 
I understand it's a legacy port, and maybe on a high end product it makes sense to target high end monitors (I'm sure they are targeting users who have a FreeSync monitor, ultimately), it just seems strange given the popularity of DVI monitors still with gamers. I don't feel like we are quite there yet where we can totally phase out DVI. My girlfriend is still playing on a 2560x1440 Korean IPS panel, which totally suits her, and I don't plan on replacing it or putting a $100 active adapter on it.

Well luckily AMD will probably provide you with one if you buy a Fury for her. So you will be set :)
 
Well luckily AMD will probably provide you with one if you buy a Fury for her. So you will be set :)

ill eat a hat if AMD packs in an active DP to DVI that supports 1440p 144hz

thats a 100 to 200 buck adapter
 
ill eat a hat if AMD packs in an active DP to DVI that supports 1440p 144hz

thats a 100 to 200 buck adapter

Luckily most 1440p/144hz monitors have display port :) So no need for an active adapter.
 
No they don't, and they do not reach 144hz either :)

did you even READ THE first page?

here are some highlights

Not only that - Those active DP to dual link adapters bork out over 75Hz or so.

I run one so I can drive a overclockable 120Hz Korean monitor with a DP. The monitor does 120Hz fine over DVI. But with the adaptor, can't get over 70Hz without the adaptor crashing.

If there's no DVI on the Fury I'm not interested.



Given that a 1440p Korean overclockable PLS monitor is still the best gaming monitor out there, I'm sticking with it, and will wait for a video card with DVI.


then there is the 4k tv issue... almost no tvs have DP and you cant get 4k 60hz over ANY converter
 
I'm pretty sure you need an active adapter for 2560x1440 regardless of refresh rate, no?
 
DDR reached its bandwidth limit. GDDR5 hasn't. Proof is that overclocking 980Ti's memory doesn't increase FPS. More bandwidth doesn't help frame rate if it wasn't bandwidth starved in the first place. And if 4GB doesn't hurt, why are some of the leaked benchmarks running at no AA and 0xAF? Marketers don't play games unless they have to.

Other than the stuff we still don't know about HBM and its interaction with the on-GPU memory controller there are tons more benefits other than the silly bandwidth argument.

GDDR5 has essentially hit a stall where increasing clocks also increases board power usage considerably. Yes there are benefits, no AA doesn't really show it. HBM gives great advantages towards real-estate and power use. It also makes cooling the card a trillion times easier as you don't have to have a HUGE honking metal plate covering the entire card.

There are tons of technology that has bettered the GPU landscape that no one has asked for. HBM will be one of them and soon both GPU makers will have their own version.

As for why some of the "leaked" benchmarks aren't running AA (other than the fake ones, which there are literally thousands of them) is to show off the greatest performance advantage. Nvidia currently has much more resources to run traditional AA and right now it appears AMD will have a lot of resources to run post-processing AA. Those leaked benchmarks have no idea what the new AMD post processors will be called so they most likely just omitted those to begin with in order to maintain some sort of defense later.
 
I'm pretty sure you need an active adapter for 2560x1440 regardless of refresh rate, no?

correct as well as 1080p 3d ie 120hz

but almost none of the adapter work well past 60hz even the pricy ones that support 3d
 
Back
Top