AMD Fury X... Wait, no DVI?!

I am shocked that in the middle of 2015 that a brand new product from AMD would not support DVI and HDMI 2.0.

No DVI, no HDMI 2.0. They have killed this product for many of us. The options now are very narrow and limiting.

No DVI, No HDMI 2.0 .... No Thanks.
 
So i Currently have a 30 inch dell 2560x1600 and was planning on upgrading to 40 inch samsung 4k in the near future after buying the first fury. Looks like theres no way for me to even play the card on my dell which I think is still one of the best screens around? What was amd thinking, the 980ti is looking more and more temping after having amd cards for 5 years.
 
So i Currently have a 30 inch dell 2560x1600 and was planning on upgrading to 40 inch samsung 4k in the near future after buying the first fury. Looks like theres no way for me to even play the card on my dell which I think is still one of the best screens around? What was amd thinking, the 980ti is looking more and more temping after having amd cards for 5 years.

Doesn't your Dell have a Displayport connector? You can use that if it does. Maybe someone makes a DP -> Dual link DVI adapter.
 
Doesn't your Dell have a Displayport connector? You can use that if it does. Maybe someone makes a DP -> Dual link DVI adapter.
I have the 3007WFP-HC with just a single dvi-d. I know the newer dell has displayport. Looks like they make adapters but they are $100+ and people complain of lag. That pushes the card up out of what I want to spend. And a hdmi to dvi wont work because that cant handle dual link think thats only good up to 1900x1200.
 
So.... anyone buying a new monitor for Fury X lol.
I cant give up my 2560x1600 unless its a 4k samsung and neither of those are possible with this card so ill be going back to the green team I guess. My 2 6950s have treated me decent but I need an upgrade bad.
 
Ahhh, so its ok to do away with one type of legacy port but not the other :rolleyes:

Again, there are a lot of enthusiasts who use the Korean 1440p panels which only have a DVI port. Hell, go find the hundreds upon hundreds of pages on OCN. AMD is basically burning us all to do away with "legacy" all while keeping "legacy".

You're not an "Enthusiast" buying a $300 monitor especially with one port.....
 
Last edited:
I cant give up my 2560x1600 unless its a 4k samsung and neither of those are possible with this card so ill be going back to the green team I guess. My 2 6950s have treated me decent but I need an upgrade bad.

Perfect for me because I could use a new monitor so no big deal. I like getting rid of old tech. I'll wait till later in the year though to see what cards come out. Also, dual 1440p monitors would be great for my Pro Tools boot.
 
Last edited:
What year is it again? I didn't know DVI was still a thing.

I guess people still haven't upgraded from QNIX/Catleap/Yamakasi/Shimians. :eek:
 
Just seeing the name QNIX I wouldn't look twice and would assume it's cheap and move on.
 
You're not an "Enthusiast" buying a $300 monitor especially with one port.....

For gaming, I will buy a monitor with one port everyday of the year. Why?
Low input lag.

Those Korean monitors have very low input lag and high refresh rates.
The amount of money spent doesn't mean anything when it performs above what is currently available.
 
Maybe HDMI 2.0 isn't in because the GPUs could've been finished beforehand? Could be because AMD would rather push DP?

Me, I'd rather use a monitor than a TV for my computer display. Those 4k@60 TVs are nice-looking, but I prefer something specifically designed for computer usage/gaming.
 
LOL, search for a monitor, enter 1440p, search by cheapest first.......... Everyone defending it but have nicer stuff. Anyways, yes finally get rid of DVI!
 
Last edited:
input lag has to be the most retarded thing people complain about when gaming monitors, i dont know why people will run for better numbers they cant notice?
most people would agree a can of beer adds more to your gaming experience than reduced input lag from an average monitor to high end model
 
I need DVI for my Yamakasi. Can't buy an expensive 4k monitor just yet, and I'm not thrilled yet with any of the 120-144hz TN based panels.
 
input lag has to be the most retarded thing people complain about when gaming monitors, i dont know why people will run for better numbers they cant notice?
most people would agree a can of beer adds more to your gaming experience than reduced input lag from an average monitor to high end model

That and also the gamer age. Believe it or not, but even old fart like myself still game at 50 years old. But at that age, 1, 2 or 5ms lag won't matter all that much :D
 
Yeah, even if you can see it the human body can't respond to it that fast.

10ms is faster than the "O" in "Oh shit!".

Big screen TVs tend to have much more lag so it is more noticeable there. My 65in 4k has a 19ms response time on its special HDMI 2.0/PC port, which is supposedly pretty damn good for a 2014 model.
 
I'm sure we will get some better pricing later in the year on the 2k 4k stuff.

ASUS MG279Q Black 27" 144 Hz 4ms (GTG)WQHD Widescreen 2560X1440 LED IPS Panel, Adapting-Sync( Free Sync), Ergonomic Professional high performance Monitor, Pivot & High Adjustable - $599.00 now, will see how the pricing is later this year. Not sure which monitor I want but I will need to get x2 of them.
 
I'll be happy to upgrade once some decent freesync monitors come out.
 
No DVI support on the Korean enthusiast 144p/120hz end and no HDMI 2.0 support on the 4k/60hz TV panel side seems like a terrible miscalculation.
 
No DVI support on the Korean enthusiast 144p/120hz end and no HDMI 2.0 support on the 4k/60hz TV panel side seems like a terrible miscalculation.

They will probably add in a DP to DVI adapter like they did before.

The HDMI 2.0 isnt a HUGE to deal to most. 4k TV gaming is a very small market.

Edit: would like to add I am shocked AMD did not add it in with Fury.
 
OcUK nuked the whole thread :D Poor AMD representative I hope they won't fire him.

If it was a mistake, the Rep would have stated so with a follow up or edited post.
Nuking the thread means AMD is trying to limit the damage.

But the internet is already running with scissors.
 
They will probably add in a DP to DVI adapter like they did before.

The HDMI 2.0 isnt a HUGE to deal to most. 4k TV gaming is a very small market.

Edit: would like to add I am shocked AMD did not add it in with Fury.

Yes but will that adapter work at 1440p@120mhz? Probably not...
 
DVI Single Link uses 19 pins and supports up to 1920x1200 at 60Hz, but can't transmit enough data to support 2560x1600 or 2560x1440 at 60Hz, or 1680x1050 at 120Hz.

DVI Dual Link adds 6 more pins to the original 19 to support 2560x1600 and 2560x1440 at 60Hz, and 1920x1080 at 120Hz (the maximums are actually higher, apparently 2560x1440 at 144Hz, I'm copying and pasting from a much older post I made before such resolutions and Hz were available).

HDMI uses the same 19 pins as DVI Single Link for video, but is able to support the higher resolutions and Hz through those 19 pins by being able to send more data down those 19 pins.

HDMI is able to pass DVI Single Link through its 19 video pins, making conversion extremely simple with no data loss. However, HDMI is not able to pass DVI Dual Link through because it has no way of passing through the extra 6 pins that DVI Dual Link requires. Therefore, when using a passive HDMI->DVI cable, you will have the same resolution and Hz restrictions as DVI Single Link.

Displayport also has 19 video pins, same deal as HDMI for passing through DVI.
 
Last edited:
Not only that - Those active DP to dual link adapters bork out over 75Hz or so.

I run one so I can drive a overclockable 120Hz Korean monitor with a DP. The monitor does 120Hz fine over DVI. But with the adaptor, can't get over 70Hz without the adaptor crashing.

If there's no DVI on the Fury I'm not interested.

Terrific post man, I agree with you completely. The fact that Fury may not have HDMI 2.0 is fucking embarrassing for AMD too.
 
I wish my new truck came with a tape deck (DVI)! I for one am happy to see old tech get phased out. Next is getting the ps2 ports eradicated. I hate seeing that ugly purple/green ps2 port on my mobo, I'll overhaul again next tax season.
 
rofl, get off your high horse. people bought the qnix because it was the only 1440p, 27", IPS-based monitor that could do 120 Hz for YEARS, not because it was cheap. that was just a bonus. i'd also like to know how it was knockoff of the XB270HU three years before it came out, considering that's the only monitor you can buy that does the same thing. :rolleyes:

No horses here my friend. I am sorry I seem to have struck a sore spot with you, but whether it was the only thing on the market to fill that niche is irrelevant of the point that the Korean panels are cheap alternatives that cut corners and have no warranty compared to mainstream monitors. They are knockoffs because they use the same panels that are used in much higher priced monitors, such as the Apple Cinema. In fact most of those companies outright state that the panels are units that did not pass QC. Conversely, some/many of them do come in excellent condition with no dead pixels and little IPS glow.
I think the mainstream monitor manufacturers did not realize the demand for higher refresh IPS/PLP panels until the Korean monitor craze among enthusiasts and that is why we are now seeing more of them. Sure you lose color accuracy and/or gamma but to some gamers that is of little concerns compared to refresh rate.

Calm down big guy the Korean panels are a great value if you are ok with their tradeoffs (no warranty and only DVI-D). Their higher Hz panels typically do not have the extra connectors such as DP or HDMI which are a little more future-proof as you are now aware. The real tragedy with Fury is that it is not HDMI 2.0.
 
I wish my new truck came with a tape deck (DVI)! I for one am happy to see old tech get phased out. Next is getting the ps2 ports eradicated. I hate seeing that ugly purple/green ps2 port on my mobo, I'll overhaul again next tax season.

Tape deck = VGA port. Display Port and HDMI 2.0 = Mp3. CD player = DVI.

Your analogies need some work. The problem with AMD is not their technologies. The problem is that they have no idea what the current market wants. They push the wrong technology (HBM) when nobody asked for it, and instead leave off the ones that's needed (more than 4GB, HDMI 2.0). They just pushed the DVI and HDMI 2.0 users straight to Nvidia. Good job, AMD. You just lost another 5% marketshare. Congrats!
 
Nobody wanted HBM...LOL. With that logic we would still be on DDR.
 
Back
Top