AMD Fury X... Wait, no DVI?!

Eh, I use 3 DisplayPort to Dual-Link DVI adapters. They're not that expensive, especially if you buy them from ebay, and if you use dedicated usb power supplies for each adapter, they don't drop the signal (I use iPad chargers).
 
Eh, I use 3 DisplayPort to Dual-Link DVI adapters. They're not that expensive, especially if you buy them from ebay, and if you use dedicated usb power supplies for each adapter, they don't drop the signal (I use iPad chargers).

You run 96Hz right??

Find me a way to get back to Fiji.
 
I'll be honest you cant. Now I know AIB members will be able to make custom Fury's (non WCE). So there is a possibility they will add DVI ports in.

It's pretty much a guarantee they will, most people still run DVI. That being said, they can't add an HDMI 2.0 port (I don't think)
 
I can do it too, watch: "wai no floppy driv!" Am I doing it right? Except DVI isn't old tech, still relevant.. And in any case there's no excuse for no HDMI 2.0. Hell even the GTX960 has HDMI 2.0.

In any case its AMD's loss, not the person that needs DVI -- you guys don't seem to grasp that. This isn't going to force people to throw their perfectly working monitor out just to be able to get a card that's merely on par with the competing brand that still has DVI this generation.

DVI was introduced in 1999. That's old tech last I checked, along with....

1999
January 4
Intel releases 366 and 400 MHz Celeron processors, priced at US$123 and US$158 each, respectively, in 1000-unit quantities. [1233.131] [1559]
 
Given that a 1440p Korean overclockable PLS monitor is still the best gaming monitor out there, I'm sticking with it, and will wait for a video card with DVI.

60Hz is pretty far away from best gaming monitor. I am not into slide shows.
 
It's pretty much a guarantee they will, most people still run DVI. That being said, they can't add an HDMI 2.0 port (I don't think)

Nope. HDMI 2.0 has to be in the bowels of the chip, the architecture. Not something an AIB can just hack in.
 
DVI was introduced in 1999. That's old tech last I checked, along with....

1999
January 4
Intel releases 366 and 400 MHz Celeron processors, priced at US$123 and US$158 each, respectively, in 1000-unit quantities. [1233.131] [1559]

Semantics, by not old I meant not outdated and still relevant in display products today.

Regardless, people seem to missing the bigger issue that all this silly elitism and condescension about DVI monitors isn't shaming anyone, its simply costing AMD potential sales since people aren't going to throw their monitors out just to get equal GPU performance in a different brand.
 
Wow really lame. My monitor, a Samsung S27A950D, a 120hz 1080P monitor, actually has 3 ports. It has a Dual-Link DVI port which works perfectly at 1080P@120hz. An HDMI port which only does 1080P@60hz, and a Displayport port. The Displayport port can do 1080P@120hz, but has an issue where every time I alt-tab in or out of something like a game, the entire screen goes black until I manually turn the monitor off and on again.

Lame that my only option would be to use the DP port, and have to deal with that bug. I had always intended to use DP with this monitor, but the bug became so annoying... I tried dual-link DVI and it worked so absolutely perfectly, I never looked back.

I like HDMI but it really needs to get to the point where it can do at least 1080P@120hz and 4K@60hz. It seems silly for them to have included an HDMI port on this monitor at all since it can only do 1080P@60hz. Hell, I would have rather had a VGA port...
 
I should hold out on upgrading until a decent 4K 120hz freesync monitor is released. That'll be a decent combo with the Fury X
 
Apparently there is an adapter that will do ~90-110Hz or so. On Amazon, will find the link later.
But it's still $90...
 
I will never upgrade my Korean 100Hz 1440p PLS, with 0 lag that I paid $240 for. Not until I can get an OLED screen with 120Hz or higher.
I refuse to support the display industry and their constant rehashing of LCD's.
I don't have a problem with paying a premium price - but give me a premium panel. Not 15 yr sold tech.
 
DVI was introduced in 1999. That's old tech last I checked, along with....

1999
January 4
Intel releases 366 and 400 MHz Celeron processors, priced at US$123 and US$158 each, respectively, in 1000-unit quantities. [1233.131] [1559]

x86 32-bit was introduced in 1985. That's old tech last I checked, along with...

1985
Commodore 128
Amiga A1000
Atari 130XE, 130ST, 260ST, 520ST, 65XE, 65XEM, and 65XEP computers
Nintendo Entertainment System (NES)
CAT1 wiring
ATI founded
Coca Cola releases New Coke
 
Damn...there is nothing wrong with my 30" Dell monitor @ 1600p, but looks like I'll have to move on if I want to continue upgrading my video cards.

Anyone running those new curved monitors, I would prefer that over eyefinity.
 
I am cheap. So I am unlikely to buy these card till they are outdated. But.. depending what monitors I am running then, this could be bad.
My take is many cards get sold to help pay for upgrades.
Less compatibility with old tech, less desirable card.

Question is are there more Display port monitor people or "legacy" people?

As far as how good, capable, and available Display Port to D-DVI adapters are I have no clue.

The chance of a manufacturer including a high value adapter seems unlikely.

Hopefully another manufacturer will include D-DVI, if it can be done.
 
It's pretty much a guarantee they will, most people still run DVI. That being said, they can't add an HDMI 2.0 port (I don't think)

No you can't, the ONLY thing AMD can do is add an adapter.

Other then that....yea like I said and will repeat. It is a head scratcher why AMD left out HDMI 2.0.
 
AMD Fury has been in the works for awhile. Maybe it's just too old for them to have put hdmi 2.0 in the architecture. Fury had been "pushed back" so many times. :/
 
Damn...there is nothing wrong with my 30" Dell monitor @ 1600p, but looks like I'll have to move on if I want to continue upgrading my video cards.

Anyone running those new curved monitors, I would prefer that over eyefinity.

Nvidia still supports DVI.
 
Where do we have any confirmation that AMD cannot add HDMI2.0 or that AIBs cannot?
I don't really care honestly seeing as adapters are not a big deal, but I find it odd that we are saying that they cannot do it without any real official info on the subject.
 
The regular Fury boards might have it... Still have to wait 3 weeks and you get the lower binned chips. Not really the best situation.
 
Where do we have any confirmation that AMD cannot add HDMI2.0 or that AIBs cannot?
I don't really care honestly seeing as adapters are not a big deal, but I find it odd that we are saying that they cannot do it without any real official info on the subject.

Someone said that HDCP 2.2 is hardware related but I think it was inferred that HDMI2.0 is not so 2.0 is still a possibility.
 
Where do we have any confirmation that AMD cannot add HDMI2.0 or that AIBs cannot?
I don't really care honestly seeing as adapters are not a big deal, but I find it odd that we are saying that they cannot do it without any real official info on the subject.

From the Samsung display thread:
http://www.twitch.tv/thetechreport/b/670328467
Jump to 36:00...
"It is the case that the display block on this card hasn't changed substantially from the Tonga. As a result, HDMI 2.0 is not supported by this card."[/QUOTE]
 
Back
Top