Why would AMD release a 4K video card without HDMI 2.0 and with 4 GB of VRAM

You know the FX is in good shape when people can only complain about outputs. Well played, AMD. Although I have to wonder, if the FX did have HDMI 2.0 and DVI, what would people whine about instead?
 
You know the FX is in good shape when people can only complain about outputs. Well played, AMD. Although I have to wonder, if the FX did have HDMI 2.0 and DVI, what would people whine about instead?

- Only 4GB vram
- OC potential
- AIO
- Aesthetics (I've seen people say elsewhere they won't buy the card based on the looks alone)

Failing all that, the perennial talking point - drivers.
 
- Only 4GB vram
- OC potential
- AIO
- Aesthetics (I've seen people say elsewhere they won't buy the card based on the looks alone)

Failing all that, the perennial talking point - drivers.

You forgot power usage and not being dx12.1! lol
 
People looking to compare cards might not go to the AMD subforum. Not to mention they might not want the overly biased point of view presented there. At least here you can get a more balanced and realistic point of view.

I found that with the issues which were discussed in the AMD forum people were informed and able to make a decision based on what their needs were.
On the other hand Just looking at the title of this thread shows quite a bit of bias. I mean what was AMD thinking releasing a 4gb card with no HDMI 2.0, no one should buy this card on principal alone, AMD should just file for bankruptcy right now, so I can enjoy playing at frame rates that my sli configuration can't even produce(60hz smooth at 4k max settings) (Don't worry I'll make a youtube video to show you) /sarcasm
 
You forgot power usage and not being dx12.1! lol

Aww crap you're right, might as well throw in heat output + noise in that case then.

So to sum it up:

- Only 4GB vram
- No DVI
- No HDMI 2.0
- OC potential
- AIO
- Power consumption
- Heat output
- Noise
- Aesthetics (I've seen people say elsewhere they won't buy the card based on the looks alone)

Failing all that, the perennial talking point - drivers.

No wonder Fury is DOA, look at that laundry list of issues! :eek:
 
So get an adapter.

They don't exist.

No need review sites have beat you to it already http://www.maximumpc.com/nvidia-gtx-980-ti-2-way-sli-crushing-performance/
Looks like that smooth 60hz 4k game play on 980ti's in sil is exclusive to people with super 4k tv's. :rolleyes:

I saw >60 FPS average framerates for 980 Ti 2-Way SLI in that table for every game except GTA V and The Witcher 3, and that's even WITH these guys stacking on retarded overkill settings (4x MSAA ON TOP OF running at 4K resolution? Are you serious? That's only something you do when you're playing an older title with tons of open GPU headroom; you don't try and run The Witcher 3 at 4K with 4x MSAA. You don't even really need AA on top of running at 4K, it makes much less of a difference than putting AA on top of 1080p.)

Games were tested at maximum quality, with all non-proprietary features (e.g., not PhysX and TXAA) enabled. 4xMSAA is used in all benchmarks that support it (everything but Metro: Last Light, Shadow of Mordor, and Tomb Raider); SSAA was not enabled. Yes, you could turn off 4xAA on a 4K display and probably not miss it much, but we left the settings alone for consistency.

I get 60 FPS at 4K resolution in The Witcher 3 without doing retarded things like stacking 4x MSAA on top of it.

Are you running them in SLI? If not then feel free to show me the single 980 TI locked at 60 FPS @ 3840 x 2160 which is the standard 4K TV resolution. You said stable 60 so this means it shouldn't dip below that and please don't record a video with every feature turned off.

I play all my games with every feature on its max setting except:

-Depth of Field (looks like shit)
-Motion blur (looks like shit)
-Chromatic aberration (looks like shit)
-SSAA or MSAA stacked on top of running at 4K unless the game is older and has extra headroom for overkilling (I do ridiculous things in the LEGO games for example like running at 4K with 8x SGSSAA stacked on top of that). Sleeping Dogs Definitive Edition actually performs so well on 980 Ti's I can keep at at 60 FPS running at 4K with its first SSAA option enabled on top of that too. Call of Duty: Advanced Warfare can run at 4K with 2x SSAA before going below 60 FPS.

Aww crap you're right, might as well throw in heat output + noise in that case then.

So to sum it up:

- Only 4GB vram
- No DVI
- No HDMI 2.0
- OC potential
- AIO
- Power consumption
- Heat output
- Noise
- Aesthetics (I've seen people say elsewhere they won't buy the card based on the looks alone)

Failing all that, the perennial talking point - drivers.

No wonder Fury is DOA, look at that laundry list of issues! :eek:

Can you come up with a list of issues this extensive with the GTX 980 Ti cards?
 
I saw >60 FPS average framerates for 980 Ti 2-Way SLI in that table for every game except GTA V and The Witcher 3, and that's even WITH these guys stacking on retarded overkill settings (4x MSAA ON TOP OF running at 4K resolution? Are you serious? That's only something you do when you're playing an older title with tons of open GPU headroom; you don't try and run The Witcher 3 at 4K with 4x MSAA. You don't even really need AA on top of running at 4K, it makes much less of a difference than putting AA on top of 1080p.)



I get 60 FPS at 4K resolution in The Witcher 3 without doing retarded things like stacking 4x MSAA on top of it.



I play all my games with every feature on its max setting except:

-Depth of Field (looks like shit)
-Motion blur (looks like shit)
-Chromatic aberration (looks like shit)
-SSAA or MSAA stacked on top of running at 4K unless the game is older and has extra headroom for overkilling (I do ridiculous things in the LEGO games for example like running at 4K with 8x SGSSAA stacked on top of that). Sleeping Dogs Definitive Edition actually performs so well on 980 Ti's I can keep at at 60 FPS running at 4K with its first SSAA option enabled on top of that too.

Come on man you are talking like you own an AMD card now. With Nvidia I don't settle for anything less than the max settings I need all the MSAA I can get and nothing less, but only on a 4k tv with heavy input lag because monitors with 1ms input lag are for ants. Hell I'll go fire up fallout 3 to get that smooth 60hz 4k game play right now :rolleyes:
 
What is wrong with this company?

Being a little faster than a 980 Ti doesn't mean jack squat when you are restricted to 30 FPS when you hook it up to your 4K TV because it only has HDMI 1.4 on it. All Maxwell cards can drive 4K @ 60 FPS thanks to having HDMI 2.0.

Being a little faster than a 980 Ti also doesn't matter when there's only 4 GB of VRAM. Fury X will constantly be hitting that 4 GB barrier and every time it does there will be huge stuttering as the GPU has to swap in new textures. This was the biggest problem that plagued the GTX 980 that the Titan X / 980 Ti corrected, and AMD is repeating the issue. The minute I swapped my 980's for 980 Ti's is when the stuttering in all the games I played at 4K stopped because the 980 Ti's finally had enough VRAM for 4K. It wasn't a power issue that made 4K gaming on SLI 980's a troubling experience; it was the 4 GB of VRAM that modern games running at 4K was constantly hitting. The speed of the VRAM is not what is critical for preventing the stuttering at 4K; It is solely about the amount of VRAM available. So AMD's HBM technology is useless as long as it is restricted to 4 GB.

As an Nvidia owner this makes me mad because I'm tired of Nvidia continuing to have a de facto monopoly due to AMD's constant incompetence at presenting a real alternative. Nvidia correctly recognized the necessity of HDMI 2.0 and more VRAM for smooth 4K gaming when they designed Maxwell. AMD might be able to put out cards that appear to run faster than Nvidia cards but what they fail to realize is that power is not the deal breaker; it's the critical features Nvidia GPUs offer that convince gamers to pay more for slightly slower Nvidia competitors to AMD.

Ok ok...this is like teh 10th thread we created about this topic and we are only getting better.

I'm sure AMD is going to release rev 1.1 with HDMI 2.0 or whatever. But if it really bothers you go buy a GTX 980 TI and shush it.
 
he already has 2 980ti's so this thread is just trolling and bash amd. must make prime proud :D
 
Can you come up with a list of issues this extensive with the GTX 980 Ti cards?

Where is my VGA port?
Why is the card running so hot, was AMD taking up all the hot card market share?
Wow that power usage is through the roof, I would hate to see the power draw on an over clocked 980ti.
Why is the card not water cooled out of the box?
Why does the card sound like a leaf blower?
Why wasn't it released before the Titan X?

Some what let me know if I missed anything.
I wonder why people were not parading this information around when the 980ti came out?
 
I saw >60 FPS average framerates for 980 Ti 2-Way SLI in that table for every game except GTA V and The Witcher 3,


The only game in that list that can maintain 60 FPS is Tomb Raider that's it. As I said before you need SLI to do it or TITAN, and the list actually even takes away SLI somewhat for 4K gaming. Can you? Sure. But you are definitely turning off quite a bit to do it. Personally I'd much rather have the lower resolution with every feature turned on. But that's just me.


Can you come up with a list of issues this extensive with the GTX 980 Ti cards?

Probably not. However, probably the biggest tell in all of this is how you know all of that is applicable to a card which hasn't been released yet.
 
Why would anyone be upset about this. If you don't want one, don't buy a Fury X. It's not like something better won't come out in a few months anyway. AMD's new refresh is just a stopgap that only exists because they (and Nvidia) are skipping a process node.
 
Not a huge deal in my view as there are very few PC monitors supporting HDMI 2.0.

Still, this is an odd decision from AMD as NVIDIA has cards featuring HDMI 2.0 since September.
Seems like a (desperate?) cost saving measure.
 
Why would anyone be upset about this. If you don't want one, don't buy a Fury X. It's not like something better won't come out in a few months anyway. AMD's new refresh is just a stopgap that only exists because they (and Nvidia) are skipping a process node.


/thread
 
Not a huge deal in my view as there are very few PC monitors supporting HDMI 2.0.

Still, this is an odd decision from AMD as NVIDIA has cards featuring HDMI 2.0 since September.
Seems like a (desperate?) cost saving measure.

It was most likely a mix up of information and the card does in fact have 2.0, but if it does not I find it hard to believe, with the introduction of HBM, that it is a cost saving measure. More likely an over-site by AMD.
 
Well, I am currently running 2 r9 290x gpu cards and bought a 4k TV with hdmi 2.0 only to find out that my r9 290x cards would only run a max refresh rate of 30 hertz = 30 fps at 3840 x 2160 and while I live with it and the games like Shadow of Mordor looks damn good; I have no incentive to update my cards now so I guess I will go on playing games at 30fps

Thanks AMD for helping me save my money :)
 
Not a huge deal in my view as there are very few PC monitors supporting HDMI 2.0.

Still, this is an odd decision from AMD as NVIDIA has cards featuring HDMI 2.0 since September.
Seems like a (desperate?) cost saving measure.

Exactly. It is a head scratcher that AMD did not go HDMI 2.0, but it isn't a huge deal as some people are making out to be.

the 4k60hz TV PC gaming people are like 0.000001% of the minority who want to game like that.
 
Last edited:
I looked for adapters and found no such adapter that magically make my refresh rate go up at 4k. I even thought that I would need to get a faster HDMI cable then found out 1.4a cables would still output to 60 refresh rate on Nvidia cards
 
NVIDIA can do 4K60 over HDMI 1.4, but only at 4:2:0 color which isn't very good quality.
 
I think its a pretty big deal considering Windows 10 is right around the corner releasing console games like Gears of War ultimate, Killer Instinct crossplay. More and more people are building PCs (steam boxes) amd showed a small form factor a little bigger then the xbox one so of course people will want to hook up their bigger displays! A 4k TV hdmi 2.0 compliant is affordable. Surely AMD is smarter then this?

I prefer playing the witcher 3 @ 1080p 60fps vs @ only 900p or 1080p on the X1 or ps4
 
I think its a pretty big deal considering Windows 10 is right around the corner releasing console games like Gears of War ultimate, Killer Instinct crossplay. More and more people are building PCs (steam boxes) amd showed a small form factor a little bigger then the xbox one so of course people will want to hook up their bigger displays! A 4k TV hdmi 2.0 compliant is affordable. Surely AMD is smarter then this?

I prefer playing the witcher 3 @ 1080p 60fps vs @ only 900p or 1080p on the X1 or ps4

I will be honest. I will be shocked if gears of war and KI both work at 4k without any form of downsampling.

I honestly have no idea if they will or won't, but I just don't have faith.

Now I AM looking forward to KI on the pc!!!!!! DSR here I come!
 
Aww crap you're right, might as well throw in heat output + noise in that case then.

So to sum it up:

- Only 4GB vram
- No DVI
- No HDMI 2.0
- OC potential
- AIO
- Power consumption
- Heat output
- Noise
- Aesthetics (I've seen people say elsewhere they won't buy the card based on the looks alone)

Failing all that, the perennial talking point - drivers.

No wonder Fury is DOA, look at that laundry list of issues! :eek:

Can you come up with a list of issues this extensive with the GTX 980 Ti cards?

Well for starters, we can cross off power consumption, heat output, and noise right off the list. 980 Ti is only 20W less power hungry than a 290X, runs right into the thermal limit of 84C on the default fan curve, and even then is fairly loud.

Aesthetics is 100% subjective, and an AIO is seen as a plus by some. So both of these aren't necessarily "issues" and may even be positives for certain people.

So that leaves us with 4GB vram, no DVI + HDM 2.0, and OC potential. OC potential remains to be seen, and Joe Macri, AMD's CTO made a comment about how 4GB HBM won't be an issue. Yes it could be 100% BS and just PR spin, but again that remains to be seen.

And then someone said AMDMatt is going to recheck to see if he misspoke about not having HDMI 2.0. So really the only truly confirmed "issue" at this point is no DVI. Not exactly what I'd call extensive.
 
Tucked away in a VESA consortium corner booth dedicated to DisplayPort, we could see a prototype adapter that converts the signal DisplayPort 1.2 to HDMI 2.0. Offered by BizLink, who was also the first in the DisplayPort hub, this is an active adapter cable but does not need external power.
To prove that he was functional, a demonstration logs on a 4K TV 60 Hz HDMI 2.0 to a Mac Pro equipped with FirePros inconsistent with that connection. It was indeed a 24-bit per pixel format type, without reduction in quality, but probably without HDCP 2.2. We do not know if sales are due, and if this is the case when it should intervene, but technically this conversion seems to be possible.
http://www.hardware.fr/news/09-01-2015/
 
Well for starters, we can cross off power consumption, heat output, and noise right off the list. 980 Ti is only 20W less power hungry than a 290X, runs right into the thermal limit of 84C on the default fan curve, and even then is fairly loud.

Aesthetics is 100% subjective, and an AIO is seen as a plus by some. So both of these aren't necessarily "issues" and may even be positives for certain people.

So that leaves us with 4GB vram, no DVI + HDM 2.0, and OC potential. OC potential remains to be seen, and Joe Macri, AMD's CTO made a comment about how 4GB HBM won't be an issue. Yes it could be 100% BS and just PR spin, but again that remains to be seen.

And then someone said AMDMatt is going to recheck to see if he misspoke about not having HDMI 2.0. So really the only truly confirmed "issue" at this point is no DVI. Not exactly what I'd call extensive.

AMDMatt should know if the Cards are capable of HDMI 2.0 - he would have already tweeted a correction by now. He sees and talks to the lead engineer on a daily bases. I highly doubt he would screw something of this magnitude up, HDMI 2.0 was on of Nvidias selling points when the 980 was released. I just was holding out to see what AMD's offering was and if not compliant with HDMI 2.0 Well, they are just not with the times. They had tombraider and other games running on those big displays in the background, were those monitors or TVs?
 
Exactly. It is a head scratcher that AMD did not go HDMI 2.0, but it isn't a huge deal as some people are making out to be.

the 4k60hz TV PC gaming people are like 0.000001% of the minority who want to game like that.

I think part of it is that their major competitors have had this feature for 9 months (!!!!!), yet AMD haven't seen fit to update theirs! Pretty poor show tbh, but like you say it's not a deal-breaker for most
 
[jF];1041676242 said:
I think part of it is that their major competitors have had this feature for 9 months (!!!!!), yet AMD haven't seen fit to update theirs! Pretty poor show tbh, but like you say it's not a deal-breaker for most

And in 9 months MAYBE 0.01% of the population run 4k60hz on a TV.
 
Next card needs to have HDMI 2.0a + HDCP 2.2 for future proofing - otherwise, 4K BluRay content ain't gonna play.
 
"Why would AMD release a 4K video card without HDMI 2.0 and with 4 GB of VRAM"

He only forgot "and market it for the living room". Because AMD is a typical corporation where everyone is there to collect a paycheck and are completely disconnected from reality? That's always been the feeling I got from AMD. Reminds me of my shitty company where management is totally disconnected and there's absolutely no incentives.

I use a 34" curved monitor. I much prefer this over any 4k TV. I sit back a few feet and recline for games like the Witcher 3. Then for FPS I can sit at a normal distance.
 
i blame her
lisa_su.jpg
 
Well, I am currently running 2 r9 290x gpu cards and bought a 4k TV with hdmi 2.0 only to find out that my r9 290x cards would only run a max refresh rate of 30 hertz = 30 fps at 3840 x 2160 and while I live with it and the games like Shadow of Mordor looks damn good; I have no incentive to update my cards now so I guess I will go on playing games at 30fps

Thanks AMD for helping me save my money :)

Why don't you sell these 2 290x's and buy 980Ti ?

30Hz is absolutely horrible.. I am in the similar predicament, waiting for 980 to arrive, and I currently rather use 1080p@60Hz than 4k@30Hz..
 
And in 9 months MAYBE 0.01% of the population run 4k60hz on a TV.

Thankfully Nvidia accounts for the 0.01% or the 0.000001% or any other number you're pulling out of your ass.
Good companies stay ahead of the curve and the bad ones are always trying to catch up.

Also, if I need an adapter with a $650 card, it's not worth the money.
Clearly, I would have the wrong tool for the job.
 
Back
Top