- Joined
- Aug 5, 2013
- Messages
- 13,286
You know the FX is in good shape when people can only complain about outputs. Well played, AMD. Although I have to wonder, if the FX did have HDMI 2.0 and DVI, what would people whine about instead?
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
Simple logic is to go to the AMD sub forum where all this information is.
Just like the 3.5+0.5GB vram issue with the 970 GTX.
Afaik he cant go to the amd forum.
You know the FX is in good shape when people can only complain about outputs. Well played, AMD. Although I have to wonder, if the FX did have HDMI 2.0 and DVI, what would people whine about instead?
- Only 4GB vram
- OC potential
- AIO
- Aesthetics (I've seen people say elsewhere they won't buy the card based on the looks alone)
Failing all that, the perennial talking point - drivers.
People looking to compare cards might not go to the AMD subforum. Not to mention they might not want the overly biased point of view presented there. At least here you can get a more balanced and realistic point of view.
You forgot power usage and not being dx12.1! lol
So get an adapter.
No need review sites have beat you to it already http://www.maximumpc.com/nvidia-gtx-980-ti-2-way-sli-crushing-performance/
Looks like that smooth 60hz 4k game play on 980ti's in sil is exclusive to people with super 4k tv's.![]()
Games were tested at maximum quality, with all non-proprietary features (e.g., not PhysX and TXAA) enabled. 4xMSAA is used in all benchmarks that support it (everything but Metro: Last Light, Shadow of Mordor, and Tomb Raider); SSAA was not enabled. Yes, you could turn off 4xAA on a 4K display and probably not miss it much, but we left the settings alone for consistency.
Are you running them in SLI? If not then feel free to show me the single 980 TI locked at 60 FPS @ 3840 x 2160 which is the standard 4K TV resolution. You said stable 60 so this means it shouldn't dip below that and please don't record a video with every feature turned off.
Aww crap you're right, might as well throw in heat output + noise in that case then.
So to sum it up:
- Only 4GB vram
- No DVI
- No HDMI 2.0
- OC potential
- AIO
- Power consumption
- Heat output
- Noise
- Aesthetics (I've seen people say elsewhere they won't buy the card based on the looks alone)
Failing all that, the perennial talking point - drivers.
No wonder Fury is DOA, look at that laundry list of issues!![]()
I saw >60 FPS average framerates for 980 Ti 2-Way SLI in that table for every game except GTA V and The Witcher 3, and that's even WITH these guys stacking on retarded overkill settings (4x MSAA ON TOP OF running at 4K resolution? Are you serious? That's only something you do when you're playing an older title with tons of open GPU headroom; you don't try and run The Witcher 3 at 4K with 4x MSAA. You don't even really need AA on top of running at 4K, it makes much less of a difference than putting AA on top of 1080p.)
I get 60 FPS at 4K resolution in The Witcher 3 without doing retarded things like stacking 4x MSAA on top of it.
I play all my games with every feature on its max setting except:
-Depth of Field (looks like shit)
-Motion blur (looks like shit)
-Chromatic aberration (looks like shit)
-SSAA or MSAA stacked on top of running at 4K unless the game is older and has extra headroom for overkilling (I do ridiculous things in the LEGO games for example like running at 4K with 8x SGSSAA stacked on top of that). Sleeping Dogs Definitive Edition actually performs so well on 980 Ti's I can keep at at 60 FPS running at 4K with its first SSAA option enabled on top of that too.
What is wrong with this company?
Being a little faster than a 980 Ti doesn't mean jack squat when you are restricted to 30 FPS when you hook it up to your 4K TV because it only has HDMI 1.4 on it. All Maxwell cards can drive 4K @ 60 FPS thanks to having HDMI 2.0.
Being a little faster than a 980 Ti also doesn't matter when there's only 4 GB of VRAM. Fury X will constantly be hitting that 4 GB barrier and every time it does there will be huge stuttering as the GPU has to swap in new textures. This was the biggest problem that plagued the GTX 980 that the Titan X / 980 Ti corrected, and AMD is repeating the issue. The minute I swapped my 980's for 980 Ti's is when the stuttering in all the games I played at 4K stopped because the 980 Ti's finally had enough VRAM for 4K. It wasn't a power issue that made 4K gaming on SLI 980's a troubling experience; it was the 4 GB of VRAM that modern games running at 4K was constantly hitting. The speed of the VRAM is not what is critical for preventing the stuttering at 4K; It is solely about the amount of VRAM available. So AMD's HBM technology is useless as long as it is restricted to 4 GB.
As an Nvidia owner this makes me mad because I'm tired of Nvidia continuing to have a de facto monopoly due to AMD's constant incompetence at presenting a real alternative. Nvidia correctly recognized the necessity of HDMI 2.0 and more VRAM for smooth 4K gaming when they designed Maxwell. AMD might be able to put out cards that appear to run faster than Nvidia cards but what they fail to realize is that power is not the deal breaker; it's the critical features Nvidia GPUs offer that convince gamers to pay more for slightly slower Nvidia competitors to AMD.
Can you come up with a list of issues this extensive with the GTX 980 Ti cards?
I saw >60 FPS average framerates for 980 Ti 2-Way SLI in that table for every game except GTA V and The Witcher 3,
Can you come up with a list of issues this extensive with the GTX 980 Ti cards?
Why would anyone be upset about this. If you don't want one, don't buy a Fury X. It's not like something better won't come out in a few months anyway. AMD's new refresh is just a stopgap that only exists because they (and Nvidia) are skipping a process node.
Not a huge deal in my view as there are very few PC monitors supporting HDMI 2.0.
Still, this is an odd decision from AMD as NVIDIA has cards featuring HDMI 2.0 since September.
Seems like a (desperate?) cost saving measure.
Not a huge deal in my view as there are very few PC monitors supporting HDMI 2.0.
Still, this is an odd decision from AMD as NVIDIA has cards featuring HDMI 2.0 since September.
Seems like a (desperate?) cost saving measure.
I think its a pretty big deal considering Windows 10 is right around the corner releasing console games like Gears of War ultimate, Killer Instinct crossplay. More and more people are building PCs (steam boxes) amd showed a small form factor a little bigger then the xbox one so of course people will want to hook up their bigger displays! A 4k TV hdmi 2.0 compliant is affordable. Surely AMD is smarter then this?
I prefer playing the witcher 3 @ 1080p 60fps vs @ only 900p or 1080p on the X1 or ps4
Aww crap you're right, might as well throw in heat output + noise in that case then.
So to sum it up:
- Only 4GB vram
- No DVI
- No HDMI 2.0
- OC potential
- AIO
- Power consumption
- Heat output
- Noise
- Aesthetics (I've seen people say elsewhere they won't buy the card based on the looks alone)
Failing all that, the perennial talking point - drivers.
No wonder Fury is DOA, look at that laundry list of issues!![]()
Can you come up with a list of issues this extensive with the GTX 980 Ti cards?
Tucked away in a VESA consortium corner booth dedicated to DisplayPort, we could see a prototype adapter that converts the signal DisplayPort 1.2 to HDMI 2.0. Offered by BizLink, who was also the first in the DisplayPort hub, this is an active adapter cable but does not need external power.
http://www.hardware.fr/news/09-01-2015/To prove that he was functional, a demonstration logs on a 4K TV 60 Hz HDMI 2.0 to a Mac Pro equipped with FirePros inconsistent with that connection. It was indeed a 24-bit per pixel format type, without reduction in quality, but probably without HDCP 2.2. We do not know if sales are due, and if this is the case when it should intervene, but technically this conversion seems to be possible.
Well for starters, we can cross off power consumption, heat output, and noise right off the list. 980 Ti is only 20W less power hungry than a 290X, runs right into the thermal limit of 84C on the default fan curve, and even then is fairly loud.
Aesthetics is 100% subjective, and an AIO is seen as a plus by some. So both of these aren't necessarily "issues" and may even be positives for certain people.
So that leaves us with 4GB vram, no DVI + HDM 2.0, and OC potential. OC potential remains to be seen, and Joe Macri, AMD's CTO made a comment about how 4GB HBM won't be an issue. Yes it could be 100% BS and just PR spin, but again that remains to be seen.
And then someone said AMDMatt is going to recheck to see if he misspoke about not having HDMI 2.0. So really the only truly confirmed "issue" at this point is no DVI. Not exactly what I'd call extensive.
Exactly. It is a head scratcher that AMD did not go HDMI 2.0, but it isn't a huge deal as some people are making out to be.
the 4k60hz TV PC gaming people are like 0.000001% of the minority who want to game like that.
[jF];1041676242 said:I think part of it is that their major competitors have had this feature for 9 months (!!!!!), yet AMD haven't seen fit to update theirs! Pretty poor show tbh, but like you say it's not a deal-breaker for most
Well, I am currently running 2 r9 290x gpu cards and bought a 4k TV with hdmi 2.0 only to find out that my r9 290x cards would only run a max refresh rate of 30 hertz = 30 fps at 3840 x 2160 and while I live with it and the games like Shadow of Mordor looks damn good; I have no incentive to update my cards now so I guess I will go on playing games at 30fps
Thanks AMD for helping me save my money![]()
And in 9 months MAYBE 0.01% of the population run 4k60hz on a TV.