AMD Fury series coming soon.

I was going to buy 50 Fury X cards if they had HDMI 2.0. I guess I am going to have to give all my life savings to Nvidia now.
 
Part of the reason I bought my specific high-ish end 4K TV last year was because it could do 4:4:4 at 60fps. I figured I could run content from my computer to it once I upgraded to either a 980 variant or to AMD's new halo card.

Sure, for gaming I'll probably replace my ZR30W next year with a 30-32 inch display port capable screen, but there is something to be said for watching very fluid content on a 65 inch screen. If this is accurate, it really would be a big and stupidly self-inflicted strike against AMD.
 
If anything that overclocking comparison chart doesn't exactly paint a rosy picture lol. So I think it's safe to just take it at face value.
 
AMD sure picks some specific graphic settings

not sure if I should read too much into AMD choosing to do an OC comparison by only +100 Mhz over stock

you should see how nvidia does benchmarks and reviews.
its horrible.
 
http://forums.overclockers.co.uk/sho...8676801&page=9



Someone from AMD (legit source) posted over at overclockersforum that the Fury doesn't support HDMI 2.0. I hope this isn't true because I have a Samsung 2015 4k 48" display and will have to go with Nvidia. Seems kind of dumb for a card targeted at 4k gaming not to offer HDMI 2.0 support at this point in time.

I had read that the Nvidia cards with HDMI 2.0 are not really HDMI 2.0. They only utilize a little more bandwidth to enable 4k at 60Hrz without any of the other HDMI 2.0 or 2.0a goodies. This is still problematic for the few of us that what use a card in a HTPC and play Ultra HD Blu-rays or 4k streams with HDR and improved color depth and the new protection and so on but maybe AMD will let you game at 4k at 60Hrz but aren’t going to label it HDMI 2.0. Just a thought.

It’s still no help to me as I what a HDMI 2.0a card to use with a HDMI 2.0a receiver and TV. I don’t have to worry until there is a 4K TV better than (and less than $10,000) me trusty old Kuro.
 
I had read that the Nvidia cards with HDMI 2.0 are not really HDMI 2.0. They only utilize a little more bandwidth to enable 4k at 60Hrz without any of the other HDMI 2.0 or 2.0a goodies. This is still problematic for the few of us that what use a card in a HTPC and play Ultra HD Blu-rays or 4k streams with HDR and improved color depth and the new protection and so on but maybe AMD will let you game at 4k at 60Hrz but aren’t going to label it HDMI 2.0. Just a thought.

It’s still no help to me as I what a HDMI 2.0a card to use with a HDMI 2.0a receiver and TV. I don’t have to worry until there is a 4K TV better than (and less than $10,000) me trusty old Kuro.

You are incorrect. Nvidia's cards utilize the full HDMI 2.0 spec, they just don't include HDCP 2.2 protection, which will enable them to play Ultra HD Blu-Rays, Netflix 4k, etc, etc. But this has been known for quite a while. I'm OK with this. I just want to be able to game in full 4k/60Hz 4:4:4.
 
You are incorrect. Nvidia's cards utilize the full HDMI 2.0 spec, they just don't include HDCP 2.2 protection, which will enable them to play Ultra HD Blu-Rays, Netflix 4k, etc, etc. But this has been known for quite a while. I'm OK with this. I just want to be able to game in full 4k/60Hz 4:4:4.

Well HDCP 2.2 is a big part of HDMI 2.0 and Ultra HD Blu-Rays and streams and 4k with a receiver and a TV. It don't seem like full support.
 
AMD should have added HDMI 2.0 and more VRAM to their flagship card.

This is 2015 we are talking about.
 
Here is the full reviewer guide the press received today.
http://videocardz.com/56728/amd-radeon-r9-fury-x-reviewers-guide

Frame Rate Targeting Control

Following along the topic of power draw and efficiency, Frame Rate Targeting Control is a new feature we’re introducing with the Radeon™ Fury X graphics card, enabling users with the control to set a target maximum frame rate when playing an application in full screen mode; the benefit being that FRTC can reduce GPU power consumption (great for games running at frame rates much higher than the display refresh rate) and therefore reduce heat generation and fan speeds/noise on the graphics card. Below is a screenshot of the feature in Catalyst Control Center.

Frame rate targeting control caps performance not only in 3D rendered in-game scenes, but also in splash screens, loading screens and menus, where framerates often run needlessly into the hundreds of fps.

Users might wish to set a very high cap just to limit wasteful fps like that seen in menus and such, while still taking advantage of the responsiveness given by fps well beyond 60.

FRTC is especially useful when rendering relatively ‘easy’ content on powerful hardware, e.g. when you’ve got a relatively low resolution monitor connected to a higher end graphics board, or when playing an older title, or a game with a relatively lightweight graphics load.

Limiting the framerate not only saves power, but also heat and noise, keeping your GPU cool and quiet.

Note: Changes to the Framerate Target must be done outside of the game, i.e. exit the game completely, make your changes, and then start the game again. The current implementation of Frame Rate Targeting Control works with DirectX 10 and 11 titles, and offers targets in the range of 55 to 95 fps.

Well that ended the speculation we had here - http://hardforum.com/showthread.php?t=1845509&highlight=frame+rate+target+control
 
Hmm, I got an extra half a rent check and security deposit coming my way... Gonna get me at least one of these Fury X's.
 
year 2015, there are thousands of 4k TV's around most perform better than cheap 4k TN screens and what AMD delivers is a year 2011 card with hdmi 1.4...
I've been waiting for over a year to get an amd card that can do 60hz@4k for my 4k tv . I am just shocked it doesn't come with a hdmi 2.0 port.
Amd is living on a dream world,they have no idea how so many people are using their 4k tv's as a tv monitor which doesn't have that ...g display port.
All I can say Epic Fail AMD!

I don't care about fps or power draw, just give me a card with hdmi 2.0.
Gonna grab a overpriced gtx960 for my pc(for 2d purposes) and play games on my ps4.
I am not going to pay ridicilous money for high end nvidia cards.
 
Last edited:
And this is spam, that we are talking about.

Can you refute my claim. There are enough titles out there that require more than 4 Gigs of vram. And pretty sure we won't see the last of those moving ahead.
 
Can you refute my claim. There are enough titles out there that require more than 4 Gigs of vram. And pretty sure we won't see the last of those moving ahead.

Require or utilize? I would like to see some hard proof that you can't run all currently released games with 4GB or less.
 
Require or utilize? I would like to see some hard proof that you can't run all currently released games with 4GB or less.

I have seen my Titan X consuming close to 5.6 gig of VRAM whenever I am in open vegetation in GTA 5. Now please don't give me the reason the 4 Gigs of HBM will magically consume less.

This is a flagship card we are talking about and premium cards should have an extra headroom.

I have owned plenty of 290s and 290Xs in the past and they all had a decent amount of vram headroom when they launched in 2013.

I am assuming AMD just couldn't create an 8 gig HBM. There would have been many buyers that would have grabbed the FuryX if it had twice the VRAM headroom.
 
I have seen my Titan X consuming close to 5.6 gig of VRAM whenever I am in open vegetation in GTA 5. Now please don't give me the reason the 4 Gigs of HBM will magically consume less.

I wasn't. Merely pointing out utilization is not the same as a requirement. Running a game such as GTA V at ultra presets is not a requirement. You are distorting a viewpoint by saying games are requiring more than 4GB.
 
I wasn't. Merely pointing out utilization is not the same as a requirement. Running a game such as GTA V at ultra presets is not a requirement. You are distorting a viewpoint by saying games are requiring more than 4GB.

So what's the use of a flag ship card from AMD if I can't run it at the same settings as a 980 Ti ?

Is it more of a mid ranged card?
 
You conveniently ignore even [H] own review about the vram utilization. No point in arguing. The review and a lot of the people out there have shown vram utilization jump to more than 5 Gigs.

Just give the damn people more vram!
 
You conveniently ignore even [H] own review about the vram utilization. No point in arguing.

I do not, and I am not. You choose to ignore what I've said. Utilization is not requirement. There are games that will utilize all the VRAM you have, even if they don't exactly require it for the settings.
 
Wrong, the complexity and polygon count along with content rendering govern vram utilization. At 4K we are talking in billions of polygons which naturally ask for vram utilization.

I would have seen close to 10 gigs of vram usage if I went by your explanation.
 
You conveniently ignore even [H] own review about the vram utilization. No point in arguing. The review and a lot of the people out there have shown vram utilization jump to more than 5 Gigs.

Just give the damn people more vram!

What is it with Nvidia owners coming into AMD threads just to troll? Why can't you just be happy with your $1,000... wait, sorry. $650 Titan X?
 
I have seen my Titan X consuming close to 5.6 gig of VRAM whenever I am in open vegetation in GTA 5. Now please don't give me the reason the 4 Gigs of HBM will magically consume less.

This is a flagship card we are talking about and premium cards should have an extra headroom.

Notice how AMD mysteriously left GTA5 out of their 4K benches released yesterday? It's the biggest release of the year so naturally people are going to want to know. At first I thought maybe they just weren't including the latest games, but then Witcher 3 is on there which is a title that came out a month later.

In any case I don't think the 4GB VRAM is a dealbreaker honestly, most games will still run fine with it, especially sub-4K.
 
Last edited:
ox1eY6S.png
 
You conveniently ignore even [H] own review about the vram utilization. No point in arguing. The review and a lot of the people out there have shown vram utilization jump to more than 5 Gigs.

Just give the damn people more vram!

Just have patience ffs. Reviews will show how the memory is managed for the 4K gimmick, or in two years time, the Fury X will be rebranded as the Fury XXX with 8GB memory and higher clock speeds.
 
how about by beginning of next year, games are already out that hit the 4gb barrier, and there are more coming. Game devs have to keep up competition with their piers too...
 
Wrong, the complexity and polygon count along with content rendering govern vram utilization. At 4K we are talking in billions of polygons which naturally ask for vram utilization.

I would have seen close to 10 gigs of vram usage if I went by your explanation.

a game can have 3.5GB+ of VRAM usage on a R9 290, yet a 780TI with 3GB VRAM can still outperform it with higher minimums

the answer is the game is caching more VRAM than it actually needs to make use of
 
Do anyone know if there will be after market water blocks for this card? If I get this card, I want to use my current cooling loop instead of the stock cooler.
 
People who have money to buy these cards or multiple cards aren't waiting a year. A year is a long time for them. They upgrade tech in a couple of months. You think people with titan x just stick to old 1080p monitors? They get the newest shit they can get their hands on. Fury X is AMDs top of the line card. They need to cater to their audience (the top % of enthusiast market) and that is the ones that have the money to buy multiple gaming monitors, large 40+ 4k TVs, beastly systems with custom watercooling, etc. It just feels like they are alienating a large portion of the market that have the actual money to buy the card. Everyone else sitting on their 1080p monitors and r9 270/ gtx 670 graphics cards probably aren't actually buying a 650 dollar graphics card. They just want to talk about it and how great amd or nvidia is compared to the competition.
 
Back
Top