AMD Fury series coming soon.

He says "don't touch it" yet AMD is smathering the whole thing in thermal paste lol.
We already saw what Fiji looked like with its heatsink freshly removed.

the paste they use is non conductive and I'm sure has a process used to ensure none of the components are damaged or interfered with. different than joe blow putting AS5 all over it with a qtip
 
the paste they use is non conductive and I'm sure has a process used to ensure none of the components. different than joe blow putting AS all over it with a qtip
Realistically there's no excuse for removing the Fury X's heatsink.
You can't really put it in your own loop (just buy a regular Fury and get a better PCB). The only reason to remove it would be to replace the paste with your own.

The card is already running <60C what else do you want? :cool:
 
https://en.m.wikipedia.org/wiki/DisplayPort#Cost

[DisplayPort has a royalty rate of $0.20 per unit from patents licensed by the MPEG LA.[45] The royalty applies to DisplayPort products that are manufactured or sold in countries that are covered by one or more of the patents in the MPEG LA license.[45] The MPEG LA license includes DisplayPort patents from Hitachi Maxell, Philips, Silicon Image, and Sony.

Lots of defenders of AMD's bad business practice. Apparently there's only 10 individuals who buy 4k TVs and these same individuals are in all the threads for Fury X all over the intenert. These folks sure get around. :rolleyes:

Never mind the fact that the market for TVs exceed monitors 100 to 1. There is economies of scale involved in TVs, and any monitor over $400 is a niche product. Apparently these individuals believe the opposite because in their world, nobody buys TVs and display port is the de facto connection standard. Never in that HDMI exceed display port by a magnitude of 100 to 1, including game consoles, DVD drives, cable boxes, etc.

There's no point in arguing about this anymore because fanboyism is irrational. The proof will be 3rd quarter AMD and Nvidia financial statements. If AMD doesn't lose market share, then we will agree that they made the right choice in not including HDMI 2.0. Let's move on to the benchmarks.
 
Realistically there's no excuse for removing the Fury X's heatsink.
You can't really put it in your own loop (just buy a regular Fury and get a better PCB). The only reason to remove it would be to replace the paste with your own.

The card is already running <60C what else do you want? :cool:


regular fury is supposed to be cut down
 
https://en.m.wikipedia.org/wiki/DisplayPort#Cost



Lots of defenders of AMD's bad business practice. Apparently there's only 10 individuals who buy 4k TVs and these same individuals are in all the threads for Fury X all over the intenert. These folks sure get around. :rolleyes:

Never mind the fact that the market for TVs exceed monitors 100 to 1. There is economies of scale involved in TVs, and any monitor over $400 is a niche product. Apparently these individuals believe the opposite because in their world, nobody buys TVs and display port is the de facto connection standard. Never in that HDMI exceed display port by a magnitude of 100 to 1, including game consoles, DVD drives, cable boxes, etc.

There's no point in arguing about this anymore because fanboyism is irrational. The proof will be 3rd quarter AMD and Nvidia financial statements. If AMD doesn't lose market share, then we will agree that they made the right choice in not including HDMI 2.0. Let's move on to the benchmarks.

Lol where were you when fanboys were defending the 970 with it's memory issues. That is a lot more serious then some people who use tvs for monitors.
 
regular fury is supposed to be cut down
Oh yeah, good point.
I hope all the water-cooling enthusiasts out there have steady hands.

I would be very surprised if Fury is cutdown. Although AMD has surprised me a lot this cycle. In a lot of bad ways.
 
https://en.m.wikipedia.org/wiki/DisplayPort#Cost



Lots of defenders of AMD's bad business practice. Apparently there's only 10 individuals who buy 4k TVs and these same individuals are in all the threads for Fury X all over the intenert. These folks sure get around. :rolleyes:

Never mind the fact that the market for TVs exceed monitors 100 to 1. There is economies of scale involved in TVs, and any monitor over $400 is a niche product. Apparently these individuals believe the opposite because in their world, nobody buys TVs and display port is the de facto connection standard. Never in that HDMI exceed display port by a magnitude of 100 to 1, including game consoles, DVD drives, cable boxes, etc.

There's no point in arguing about this anymore because fanboyism is irrational. The proof will be 3rd quarter AMD and Nvidia financial statements. If AMD doesn't lose market share, then we will agree that they made the right choice in not including HDMI 2.0. Let's move on to the benchmarks.

The problem is not that no one buys TV's, it's that not a lot of people buy $1300 worth of graphics cards and even people that would spend that money may not prefer multi card setups. It is irrational for everyone to be up in arms because not a lot of people buy multi card setups and not a lot of people need HDMI 2.0.

Don't believe me? Go look at benches like 3dmark. When it says 99% better than what everyone else has that means not a lot of people own that.

As far as monitors lets make a poll just to see what the users on [H] use to play games at 4k (What HDMI 2.0 is used for). Then we can have a realistic number that shows how niche what everyone is spouting about really is .
 
Go to the Steam hardware page and look at the percentage of gamers that game above 1440P. The percentage is ridiculously low. And yet the Fury X isn't good enough or only has 4GB, no HDMI 2.0,blah, blah...ridiculous
 
What do you mean by artificial support? It's all about bandwidth and conductivity. You can create a connector of any shape but it has to be able to support enough bandwidth for your targeted resolution. As it turns out HDMI 2.0 is perfectly capable of doing just that.

I mean I see HDMI as a dying format, that it is reaching the end of it's life. Sure, HDMI 2.0 works today but it's just there on the limit of its capacity and I have a hard time seeing it move further than what it is today and that we right now have a better solution already here with Display Port. Only thing needed is for the TV manufacturers to see it too and to start implementing DP in TVs as well as HDMI for a transition period moving forward. The specs for HDMI 2.0 is enough for today but is it in 1 or 2 or 3 years from now? I doubt that.
 
The problem is not that no one buys TV's, it's that not a lot of people buy $1300 worth of graphics cards and even people that would spend that money may not prefer multi card setups. It is irrational for everyone to be up in arms because not a lot of people buy multi card setups and not a lot of people need HDMI 2.0.

Don't believe me? Go look at benches like 3dmark. When it says 99% better than what everyone else has that means not a lot of people own that.

As far as monitors lets make a poll just to see what the users on [H] use to play games at 4k (What HDMI 2.0 is used for). Then we can have a realistic number that shows how niche what everyone is spouting about really is .

What you're forgetting is that these are niche GPU that's marketed at high end 144hz and 4k. Just look at the AMD marketing for Fury X (hint 4k 4k 4k). Samsung 32" 4K was 2000 now 1300. The most reasonable 4k monitor is the 32" BenQ for 1000. A 40" 6500 is 700. A 48" quantum friggin dot 4K TV is 1500. What makes the most sense? Every single article for Fury X there are bunch of people complaining about HDMI 2.0. Are they all Nvidia paid shills or actual consumers? You decide but I think this horse has been beaten to the ground. 3rd quarter results will tell us everything.
 
What you're forgetting is that these are niche GPU that's marketed at high end 144hz and 4k. Just look at the AMD marketing for Fury X (hint 4k 4k 4k). Samsung 32" 4K was 2000 now 1300. The most reasonable 4k monitor is the 32" BenQ for 1000. A 40" 6500 is 700. A 48" quantum friggin dot 4K TV is 1500. What makes the most sense? Every single article for Fury X there are bunch of people complaining about HDMI 2.0. Are they all Nvidia paid shills or actual consumers? You decide but I think this horse has been beaten to the ground. 3rd quarter results will tell us everything.

What makes the most sense to me? The $1000 4K monitor. I mean, 4K TVs are nice and all, but if I'm going to game from the couch, upscaled 1080p or 30Hz 4K won't bother me one bit, since I'd be playing on the TV to relax.

If I wanted to be serious about refresh rates and FPS while gaming, that's what I would have a desktop with either a high refresh 1440p/1600p display, or a 60Hz+ 4K monitor.

Simple people, if the lack of HDMI 2.0 bothers you, don't buy a Fury. Also, no point in complaining about it if doesn't have what you need.
 
Personally, I am not happy with the guys lack luster answer "it was a time constraints issue"

Look, if you are going to price a video card at $649 and release it in the middle of 2015 it better make me, your customer feel like I just bought the hottest and latest and greatest video card ever.

I don't get that from this product at all.

Like a lot of others, I'm pretty bummed out.
 
What you're forgetting is that these are niche GPU that's marketed at high end 144hz and 4k. Just look at the AMD marketing for Fury X (hint 4k 4k 4k). Samsung 32" 4K was 2000 now 1300. The most reasonable 4k monitor is the 32" BenQ for 1000. A 40" 6500 is 700. A 48" quantum friggin dot 4K TV is 1500. What makes the most sense? Every single article for Fury X there are bunch of people complaining about HDMI 2.0. Are they all Nvidia paid shills or actual consumers? You decide but I think this horse has been beaten to the ground. 3rd quarter results will tell us everything.

0.06% of the steam users play on 4k displays. Steam has 125million users. Out of everyone that plays on steam (the place you play all the modern games, so everyone) that is 750,000 users assuming that they all participated in the serve and we have to consider that 4.93 (being very generous) of users have 4k capable cards (not even 60 fps) so that is (750,000*0.0493) 36,975 people. The number of people that can play at 60fps 4k (with settings at medium) 2.29% (added the same amount for the 980ti as is for the 980 since it is not listed yet) (750000*0.0229) is 17,175 users can play at 60 fps 4k. Seeing as they already play on 4k they must have cards. So who does this affect? The 1000 or so people complaining online?

AMD is on its last legs in my opinion. They are hemorrhaging money and will most likely file for chapter 13 in a few years. A million dollars lost for all the 1000 people genuinely complaining is not going to save or brake the company. The fury x will sell out regardless and continue to sell. How much market share AMD loses is more important, they need a better brand to win back all the mid-range users.
 
Last edited:
I really want to support AMD, but same price as 980 ti minus 2GB vram, no HDMI 2.0, and mandatory AIO watercooler. It's a no brainer to go green.
 
AMD is on its last legs in my opinion. They are hemorrhaging money and will most likely file for chapter 13 in a few years. A million dollars lost for all the 1000 people genuinely complaining is not going to save or brake the company.

Feels like I've read this some years ago...
 
Takes a lot of backwards logic to describe free watercooling as a 'con'.
I for example would not purchase one since I'm currently rocking a Mini-ITX build.

There is no place for me to put a radiator, while the Titan X (which I have), or the 980ti would fit and work just fine.
 
people are benchmarking these cards at similar performance to titan x and hes mad because he paid $1000 for it
 
I'd find a place for the rad... card is shorter. Grab a drill and make a place. :D

When do official reviews come out? Midnight or during the day sometime?
 
I hope AMD does well. Without AMD, we will get incremental 10% increase from Nvidia like Intel CPU's is now. It's a shame that I can't give them my money.
 
I met this guy at PDX Lan 24 last year, he was pretty cool. Some lucky bastard won a FREE Saphire 7950 8gig NVRAM when he asked "Monty Python or Dr Who?"
 
"Cash, cash equivalents and marketable securities were $906 million at the end of the quarter, down $134 million from the end of the prior quarter." http://www.amd.com/en-us/press-releases/Pages/press-release-2015apr16.aspx

So how many quarters are left at that rate? This time it seems more real than last time or the time before that, but hey they could get another loan.


Well I think their graphics division will keep them going, possibly flat to little loss, much less then they had this quarter till Zen comes out, if Zen isn't good, they are going to have problems,

Their credit rating has been slashed from B+ to B- if I'm not mistaken, this past quarter, so loans are out of the question, not to mention they are paying high interest on the current loan they are in right now.

Now filing for bankruptcy for the frist few quarters if this happens, its only to absolve debt, so there will be a new executor but the company can still operate as it has. Which might be a good thing but only in the short term.
 
FuryLicious
InFuryAting
InGloriusFury
LordOfTheFury
HarryFuryPotter
StarTrekTheNextFury
MayTheFuryBeWithYou

I'm Batman
 
KBjRAQu.png


L2BCe3i.png


2FKkVBA.png


2xRICy7.png


xK0WC2W.png


zxE0qc1.png
 
Last edited:
Go to the Steam hardware page and look at the percentage of gamers that game above 1440P. The percentage is ridiculously low. And yet the Fury X isn't good enough or only has 4GB, no HDMI 2.0,blah, blah...ridiculous

Maybe AMD shouldn't have marketed it so hard as a "4K card", with 4K slathered over all the slides in their E3 presentation.

And Steam HW survey doesn't account for all the people like me gaming 4K on a 1080 or 1440 display with DSR/VSR.
 
Back
Top