Richard Huddy talks about rebrands, Fury X, 4 GB limits & more.

So......he seemed to agree with the assertion that AMD themselves was putting out an active adapter for the HDMI 2.0. Hm.

Interesting vid.
 
I would've liked to see him sweat a bit more over the exaggerated 300 series rebrand claims >P

Specifically mentioning the clock-for-clock and power tests.

Likewise the performance gains that appear to come from the newer 300 series drivers.

Will we see these at some point for other cards (in a unified driver) or will we be artificially crippled (with separate forks) as some pessimists on Guru3D speculate.
 
The semantics arguement is really a deflection and skirting the actual issue which is that a newer generation of products that had to be waited for offer minimal improvements/differences but vastly higher costs.

I'm sure from AMDs perspective its beneficial in terms of margins, well if they can actually move the units, but for actual consumers there is no real gains.
 
Actually the 300 series were launched with AIB so there were hefty differences in performance levels. For instance the MSI 390X here showed little to no increase, yet the sapphire 390 (nonX) showed improvement over the 290X in most tests. Results were all over the place.

Does any know if there were designations to differentiate between the dies 200 to 300 using GPU-Z? Only thing that concerns me is that some may use the 200 series chips in 300 series to clear stock. Maybe that is why they were so inconsistent.
 
You can't look at it without the entire context. You need consider how much of a price increase you are paying along with the wait time. That is the issue, the market is not more consumer favorable based on this release then it was 6 months ago.

GPU-Z will show Grenada. I guess you can also technically remove the HSF and look at the die as well.
 
the wait was to clear stock, a business maneuver. That was stated at the beginning of the year and restated after the first quarter. Technically there is no price hike as the original price of the 290X was $549 with 4Gb. Even now the MSRP of the 290X 8Gb is $429 which is the price of the 390X 8Gb. Any prices less then MSRP do no facilitate price hikes on those that sale at MSRP.
 
OK

Here is my take on AMD issues

1. Don't ever say you do not compete in a market where you are.
2. Please go back to releasing low to mid range parts first on new tech so you can tweak it for the high end card that comes later.
3. Get the FIRE back in your soul Remember the epicness that was 9500,1800,3870, 4870, 5870
4. Get out in front of people even if you have to do it on a reduced budget.

A lot of us buy your stuff as a value/performance deal and this time you have let us down. There is NO WAY I am going to spend $650 on a card that does not dominate the performance of the other card at the same price point.
 
OK

Here is my take on AMD issues

1. Don't ever say you do not compete in a market where you are.
2. Please go back to releasing low to mid range parts first on new tech so you can tweak it for the high end card that comes later.
3. Get the FIRE back in your soul Remember the epicness that was 9500,1800,3870, 4870, 5870
4. Get out in front of people even if you have to do it on a reduced budget.

A lot of us buy your stuff as a value/performance deal and this time you have let us down. There is NO WAY I am going to spend $650 on a card that does not dominate the performance of the other card at the same price point.

For now you are correct but I am hopeful of driver refinement. Look at GCN altogether, especially the 7970. It is possible this is as good as it will be, but I find that to be highly unlikely.
 
For now you are correct but I am hopeful of driver refinement. Look at GCN altogether, especially the 7970. It is possible this is as good as it will be, but I find that to be highly unlikely.

we will see driver improvements but I do not think we will get to the dominate the 980ti card.. at best we get equal or slightly better performance.

Hearing that AMD guy talk about using system ram as GPU memory was just painful.. we all know what happens when you page swap from super fast memory to slower memory...
 
Improved silicon and power efficiency on the 300 series?:D He be talking some serious bullshit right
 
Improved silicon and power efficiency on the 300 series?:D He be talking some serious bullshit right


That's not all he also stated that 4gb is enough because they can use system ram as well as vram since they HBM, what the hell happened to the PCI-e bus bottleneck lol?
 
We as AMD fans hate to admit it, but Nvidia actually helped keep the price of the top tier AMD cards from being to high cause had it not been for the 980ti release when it did, the fury x would be flying of the shelves at a hundred dollars higher all while still being just a 4gb card
Not saying the fury is a bad card but its way less future proof than the eventual 8gb versions. I would love to see more about memory usage in benchmarks on that card than what we have been showed so far. The [H} seemed to talk about it a lot but didst show any screenshots of that specifically or did they?
 
We as AMD fans hate to admit it, but Nvidia actually helped keep the price of the top tier AMD cards from being to high cause had it not been for the 980ti release when it did, the fury x would be flying of the shelves at a hundred dollars higher all while still being just a 4gb card
Not saying the fury is a bad card but its way less future proof than the eventual 8gb versions. I would love to see more about memory usage in benchmarks on that card than what we have been showed so far. The [H} seemed to talk about it a lot but didst show any screenshots of that specifically or did they?


We only got our Fury X card Saturday morning. We would have had it Friday, but AMD sent the card out signature required, and did not give us notice of shipment or tracking number. Brent was burning both ends of the candle to get out what we did.

We are going to finish up a 390X overclocking article then we will move back to Fury X coverage that is a bit more granular at the 4K resolution, which we think its deserves. We will cover highest playable settings at 4K, do a set of apples to apples with GameWorks features on and off, and include Titan X in the results as well. We will focus on VRAM usage as well and graph that data as we move through it. We will try to get a couple more games in as well. This is a ton of work for Brent to do. We hope to get this out before the 14th. As always, things are subject to change but this is the focus we are moving forward with.
 
I get the push for 4k but it seems unreasonable. Even with $1000+ dual card systems it's tough to get ultra settings and decent frame rates. 2560x1440 seems like a decent step up from 1080p while still getting good frame rates and higher graphic settings from single card systems.
 
The semantics arguement is really a deflection and skirting the actual issue which is that a newer generation of products that had to be waited for offer minimal improvements/differences but vastly higher costs.

I'm sure from AMDs perspective its beneficial in terms of margins, well if they can actually move the units, but for actual consumers there is no real gains.

Compare to Nvidia's products and they're very competitive. R390X competes well with GTX 980 for a vastly lower price (429 vs 550). That's all that matters. (looking, I guess 980 has been reduced to $480, still $50 more and half the RAM).

Nvidia has an edge in power usage if you care about that (I dont). Yet Fury X I think uses less power than 980 Ti so the tables are turned there and people are silent, suddenly power is no longer a problem.

It just highlights how bad Nvidia's lineup was in a way. Everybody is like "390x is no better than the old 290X". Yet go look at the benchmarks it trades blows with $550 980 for $429, AND has double the ram. R9 390 is an even better deal sporting a massive 8GB for $329. Granted, Nvidia's 970 is $329 and very close to their 980 in performance (then again it has only 3.5 GB of full speed RAM vs 8GB if RAM is an issue), it's not really AMD's fault Nvidia was charging a $200 premium for a few percent more performance.

I dont see a problem with it at all. I would rather we got Tonga-like architecture improvements, but they probably weren't worth implementing for AMD.
 
980 / 970 is a superior architecture to Hawaii, if I go by AMD fan boy logic which they apply on Fury X ;)
 
980 / 970 is a superior architecture to Hawaii, if I go by AMD fan boy logic which they apply on Fury X ;)

It's funny I haven't really seen you before until the new amd stuff. Then you come here poopooing over AMD and have the gull to call people fanboys.
 
So......he seemed to agree with the assertion that AMD themselves was putting out an active adapter for the HDMI 2.0. Hm.

Interesting vid.

HDMI 2.0 should be onboard the card itself. An adapter shouldn't be necessary. I prefer avoiding adapters whenever possible. Especially when you are talking about higher resolution displays and signaling.
 
I would've liked to see him sweat a bit more over the exaggerated 300 series rebrand claims >P

Specifically mentioning the clock-for-clock and power tests.

Likewise the performance gains that appear to come from the newer 300 series drivers.

Will we see these at some point for other cards (in a unified driver) or will we be artificially crippled (with separate forks) as some pessimists on Guru3D speculate.

How can you link to [H] review showing same performance at the same clocks and then try and claim they are pulling any shens with the drivers? [H] review completely disputes any claim that the drivers offer different performance.
 
we will see driver improvements but I do not think we will get to the dominate the 980ti card.. at best we get equal or slightly better performance.

Hearing that AMD guy talk about using system ram as GPU memory was just painful.. we all know what happens when you page swap from super fast memory to slower memory...

Tahiti beats GK104 and Hawaii beats GK110 but it didn't start out that way.
 
How can you link to [H] review showing same performance at the same clocks and then try and claim they are pulling any shens with the drivers? [H] review completely disputes any claim that the drivers offer different performance.

Bit of bad wording on my part perhaps. I'm not suggesting there's a heinous driver conspiracy - or that the 300 series cards are winning due to vastly better software.

We all agree that the [H] article debunks his claims that the 300 rebrands are:
>More efficient
>Good/better value
>Significantly faster

As far as I can see the results from the cards are practically identical and there's no shame in grabbing a previous series card - especially so if it's on offer - as I did.

I'll be amazed if the higher memory count makes any difference in crossfire or otherwise - I'd expect for most launch titles that feature simply won't work - like usual!


Eurogamer have done a similar comparison further down the page.

I was only really suggesting that there does appear to be some improvements when using modded 300 series drivers on a 200 series card: Assassins Creed Unity is improved, my cutscene issues in GTA V are solved as is my crashing in the Witcher 3, others report less stuttering overall and better results in synthetic tests which should bode well for the future?

Why keep the older cards on the outdated driver branch when a few modders can add a few lines to the ini?
 
Why keep the older cards on the outdated driver branch when a few modders can add a few lines to the ini?

Likely because they didn't have time to certify them with all cards. There's a difference between someone hacking drivers and a company coming out with drivers that are certified for a particular device.
 
Bit of bad wording on my part perhaps. I'm not suggesting there's a heinous driver conspiracy - or that the 300 series cards are winning due to vastly better software.

We all agree that the [H] article debunks his claims that the 300 rebrands are:
>More efficient
>Good/better value
>Significantly faster

As far as I can see the results from the cards are practically identical and there's no shame in grabbing a previous series card - especially so if it's on offer - as I did.

I'll be amazed if the higher memory count makes any difference in crossfire or otherwise - I'd expect for most launch titles that feature simply won't work - like usual!


Eurogamer have done a similar comparison further down the page.

I was only really suggesting that there does appear to be some improvements when using modded 300 series drivers on a 200 series card: Assassins Creed Unity is improved, my cutscene issues in GTA V are solved as is my crashing in the Witcher 3, others report less stuttering overall and better results in synthetic tests which should bode well for the future?

Why keep the older cards on the outdated driver branch when a few modders can add a few lines to the ini?

But you have to look at all the reviews to see what might have happened. PcPer did a R9-390 (nonX) review and it was beating the 290X. Also I think theirs was a Sapphire whereas [H] was a MSI. Seems AIBs were the difference more than any driver. Also PcPer went back and tested the driver to see if there were any difference or hanky-panky and found none. Even the power usage was improved al-be-it slightly when you consider 8Gb to 4Gb VRAM.

It is unfortunate but this same thing happened with the 280X if you kept up with such things. The first set was the old chip with the second set being the new revision. I don't remember exactly what improvements there were just remember a lot of complaining when it was found out.
 
I can't see youtube at work, but I did catch this the other day:

AMD: The Radeon R9 390 and 390X Aren’t Rebadges – New Power Management uArch, Higher Bandwidth and 8GB vRAM

Complete re-write of the GPUs power management micro-architecture
· Under “worse case” power virus applications, the 390 and 390X have a similar power envelope to 290X
· Under “typical” gaming loads, power is expected to be lower than 290X while performance is increased”

http://wccftech.com/amd-radeon-r9-390-390x-not-rebadges-power-optimization/


If that is true, then 290X => 390X is sort of like Trinity => Richland.
 
I can't see youtube at work, but I did catch this the other day:

AMD: The Radeon R9 390 and 390X Aren’t Rebadges – New Power Management uArch, Higher Bandwidth and 8GB vRAM

Complete re-write of the GPUs power management micro-architecture
· Under “worse case” power virus applications, the 390 and 390X have a similar power envelope to 290X
· Under “typical” gaming loads, power is expected to be lower than 290X while performance is increased”

http://wccftech.com/amd-radeon-r9-390-390x-not-rebadges-power-optimization/


If that is true, then 290X => 390X is sort of like Trinity => Richland.


To be honest this sounds like fighting over semantics and grasping at straws. For example GTX 770 has some minor stuff tweaked but it is said to be GTX 680 rebrand and no one argues about it. 390 and X has also stuff tweaked and extra 4gb added (the only really major thing) but otherwise its exactly the same, to the point that people have been flashing 290X with 390X bios.
 


you could almost say it's the same and i think thats also including the power usage from the watercooling kit since it's the entire system usage(could be wrong though). although have to give them credit, for what they added i'm amazed it still uses less power than the 290x fully expected it to be a power hungry monster when they added the watercooling.
 
you could almost say it's the same and i think thats also including the power usage from the watercooling kit since it's the entire system usage(could be wrong though). although have to give them credit, for what they added i'm amazed it still uses less power than the 290x fully expected it to be a power hungry monster when they added the watercooling.

Guess we have to wait and see what the 980Ti hybrid uses to get an idea of water pump power usage.
 
this latest amd refresh has been so polarizing and personally, disappointing.
 
Back
Top