From ATI to AMD back to ATI? A Journey in Futility @ [H]

RX480, aptly named, as it is the spiritual successor the gtx 480, only on a lower (relative) performance level

Kyle why are you buying two, I thought they sent you one for review
 
Kyle why are you buying two, I thought they sent you one for review
Video card companies have in the past sent cards to reviewers that were substantially different from the cards available at retail.
And by in the past, I mean last month.
 
Video card companies have in the past sent cards to reviewers that were substantially different from the cards available at retail.
And by in the past, I mean last month.


Hmm, this seems like it's unnecessary for this launch, I don't see how the retail cards could possibly be any worse. Maybe they're better and actually draw 110w and overclock 10%
 
RX480, aptly named, as it is the spiritual successor the gtx 480, only on a lower (relative) performance level

Kyle why are you buying two, I thought they sent you one for review

probably their card is for limited time, most of the time the review samples are called for return, or it's just given by certain period of time for reviews and done, no more.. this is common on some manufacturers as ASUS, "hey here's the review sample of our card, you can use the card from this to that date then you have to return it to us so we can sample it to another site".

As far I know only Nvidia with their own nvidia branded cards, allowed the reviewers to keep the cards.. with AMD probably this isn't the case..
 
Hmm, this seems like it's unnecessary for this launch, I don't see how the retail cards could possibly be any worse. Maybe they're better and actually draw 110w and overclock 10%

The problem is that 110W draw (versus the 160W seen by reviewers) probably means a substantial drop in performance. You can lower power for a particular chip by lowering the voltage (which will decrease how fast a CMOS chip can run) and/or lowering the clock speed. Neglecting thermal effects (that is, if you can keep the die at a constant temperature) and leakage, it winds up being a PWR = V^3 relationship: max clock increases roughly linearly with voltage and power increases as the square. That's the physics of CMOS.

So unless AMD just happened to send all the reviewers cards from a yield corner full of leaky transistor, there is no way 110W retail cards will perform anywhere near as well as 160W reviewed cards.

We can see the leakage sort of from the idle numbers. That won't change much I think. So, a power-reduced 110W card will probably deliver around 85% or so of the performance of a 160W card, which would make the card a lot less attractive.

Disclaimer: this is based on my knowledge of Intel's pre-FinFET CMOS processes circa 2005. I left Intel in 2007.
 
The problem is that 110W draw (versus the 160W seen by reviewers) probably means a substantial drop in performance. You can lower power for a particular chip by lowering the voltage (which will decrease how fast a CMOS chip can run) and/or lowering the clock speed. Neglecting thermal effects (that is, if you can keep the die at a constant temperature) and leakage, it winds up being a PWR = V^3 relationship: max clock increases roughly linearly with voltage and power increases as the square. That's the physics of CMOS.

So unless AMD just happened to send all the reviewers cards from a yield corner full of leaky transistor, there is no way 110W retail cards will perform anywhere near as well as 160W reviewed cards.

We can see the leakage sort of from the idle numbers. That won't change much I think. So, a power-reduced 110W card will probably deliver around 85% or so of the performance of a 160W card, which would make the card a lot less attractive.

Disclaimer: this is based on my knowledge of Intel's pre-FinFET CMOS processes circa 2005. I left Intel in 2007.

Thank you for sharing that. I want to see how Kyle's purchased 480s compare with his review sample.
 
Well I imagine [H] will be doing an X-Fire review, as that was a significant part of AMD's Computex keynote.
Should be interesting to see if the advertised (dual RX480s) is a compelling alternative to a single 1080.
 
Kyle why are you buying two, I thought they sent you one for review
Yes, but Brent and I are not in the same location. I bought the two to do specific power and VR testing here in my office. I have all the power equipment here and have all the VR equipment as well. Brent stays focused on the realworld desktop content which requires a constant flow. I can handle these asides without totally stalling Grady and Brent's schedule.
 
52788_01_emotional-amd-commercial-nvites-join-radeon-rebellion.jpg


I think we can all agree that Raja Koduri and Radeon Technologies Group will rebel against AMD. LMAO.
 
Sooooo

AMD Robert is claiming my statement regarding the 150w TDP, the new claimed 110w, and the measured 170w is wrong. He says I am disseminating false information

Have at him folks

 
LOL so the GPU alone uses 110 watts and rest is from the other components, is that how they are measuring power now only on the GPU?

lets add another term in there TGP, we had TDP and TBP before!

What are these hairless monkeys doing LOL.
 
Sooooo
AMD Robert is claiming my statement regarding the 150w TDP, the new claimed 110w, and the measured 170w is wrong. He says I am disseminating false information
Have at him folks

I like that you quoted my post on the implications of violating the PCI Express tradmark. :)
And who the hell cares what the chip alone (absent memory and voltage regulators) draw? Walking back their power spec to the chip alone is just lame. Consumers aren't buying the chip, and the chip can't do a damn thing without memory and voltage regulators.

I smell a desperate marketing department trying to keep the stock price from collapsing until their next stock-option exercise window opens. There's always a blackout period surrounding big events like this product launch ...
 
I like that you quoted my post on the implications of violating the PCI Express tradmark. :)
And who the hell cares what the chip alone (absent memory and voltage regulators) draw? Walking back their power spec to the chip alone is just lame. Consumers aren't buying the chip, and the chip can't do a damn thing without memory and voltage regulators.

I smell a desperate marketing department trying to keep the stock price from collapsing until their next stock-option exercise window opens. There's always a blackout period surrounding big events like this product launch ...
You summed things up very nicely, and frankly I have no idea how anything works in legal terms, all I know is that you raise a very good point regarding electrical safety for low end mobos, not to mention it is outright deceitful to release 6pin only card claiming 110w then have it suck 170. It should have been an 8pin, or at very least draw more than spec on the 6pin , not from mobo
 
Well Damn... I thought it would finally make sense from both a price and power perspective for me to run crossfire. It looks like once pricing stabilizes, a 1070 is what makes sense. Oh well.
 
Hats off to Kyle, it looks like you were right about the heat and power usage.

Technically Kyle's sources were correct. Kyle reported accurately what they told him and he wrote the editorial because he had reason to trust them since they have given him information in the past that was accurate. What you and others did is question his judgement of his sources and accused him of yellow journalism and AMD hate/bias. As a matter of fact some jerks around here even compared this site to WTFtech. Kyle calls it like he see's it and doesn't make shit up. He also won't report stuff from sources unless he implicitly trusts them. Enjoy your crow.
 
Regarding AMD's claim that they passed PCI-SIG compliance test performed by the PCI-SIG:

1. AMD selected the card to be tested. There's no guarantee the tested card was identical to reviewed or retail cards, unless the PCI-SIG requires re-testing every time a BIOS is updated, which I doubt.

2. Volkswagen diesels were programmed to detect when they were being emissions tested and to alter the operation of the engine so that they passed, and no one caught them at it for years. This has happened in the PC space too, with graphics drivers detecting that a benchmark was being run and switching to an optimized set of code for just that benchmark.
I suggest people just wait to see what [H]ardOCP testing discovers. The delay won't kill you.
 
Regarding AMD's claim that they passed PCI-SIG compliance test performed by the PCI-SIG:

1. AMD selected the card to be tested. There's no guarantee the tested card was identical to reviewed or retail cards, unless the PCI-SIG requires re-testing every time a BIOS is updated, which I doubt.

2. Volkswagen diesels were programmed to detect when they were being emissions tested and to alter the operation of the engine so that they passed, and no one caught them at it for years. This has happened in the PC space too, with graphics drivers detecting that a benchmark was being run and switching to an optimized set of code for just that benchmark.
I suggest people just wait to see what [H]ardOCP testing discovers. The delay won't kill you.

Comparing AMD to VW in this case is a VERY good analogy (though unproven at this time).
 
Technically Kyle's sources were correct. Kyle reported accurately what they told him and he wrote the editorial because he had reason to trust them since they have given him information in the past that was accurate. What you and others did is question his judgement of his sources and accused him of yellow journalism and AMD hate/bias. As a matter of fact some jerks around here even compared this site to WTFtech. Kyle calls it like he see's it and doesn't make shit up. He also won't report stuff from sources unless he implicitly trusts them. Enjoy your crow.

Go over the thread and see what I mainly talked about. The main thing that I criticized was the claim that Polaris 10 was supposed to be a GP104 competitor.
I don't think that Polaris 10 was ever supposed to be a high-end card (read: Radeon 490).

The 480 does not have high-end characteristics, just given its size, and I stand by it even more now that we know that it only has 32 rops.
 
Kyle did say AMD needed more time to possibly clocks higher and it was going to be hot.
Drawing more power than advertised says they're artificially trying to get more power to the card to hold the increased clocks, which also makes the card hot with little OC room.

The journey in futility continues.
 
What if I told you the RX 490 that gets released in October is a 1.6Ghz RX 480 and it will come with 300W TDP?
 
I would be a bit wary if using 2 in a single PC as even one seems to draw the limit of the mainboard PCIe ATX slot spec.
18-Gaming-Bars.png



The Max figures are not too much of a concern as they are closer to instantaneous bursts with short milliseconds durations, but the averages are if going with 2 cards.
The consideration is the Mainboard PCIe ATX slot and that it will be shared between both cards.

No idea how much leeway there is with the PCIe ATX spec after 2004 with motherboards.
Separately the more recent PCIe Molex PEG 6-pin connector is rated to 8A providing in theory support up to 192W (a lot will depend upon wire gauge/PSU/card circuit rating) but I would not want to push near that myself, what Tom's Hardware measured is high enough IMO.

I appreciate further information and clarification is required to see if it is a general card behaviour or fault with some cards.
Cheers

Tom's Hardware measures with a maximum resolution of 500ns.

500ns.

Overkill doesn't cut it lol
 
Sooooo

AMD Robert is claiming my statement regarding the 150w TDP, the new claimed 110w, and the measured 170w is wrong. He says I am disseminating false information

Have at him folks


You know you really have an agenda. At no point he said 170w was wrong. He was saying how the power is distributed. You are right but the way you are handling this is desparation mode. The card is using more, it is out of spec. .

Whats the harm in waiting a few days and seeing how things play out.? I am damn sure the reviews sites will do their work, realax a little. I think the guy made a mistake responding to you its like feeding the fire, he is stupid enough to do it I think. If I was him I would just shut up and let you go at it lol.

You are right you won, stop going crazy over it. Seriously man I don't wanna pay 600 dollars for a 1170 next year if there is no amd, lol. I am going to have to charge you th extra money. give it a few days see what they say. You are gonna lose your mind, do some homework.

I think I need to solve that problem for you brother. HAHAHA
 
You know you really have an agenda. At no point he said 170w was wrong. He was saying how the power is distributed. You are right but the way you are handling this is desparation mode. The card is using more, it is out of spec. .

Whats the harm in waiting a few days and seeing how things play out.? I am damn sure the reviews sites will do their work, realax a little. I think the guy made a mistake responding to you its like feeding the fire, he is stupid enough to do it I think. If I was him I would just shut up and let you go at it lol.

You are right you won, stop going crazy over it. Seriously man I don't wanna pay 600 dollars for a 1170 next year if there is no amd, lol. I am going to have to charge you th extra money. fuck give it a few days see what they say. You are gonna lose your mind, do some homework. HAHAHA

I aint going crazy! You don't know crazy! Look at my inbox and you'll see crazy lol
 
I aint going crazy! You don't know crazy! Look at my inbox and you'll see crazy lol

HAHAHA where? here? Holy shit. Thats how you should know everytime I criticize its because I am probably older than you (in a bad fuckin way that it sucks being 30+) and look at life like an old guy who is like whatever I am not buying this shit so who cares lol. I don't go that far lol.
 
HAHAHA where? here? Holy shit. Thats how you should know everytime I criticize its because I am probably older than you (in a bad fuckin way that it sucks being 30+) and look at life like an old guy who is like whatever I am not buying this shit so who cares lol. I don't go that far lol.

Man updating a reddit thread takes no more effort than making a thread here, I've been studying and playing CS games this whole time :D
 
Seriously man I don't wanna pay 600 dollars for a 1170 next year if there is no amd, lol.

I can't imagine that anyone wants to see AMD fold. That's not it at all. But this just seems beyond stupid. The 1070 is rated at 150W and has an 8 pin power connector. The 480 is rated 150W but has a 6 pin power connector but actually uses more power than the 1070? Why did they not just put an 8 pin power connector on the 480? It just doesn't make any sense.
 
I can't imagine that anyone wants to see AMD fold. That's not it at all. But this just seems beyond stupid. The 1070 is rated at 150W and has an 8 pin power connector. The 480 is rated 150W but has a 6 pin power connector but actually uses more power then the 1070? Why did they not just put an 8 pin power connector on the 480? It just doesn't make any sense.

I honestly think because it would look less efficient
 
Review just opened up for me.

So disappointed in this power consumption from a 14nm part :(
 
HAHAHA where? here? Holy shit. Thats how you should know everytime I criticize its because I am probably older than you
If age matters, I started programming and building digital electronic devices back in 1972. Odds are I was soldering and coding before you were born. Ever hear of the Altair 8800? I debugged one of those for a local computer shop when they were new. I remember buying new-fangled 16Kbit DRAMs (200ns access time!) in a tube of DIPs as a teenager.

So what privilege does age give me here, again? :)
 
  • Like
Reactions: rat
like this
I can't imagine that anyone wants to see AMD fold. That's not it at all. But this just seems beyond stupid. The 1070 is rated at 150W and has an 8 pin power connector. The 480 is rated 150W but has a 6 pin power connector but actually uses more power then the 1070? Why did they not just put an 8 pin power connector on the 480? It just doesn't make any sense.

Now that was a fucking stupid move I think. I never understood why they did that. They could have easily avoided this situation. IDK I think they wanted to frickin stop people from overclocking these cards so they can promote AIB cards. Kinda silly of them, just call it a day and set a hard limit in drivers for reference cards and give people fair warning on overclocking that they may go over specs. I think they didn't do the work required in drivers for this card. Seems like there is no control on power on it. They should have set a limit of 150w, I mean after all they are boost clocks. Let it hover up and down, nvidia does this just fine.
 
Review just opened up for me.
So disappointed in this power consumption from a 14nm part :(

I'm really having a hard time understanding how AMD creates a part on a smaller process that's getting killed in perf/watt by 16nm Pascal.
 
Now that was a fucking stupid move I think. I never understood why they did that. They could have easily avoided this situation. IDK I think they wanted to frickin stop people from overclocking these cards so they can promote AIB cards. Kinda silly of them, just call it a day and set a hard limit in drivers for reference cards and give people fair warning on overclocking that they may go over specs. I think they didn't do the work required in drivers for this card. Seems like there is no control on power on it. They should have bumped up the speed a little and set a limit of 150w, I mean after all they are boost clocks. Let it hover up and down, nvidia does this just fine.

because they are introducing their brand new GPU on a brand new process in the mid range. They wanted to promote that the card uses less power because of all of these things.

AMD is trying to hide their short comings with GCN.
 
I'm really having a hard time understanding how AMD creates a part on a smaller process that's getting killed in perf/watt by 16nm Pascal.

I'm still trying to wrap my head around the fact that they made this dramatic process shrink and only cut actual power by 45%. Good card for the price, but c'mon ...
 
because they are introducing their brand new GPU on a brand new process in the mid range. They wanted to promote that the card uses less power because of all of these things.

AMD is trying to hide their short comings with GCN.

When I first saw 170W I swear I thought it was total system power using a stock 6600k or something.
 
Back
Top