AMD Radeon HD 6950 to HD 6970 Mod

So here's the question. Has anybody removed the heatsinks on a 6970 and 6950 to compare the ram?

It would be interesting to know what the manufacturers are using for each card. It would also solve a lot of bickering.
 
So here's the question. Has anybody removed the heatsinks on a 6970 and 6950 to compare the ram?

It would be interesting to know what the manufacturers are using for each card. It would also solve a lot of bickering.

The good people at TechPowerUp did, the same folks who discovered the mod, same people who make GPU-Z and RBE.

The 6950 cards come with 5GHz Hynix memory, and the 6970 come with 6GHz Hynix memory.
 
Thanks for the heads up wolfman3k5. I never actually read their review, just the unlock article.
 
Last edited:
Thanks for the heads up wolfman3k5. I never actually read their review, just the unlock article.

You're welcome. I'm somehow sorry that I got these cards do. I should have gotten a pair of GTX 570's or a single GTX 580. I wonder if anyone with a GTX 580 can give me some input on performance.

Thanks!
 
well I thought I was stable, or maybe it was a freak accident but I was playing bc2 for a few hours today and all the sudden the left 1/4 of the screen was on the right side of the screen, so it was split I guess you could say. I alt tabbed and lowered the memory clock down to 1410 and everything has been running fine since at 900/1410. Not sure if that was the reason, but it was a little sketchy at first haha.
 
The memory is what gives me pause. How low can the memory be turned down to with the 6970 bios?
 
So here's the question. Has anybody removed the heatsinks on a 6970 and 6950 to compare the ram?

It would be interesting to know what the manufacturers are using for each card. It would also solve a lot of bickering.
I've seen a couple reviews showing the memory on the cards.

Here is a page with the 6950 memory chips displayed (the last photo on the page):
http://www.hardwaresecrets.com/article/AMD-Radeon-HD-6950-and-6970-Video-Card-Review/1151/3

Here is a page with the 6970 memory chips displayed:
http://www.hardwaresecrets.com/article/AMD-Radeon-HD-6950-and-6970-Video-Card-Review/1151/5
 
I originally flashed my 6950 with 6970 bios, was working great at 925/1400 from 800/1250 6950 clocks.

Just for shits and grins I flashed the 6950 with unlocked shaders bios that wizzard made, that has 1.1vcore and 1.5vdimm, instead of 1.175 and 1.6vdimm.
My card still clocks to 1400 on the mem No problem... Core tops out at 900 now... I ran 45 min of bc2, 30 minutes of black ops, I looped heaven while I took a shower and ran 3dmark 11 2x without a single artifact or glitch.
Card runs cooler though with fan profile in afterburner I havnt hit 60c in anything yet and fan hasnt gone over 60%....

I personally wouldnt worry about .1 overvolt on the memory of these cards...

Seems that BC2 is especially picky about clocks though, the other thing is with the bandwidth of gddr5 memory even a 6950 at 1250 is effectively 5000mhz so clocking the memory is essentially pointless anyway.
 
I originally flashed my 6950 with 6970 bios, was working great at 925/1400 from 800/1250 6950 clocks.

Just for shits and grins I flashed the 6950 with unlocked shaders bios that wizzard made, that has 1.1vcore and 1.5vdimm, instead of 1.175 and 1.6vdimm.
My card still clocks to 1400 on the mem No problem... Core tops out at 900 now... I ran 45 min of bc2, 30 minutes of black ops, I looped heaven while I took a shower and ran 3dmark 11 2x without a single artifact or glitch.
Card runs cooler though with fan profile in afterburner I havnt hit 60c in anything yet and fan hasnt gone over 60%....

I personally wouldnt worry about .1 overvolt on the memory of these cards...

Seems that BC2 is especially picky about clocks though, the other thing is with the bandwidth of gddr5 memory even a 6950 at 1250 is effectively 5000mhz so clocking the memory is essentially pointless anyway.

Where can I get a copy of Wizzard's BIOS?
 
well I thought I was stable, or maybe it was a freak accident but I was playing bc2 for a few hours today and all the sudden the left 1/4 of the screen was on the right side of the screen, so it was split I guess you could say. I alt tabbed and lowered the memory clock down to 1410 and everything has been running fine since at 900/1410. Not sure if that was the reason, but it was a little sketchy at first haha.

Happen to me too and I think it was due to the core not the memory.
 
^^this^^

just extract wizzards zip to a regular folder and put your stock bios in...

then flash away.

http://forums.techpowerup.com/showpost.php?p=2137760&postcount=381

I did the core unlock on my pair of ASUS HD6950. It went without a hick up, but the performance difference is nothing to write home about.

On stock 6950 speeds before the unlock I was getting 30600 pints in 3DMark Vantage, now I'm getting 31630. So the unlock added a mislay 1000 points, but in actuality I haven't noticed anything else. To settle this once and for all, the performance difference comes from the clock speeds and not from the shader count. I believe that AMD used the same exact GPU on both cards (no binning needed as they have perfected the manufacturing), and just disabled one cluster of shaders (128). I believe that they did this for marketing reasons, because it would have been odd to sell the same exact GPU with the same shader count on two cards that are so similar. The real differences are the 8PIN connector which enables the 6970 to overlock higher, and the higher speed RAM on the 6970. I hope that this settles it once and for all.

To be honest, I expected a little more. I got 29500 out of my slightly overclocked pair of GTX 470's. I got 25000 points out of a single NOT overclocked EVGA GTX 580 in 3DMark Vantage. I think that I will return the cards and either get me a pair of GTX 570's and overclock them, or a single GTX 580 (sold the EVGA). These 6950's might be nice toys, but they are big - boxy - power hungry (yes, once you disable PowerTune) monsters that are a tad faster than the good old GTX 470.
 
pair of GTX 570's
Why?
I've said before, the 570 right now is obsolete, especially if we're talking about 6950 CF.

PowerTune only kicks in under extreme tests, one review said the only way to do it was in Furmark. Under Crysis load, Anandtech got about 70W less power on the 6950 under the 570.

6950 CF will beat 570 SLi very often, I would say they trade blows if not the 6950's winning outright... for about $100 less.
Throw in the unlock and there's no competition.

The only reason to buy a 570 right now is if you really must have Nvidia for some reason. If you're disappointed with 6950 CF, and then further disappointed with free 6970 CrossFire, I'm sorry to tell you that swapping out for 570 SLi will be even more disappointing for a lot more cash.
 
Why?
I've said before, the 570 right now is obsolete, especially if we're talking about 6950 CF.

PowerTune only kicks in under extreme tests, one review said the only way to do it was in Furmark. Under Crysis load, Anandtech got about 70W less power on the 6950 under the 570.

6950 CF will beat 570 SLi very often, I would say they trade blows if not the 6950's winning outright... for about $100 less.
Throw in the unlock and there's no competition.

The only reason to buy a 570 right now is if you really must have Nvidia for some reason. If you're disappointed with 6950 CF, and then further disappointed with free 6970 CrossFire, I'm sorry to tell you that swapping out for 570 SLi will be even more disappointing for a lot more cash.

Thanks for your insight. Yeah, NVIDIA cards are really old school. I would say that NVIDIA has fallen behind the times. I've just tested my unlocked cards and I get zero hick ups. For those who didn't follow my posts, I unlocked only the shaders. I left the core voltage and clock speeds alone, they are at 6950 stock levels.

Thank you again and Happy New Year!
 
I did the core unlock on my pair of ASUS HD6950. It went without a hick up, but the performance difference is nothing to write home about.

On stock 6950 speeds before the unlock I was getting 30600 pints in 3DMark Vantage, now I'm getting 31630. So the unlock added a mislay 1000 points, but in actuality I haven't noticed anything else. To settle this once and for all, the performance difference comes from the clock speeds and not from the shader count. I believe that AMD used the same exact GPU on both cards (no binning needed as they have perfected the manufacturing), and just disabled one cluster of shaders (128). I believe that they did this for marketing reasons, because it would have been odd to sell the same exact GPU with the same shader count on two cards that are so similar. The real differences are the 8PIN connector which enables the 6970 to overlock higher, and the higher speed RAM on the 6970. I hope that this settles it once and for all.

To be honest, I expected a little more. I got 29500 out of my slightly overclocked pair of GTX 470's. I got 25000 points out of a single NOT overclocked EVGA GTX 580 in 3DMark Vantage. I think that I will return the cards and either get me a pair of GTX 570's and overclock them, or a single GTX 580 (sold the EVGA). These 6950's might be nice toys, but they are big - boxy - power hungry (yes, once you disable PowerTune) monsters that are a tad faster than the good old GTX 470.


go play some games man, benches mean nothing, I cannot believe how much smoother black ops is on this card, and go look at the IQ with 8XAA with 1 card in bc2 and the fps you get... all on shitty driver.


My last card was a gtx 470 and I am happy as hell with it.

Seriously I do not understand your posts.... your not happy then spend $700 on gtx 570 SLI and and lose over $100 FOR what the same performance?

I just played black ops for 30 min at 880/1375 with my unlocked 6950 and hit 55cc with fan at 50%....
 
I did the core unlock on my pair of ASUS HD6950. It went without a hick up, but the performance difference is nothing to write home about.

On stock 6950 speeds before the unlock I was getting 30600 pints in 3DMark Vantage, now I'm getting 31630. So the unlock added a mislay 1000 points, but in actuality I haven't noticed anything else. To settle this once and for all, the performance difference comes from the clock speeds and not from the shader count. I believe that AMD used the same exact GPU on both cards (no binning needed as they have perfected the manufacturing), and just disabled one cluster of shaders (128). I believe that they did this for marketing reasons, because it would have been odd to sell the same exact GPU with the same shader count on two cards that are so similar. The real differences are the 8PIN connector which enables the 6970 to overlock higher, and the higher speed RAM on the 6970. I hope that this settles it once and for all.

To be honest, I expected a little more. I got 29500 out of my slightly overclocked pair of GTX 470's. I got 25000 points out of a single NOT overclocked EVGA GTX 580 in 3DMark Vantage. I think that I will return the cards and either get me a pair of GTX 570's and overclock them, or a single GTX 580 (sold the EVGA). These 6950's might be nice toys, but they are big - boxy - power hungry (yes, once you disable PowerTune) monsters that are a tad faster than the good old GTX 470.

6950 is power hungry? Am I missing something?
 
On stock 6950 speeds before the unlock I was getting 30600 pints in 3DMark Vantage, now I'm getting 31630. So the unlock added a mislay 1000 points, but in actuality I haven't noticed anything else. To settle this once and for all, the performance difference comes from the clock speeds and not from the shader count.

Uhm... What?

Tell that to all the people who ran benchmarks on their unlocked 6950's, at the same clocks like I did. There's a certain synthetic benchmark delta, and a few fps in games too. It's not a major difference, you're right, but it is there.
 
6950 is power hungry? Am I missing something?

I love how people's emotions run high over this crap. Yes, once you turn PowerTune up to 20% you basically disable PowerTune, and then the card gets as much power as it wants. Read the TechPowerUp guide, read the Anandtech article: http://www.anandtech.com/show/4061/amds-radeon-hd-6970-radeon-hd-6950/9

For the real HD6970 at stock clocks with PowerTune disabled under Furmark the max power draw was 422Watts. Under Crysis Warhead it was 355Watts. This values are astronomical.

For the HD6950 I don't have measurements yet, and from my own measurements when I disable PowerTune my entire system is in the 600W range, which is the same as when I had a pair of GTX 470's. I guess that because it's not NVIDIA people will not bitch about power usage and just let it go. It is an interesting phenomenon how people can't be objective about these two companies. People always, always bitch about AMD drivers. But it ends there. With NVIDIA, people bitch about everything. In reality both companies fuck up on a regular basis. Remember the NVIDIA driver fuck up from February - March last year? The fan would turn off, the GPU would burn up? Did I even mention AMD's recent fuck up with the cut power connectors? The had to cut off the of one of the power connectors on each card because the fan wouldn't fit? I thought it was all bullshit when I read it online, until I took a small LED flashlight and looked at my own cards. So really, are you going to be angry at me for pointing out the obvious?

But here is the kicker! I did the shader count mod last night (last year ;)) and now every time when I reboot my fan control switches itself to manual on each card. First card wants to be at 56%, second card wants to be at 40%. OK, I agree, maybe its not strange enough. It gets better. The second card always shows 99% CPU ussage, but its temperature doesn't go up. I have to disable CrossFireX, then re enable it, and then things go back to normal... for a while. A couple of minutes later, the second card is at it again. I guess those shaders where disable for a reason after all...
 
What is the point behind Powertune anyway? I mean the point of not allowing users to simply disable it. I bought my PSU for a reason. Seems like it will be an annoyance once you start raising voltages as you won't be able to raise clocks with higher voltages as much due to increased power consumption causing Powertune to throttle the card.
 
Wolfman, your issues could be driver related. Testing each unlocked card individually could shed some light on the issue you describe
 
I love how people's emotions run high over this crap. Yes, once you turn PowerTune up to 20% you basically disable PowerTune, and then the card gets as much power as it wants. Read the TechPowerUp guide, read the Anandtech article: http://www.anandtech.com/show/4061/amds-radeon-hd-6970-radeon-hd-6950/9

For the real HD6970 at stock clocks with PowerTune disabled under Furmark the max power draw was 422Watts. Under Crysis Warhead it was 355Watts. This values are astronomical.

For the HD6950 I don't have measurements yet, and from my own measurements when I disable PowerTune my entire system is in the 600W range, which is the same as when I had a pair of GTX 470's. I guess that because it's not NVIDIA people will not bitch about power usage and just let it go. It is an interesting phenomenon how people can't be objective about these two companies. People always, always bitch about AMD drivers. But it ends there. With NVIDIA, people bitch about everything. In reality both companies fuck up on a regular basis. Remember the NVIDIA driver fuck up from February - March last year? The fan would turn off, the GPU would burn up? Did I even mention AMD's recent fuck up with the cut power connectors? The had to cut off the of one of the power connectors on each card because the fan wouldn't fit? I thought it was all bullshit when I read it online, until I took a small LED flashlight and looked at my own cards. So really, are you going to be angry at me for pointing out the obvious?

But here is the kicker! I did the shader count mod last night (last year ;)) and now every time when I reboot my fan control switches itself to manual on each card. First card wants to be at 56%, second card wants to be at 40%. OK, I agree, maybe its not strange enough. It gets better. The second card always shows 99% CPU ussage, but its temperature doesn't go up. I have to disable CrossFireX, then re enable it, and then things go back to normal... for a while. A couple of minutes later, the second card is at it again. I guess those shaders where disable for a reason after all...

As you see, there is like 2-10w increase across the board in games, but only furmark that have dramatic increase..

Go ahead and try to disable GTX 570/580 power limit, then tell me which one is power hungry.. :eek:
 
I love how people's emotions run high over this crap. Yes, once you turn PowerTune up to 20% you basically disable PowerTune, and then the card gets as much power as it wants. Read the TechPowerUp guide, read the Anandtech article: http://www.anandtech.com/show/4061/amds-radeon-hd-6970-radeon-hd-6950/9

For the real HD6970 at stock clocks with PowerTune disabled under Furmark the max power draw was 422Watts. Under Crysis Warhead it was 355Watts. This values are astronomical.

For the HD6950 I don't have measurements yet, and from my own measurements when I disable PowerTune my entire system is in the 600W range, which is the same as when I had a pair of GTX 470's. I guess that because it's not NVIDIA people will not bitch about power usage and just let it go. It is an interesting phenomenon how people can't be objective about these two companies. People always, always bitch about AMD drivers. But it ends there. With NVIDIA, people bitch about everything. In reality both companies fuck up on a regular basis. Remember the NVIDIA driver fuck up from February - March last year? The fan would turn off, the GPU would burn up? Did I even mention AMD's recent fuck up with the cut power connectors? The had to cut off the of one of the power connectors on each card because the fan wouldn't fit? I thought it was all bullshit when I read it online, until I took a small LED flashlight and looked at my own cards. So really, are you going to be angry at me for pointing out the obvious?

But here is the kicker! I did the shader count mod last night (last year ;)) and now every time when I reboot my fan control switches itself to manual on each card. First card wants to be at 56%, second card wants to be at 40%. OK, I agree, maybe its not strange enough. It gets better. The second card always shows 99% CPU ussage, but its temperature doesn't go up. I have to disable CrossFireX, then re enable it, and then things go back to normal... for a while. A couple of minutes later, the second card is at it again. I guess those shaders where disable for a reason after all...

You give out to Shansoft about been emotional about this crap? you do realise he has a 480gtx? That he has no bias towards AMD in this, why would he get emotional? He pointed out that 6950 isn't power hungry.

I think that I will return the cards and either get me a pair of GTX 570's and overclock them, or a single GTX 580 (sold the EVGA). These 6950's might be nice toys, but they are big - boxy - power hungry (yes, once you disable PowerTune) monsters that are a tad faster than the good old GTX 470.

This is the phrase He was refering to. And it made me laugh as well. Power Hungry? You do know that the 6950 uses less power than the 5870 and the 570 GTX and that is with powertune off. The 6970 uses less power than the 570 GTX.

And to say that the 6950 is just a tad faster than the gtx 470, I think I know the person with a bias on this thread.

And, yes, everybody know about the powerconnector issue, where have you been? Everybody also knows about the power consumption. You gave out earlier about how stupid people were not doing any research before doing the BIOS upgrade, it's pretty plain that you did no research at all on the 69xx series before purchase as you are mentioning things now that everybody knew before xmas, most people knew all this one or two days after launch.

In fact all you have done on this thread is make dire perdictions about how all these cards will fail after a few months. And since then you have done nothing but point out how bad the 6950's are, even though for both things your arguments are flawed or just plain wrong.
 
You give out to Shansoft about been emotional about this crap? you do realise he has a 480gtx? That he has no bias towards AMD in this, why would he get emotional? He pointed out that 6950 isn't power hungry.



This is the phrase He was refering to. And it made me laugh as well. Power Hungry? You do know that the 6950 uses less power than the 5870 and the 570 GTX and that is with powertune off. The 6970 uses less power than the 570 GTX.

And to say that the 6950 is just a tad faster than the gtx 470, I think I know the person with a bias on this thread.

And, yes, everybody know about the powerconnector issue, where have you been? Everybody also knows about the power consumption. You gave out earlier about how stupid people were not doing any research before doing the BIOS upgrade, it's pretty plain that you did no research at all on the 69xx series before purchase as you are mentioning things now that everybody knew before xmas, most people knew all this one or two days after launch.

In fact all you have done on this thread is make dire perdictions about how all these cards will fail after a few months. And since then you have done nothing but point out how bad the 6950's are, even though for both things your arguments are flawed or just plain wrong.

Did I mention shitty drivers? No I forgot... Never mind, the 6900 series drivers are crap.. oh wait, there are no official drivers for these cards yet... Look man, I've expressed my personal dissatisfaction with the cards. I have bought them based on benchmarks (and I'm not talking 3DMark numbers), I'm talking real games. I don't get the numbers that Anandtech or [H]ard got, and my system isn't exactly a slouch either. So I was asking people that have a GTX 570 or 580 how they like their cards. That was all. Thank you.
 
Does anyone know how to completely clean the registry entries for the Catalyst driver. I'm using Driver Cleaner.NET, but it seems that there are still old settings left in there. Thank you.
 
All I see are pointless claims (power connector filing) which have no effect on you, and false claims about power consumption. I don't see how any of that is a "personal dissatisfaction".
Swapping to a 570 will only ease your mind that you own Nvidia, which for some reason, seems to be a serious concern for you. You will lose performance.
 
oh fuck it... I bought the ASUS 6950 for the all-aluminum sink and utilities, I'll flash it and OC it.

The enthusiast in me won over the anxious part.
 
oh fuck it... I bought the ASUS 6950 for the all-aluminum sink and utilities, I'll flash it and OC it.

The enthusiast in me won over the anxious part.

the aluminum thing is just the sticker on top that is aluminum..

If you think that sticker is going to do anything, then its a plus I guess.. :p
 
Did I mention shitty drivers? No I forgot... Never mind, the 6900 series drivers are crap.. oh wait, there are no official drivers for these cards yet... Look man, I've expressed my personal dissatisfaction with the cards. I have bought them based on benchmarks (and I'm not talking 3DMark numbers), I'm talking real games. I don't get the numbers that Anandtech or [H]ard got, and my system isn't exactly a slouch either. So I was asking people that have a GTX 570 or 580 how they like their cards. That was all. Thank you.

Shitty drivers? lol, wrong again, the drivers are fine. They aren't fully optimized yet, but, that will come. Both companies seem do this with new generation cards, Nvidia did it with Fermi, only a beta driver on release of the 4xx series and now AMD are doing with the 69xx series.

And why do you keeping posting about Nvidia cards? This is a thread about 6950 overclocking? Yet you keep asking about the 570 and 580, saying how good your 470's were and then asking about a 470 with a special PCB? Isn't there a Nvidia section of the forum for these questions?

Can you please keep on topic.
Thank you.
 
Last edited:
wolfman3k5 said:
OK, I agree, maybe its not strange enough. It gets better. The second card always shows 99% CPU ussage, but its temperature doesn't go up. I have to disable CrossFireX, then re enable it, and then things go back to normal... for a while. A couple of minutes later, the second card is at it again. I guess those shaders where disable for a reason after all...

This is a bug in the current versions of GPU-Z and MSI Afterburner. IF YOU REALLY BELIEVE that the gpu is at 99% activity and its temperatures are not going up, I have some beachfront property in Kansas to sell you.

FWIW, This happened to me before I flashed to 6970 as well.
 
oh fuck it... I bought the ASUS 6950 for the all-aluminum sink and utilities, I'll flash it and OC it.

The enthusiast in me won over the anxious part.


Where did you order yours from?

I ordered the asus 6950 from Newegg and my order is still stuck in processing since thursday morning.

:mad:
 
the aluminum thing is just the sticker on top that is aluminum..

If you think that sticker is going to do anything, then its a plus I guess.. :p

Please read the product description.

See where under features it says:
Full aluminum cover dissipates surface heat evenly and rapidly.
That is what ASUS uses instead of all the other manufacturers that use plastic. How much it will help? Who knows; but it doesn't take a genius to interpret the laws to thermal energy conduction.
 
Someone delete this crap way off topic.


While this thread is about the HD6950 unlocking, it is nevertheless about video card unlocking. So no, it is not off topic. Not even close. What it is, is that you're a little cockroach that discovered this forum and now you want to act all though guy and [H]ard.
 
I did a clean Windows 7 install today (it was overdue), and all my driver related issues went away. So the mod was a success. For those that didn't follow all my posts, I only did the shader unlock. I believe that it is far safer. Because of the HD6970 running at higher voltages, I recommend that others do the shader unlock only, and then overclock the card to HD6970 speeds with 3rd party software. It should keep temps down, and it will keep the RAM at 1.5V instead of 1.6V. Best of lock and happy unlocking!

I've posted that GIGBAYTE GTX 470 link on NewEgg. I was wondering if anyone thinks that it will unlock to a full GTX 480, I'm tempted to buy one and try.
 
Please read the product description.

See where under features it says:
That is what ASUS uses instead of all the other manufacturers that use plastic. How much it will help? Who knows; but it doesn't take a genius to interpret the laws to thermal energy conduction.

Yes, it covers the surface of heat sink, which is the sticker on top..

can't believe me? Go ask anyone with ASUS cards and tell them to check the sticker on top.. That is the Aluminum surface cover right there ..

Not even kidding, tons of people already complain about this in other forums...
Thought they got scam, but not really based on the description. :p
 
I did the mod but found temps to be unacceptable in my case (in excess of 90c) Went back and installed an ASUS bios (so that I can mess with voltages), figured temps were not worth the 8% gain, at least until I can get better cooling.
 
Yes, it covers the surface of heat sink, which is the sticker on top..

can't believe me? Go ask anyone with ASUS cards and tell them to check the sticker on top.. That is the Aluminum surface cover right there ..

Not even kidding, tons of people already complain about this in other forums...
Thought they got scam, but not really based on the description. :p

I own two ASUS HD6950. First of all, ASUS is the best of the worst. The reason why I bought them was that MSI and GIGABYTE where out of stock, and I wanted a card with 3 years warranty from a manufacturer that provides a bare minimum warranty service. Everyone else that sells AMD cards is worthless. Don't even get me started on Sapphire...

That being said, the ASUS card is the same card as all the other 6900 series cards. All these cards where made in the same factory. It's the same place that makes NVIDIA reference cards, Intel SSDs, most WD hdds and so on... oh, same place where all the ASUS motherboards come from. I don't know what the company is called, but I see their logo on allot of computer parts these days. The only places where I don't see this logo is MSI motherboards, Gigabyte motherboards, or anything manufactured by anyone else.

The logo looks like this:
logo.jpg


If anyone knows what the company is called, please post a message.

I bring all of this up to let people know that right now all 6900 series cards are identical. They where all made in the same place, with the only difference being the sticker on top and the warranty. If you're buying ASUS because you think it's in any way superior, or because of some claim that the housing of their cards is made from aluminum, it's all marketing talk from the ASUS website. I know, because I've read it as well. I hope this helps.
 
Back
Top