HD6970 vs gtx580 benchmarks

Hmmm, just a few more hours for the NDA to expire. I'm still thinking it's possible the store versions contain fully working but performance reduced drivers to stymie store-based leaks, but we'll know for sure tomorrow.

My plan to buy AMD stocks failed, funds took too long to transfer to my broker.
 
I'm saying if he was talking about min fps performance, then your average FPS chart is pretty much meaningless. I don't know if he can prove it.

No my chart is 100% meaningful, only a person who wants to "see" NV winning the scaling match, and thinks AMD can't improve scaling performance will say it's meaningless. Regardless of what he meant, he never once specified min fps, and even if it IS min fps he is talking about. He still has not provided any ACTUAL data or information other than what it "seems" like, and what it "is"

So again, Provide some sort of DATA , some NUMBERS, from a reputable source, or just can your crusade and move on.

And on that note.

Great, so both cards have a minimum FPS of 0 in Civilization 5, great comparison. Both cards are unplayable :)?

This conversation has derailed enough, I just hope the 69xx scale as well as the 68xx do in CF, because at the listed prices, 2 6950s could be a winner.
 
Ok ok joking aside.

i am sure his point was that the benchmarks tell one story while his personal user experience tells another.

I get it, i really do. I feel the same way about my OC I860 versus a 980, or about the value of buying a cheap scratch and dent alienware for $1,209 over the value of building from scratch and spending $1,400..

Some things "seem one way" and lots of empirical data and objective facts and can be giving to back that up..while the actual user experience for a particular individual go against those numbers.

We all get it..

However, that in and of it self...is not going to convince anyone of anything.

I mean....good for him, i am glad SLI is working out better for HIM.

Now getting back to reality and the rest of us...the real data states that x-fire is now working just fine and should not enter in the the discussion AGAINST AMD in any manner what so ever..(unless you are speaking of the 5xxx series).
 
Hmmm, just a few more hours for the NDA to expire. I'm still thinking it's possible the store versions contain fully working but performance reduced drivers to stymie store-based leaks, but we'll know for sure tomorrow.
Hey grandpa, midnight past your bedtime? ;)

He still has not provided any ACTUAL data or information other than what it "seems" like, and what it "is"
Let's examine two graphs I quickly made in Paint.
http://imgur.com/WNUGw.png

Both graphs are 60fps average.
However, it might seem both graphs represent the same fps, the 2nd one is actually worse since the frame rate is ping ponging all over the place.

That's why your average fps graph is worthless.
Depending on what that guy is talking about. I'm assuming he means min fps, since the average fps boost clearly favors CrossFire.
 
Ok but, wasnt it decided that some of that "dippage" was attributable to the 68xx series having less vram than say a 580x?

We saw that very clearly in the review of the 5970 versus a 580 which i think was done on H, (although i am not sure at the moment). in xfire/sli modes.

The 6970 has 2G of vram

Problem with excess "dippage" in comparison to a 580 solved?
 
Last edited:
No my chart is 100% meaningful, only a person who wants to "see" NV winning the scaling match, and thinks AMD can't improve scaling performance will say it's meaningless. Regardless of what he meant, he never once specified min fps, and even if it IS min fps he is talking about. He still has not provided any ACTUAL data or information other than what it "seems" like, and what it "is"

So again, Provide some sort of DATA , some NUMBERS, from a reputable source, or just can your crusade and move on.

And on that note.

Great, so both cards have a minimum FPS of 0 in Civilization 5, great comparison. Both cards are unplayable :)?

This conversation has derailed enough, I just hope the 69xx scale as well as the 68xx do in CF, because at the listed prices, 2 6950s could be a winner.

I saw a lot of people talking about multi monitor gaming, which is why I think you are talking right past the point being made. I agree for single monitor, crossfire has caught up with SLI. The difference is in multi monitor, where crossfire eyefinity was a shambles for a long time, with many games not even touching the second GPU. I waited until the 480's dropped, for like 6 months, holding on to my money and hoping with each catalyst that AMD would fix the crossfire eyefinity drivers.

Like has been pointed out, that is where SLI really shines right now, otherwise AMD has better price/performance for single monitor gaming. And if I didn't run 3 24" monitors, I would own a pair of 5870's by now.
 
This doesn't effect most, but there is one other area where NVIDIA trumps AMD right now. With regard to multiple monitors and certain production type scenarios. I use my PC for all my work and I frequently have multiple RDP sessions open to various systems. When I have these windows open, I get artifacting on one or more displays. I had this with dual displays without Eyefinity being enabled, and now with three of them. It happens far less than it did and is less noticable than it was with earlier drivers, but when it comes to this sort of thing I've always found ATI / AMD to lag behind NVIDIA.

It's just one more thing that pushed me towards NVIDIA this round.
 
Well we all know nVidia is better because they user both Uppercase AND lower case letters in their name while "AMD" is just a typo for the conjunction, "AND".

Add to that the fact the word "SLICK" which, let's be honest here...is an awesomely cool word, BEGINS with "SLI", There's the whole ball game, right there...

Even if you were to rearrange the letters A, M, and D, to make it sound better it ends up either spelling "MAD", which is just another word for "crazy", or it spells "DAM", which is the very epitome of a bottleneck. So it must be slower...

Need even more proof? nVidia uses more cool letters like "GTX" and "GT" in their model names, which obviously means "fast as hell" because cars like the Pontiac Sunfire use it.

I mean, you gotta fill out forms and stuff to even be able to use GT, right? That's what my older brother's former room mate told me. AMD uses "HD" which comes from like mini-vans or something.
 
Well we all know nVidia is better because they user both Uppercase AND lower case letters in their name while "AMD" is just a typo for the conjunction, "AND".

Add to that the fact the word "SLICK" which, let's be honest here...is an awesomely cool word, BEGINS with "SLI", There's the whole ball game, right there...

Even if you were to rearrange the letters A, M, and D, to make it sound better it ends up either spelling "MAD", which is just another word for "crazy", or it spells "DAM", which is the very epitome of a bottleneck. So it must be slower...

Need even more proof? nVidia uses more cool letters like "GTX" and "GT" in their model names, which obviously means "fast as hell" because cars like the Pontiac Sunfire use it.

I mean, you gotta fill out forms and stuff to even be able to use GT, right? That's what my older brother's former room mate told me. AMD uses "HD" which comes from like mini-vans or something.

You've got me convinced! (y)
 
You know...given that the bickering really is for guys doin multiple monitors and running high resolutions....

It does make me think that by..admitting that..EVEN the guys arguing in favor of Nvidia just inadvertently gave the thumbs up to AMD as the best value for all the rest of use 95% "normal users" who have just one.....normal desk size monitor but expect smooth performance.

So...Nvidia for the 5%-10% extreme top users....AMD for pretty much everyone else. I am still eyeing a second 5870 for less than $200 myself (as one of the users described above..i meet the criteria he listed perfectly).
 
This doesn't effect most, but there is one other area where NVIDIA trumps AMD right now. With regard to multiple monitors and certain production type scenarios. I use my PC for all my work and I frequently have multiple RDP sessions open to various systems. When I have these windows open, I get artifacting on one or more displays. I had this with dual displays without Eyefinity being enabled, and now with three of them. It happens far less than it did and is less noticable than it was with earlier drivers, but when it comes to this sort of thing I've always found ATI / AMD to lag behind NVIDIA.

It's just one more thing that pushed me towards NVIDIA this round.

It's like I said earlier, you simply prefer Nvidia. Nothing devious about that.
 
This doesn't effect most, but there is one other area where NVIDIA trumps AMD right now. With regard to multiple monitors and certain production type scenarios. I use my PC for all my work and I frequently have multiple RDP sessions open to various systems. When I have these windows open, I get artifacting on one or more displays. I had this with dual displays without Eyefinity being enabled, and now with three of them. It happens far less than it did and is less noticable than it was with earlier drivers, but when it comes to this sort of thing I've always found ATI / AMD to lag behind NVIDIA.

It's just one more thing that pushed me towards NVIDIA this round.

This is typical of memory speed fluctuations, and didn't you just recently purchase your second and third monitors? I ran RDP with no issues with 5970 on each of my 3 monitors as long as my GDDR5 was running at least 1000 mhz. If it downclocked, there were flickering and artifacts present.
 
It's like I said earlier, you simply prefer Nvidia. Nothing devious about that.

I don't really have a preference for either. I'll buy whatever meets my needs best. Right now that happens to be NVIDIA. Last time I bought I video card, that was AMD.
 
Nest2001 is pretty dead on. He leaks the stuff right from china where he probably follows the forums there. he was dead on with the GTX 580 leaks and is probably dead on here. Pricing is going to be critical for the HD 6970/6950.
 
Hey grandpa, midnight past your bedtime? ;)


Let's examine two graphs I quickly made in Paint.
http://imgur.com/WNUGw.png

Both graphs are 60fps average.
However, it might seem both graphs represent the same fps, the 2nd one is actually worse since the frame rate is ping ponging all over the place.

That's why your average fps graph is worthless.
Depending on what that guy is talking about. I'm assuming he means min fps, since the average fps boost clearly favors CrossFire.

No, see this is where you don't seem to understand, neither YOU or HIM have provided anything other than YOUR opinions, you have not provided a single shred of proof that the 470sli provides better minimum fps.

and as it was already mentioned, simply stating minimum fps is even more useless, as the card might drop there ONCE.
EG

6870 hits min 15fps, for only 1% of the test, the GTX470 hits 17fps minimum but for 20% of the test, a simple # will not address your issue either, you need a graph, like [H] does, and still I've brought something to the table. Avg FPS, you and him have brought nothing to the table :p
 
No, see this is where you don't seem to understand, neither YOU or HIM have provided anything other than YOUR opinions, you have not provided a single shred of proof that the 470sli provides better minimum fps.
I didn't say SLi provides better min fps.
I said if it does, then that's what he might be talking about. In your averages, they would appear the same (the table you posted), but anyone playing games with the actual cards will see a clear difference.

I don't care, or know, who's right or wrong about the fps, I'm just trying to tell you what he might be talking about.
 
Hi folks.
I just found this:
6970.jpg


From this shop: http://tienda.grupoodin.es/componentes/tarjetas-graficas/pci-express/ddr5/

If prices are any reliable we can be sure that performance-wise these cayman products wont be what many expected :mad::mad: We will find out in the next days when NDA is over :cool:
 
Can we stop the back and forth now and get back to the topic at hand? A leaked review from Russia is out now, and it shows the 6970 is slower than the 580.
 
Its possible that AMD did a much better initial job with its 5xxx series and thus, there isnt as much for them to get out of this refresh.

Contrast that to Nvidia, who did a crappy job with the 4xx series and they were able to make a much steeper improvement with the 5xx refresh.

So its entirely feasible that the final dx11 products from both parties end up being within 5%-10% of each other with Nvidia coming out on top from an engineering stand point.

Of course, the x-fire/sli scores could still prove to be very close.....and for less.
 
Last edited:
"So, the pricing. The new 6970 and 6950, launched today, will have an RRP of $419 and $339 respectively. That's in Aussie dollars, by the way. Incidentally, US pricing for the 6970 sits at $US369. Just so you know.

However, it's worth noting that the 6970 is not AMD's best performer. That crown still sits firmly on the head of the previous generation's 5970 - which is a dual-core monster that'll cost you around $200 more than the 6970.

That's the trick here - the 6970 sits at a much more attractive price-performance ratio than any other AMD card. It's not the top dog, but it's probably the better enthusiast choice. Our testing, which is still in progress, backs that up - be sure to check out issue 121 for the full details. As a teaser, it does double the tesselation performance over previous cards.

In terms of NVIDIA cards, this release still leaves the GTX 580 as the single fastest card on the market, but again, it's a level of performance that you'll pay through the nose for - and will be unlikely take full advantage of. And, to be honest, it's not that much faster than the 6970.

Though we only got the card yesterday... but so far, it's a sweet bit of kit."


http://www.atomicmpc.com.au/News/241778,amds-6970-pricing-revealed.aspx
 
over at ocuk:

Gibbed said:
The new drivers are much faster, but I did not get them in time. The new drivers basically mean a 6970 beats a GTX 580 in tessellation or at least is no longer chasing it.

Wait for reviews tomorrow guys, its a fantastic card for the money, but the GTX 580 does still have outright power, but it cost more.

For the same money I'd be bagging myself a pair of 6950's.

They also mention 6950 Xfire scaling was on average 95%.
 
If it's only slightly better than the 570, for $369, it's more "meh" than "sweet".
Especially since ATI is late to the party.
 
370$ for 5% less performance than 500$ card is :drools:

The top cards from both sides are lacking the knockout punch it feels like, 100% more price for 30% more performance.
 
If it's only slightly better than the 570, for $369, it's more "meh" than "sweet".
Especially since ATI is late to the party.

It's not even one fucking month. Chill out, will you? :rolleyes: It's not like it took 6 months to release, or one year (HD2900XT).
 
Seriously, people are overreacting so much. Fermi was late by 6 months. 6900 series was late by 3 weeks.
 
You sound like you WANT it to be true.

Dan like everyone else basis his decisions on whatever info is available to him, including his past experiences. He already stated more than once, that
1) 5970 was not enough for this 3x30" setup, and
2) From his experience, and others it seems, Crossfire+Eyefinity scaling sucks.

Given those 2 points, it seems like a logical decision for him to go with the 580s. Why?
1) even if the 6970 is faster, is their Eyefinity + Crossfire scaling better, and if so, is it enough to beat out a multi-monitor GTX 580 sli setup?
2) If the 6970 is on par or slower, then the only loss he is taking, is a bit of $$ on each card, and thats given that the Eyefinity+Crossfire scaling is working well.

I'll simplify it down for you one more time.
even if the 6970 is FASTER or a better VALUE than the GTX580 Dan would have probably gone with the GTX580 still, because for his application, NV has proven to be more reliable so far.


have you seen the 6850/6870 crossfire scaling? it's MUCH improved over 58xx and it even seems to be better then SLI Scaling.

OK, fanboy, the fact remains that Dan voted with his money for GTX 580. Slice it, dice it, that's the bottom line. GTX 580, baby! I thought Eyefinity was a major selling point, yet we have Danny-boy here going with GTX 580 for his 3 monitor setup... Are you saying Nvidia is beating AMD in a technology that AMD is trying to popularize? What are you saying are the special needs that Dan needs that AMD can't meet?

The bottom line here is that Nvidia got a major endorsement here by Dan. He usually only buys the top of a generation, i.e. I was expecting him to be buying a card when the next shrink was to come, but he spent his money earlier. That says it all.

You need to remain impartial when you look at the facts.
 
OK, fanboy, the fact remains that Dan voted with his money for GTX 580. Slice it, dice it, that's the bottom line. GTX 580, baby! I thought Eyefinity was a major selling point, yet we have Danny-boy here going with GTX 580 for his 3 monitor setup... Are you saying Nvidia is beating AMD in a technology that AMD is trying to popularize? What are you saying are the special needs that Dan needs that AMD can't meet?

The bottom line here is that Nvidia got a major endorsement here by Dan. He usually only buys the top of a generation, i.e. I was expecting him to be buying a card when the next shrink was to come, but he spent his money earlier. That says it all.

You need to remain impartial when you look at the facts.
Call me a fanboy all you want, I went from a GTX285 -> 5970, and now I'm waiting to see who gives me better performance, how many AMD cards have you ever run?

if you want to follow Dans decision like a sheep without even waiting for the facts, or even understanding what he posted, be my guest, your wallet can share his vote, I'll wait till my wallet finds what suits it best :)

and since you can't seem to read and comprehend at the same time, let me break it down for you
DAN RUNS 3 30" MONITORS WITH THE 5XXX CROSSFIRE + EYEFINITY(MULTI MONITOR) SUCKED, WHICH DAN CLEARLY STATED, IF YOU HAD READ PAST THE FIRST LINE OR 2 YOU MIGHT HAVE GOTTEN THAT PART TOO. SO IT MAKES SENSE FOR DAN TO BUY NVIDIA CARDS THIS ROUND, SINCE HIS LAST EXPERIENCE WITH AMD AND MULTI-MONITOR / MULTI GPU SUCKED, EVEN IF THE 6970 IS A BETTER VALUE, OR FASTER THAN THE GTX580, OR EVEN SLOWER AND MORE EXPENSIVE IT MAKES SENSE FOR DAN TO BUY GTX580S

Hope that clears it up for you.
 
Dude, I read your words and try to give you the benefit of the doubt, that's why I say I don't understand you. Everyone wants eyefinity. That's what [H]ard gaming is all about.

If you are one of those guys who still plays on a 17" monitor then I get your point...

I like your colors though.
 
Dude, I read your words and try to give you the benefit of the doubt, that's why I say I don't understand you. Everyone wants eyefinity. That's what [H]ard gaming is all about.

If you are one of those guys who still plays on a 17" monitor then I get your point...

I like your colors though.

I play on a single 30" display, I don't think I'll be getting 3 30" monitors any time soon, and I don't wan to give up the 30" for 3 smaller displays either, I'm part of the MAJORITY that will not be gaming on multiple monitors, one day I might, but for now I wont. So I, like most other people do not care about xfire / eyefinity scalability, and given that Crossfire scalability has been fixed with the 68xx cards, there is a chance that eyefinity + crossfire is fixed too.
 
OK, fanboy, the fact remains that Dan voted with his money for GTX 580. Slice it, dice it, that's the bottom line. GTX 580, baby! I thought Eyefinity was a major selling point, yet we have Danny-boy here going with GTX 580 for his 3 monitor setup... Are you saying Nvidia is beating AMD in a technology that AMD is trying to popularize? What are you saying are the special needs that Dan needs that AMD can't meet?

The bottom line here is that Nvidia got a major endorsement here by Dan. He usually only buys the top of a generation, i.e. I was expecting him to be buying a card when the next shrink was to come, but he spent his money earlier. That says it all.

You need to remain impartial when you look at the facts.

You clearly have said it like a tru fanboy yourself. I have a gtx 480 and I go back and forth between ati and nvidia, so I am sure DAN does the same. I have no excuse for why I do it. I just do it just for the hell of it. I have an amd board and amd processor right now so I am thinking about going crossfire soon after I sell of my gtx 480. Amd seemed to have finally made crossfire worth the money with the 6000 series. I am sure they did something in hardware because 5000 series never exceled that much, may be its the way new shaders are designed.
 
OK, fanboy, the fact remains that Dan voted with his money for GTX 580. Slice it, dice it, that's the bottom line. GTX 580, baby! I thought Eyefinity was a major selling point, yet we have Danny-boy here going with GTX 580 for his 3 monitor setup... Are you saying Nvidia is beating AMD in a technology that AMD is trying to popularize? What are you saying are the special needs that Dan needs that AMD can't meet?

The bottom line here is that Nvidia got a major endorsement here by Dan. He usually only buys the top of a generation, i.e. I was expecting him to be buying a card when the next shrink was to come, but he spent his money earlier. That says it all.

You need to remain impartial when you look at the facts.

What AMD hasn't been able to do very well was scaling beyond dual GPUs. Dual GPU scaling with AMD has been fine with the 5870 and 5970 cards. Having reached that level of performance, I know damned well that it isn't enough. Dual 6970's may very well serve my needs. However in the past Crossfire and Eyefinity performance suffered compared to SLI + NVSurround performance. So I went with NVIDIA. There are other factors leading me that direction as well, but that's the basic gist of the decision.

I typically buy the highest end solution within a given generation. I often skip the refreshes if there isn't enough of a leap over the previous iteration of the high end product. In other words I had three GeForce GTX 280's and skipped the GeForce GTX 285's because they didn't offer enough of a boost to make buying three new cards and selling my three old cards worth the extra hit to my wallet. I didn't opt for the GeForce GTX 295 because in Quad-SLI two of those were a weaker solution than three GeForce GTX 280's or GTX 285's. I grabbed the 5970 knowing I could overclock it to 5870 speeds and thus get the same performance for less money. The main selling point for me was actually Eyefinity. I had hoped I'd be able to run two 5970's in Eyefinity at some point but AMD never got the CrossfireX scaling issues resolved to a point where I felt the cost of a second 5970 was worth while. In fact no matter how I sliced it, GeForce GTX 480's and now GeForce GTX 580's in SLI were the fastest solution especially where multi-monitor gaming was concerned.

Call me a fanboy all you want, I went from a GTX285 -> 5970, and now I'm waiting to see who gives me better performance, how many AMD cards have you ever run?

if you want to follow Dans decision like a sheep without even waiting for the facts, or even understanding what he posted, be my guest, your wallet can share his vote, I'll wait till my wallet finds what suits it best :)

and since you can't seem to read and comprehend at the same time, let me break it down for you
DAN RUNS 3 30" MONITORS WITH THE 5XXX CROSSFIRE + EYEFINITY(MULTI MONITOR) SUCKED, WHICH DAN CLEARLY STATED, IF YOU HAD READ PAST THE FIRST LINE OR 2 YOU MIGHT HAVE GOTTEN THAT PART TOO. SO IT MAKES SENSE FOR DAN TO BUY NVIDIA CARDS THIS ROUND, SINCE HIS LAST EXPERIENCE WITH AMD AND MULTI-MONITOR / MULTI GPU SUCKED, EVEN IF THE 6970 IS A BETTER VALUE, OR FASTER THAN THE GTX580, OR EVEN SLOWER AND MORE EXPENSIVE IT MAKES SENSE FOR DAN TO BUY GTX580S

Hope that clears it up for you.

It made sense for me to make the choice I made. At least in my mind. I think once NDA lifts and we can all see the data it will point to the GeForce GTX 580 and the 6970 being fairly close performance wise. Cost looks like it may favor AMD but I'm not interested so much in bang for my buck, but rather absolute performance which I feel will favor NVIDIA in the end when all is said and done. I think two GeForce GTX 580's and possibly a third down the line will yield the best possible results I can expect and give me the most performance for my triple 30" display setup. Running this now on a single Radeon HD 5970 is brutal. While on one hand I'm surprised at how well the card handles the strain, it's too much for any one card, even with dual GPUs to handle this at present. For the first time ever I'm having to turn off AA completely in some games and other details as well in order to get playable frame rates in my favorite games. I've not had to do that since the days of Doom and Descent on my 486. Since then I've always been able to afford enough graphics power and CPU power to max out games as much as possible or anyone else could at a given time with all the current games. To me, dropping eye candy quality and taking my monitors out of native resolution is unacceptable. If it takes me purchasing three GeForce GTX 480's or hell even four of the bastards to stop that I'll do it.

Dude, I read your words and try to give you the benefit of the doubt, that's why I say I don't understand you. Everyone wants eyefinity. That's what [H]ard gaming is all about.

If you are one of those guys who still plays on a 17" monitor then I get your point...

I like your colors though.

Eyefinity kicks ass. No doubt about it. I love it and new I had to have it when I first tried it at the Eyefinity Challenge we did here in Dallas. It got me excited about gaming again. Now that I'm running it I'm even more excited about it. However, NVIDIA's NVSurround technology gives you the same thing, but does so with a little more elegance (in some ways) and NVIDIA's Fermi cards did so without the performance hit that AMD's suffered in comparison. Believe me, the 5970's overclocked to 5870 speeds can chew up and spit out damned near any game at 2560x1600 but it struggles at 7680x1600 (3x2560x1600).

I play on a single 30" display, I don't think I'll be getting 3 30" monitors any time soon, and I don't wan to give up the 30" for 3 smaller displays either, I'm part of the MAJORITY that will not be gaming on multiple monitors, one day I might, but for now I wont. So I, like most other people do not care about xfire / eyefinity scalability, and given that Crossfire scalability has been fixed with the 68xx cards, there is a chance that eyefinity + crossfire is fixed too.

I didn't want to give up my 30" monitor for 3 smaller displays. I almost did. I had three Dell U2410's in the cart at Dell's online store ready to confirm the purchase and couldn't do it. I waited a bit longer and got two more Dell 3007WFP-HC's. I am GLAD I did. Given that 6000 series scaling has improved over 5x00 series scaling, I hope their Eyefinity performance improved. I really do. I just don't have a good feeling about AMD's ability to improve scaling beyond dual GPUs. From all the leaked data that's been posted in this thread, and articles from around the world, it appears that the 5970 will remain AMD's fastest single card for sometime at least. Thus I can only conclude that I'd need at least two 6970's to make me happy. However I've got the feeling that dual GTX 580's will be slightly faster in multi-monitor gaming and even if they aren't, they scale well enough for me to add a third card. I doubt AMD's 6970's will do the same.

I thought the 5770 didn't have the problems the 5870 did in xfire/eyefinity...

It may or may not but regardless it's not fast enough for me. It may be enough for others even where Eyefinity is concerned with say 22" monitors or whatever.

You clearly have said it like a tru fanboy yourself. I have a gtx 480 and I go back and forth between ati and nvidia, so I am sure DAN does the same. I have no excuse for why I do it. I just do it just for the hell of it. I have an amd board and amd processor right now so I am thinking about going crossfire soon after I sell of my gtx 480. Amd seemed to have finally made crossfire worth the money with the 6000 series. I am sure they did something in hardware because 5000 series never exceled that much, may be its the way new shaders are designed.

Yeah, I go back and fourth all the time. I vote with my wallet based on what I believe to provide the absolute fastest performance possible. I don't care so much about bang for my buck unless I can get 95% as much bang for significantly less buck.
 
I wonder if Charlie is still laughing now?

I am sure he is, for the price the cards are pretty much a win. Plus with no official reviews out.

Probably. As long as the performance of the 6970 is close enough to the GTX 580 he'll be able to highlight what he considers NVIDIA's weaknesses and given the price point it looks like the 6970 will hit, he'll argue it's value. The sad part is he'll be right for the most part even if his hatred for NVIDIA has reached irrational proportions.
 
Back
Top