GTX 580 rumors

If you guys are referencing the more recent chart

gtx580A_copia.jpg


GTX 580 is showing to usually be 20% or more (usually more) faster than GTX 480 in the following benchmarks

Batman Arkham Asylum
Starcraft 2
3dmark Vantage
Crysis Warhead
Hawx
Just Cause 2
Alien Vs Predator
Battleforge
Battlefield Bad Company 2
Dirt 2
Lost Planet 2
Metro 2033
Stalker COP
Stone Giant
Unigen Heaven

It's only less than 20% or more faster in the following;

COD MW2
Resident Evil 5
World in Conflict

Please Re-read the chart, look at the data not the length of the bars Each line up is a 20% boost over the other card

Now granted all of these charts could be canned and some are obviously Synthetic (Vantage, Stone Giant, Heaven) to name a few, but man in other forums they are laughing and saying that most of you guys can't read the charts.

Lol like I said in the other thread you posted this...wrong math. Recheck.
 
Lol like I said in the other thread you posted this...wrong math. Recheck.

My math seems right, Each line up is a 20% performance difference. For example GTX 480 is at 1.4 or just under in Batman Arkham Asylum. GTX 580 is at the line above at 1.6. That is 20% faster than GTX 480. What am I missing here?
 
that would be true if you were only talking about the 5870. those lines between 480 and 580 are gonna be less than 20% when they are higher than the 5870.
 
My math seems right, Each line up is a 20% performance difference. For example GTX 480 is at 1.4 or just under in Batman Arkham Asylum. GTX 580 is at the line above at 1.6. That is 20% faster than GTX 480. What am I missing here?

Let's break it down shall we?

GTX 580 ~ 1.6
GTX 480 ~ 1.4

1.6-1.4 = 0.2

0.2 / 1.4 = .1428571

.1428571 x 100 =14.28571%
 
that would be true if you were only talking about the 5870. those lines between 480 and 580 are gonna be less than 20% when they are higher than the 5870.

Ah I see what your saying, in correlation to the 5870. The GTX 480 is 3 bars = 1 in batman so it doesn't equate the same.
 
My math seems right, Each line up is a 20% performance difference. For example GTX 480 is at 1.4 or just under in Batman Arkham Asylum. GTX 580 is at the line above at 1.6. That is 20% faster than GTX 480. What am I missing here?

Your maths *is* wrong
:rolleyes:

Here is an arithmetic lesson...
If 'A' has a speed of 1.4, and 'B' has a speed of 1.6, how much faster is 'B' than 'A'..
Express answer as a percentage advantage to 'B' over 'A'

Ans: (1.6 / 1.4) * 100 - 100 = 14%

:)

(ok - vererz beat me to it! )
 
Last edited:
Your maths *is* wrong
:rolleyes:

Here is an arithmetic lesson...
If 'A' has a speed of 1.4, and 'B' has a speed of 1.6, how much faster is 'B' than 'A'..
Express answer as a percentage advantage to 'B' over 'A'

Ans: (1.6 / 1.4) * 100 - 100 = 14%

:)

(ok - vererz beat me to it! )


LOL. Thanks guys.
 
I thought the rumor was 560x2 or 460x2
for dual card
that be sick

the 560x2 could be true but the truth of the matter is, due to the better TDP of GF110 if you can do dual Cayman, you can do dual GF110. Dual GF114 (GTX560) will be dual mainstream chips and I dont expect those to beat Cayman which is dual high end chips. Dual GTX 570s or dual gf110s have a shot. We have to wait patiently and see.

If you would have asked me before if nvidia had an answer for 5970 I would have told you no. Too much power/heat/noise for dual GTX 470. Now I think it's obvious nvidia is coming with the big guns
 
If you would have asked me before if nvidia had an answer for 5970 I would have told you no. Too much power/heat/noise for dual GTX 470. Now I think it's obvious nvidia is coming with the big guns

Hehehe.... Welcome to the GTX 5XX dual card side... :D I bet dual 384 core though.
 
Hehehe.... Welcome to the GTX 5XX dual card side... :D I bet dual 384 core though.

LOL, you didn't convert me you said the 580 was a dual card and GF110 = dual GF104. I say the GTX 595 is a dual card aka GF110 dual :D Oh yeah, that's my prediction for the name ;)
 
LOL, you didn't convert me you said the 580 was a dual card. I say the GTX 595 is a dual card :D Oh yeah, that's my prediction for the name ;)

Ah, but I am still on the 768 core/384x2/128TMU/512-bit rumor. :D It was not GF110, but GF110 was not a 512-bit chip either as many argued either.
 
Ah, but I am still on the 768 core/384x2/128TMU/512-bit rumor. :D It was not GF110, but GF110 was not a 512-bit chip either as many argued either.

My rumor/guess was closer than yours :D Either way, I'm calling dual GF 110 GTX 570x2 basically, not 384x2, lets see who is closer there. Bet you a 6 pack :D
 
570x2 be sick
700 dollars?

(looking into forum famous crystal ball) They will launch days after Antilles, Match price, Antilles will cut price by $20. Price Wa.... Sorry it got cloudy again, I'll check the crystal ball later.:D
 
Ah, but I am still on the 768 core/384x2/128TMU/512-bit rumor. :D It was not GF110, but GF110 was not a 512-bit chip either as many argued either.

Yes, but what you argued was that the 'GF110' was a 2 x GF104 (or later 2 x GF114) card...
- which is not the same arguement as saying a 2 x GF104 card could be produced
- since there already are examples of this card out there...

Since in NV terminology, GF110 refers to a chip not a card, your arguement didn't make sense....
 
My rumor/guess was closer than yours :D Either way, I'm calling dual GF 110 GTX 570x2 basically, not 384x2, lets see who is closer there. Bet you a 6 pack :D

The problem with a Dual GF110 board is going to be power consumption
- AFAIK a PCIe board is limited to 300W, so basically, it can't be approved by the PCI board unless it's < 300W
- this is why the HD5970 used down-clocked HD5870 chips....
- (good ones)

If the HD6990 really is a Dual Cayman, then it's going to have the same problem...
- not sure what they can do about it.
A Dual Bart board would be <300W, but would it perform well enough....?

A Dual GF104 board would give about GTX480 + 20% (i.e. GF110 levels)
- but take more power, and cost more to make than a GF110
- which is why (I think) it hasn't been produced
- once the GF110 was deemed ok, there was no need for the Dual GF104

Maybe a Dual GF114 board would give better perf/watt than the GF104
- but we need see how this one performs once it's released...
 
Last edited:
The problem with a Dual GF110 board is going to be power consumption
- AFAIK a PCIe board is limited to 300W, so basically, it can't be approved by the PCI board unless it's < 300W
- this is why the HD5970 used down-clocked HD5870 chips....
- (good ones)

If the HD5990 really is a Dual Cayman, then it's going to have the same problem...
- not sure what they can do about it.
A Dual Bart board would be <300W, but would it perform well enough....?

A Dual GF104 board would give about GTX480 + 20% (i.e. GF110 levels)
- but take more power, and cost more to make....
- which is why (I think) it hasn't been produced
- once the GF110 was deemed ok, there was no need for the Dual GF104

Maybe a Dual GF114 board would give better perf/watt than the GF104
- but we need see how this one performs once it's released...

I agree with everything you said, Only thing is, If they can make a dual Cayman they can make a dual Fermi 2. What they might have to do is downclock the core and speed up the Shaders & ram Either way I have high hopes for GTX 570 pulling what a 5870 pulls or slightly less with more performance.
 
I agree with everything you said, Only thing is, If they can make a dual Cayman they can make a dual Fermi 2. What they might have to do is downclock the core and speed up the Shaders & ram Either way I have high hopes for GTX 570 pulling what a 5870 pulls or slightly less with more performance.

Yes, it could be possible, depending....
- down-clocking allows down-volting, which reduces the power consumption enourmously...
- so my guess is that they must be planning to do something like that (both AMD & NV)

(Pulling some numbers out of my ass ...)
If the GF110 / GTX580 was a 800MHz part, it could be that clocking at 600MHz would allow downvolting enough to allow a Dual GF110 card under 300W
- which might be worthwhile, even allowing for SLI -scaling losses....

Same goes for HD6990....

That's the only way I can see these cards being viable under 300W

But, what's the point?
- why not just stick with SLI & CF? Faster, cheaper

(BTW, the shaders are always 2x the core clock in Fermi (GF10x at least) )
 
Last edited:
Yes, it could be possible, depending....
- down-clocking allows down-volting, which reduces the power consumption enourmously...
- so my guess is that they must be planning to do something like that (both AMD & NV)

(Pulling some numbers out of my ass ...)
If the GF110 / GTX580 was a 800MHz part, it could be that clocking at 600MHz would allow downvolting enough to allow a Dual GF110 card under 300W
- which might be worthwhile, even allowing for SLI -scaling losses....

Same goes for HD5990....

That's the only way I can see these cards being viable under 300W

But, what's the point?
- why not just stick with SLI & CF? Faster, cheaper

(BTW, the shaders are always 2x the core clock in Fermi (GF10x at least) )

Your on the right track Samurai but my Forum famous crystal ball says the GTX 570 will not have a full 512Cores and will come with 480 Cores and slightly lower clocks than GTX 580. I anticipate that a dual version of the card will not need to be clocked down to 600 mhz. :D

Oh and I agree, I think single dual GPU cards are a waste IMO, I'm more of a dual/tripple card guy myself. I might buy 6970 if they perform better overclocked to max stable. 3 actually :D
 
Was the typo typing 3 instead of 4, or 5?

Nope, typo in the currency conversion in one of my cells inmy spreadsheet...

Reposted with accurate data. Sorry about that.

I had a look at the ASUS releases today, and did some statistical pricing analysis and here is my best guess at launch pricing based on that.

I predict the 580 will replace the 480 I think the 480 will be discontinued.

On launch, I think the 580 will sell for $570. According to my analysis it could go as high as $600, or as low as $548

After launch, once partners start doing discounts and mailins, I think we could see them for about $535. As high as $570 and as low as $480.

My analysis is based on the following:

ASUS's leaked pricing earlier today was all in Yuan. I compared the Yuan pricing to the currently lowest available prices available on Newegg for existing products, then I multiplied this by the Asus listed pricing for the 580.

Note, the Asus price list had no 480 on it. This could be due to it being discontinued by Nvidia (using the parts for Tesla boards) or just due to Asus no longer seeing a point in making a 480.

I could - of course - be wrong, but this is my best estimation based on numerical methods.
 
Zarathustra[H];1036388944 said:
Nope, typo in the currency conversion in one of my cells inmy spreadsheet...

Reposted with accurate data. Sorry about that.

I had a look at the ASUS releases today, and did some statistical pricing analysis and here is my best guess at launch pricing based on that.

I predict the 580 will replace the 480 I think the 480 will be discontinued.

On launch, I think the 580 will sell for $570. According to my analysis it could go as high as $600, or as low as $548

After launch, once partners start doing discounts and mailins, I think we could see them for about $535. As high as $570 and as low as $480.

My analysis is based on the following:

ASUS's leaked pricing earlier today was all in Yuan. I compared the Yuan pricing to the currently lowest available prices available on Newegg for existing products, then I multiplied this by the Asus listed pricing for the 580.

Note, the Asus price list had no 480 on it. This could be due to it being discontinued by Nvidia (using the parts for Tesla boards) or just due to Asus no longer seeing a point in making a 480.

I could - of course - be wrong, but this is my best estimation based on numerical methods.

Amazon listing was for $529, but they've since blanked it.
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
what do you think about this gpu?
does it worth buying or is better wait 6 months and take kepler?
 
what do you think about this gpu?
does it worth buying or is better wait 6 months and take kepler?

I think that depends on how fast it ends up being, how fast the 6970 ends up being, how much they both end up being, and what card you are using now. Other things that will affect this decision are heat, power consumption, noise, size, what PSU you have, what resolution you play at, etc. We don't know ANY of those things:)
 
I think that depends on how fast it ends up being, how fast the 6970 ends up being, how much they both end up being, and what card you are using now. Other things that will affect this decision are heat, power consumption, noise, size, what PSU you have, what resolution you play at, etc. We don't know ANY of those things:)

I don't want to know if a 580 is suitable for my needs, I want to know if it worth the 500$ when soon a new architecture will hit the road. :)
 
I don't want to know if a 580 is suitable for my needs, I want to know if it worth the 500$ when soon a new architecture will hit the road. :)

Ahh- I don't think it will be worth $500 when I anticipate the 6970 being $450 tops, but I really don't know. Especially with both Kepler, and AMD's 28nm creations coming reasonably soon.
 
Ahh- I don't think it will be worth $500 when I anticipate the 6970 being $450 tops, but I really don't know. Especially with both Kepler, and AMD's 28nm creations coming reasonably soon.

580 is tempting but kepler is too imminent...
 
Back
Top