As above.
Looking at the specs and details of the two cards, it's just...weird.
Geforce GTX 560 = $250
Geforce GTX 570 = $350
($100 difference with nothing in that slot?!)
Geforce GTX 560 = 30a on +12v
Geforce GTX 570 = 38a on +12v
(my PSU does 34a on +12v, so...shouldn't use the 570, but the 560 isn't maxing out what my PSU is designed to handle...)
Looking at pretty much any of the metrics - shader cores, etc - seems there is a pretty big gap there.
Am I missing something?
Looking at the specs and details of the two cards, it's just...weird.
Geforce GTX 560 = $250
Geforce GTX 570 = $350
($100 difference with nothing in that slot?!)
Geforce GTX 560 = 30a on +12v
Geforce GTX 570 = 38a on +12v
(my PSU does 34a on +12v, so...shouldn't use the 570, but the 560 isn't maxing out what my PSU is designed to handle...)
Looking at pretty much any of the metrics - shader cores, etc - seems there is a pretty big gap there.
Am I missing something?