Decisions, Decisions! Which EVGA GTX660?

PGHammer

2[H]4U
Joined
Oct 8, 2002
Messages
3,315
Because my (refurbished) GTX550Ti is holding up in everything *except* BF4, I'm looking down the road at eventual replacements, and, due to previous experience with it in non-personal hardware, naturally, the GTX660 came to mind.

Advantages - no PSU upgrade needed (same power requirements as current GPU), no driver replacement needed, full-size HDMI (the one quibble about my 550Ti is the mini-HDMI port), decent price.

Disadvantage - not latest or greatest in terms of nV GPU technology.

I'm trying to decide between a pair of GTX660s; both are from EVGA. The only differences are cooling and price. The choice is between the "base" Superclocked (http://www.evga.com/Products/Product.aspx?pn=02G-P4-2662-KR) and the base Signature 2 (http://www.evga.com/Products/Product.aspx?pn=02G-P4-2661-KR)

Which is the better choice paired with a Haswell i5-K?
 
Both will work fine. But, if you're wanting to overclock, the dual fan version will work better. Worth the $10? Probably.

Also, I see in your sig that you have a 3570k. Upgrading to a Haswell chip will net you almost no performance benefit. It isn't worth the money.
 
My vote its for the Signature 2, it will run colder and quieter than the reference, also will allow to reach greater turbo boost frecuency with lower temps.
 
I repeat, there will be NO OC difference between them or pretty much any 660 card on the market. The thing stopping the 660 from becoming a powerhouse card at $200 (which it already is), is the fact it's limited to just one SLI (not a problem for most) and the power draw limits.

You are limited to an extremely mild OC on any 660 card so both will run cool, the Signature 2 will just run cooler, obviously, due to its dual over-the-top fan design. You can easily set a software based fan curve on the blower one and get the same results with mild noise. The 600 series was a game changer for blower style fans, they're nearly silent in comparisons to previous generation cards.

If you're using EVGA Bucks to offset the costs, then pay the extra $10 to get the dual fans, otherwise go somewhere else online and buy the cheapest EVGA 660 non-used card you can get your hands on. Many are < $200 before a $20 rebate on top. So long as you have good air flow in your case the Signature 2 will be better. If airflow in the case isn't great then it wont matter which you choose, in fact a blower fan would probably be better.

You're going to be limited on any OC regardless though and that's the thing that should be pointed out here. Which is why I say go for the cheapest 660 you can get your hands on, then SLI them when the prices drop in November even further, throw a 110% power ratio in EVGA Precision X and call it a day.
 
Both will work fine. But, if you're wanting to overclock, the dual fan version will work better. Worth the $10? Probably.

Also, I see in your sig that you have a 3570k. Upgrading to a Haswell chip will net you almost no performance benefit. It isn't worth the money.

There is really zero price difference between Ivy Bridge and Haswell (i5-K vs. i5-K); the reason I still have the original planned configuration in my sig (labelled as "under construction") is that I am still nailing down other items. Being the price difference is zero, and the performance difference (same clocking) is in Haswell's favor, Haswell has slightly greater bang for buck than Ivy Bridge (or Sandy Bridge, for that matter). Had I already bought Ivy Bridge (the i5-3570K), I'd agree; moving to Haswell would be, in fact, pointless.

Having the CPU sorted out, it then becomes a matter of which GPU has the bang-for-buck.

The $10 price gap is at my preferred local retailer - MicroCenter. (I have a choice between Fairfax, VA and Rockville, MD - Rockville has better inventory, but the higher tax bite; the offset is that it costs less to get to via mass-transit than Fairfax.)

I use PrecisionX as my OC tool today, so there would (literally) be no change whatever in my GPU software loadout.

While GTX7xx is newer, it also significantly jacks up power requirements (even for just one) - if I had a larger display, that might be worth it. However, with a larger display not feasible (not enough desk space - literally; that is the same reason why SLI is a non-starter), GTX7xx, unless the price comes down rather significantly, makes no sense, even for BF4 (the one game pushing the upgrade).

With GTX7xx out, that leaves GTX6xx. GTX660 is a somewhat known quantity - other than BF4, it's almost overkill; not exactly something I'd expect to say about a sub-$200 GPU. However, it does have the advantage of the same power envelope as what I have now, and not necessarily forcing insane overclocking (which is not it's reason for being in any case). GTX670 and GTX680 have the same problems that GTX770 and GTX780 do - outsized power requirements and outsized (for now) prices.
 
Last edited:
Hammer, did you end up making a purchase yet? Since you're set on Sli and I assume you'll have it for awhile, have you looked at 3Gb 660s? You might be able to price match them if you can find some cheaper online. I say this because Nvidia is supposed to drop their prices significantly soon on lower end cards. Ie: Gtx 660 @ $150 or so
In Sli I'd bet the 660s have enough power to have playable fps with over a 2gb framebuffer as newer games come out. If I could re buy my cards I'd go 3gb price permitting...

However, with a larger display not feasible (not enough desk space - literally; that is the same reason why SLI is a non-starter)

Not trying to be daft here but in the last few words, what did you mean "Sli is a non-starter"?
Game on in Bf4!
 
Definitely go for the 3GB card. It's only a few $$ more from EVGA. If the early beta results from the BF4 [H] review are an indication, you'll want all the video RAM you can get. The recommended settings for BF4 indicate 3GB video RAM. If you're doing an upgrade just for BF4, it doesn't make much sense to not at least hit the recommended system specs.

Maybe you won't notice a difference... maybe you'll be kicking yourself for not spending the extra $20 per card...

http://www.hardocp.com/article/2013/10/10/battlefield_4_beta_performance_preview/1
 
Definitely go for the 3GB card. It's only a few $$ more from EVGA. If the early beta results from the BF4 [H] review are an indication, you'll want all the video RAM you can get. The recommended settings for BF4 indicate 3GB video RAM. If you're doing an upgrade just for BF4, it doesn't make much sense to not at least hit the recommended system specs.

Maybe you won't notice a difference... maybe you'll be kicking yourself for not spending the extra $20 per card...

http://www.hardocp.com/article/2013/10/10/battlefield_4_beta_performance_preview/1
a 660 has no where near the power of a card that will utilize more than 2 gb of vram. even at the maximum possible settings in games the highest I have seen is 1800mb in Crysis 3 and it was completely unplayable way before that anyway. and all games are well below 1500mb for settings that are smooth especially those that will give 60 fps. so unless BF 4 needs that vram for just high res textures then it will go to waste. and the only reviews I have seen for the 660 3gb have shown it to actually be a wee bit slower than the 2gb model. maybe due to more latency from having just a 192bit bus trying to access more memory chips.
 
a 660 has no where near the power of a card that will utilize more than 2 gb of vram. even at the maximum possible settings in games the highest I have seen is 1800mb in Crysis 3 and it was completely unplayable way before that anyway. and all games are well below 1500mb for settings that are smooth especially those that will give 60 fps. so unless BF 4 needs that vram for just high res textures then it will go to waste. and the only reviews I have seen for the 660 3gb have shown it to actually be a wee bit slower than the 2gb model. maybe due to more latency from having just a 192bit bus trying to access more memory chips.

Perhaps so. He was asking about SLI so that up's the game a bit. I honestly don't care what anyone buys, I'm just pointing out the [H] review showed the recommended video card memory of 3GB for BF4, which is unheard of previously. I don't think that recommended spec is going to go down in the future with the new Xbone and PS4 having a lot more video RAM capacity.

I can say from experience running NVSurround that when you max out your video memory you'll go from playable to slideshow almost instantly. Happened to me in Star Wars and World of Warcraft MMOs when I had my old 1GB 460s in SLI. That was a pretty decent vRAM size at the time.

There's no way my next cards will be less than 4GB.
 
Perhaps so. He was asking about SLI so that up's the game a bit. I honestly don't care what anyone buys, I'm just pointing out the [H] review showed the recommended video card memory of 3GB for BF4, which is unheard of previously. I don't think that recommended spec is going to go down in the future with the new Xbone and PS4 having a lot more video RAM capacity.

I can say from experience running NVSurround that when you max out your video memory you'll go from playable to slideshow almost instantly. Happened to me in Star Wars and World of Warcraft MMOs when I had my old 1GB 460s in SLI. That was a pretty decent vRAM size at the time.

There's no way my next cards will be less than 4GB.
again the recommended card will also be faster than a 660. they are clearly referring to the 7950 or 7970 as they are working with AMD on the game of course.

but yeah for SLI then of course I would get the 3gb model.
 
a 660 has no where near the power of a card that will utilize more than 2 gb of vram. even at the maximum possible settings in games the highest I have seen is 1800mb in Crysis 3 and it was completely unplayable way before that anyway. and all games are well below 1500mb for settings that are smooth especially those that will give 60 fps. so unless BF 4 needs that vram for just high res textures then it will go to waste. and the only reviews I have seen for the 660 3gb have shown it to actually be a wee bit slower than the 2gb model. maybe due to more latency from having just a 192bit bus trying to access more memory chips.

In fact the real problem of the 660 ad 660TI its tue 192bit Bus.. that bus its not multiple of 2GB, so in short terms the 660TI its basically a 1.5GB VRAM (or fully 3GB) that due to the asynchronous memory controller can access to extra 512mb, so 1536mb work at fully 192bit potential, the other 512mb work at 64bits, so the bandwidth slowdown like a rock to 48GB/s. being a big bottleneck to the rest of the bus thats why the normal usage of vRAM in 660 and 660TI never exceed 1.5GB.. and generally allow up to 1.65GB without decrease the performance, from thatpoint ever y mb extra will slowdown the performance badly the full potential its within 1.6GB of Vram. 6008*192/8 = 144192mb/s 6008*64/8=48064mb/s.. thats it.. a 660TI 3GB could and will use more correctly and efficiently 3GB than 2GB..but we have to agreed that does not have the enough power for that, and thats why games like crysis 3 that pass from 1.65GB will suffer badly from those extra 1.8GB of vRAM usage.
 
In fact the real problem of the 660 ad 660TI its tue 192bit Bus.. that bus its not multiple of 2GB, so in short terms the 660TI its basically a 1.5GB VRAM (or fully 3GB) that due to the asynchronous memory controller can access to extra 512mb, so 1536mb work at fully 192bit potential, the other 512mb work at 64bits, so the bandwidth slowdown like a rock to 48GB/s. being a big bottleneck to the rest of the bus thats why the normal usage of vRAM in 660 and 660TI never exceed 1.5GB.. and generally allow up to 1.65GB without decrease the performance, from thatpoint ever y mb extra will slowdown the performance badly the full potential its within 1.6GB of Vram. 6008*192/8 = 144192mb/s 6008*64/8=48064mb/s.. thats it.. a 660TI 3GB could and will use more correctly and efficiently 3GB than 2GB..but we have to agreed that does not have the enough power for that, and thats why games like crysis 3 that pass from 1.65GB will suffer badly from those extra 1.8GB of vRAM usage.
the mixed memory has ZERO real world performance penalty. performance does not drop off at 1.5gb on my 2gb gtx660 ti just like it did not drop off at 768mb on my 1gb gtx560 se. in fact after using the 192 bit gtx560 se and testing the crap out of it I knew first hand there was no issue there. and if you paid attention to what else i said earlier you would see that the 3gb 660/660ti cards are no faster and can actually be slower.
 
Heavy sensitive bandwidth games (like crysis 3, hitman: absolution, farcry 3, etcetcetc) will for sure have a GREAT penalty in performance, if you are compare3GB 660TI vs 2GB 660TI both using 1GB of vRAM in a soft texture games or nonsensitive bandwidth, fine you got the point.. If you compare a 660TI 2GB using 1.8GB of vRAM and other 660TI 3GB using 1.8GB of vRAM (because it will for sure use even more if need for example crysis 3) then you will see the real impact in the game.. Thats why the 660TI receive a nice bump in performance overclocking the memory speed more than other cards with plenty of bandwitdh and fully matched memory controllers... If you have the oportunity or have some friend with a 660TI 3GB test it in crysis 3 and you will understand what happens with the limited 48GB/s bandwidth over 1.5GB of vRAM. I say this because i was able to test in a great amount of games with a friend on my machine. Same day, same machine, same drivers, same games, same test over and over and over.. I repeat Bandwitdh sensitive games show a drop in performance compared to 660TI 3GB.
 
again the there is NO drop of going past 1.5 gb in anything. what you are claiming is nothing but theory and not one single test has proven it. I on the other hand have had TWO cards with mixed memory and tested them both. it was easy to test the 1gb 192 bit gtx560 since most games went over 768mb easily. scaling did not change at all past 768mb. in fact when I oced it to come up with basically the same overall raw numbers as the plain gtx560, it matched it exactly in benchmarks. I spent hours testing that card which is why i had no concerns getting the 660ti. I also compared it to my 670 too and again nothing abnormal. any mixed memory issues do not impact real world performance in the least.
 
Last edited:
Hammer, did you end up making a purchase yet? Since you're set on Sli and I assume you'll have it for awhile, have you looked at 3Gb 660s? You might be able to price match them if you can find some cheaper online. I say this because Nvidia is supposed to drop their prices significantly soon on lower end cards. Ie: Gtx 660 @ $150 or so
In Sli I'd bet the 660s have enough power to have playable fps with over a 2gb framebuffer as newer games come out. If I could re buy my cards I'd go 3gb price permitting...



Not trying to be daft here but in the last few words, what did you mean "Sli is a non-starter"?
Game on in Bf4!

Because SLI would require a new PSU, which would in turn play hob with my build budget. I can justify a single GPU at up to $200USD; however, two (and a new PSU) won't fly.

Also, there is another reason why SLI isn't in the cards - I tend to pass down, not sell, my used hardware (including GPUs) - and nobody is set up for SLI. And I've never passed down an entire rig at once.

Lastly, only BF4 and NFS Rivals will be out this year - and I will NOT be purchasing BF4 at launch. No other game I am even remotely interested in for PC will be out until, at the earliest, the spring of 2014. That means that the rest of my gaming time will be taken up with what I already have.
 
In fact the real problem of the 660 ad 660TI its tue 192bit Bus.. that bus its not multiple of 2GB, so in short terms the 660TI its basically a 1.5GB VRAM (or fully 3GB) that due to the asynchronous memory controller can access to extra 512mb, so 1536mb work at fully 192bit potential, the other 512mb work at 64bits, so the bandwidth slowdown like a rock to 48GB/s.

Please, just stop. That's not how it works.

My PNY 660 just arrived yesterday!
 
Please, just stop. That's not how it works.

My PNY 660 just arrived yesterday!

CONGRATULATIONS your PNY 660 Just arrived yesterday..! SO?.

if you wanna please, just stop me, stop me with BASES!. with proof, and with sources.. as i can STOP you with This Fast, realiable source and proof,(if you wanna more, just tell me i have looooot of them) if you don't know about GPU architecture, please do not talk. I was talking with Cannondale, why? because he know what he talking about. you? you don't know.

thanks;)..!
 
CONGRATULATIONS your PNY 660 Just arrived yesterday..! SO?.

Thanks. It's a gift for my parents.

if you wanna please, just stop me, stop me with BASES!. with proof, and with sources.. as i can STOP you with This Fast, realiable source and proof,(if you wanna more, just tell me i have looooot of them) if you don't know about GPU architecture, please do not talk. I was talking with Cannondale, why? because he know what he talking about. you? you don't know.

thanks;)..!
I'm sorry, but that page doesn't have any supporting evidence. All they suggest is that the memory is not balanced in capacity per channel. It reveals nothing about the hardware's actual striding / striping strategy for minimizing performance impact. Do you really think nv would design something that suddenly tank into 33% performance for the last 25% of vram? Wake up, it would have been reported by now in actual benchmark testing. Your suggested performance issue doesn't exist due to clever architects.

"In the past we’ve tried to divine how NVIDIA is accomplishing this, but even with the compute capability of CUDA memory appears to be too far abstracted for us to test any specific theories. And because NVIDIA is continuing to label the internal details of their memory bus a competitive advantage, they’re unwilling to share the details of its operation with us. Thus we’re largely dealing with a black box here, one where poking and prodding doesn’t produce much in the way of meaningful results."
 
Because SLI would require a new PSU, which would in turn play hob with my build budget. I can justify a single GPU at up to $200USD; however, two (and a new PSU) won't fly.
Fair enough. I misread your first post when you said between two cards, thinking you wanted two cards :eek: I assumed (wrongly?) that your psu also had two 6pin PEG connectors, which with your rig would power two 660s.
Just the same though, if it does have two 6pins, maybe a 660ti? I've seen a few with rebate under $200. A good deal more power and a lot more overclocking headroom vs vanilla 660.
Keep us posted!
if you wanna please, just stop me, stop me with BASES!. with proof, and with sources.. as i can STOP you with This Fast, realiable source and proof,(if you wanna more, just tell me i have looooot of them) if you don't know about GPU architecture, please do not talk. I was talking with Cannondale, why? because he know what he talking about. you? you don't know.
thanks;)
Xorbe & Cannondale were saying essentially the same thing. You are correct with your numbers about the mixed memory, but wrong about the performance loss that accompanies the architecture.
Not trying to be rude, but you may need to work on your English language comprehension Araxie as well. Cheers!
 
Fair enough. I misread your first post when you said between two cards, thinking you wanted two cards :eek: I assumed (wrongly?) that your psu also had two 6pin PEG connectors, which with your rig would power two 660s.
Just the same though, if it does have two 6pins, maybe a 660ti? I've seen a few with rebate under $200. A good deal more power and a lot more overclocking headroom vs vanilla 660.
Keep us posted!

Xorbe & Cannondale were saying essentially the same thing. You are correct with your numbers about the mixed memory, but wrong about the performance loss that accompanies the architecture.
Not trying to be rude, but you may need to work on your English language comprehension Araxie as well. Cheers!

Nyet - just the one six-pin PSU plug (Diablotek 600EL - which is barely suitable for a single GPU with a six-pin PSU plug). If I manage to get a decent 750W PSU at a decent price (Corsair AX750 or PCP&C Silencer 750) then - and only then - could I safely reconsider. However, that is down the road yet.
 
Back
Top