GTX 670 or 7950

COD4 was a console game made for console kids. It was ported to PC, and they then REHASHED the same shit over and over again every single year...

United Offense was the last release of the "good COD's". The "good COD's" include COD 1, 2, United Offense and you can stop there. COD 4 was built on a shitty, watered down Q3 engine and sucked ass. The single player campaign was horrible, so was the entire game itself when rounded up.

And I don't have good reflexes? Right, which should be why I play fuckin Painkiller MP, QuakeWorld and Adrenalie Gamer...

do you even have any idea what AG is?

oh and lol at that vid. Fuck that shit. If you were THAT good, you'd be watching THIS video instead:

http://www.youtube.com/watch?v=kw8ufsXHLw8
Sorry if I jumped to conclusions of you, u just sounded like the typical COD hater because they lack the skills to become good at it or play in the wrong servers.

I know all about Painkiller, Quake and AG, just never got into them, got addicted to Novalogic games like Delta Force. I really enjoy watching Quake vids, a few of the older guys in my squad are hardcore quake players. Some of these guys have crazy skills, these are my favorites....
https://www.youtube.com/watch?v=YiX9d_j1Xao
https://www.youtube.com/watch?v=RxVmWaprNDY * good HD quality
https://www.youtube.com/watch?v=cRV0tclxXMQ
https://www.youtube.com/watch?v=fDz0deTBUQM

Oh and 670>7950 :)
 
Last edited:
If you want the better price for performance, get the 7950.

If you want adaptive vsync and physx because you feel the added cost is worth those over the 7950, get the 670.
 
If you want adaptive physx because you feel the added cost is worth those over the 7950, get the 670.

PhysX is really a moot point with a single card as it bogs down your performance and causes instant slowdowns at massive.

It's only useful when you have a dedicated GPU to render PhysX calculations only.

Even then, there's little to no use for a dedicated PhysX card. Only a handful of games use PhysX. The amount of PhysX games is so low there's absolutely no point in bothering with it.

PhysX isn't getting much support anymore and is going to die once the next gen of consoles come out and the devs STOP making Unreal 3 Engine games. At this rate it's nothing more than a cheap gimmick like 3D gaming.
 
PhysX is an absolutely moot point with a single card as it bogs down your performance and causes instant slowdowns at massive.

It's only useful when you have a dedicated GPU to render PhysX calculations only.
that is simply not true. only in BL 2 is the physx really demanding and even then I stay pegged at 60 fps for 99.9% of the game. I even went and tested in the most demanding parts when i got my gtx660ti to see how it compared to the gtx560 and lowest I saw was in the 50s and 40fps. and that only happened for a split second on few occasions. it was not even really noticeable when using adaptive vsync if I was not looking at the fps indicator. in all other games physx not even an issue at all on a gtx570 nevermind a gtx670.
 
that is simply not true. only in BL 2 is the physx really demanding and even then I stay pegged at 60 fps for 99.9% of the game. all other games physx not even an issue at all on a gtx570 nevermind a gtx670.

If you crank up everything to the max including nVidia CP and nHancer settings, then even a 670 won't be able to max it out all without slowing down.

I'm thinking of myself as that's my way of doing it.
 
If you crank up everything to the max including nVidia CP and nHancer settings, then even a 670 won't be able to max it out all without slowing down.

I'm thinking of myself as that's my way of doing it.
again your comment about needing dedicated card in games with physx was wrong.
 
again your comment about needing dedicated card in games with physx was wrong.

It's not. A 480 completely gets bogged down in Mafia 2 and adding up a dedicated card helps at large, especially at upping the min. FPS.

You don't NEED one, for sure. But a dedicated card helps much that there almost isn't a point in running PhysX + GFX on single card.
 
It's not. A 480 completely gets bogged down in Mafia 2 and adding up a dedicated card helps at large, especially at upping the min. FPS.
I tested Mafia 2 and was going to include it in my gtx570 to gtx670 comparison but did not bother because it NEVER went below 60 fps even on the gtx570. that was in the actual game on all max settings at 1920x1080. there are a few unoptimized spots that have to do with clothing using too much physx but even those ares were not an issue plus they can be edited in the config. AA in that game is probably more demanding than physx is the bigger framerate killer.

same goes for the other games with physx out there. using FXAA or 2x MSAA in Batman AC and the only time the fps drop is because of the game hitching from loading new areas especially in DX11. Batman AC is the second most physx demanding game out there behind BL 2.
 
I tested Mafia 2 and was going to include it in my gtx570 to gtx670 comparison but did not bother because it NEVER went below 60 fps even on the gtx570. that was in the actual game on all max settings at 1920x1080. there are a few unoptimized spots that have to do with clothing using too much physx but even those ares were not an issue plus they can be edited in the config. AA in that game is probably more demanding than physx is the bigger framerate killer.

Yeah, I was referring to the Mafia bench.

AA isn't a bigger framerate killer than PhysX though as Mafia 2 uses APEX PhysX intructions which are pretty damn heavy. Yes, AA is a giant framekiller in Mafia, but not as much as PhysX, no.

If you check this out, you can see there're some pretty GIANT gains to be had from a dedicated PhysX card, even in Batman: AA:

Batman%20Stacked%20Results%2016xQ.png
 
Yeah, I was referring to the Mafia bench.

AA isn't a bigger framerate killer than PhysX though as Mafia 2 uses APEX PhysX intructions which are pretty damn heavy. Yes, AA is a giant framekiller in Mafia, but not as much as PhysX, no.

If you check this out, you can see there're some pretty GIANT gains to be had from a dedicated PhysX card, even in Batman: AA:

http://p0wifg.bay.livefilestore.com...FcphZmM0QV1FQ/Batman Stacked Results 16xQ.png


again what you said about physx games needing a dedicated gpu or it would bog down is wrong. BL 2 is the most demanding physx game out there and the only one where I dedicated physx card would be needed to keep framerate above 60 fps. even with a single gtx660ti though it plays just fine as I already mentioned.

and here I just tested Mafia 2 for a LOOONG time on max settings and high physx at 1080 with my modest gtx660ti. not once did it bog down and if it would have then I would get rid of the crappy AA cause it does nothing in this game anyway.

Frames, Time (ms), Min, Max, Avg
84843, 828724, 66, 151, 102.378
 
Last edited:
again what you said about physx games needing a dedicated gpu or it would bog down is wrong.

here I just tested Mafia 2 for LOOONG time on max settings and high physx at 1080 with my modest gtx660ti.

Frames, Time (ms), Min, Max, Avg
84843, 828724, 66, 151, 102.378

In no line I said PhysX games "NEED" a dedicated card, but they DO bog down. Look at those results I posted and analise the chart. Even 260 Tri-SLi (which is easily faster than a 570) has some pretty big, significant gains from a dedicated card.

As for your test of Mafia, as you also said before, there're PhysX-heavy areas IN-GAME that, no matter which GPU you run it with, cause slowdows unless you have a dedicated PhysX card.
 
In no line I said PhysX games "NEED" a dedicated card, but they DO bog down. Look at those results I posted and analise the chart. Even 260 Tri-SLi (which is easily faster than a 570) has some pretty big, significant gains from a dedicated card.

As for your test of Mafia, as you also said before, there're PhysX-heavy areas IN-GAME that, no matter which GPU you run it with, cause slowdows unless you have a dedicated PhysX card.
okay I will make it simple, the only spots that bog down in any physx game with a gtx670 will bog down no matter what.
 
If you crank up everything to the max including nVidia CP and nHancer settings, then even a 670 won't be able to max it out all without slowing down.

I'm thinking of myself as that's my way of doing it.

So, what you're really trying to say is that you are an AMD fanboi and everyone should stay away from nVidia since they offer more usable features with their products, so fuck them? :p

If maxing out everything to the extreme was the sole intent here, why would a single 7950 or 670 even be the consideration here? A 670 with high FXAA setting and maxed PhysX settings would work just fine for 99% of games while pleasing a large percentage of people, imo...but that does come with a performance hit and a higher price tag.

If the added (and usable) features of adaptive vsync and physx don't matter to you, it doesn't make them moot or worthless features to everyone. It's up to each and every one of us to guage the value of the features and determine if the added price of (for the subject of this thread) the 670 is worth it over the 7950.

If not, then the 7950 is the clear winner. It's is that simple. Both products are outstanding performers. Don't argue that point with extreme examples to slightly discredit one and idolize the other.
 
okay I will make it simple, the only spots that bog down in any physx game with a gtx670 will bog down no matter what.

No, it won't. If you have a 670 with a heavily OC'ed 680 for PhysX (in theory), then you won't get bogged down.

What bogs down is the graphical + PhysX rendering at the same time. Most the time it's the graphics that require all the GFX power available, but when HEAVY PhysX get in action, the card can't render both consequently at high FPS, therefore, it slows down and significantly lags.

PhysX is such shitty coded, cookie-cutter, bogus tech. It was never implemented in sufficient games to be worthwhile.

So, what you're really trying to say is that you are an AMD fanboi and everyone should stay away from nVidia since they offer more usable features with their products, so fuck them? :p.

Yeah, which is why I got a 470 Quad setup and about 2000 posts over eVGA...
 
No, it won't. If you have a 670 with a heavily OC'ed 680 for PhysX (in theory), then you won't get bogged down.

What bogs down is the graphical + PhysX rendering at the same time. Most the time it's the graphics that require all the GFX power available, but when HEAVY PhysX get in action, the card can't render both consequently at high FPS, therefore, it slows down and significantly lags.

PhysX is such shitty coded, cookie-cutter tech.
sorry but I have spent way more time testing this then you have. I have tested the crap out of physx across multiple cards and setups and know first hand what to expect in game. if it bogs down with a gtx670 then it is just some unoptimized spot much like GTA 4 bogging down in spots no matter what cpu is used. yes a dedicated card could help out in those rare spots but is that worth it at all? and again many of those spots will still bog down no matter what anyway.
 
sorry but I have spent way more time testing this then you have. I have tested the crap out of physx across multiple cards and setups and know first hand what to expect in game. if it bogs down with a gtx670 then it is just some unoptimized spot much like GTA 4 bogging down in spots no matter what cpu is used. yes a dedicated card could help out in those rare spots but is that worth it at all? and again many of those spots will still bog down no matter what anyway.

I've also tested various configs. 285's, 470's, a 570 and even an AGEIA PPU with an AMD card, since 2009.

How the hell do you even know how much I tested? Yeah, you def. know better than I do buddy!

The CPU of course doesn't mean shit. It's the GPU and has always been ALL about GPU as we're talking about HW PhysX acceleration on nVidia cards.

A dedicated card helps in those spots A LOT. But yes, you're correct. There're some shitty coded spots that slow down no matter what card you have.

I think the point we both agree here is that, PhysX is NOT worth bothering because so many little amount of games use it. :)
 
Yeah, which is why I got a 470 Quad setup and about 2000 posts over eVGA...

Sarcasm, my friend, sarcasm...

How does your quad 470 stack up against a dual 670, out of curiosity? I've never seen such a comparison.
 
I've also tested various configs. 285's, 470's, a 570 and even an AGEIA PPU with an AMD card, since 2009.

How the hell do you even know how much I tested? Yeah, you def. know better than I do buddy!

The CPU of course doesn't mean shit. It's the GPU and has always been ALL about GPU as we're talking about HW PhysX acceleration on nVidia cards.

A dedicated card helps in those spots A LOT. But yes, you're correct. There're some shitty coded spots that slow down no matter what card you have.

I think the point we both agree here is that, PhysX is NOT worth bothering because so many little amount of games so it. :)
the cpu most certainly matters and that just goes to show how much you know on the subject. using hardware physx puts a bigger load on the whole system not just the gpu. yes the gpu is the main focus but I did scaling tests on different cpu speeds with my core 2 back when Batman AA came out and yes the cpu matters more when hardware physx is used. there was an article about it mattering with Mafia 2 also so I will see if I can find that but then i am done here.

the last thing I am going to do is argue anymore about this. I know what many hours and hours of testing has shown and I am not going to repeat all of that. I am not bogging down at all so no problems on my end anyway.
 
Sarcasm, my friend, sarcasm...

How does your quad 470 stack up against a dual 670, out of curiosity? I've never seen such a comparison.


They (unfortunately) don't Quad. I'm running the fourth one for PhysX.

Triple 470 is more or less about as fast as 580 SLi.

the cpu most certainly matters and that just goes to show how much you know on the subject. using hardware physx puts a bigger load on the whole system not just the gpu. yes the gpu is the main focus but I did scaling tests on different cpu speeds with my core 2 back when Batman AA came out and yes the cpu matters more when hardware physx is used. there was an article about it mattering with Mafia 2 also so I will see if I can find that but then i am done here.

the last thing I am going to do is argue anymore about this. I know what many hours and hours of testing has shown and I am not going to repeat all of that. I am not bogging down at all so no problems on my end anyway.

I'm not saying you bog down or anything. I'm saying PhysX is NOT a worthwhile tech since only a handful of games use it. It's nothing more than a cheap gimmick at this rate, which is going to turn into completely dead tech once the next gen of consoles come out and the Unreal 3 Engine also (which was originally made for consoles) (finally) dies.

The CPU doesn't matter as much as it USED to back in the Core 2 Duo days. With modern quads, there's little to NO difference from an old i7 to an Ivy chip.

Those Core 2 Duo's sucked shit in Unreal 3 Engine games because they kept going out of processing power.

Upgrading my E8600 to an QX9650 took off ALL the slowdowns I had in the bonus battles of Batman: AA.

Unreal 3 Engine is not a CPU heavy engine like the Dunia of Far Cry 2. I DOUBT there's SIGNIFICANT difference from a Phenom 2 to Sandy Bridge on the Unreal Engine...
 
They (unfortunately) don't Quad. I'm running the fourth one for PhysX.

Triple 470 is more or less about as fast as 580 SLi.



I'm not saying you bog down or anything. I'm saying PhysX is NOT a worthwhile tech since only a handful of games use it. It's nothing more than a cheap gimmick at this rate, which is going to turn into completely dead tech once the next gen of consoles come out and the Unreal 3 Engine also (which was originally made for consoles) (finally) dies.

The CPU doesn't matter as much as it USED to back in the Core 2 Duo days. With modern quads, there's little to NO difference from an old i7 to an Ivy chip.

Those Core 2 Duo's sucked shit in Unreal 3 Engine games because they kept going out of processing power.

Upgrading my E8600 to an QX9650 took off ALL the slowdowns I had in the bonus battles of Batman: AA.

Unreal 3 Engine is not a CPU heavy engine like the Dunia of Far Cry 2. I DOUBT there's SIGNIFICANT difference from a Phenom 2 to Sandy Bridge on the Unreal Engine...
well you are sort of changing your argument now. only the user can decide if physx is worth it or not. the interactive smoke/fog in batman games was nice. the effects in BL 2 are really noticeable and add much more to the scene when things are going bonkers. I do agree that most of it is gimmicky though.

when I tested with my Core 2 it was still fairly high end chip. sure cpus have gotten much better but physx is also way more demanding than it was back then. they have made physx more cpu friendly though as you can actually run BL 2 with high physx on a good quad like the 2500k and any only a few parts of the game will slow way down. try running high physx in Batman AA though and its even slower overall even though there are way less effects.

of course to be clear what I was referring too earlier was cpu still being important when running hardware level effects on the gpu as intended. and depending on the exact game and settings, Unreal Engine 3 can most certainly be pretty cpu heavy.

and I dont why you think Core 2 duos sucked in Unreal Engine 3 games because that is the first I have ever heard of that.
 
well you are sort of changing your argument now. only the user can decide if physx is worth it or not. the interactive smoke/fog in batman games was nice. the effects in BL 2 are really noticeable and add much more to the scene when things are going bonkers. I do agree that most of it is gimmicky though.

when I tested with my Core 2 it was still fairly high end chip. sure cpus have gotten much better but physx is also way more demanding than it was back then. they have made physx more cpu friendly though as you can actually run BL 2 with high physx on a good quad like the 2500k and any only a few parts of the game will slow way down. try running high physx in Batman AA though and its even slower overall even though there are way less effects.

of course to be clear what I was referring too earlier was cpu still being important when running hardware level effects on the gpu as intended. and depending on the exact game and settings, Unreal Engine 3 can most certainly be pretty cpu heavy.

and I dont why you think Core 2 duos sucked in Unreal Engine 3 games because that is the first I have ever heard of that.

I'm not. Ever since the beginning of the PhysX discussion, I've been saying that PhysX is NOT worth it because so many LITTLE amount of games use it. You should not base your purchase on PhysX period. If the 7950 OC is faster than 670 OC, then one should NOT care about PhysX and get the 7950. That's the point.

I had the highest end C2D, E8600 at 4.4 so I know what I'm talking about.

UE 3 can of course be CPU heavy, but is most the time is NOT. Nowhere as CPU heavy as the Source Engine, Dunia or EGO.

Core 2 Duo's sucked because they offered sub-par perf. in comparison to quads since Unreal 3 Engine scaled extremely well over multiple cores (it was written for the multi core PS3 and ShitBox CPU's...).

The bonus levels of Arkham sucked shit on my 4.4 E8600. The 3.6 QX9650 took away all the slowdowns.

Dual cores, no matter how fast they're, just don't cut it anymore. They haven't been since 2010. BFBC2 came out in early 2010 and completely ENDED the dual-core era.
 
I'm not. Ever since the beginning of the PhysX discussion, I've been saying that PhysX is NOT worth it because so many LITTLE amount of games use it. You should not base your purchase on PhysX period. If the 7950 OC is faster than 670 OC, then one should NOT care about PhysX and get the 7950. That's the point.

I had the highest end C2D, E8600 at 4.4 so I know what I'm talking about.

UE 3 can of course be CPU heavy, but is most the time is NOT. Nowhere as CPU heavy as the Source Engine, Dunia or EGO.

Core 2 Duo's sucked because they offered sub-par perf. in comparison to quads since Unreal 3 Engine scaled extremely well over multiple cores (it was written for the multi core PS3 and ShitBox CPU's...).

The bonus levels of Arkham sucked shit on my 4.4 E8600. The 3.6 QX9650 took away all the slowdowns.

Dual cores, no matter how fast they're, just don't cut it anymore. They haven't been since 2010. BFBC2 came out in early 2010 and completely ENDED the dual-core era.
sorry but my E8500 at 3.8 never once caused any slow downs in any UE 3 games. in my testing, I had to drop to 2.0 for there be a cpu limitation in Batman AA so you had other issue there. my gtx260 would have been the limitation in any UE 3 game even if the engine did scale well with 4 cores. heck I cant even think of a UE 3 game back then where either of those were really a limitation.
 
sorry but my E8500 at 3.8 never once caused any slow downs in any UE 3 games. in my testing, I had to drop to 2.0 for there be a cpu limitation in Batman AA so you had other issue there. my gtx260 would have been the limitation in any UE 3 game even if the engine did scale well with 4 cores. heck I cant even think of a UE 3 game back then where either of those were really a limitation.

That's your issue there. I had 285 SLi with E8600 and getting a quad was much needed and broke Alice free.
 
That's your issue there. I had 285 SLi with E8600 and getting a quad was much needed and broke Alice free.
well running SLI of course puts more stress on a cpu and Alice is a much newer much more demanding game than Batman AA and the other games I was referring too. Core 2 duo may struggle now compared to newer cpus but it most certainly did not back then.
 
well running SLI of course puts more stress on a cpu and Alice is a much newer much more demanding game than Batman AA and the other games I was referring too. Core 2 duo may struggle now compared to newer cpus but it most certainly did not back then.

Well, for me, with 285 SLi, it did. There were tests done back in the day showing MASSIVE gains with an i7 over a Core 2 Duo, even with mid end, single GPU setups.

I'd say, the release of the i7 is by far one of the biggest technologies advancements on the enthuasiest front. It completely removed the CPU bottleneck out of equatition.

I've had MANY CPU slowdowns, that E8600 bottlenecked a setup like 285 SLi big time. Slowdowns in Batman's bonus battles, in GRID or DIRT 2's vehicle-heavy areas and hell, EVEN in F.E.A.R, which is %100 a GPU intensive, old game that doesn't give shit about CPU.

By "breaking Alice free", I was making a reference to Alice in Chains actually. Not the game Alice. :D :p
 
Well, for me, with 285 SLi, it did. There were tests done back in the day showing MASSIVE gains with an i7 over a Core 2 Duo, even with mid end, single GPU setups.

I'd say, the release of the i7 is by far one of the biggest technologies advancements on the enthuasiest front. It completely removed the CPU bottleneck out of equatition.

I've had MANY CPU slowdowns, that E8600 bottlenecked a setup like 285 SLi big time. Slowdowns in Batman's bonus battles, in GRID or DIRT 2's vehicle-heavy areas and hell, EVEN in F.E.A.R, which is %100 a GPU intensive, old game that doesn't give shit about CPU.

By "breaking Alice free", I was making a reference to Alice in Chains actually. Not the game Alice. :D :p
I played FEAR on a stock 5000 X2 and even single core 3200 A64 and never had any slowdowns because of the cpu. you must have had some sli overhead issue if you think your E8500 at 4.4 was the cause of slowdowns. heck even one core of your E8500 should have kept FEAR way over 60 fps.

and you can see here that even wimpy single cores from 10 years ago had no issue with FEAR. http://www.firingsquad.com/hardware/fear_cpu_performance/page5.asp
 
I played FEAR on a stock 5000 X2 and even single core 3200 A64 and never had any slowdowns because of the cpu. you must have had some sli overhead issue if you think your E8500 at 4.4 was the cause of slowdowns. heck even one core of your E8500 should have kept FEAR way over 60 fps.

and you can see here that even wimpy single cores from 10 years ago had no issue with FEAR. http://www.firingsquad.com/hardware/fear_cpu_performance/page5.asp

Having "no" issue isn't the same as running the game with everything maxed out at 2048x1536 and with all nHancer settings turned on.

The Wolfdale chip KEPT back my 285's performance. When a lot of shit exploded in MP, I had slowdowns solely due to CPU and NOT GPU.

A dual core is a NO GO with such setup. 285 SLi is only a few frame behind a 580. And just as, if not faster than a stock 580 once OC'ed. If you do know this, then you'd also know that pairing a 580 up with a dual core is a BIG FAT NO-GO.
 
so again it was your setup not the cpu itself. a Core 2 duo at 4.4 will laugh at those games and would never be a noticeable limitation. also again even onc core of that cpu would never cause a slowdown in FEAR unless there was another issue. please stop saying that the Core 2 duos sucked back then because that is flat out not true.
 
so again it was your setup not the cpu itself. a Core 2 duo at 4.4 will laugh at those games and would never be a noticeable limitation. also again even onc core of that cpu would never cause a slowdown in FEAR unless there was another issue. please stop saying that the Core 2 duos sucked back then because that is flat out not true.

It HAS been a noticeable limitation because I run a CRT at high HZ and when I got rid of that skinny chip and replaced it with a beastly QX9650, ALL my slowdowns went away.

Core 2 Duo's didn't flat out "sucked". They "sucked" for over decent setups like my setup, or sucked when compared to quads in games that scale well over multiple cores.

Look at this:

http://www./charts/index.php?pid=63,70&tid=5

Franky, to say that an E8600 is NOT a limitation for a 580 is flat out ignorant and silly.
 
COD4 was a console game made for console kids. It was ported to PC, and they then REHASHED the same shit over and over again every single year...

United Offense was the last release of the "good COD's". The "good COD's" include COD 1, 2, United Offense and you can stop there. COD 4 was built on a shitty, watered down Q3 engine and sucked ass. The single player campaign was horrible, so was the entire game itself when rounded up.

And I don't have good reflexes? Right, which should be why I play fuckin Painkiller MP, QuakeWorld and Adrenalie Gamer...

do you even have any idea what AG is?

oh and lol at that vid. Fuck that shit. If you were THAT good, you'd be watching THIS video instead:

http://www.youtube.com/watch?v=kw8ufsXHLw8

Hipster COD players "Before the game went mainstream.. and got ruined" :rolleyes:

Right. COD4 was garbage... sure. I would agree that everything after MW2 however is pretty much garbage.
 
Roughly 2 fps difference in that game. Looks like a toss up in some games when you consider overclocking.

If you look at how the overall shape of the RED cross goes, it keeps at 90 FPS when the blue cross goes as low as 40 FPS. The consistancy of the 7950 performance is superior over that of the 670's, the red cross is almost always significantly above the blue one, sometimes, it's TWICE as high, which means the 7950 offers significantly "snappier" performance.

Hipster COD players "Before the game went mainstream.. and got ruined" :rolleyes:

Right. COD4 was garbage... sure. I would agree that everything after MW2 however is pretty much garbage.

Indeed. The game DID go mainstream and got fucked up.

Everything "after MW2" is more or the the same as MW1. They released the same game like what, 10 times now?
 
It HAS been a noticeable limitation because I run a CRT at high HZ and when I got rid of that skinny chip and replaced it with a beastly QX9650, ALL my slowdowns went away.

Core 2 Duo's didn't flat out "sucked". They "sucked" for over decent setups like my setup, or sucked when compared to quads in games that scale well over multiple cores.

Look at this:

http://www./charts/index.php?pid=63,70&tid=5

Franky, to say that an E8600 is NOT a limitation for a 580 is flat out ignorant and silly.
again you keep changing your arguments up a bit as you go. of course a gtx580 would be held back by a Core 2 duo and I never said it would not. if you actually knew me then you would not I am the first person to mention cpu limitations around here. AGAIN though a Core 2 duo would NOT be causing noticeable slowdowns in really old games especially something like FEAR when even a Pentium 4 could run it at 60 fps. your issue were on your end and you had slowdowns that people with much less than half the processing and gpu power did not incur.
 
If you look at how the overall shape of the RED cross goes, it keeps at 90 FPS when the blue cross goes as low as 40 FPS. The consistancy of the 7950 performance is superior over that of the 670's, the red cross is almost always significantly above the blue one, sometimes, it's TWICE as high, which means the 7950 offers significantly "snappier" performance.

And yet the red spikes are the most often to sink the farthest down. I agree the 7950 is a great performing card but if anyone could point out an obvious HUGE difference?

It doesn't appear that buying either card is a mistake at this point.
 
And yet the red spikes are the most often to sink the farthest down. I agree the 7950 is a great performing card but if anyone could point out an obvious HUGE difference?

It doesn't appear that buying either card is a mistake at this point.


Not a huge difference, fo sho. It's flipping coins.

And yes, you're right. Both are great cards however, the 7950 is slightly faster once OC'ed, and cheaper to boot.

again you keep changing your arguments up a bit as you go. of course a gtx580 would be held back by a Core 2 duo and never said it would not. AGAIN a Core 2 duo would NOT be causing noticeable slowdowns in really old games especially something like FEAR when even a Pentium 4 could run it at 60 fps. your issue were on your end and you had slowdowns that people with much less than half the processing and gpu power did not incur.

My point was the I was running extreme, over the top settings and my 285 SLi setup was being HELD DOWN even in F.E.A.R.

Even my F.E.A.R slowdowns went away when the E8600 was replaced with the QX9650.

Then again, I was running a CRT at 85 HZ so I required 85 FPS for consistant gameplay and NOT 60 FPS.
 
Roughly 2 fps difference in that game. Looks like a toss up in some games when you consider overclocking.

Don't forget to take into account that the 7950 in that chart is using a modest 1050/1250 overclock due to lack of voltage boost. Once you overvolt the 7950 expect another 125-200 MHz on the core and 300-500 on the VRAM. The GTX 670 and GTX 680 doen't have that luxury.
 
Don't forget to take into account that the 7950 in that chart is using a modest 1050/1250 overclock due to lack of voltage boost. Once you overvolt the 7950 expect another 125-200 MHz on the core and 300-500 on the VRAM. The GTX 670 and GTX 680 doen't have that luxury.

Yeah, I noticed that as well but didn't bother to say it since the red spikes of the 7950 are already significantly ON TOP of the blue spikes of the 670 pretty much the entire time. It easily offers better FPS between min and average.

The 670 is only better at stock like said many times before here or on the other threads, once both are max OC'ed the 670 can barely keep up about the same level of performance.
 
gtx670's are just plain overpriced right now... to be selling for the same price they were selling for last July/August is somewhat silly. I'd wait a few months for the next gen if you want to go nvidia. If you want to go AMD, the price is quite a bit more reasonable right now with the free game bundle especially. (To get a decent card with aftermarket cooling that is, not some stock card with the reference cooler)
 
If you want to go AMD, the price is quite a bit more reasonable right now with the free game bundle especially. (To get a decent card with aftermarket cooling that is, not some stock card with the reference cooler)

Actually, the reference 7950 with the stock cooler is an excellent card. Absolute best built quality with Volterra VRM, though, yes, the stock cooler isn't up there with custom. It's never been.

But, like I tried to say, the stock cooler on the 7950 is perfectly fine for all practical purposes as the 7950 is an efficient card and since AMD uses a rather effective Vapor Chamber cooler on the reference 7950's. They're pretty hard to find nowadays, though. Since the Volterra VRM on the reference card is the absolute best VRM and costs so much, they then change the cards design (do a second revision) with a cheaper VRM.
 
Back
Top