4850 or 4870 for me?

So basically, you're saying that an HD 4870 doesn't benefit, either. It's the same exact GPU, just underclocked.
well the 4870 has much more available bandwidth to work with so that plus the faster clocks allow it to take advantage of more than 512mb a little easier than the 4850.
 
So basically, you're saying that an HD 4870 doesn't benefit, either. It's the same exact GPU, just underclocked.

No, not at all. The 4870 has just under twice the memory bandwidth (63GB/s vs. 115GB/s). Memory bandwidth is already a slight bottleneck on the 4850 (a 4850 OC'd to 4870 clocks is slower than the 4870 by ~10% or so if I remember correctly). When the 4850 gets to the point that it needs more than 512mb VRAM, it will also need to access more memory, which means it will run headfirst into that memory bandwidth bottleneck that it is already flirting with. The 4870 can access 1GB of RAM in the time it takes the 4850 to access 512MB. That is a *huge* difference.
 
I dunno...i've seen when both cards (4850, 4870) are run at the same speeds, they bench about the same. Not sure just how much that DDR5 and the extra memory bandwidth is really worth. I'm reasonably sure, with a mild OC, you could run Fallout 3 on Ultra settings..with 2XAA and it would be playable with an HD 4850. As long as you put enough CPU power into it. I guess a 939 X2 wouldn't cut it. But a Q6600 @ 3.2 Ghz or E7200 @ 3.6 Ghz could get the most out of a single HD 4850. And it would probably use over 512 MB of VRAM @ 1680x1050 and above.

I could test it...but I don't feel like installing XP again. Regardless, no reason not to have over 512 MB VRAM for a [H]ard gamer these days...
 
I dunno...i've seen when both cards (4850, 4870) are run at the same speeds, they bench about the same. Not sure just how much that DDR5 and the extra memory bandwidth is really worth. I'm reasonably sure, with a mild OC, you could run Fallout 3 on Ultra settings..with 2XAA and it would be playable with an HD 4850. As long as you put enough CPU power into it. I guess a 939 X2 wouldn't cut it. But a Q6600 @ 3.2 Ghz or E7200 @ 3.6 Ghz could get the most out of a single HD 4850. And it would probably use over 512 MB of VRAM @ 1680x1050 and above.

I could test it...but I don't feel like installing XP again. Regardless, no reason not to have over 512 MB VRAM for a [H]ard gamer these days...

http://www.xtremesystems.org/forums/showthread.php?t=192690
Company of Heroes (page 4)
36267797qz2.png


Oh, and I play Fallout 3 on Ultra with 4xAA on my 4850 at 1280x1024, no problems. 'Tis silky smooth. (X2 @ 3ghz)
 
7.7% is more than I expected...
I always thought GDDR5 was a marketing ploy.
 
don't think it really does much since its on a 4850. the 4870 can use the extra memory better.
 
don't think it really does much since its on a 4850. the 4870 can use the extra memory better.

But seeing as the resolution is only 1680x1050 would the 1gb be and improvement over the 512mb? Not all of us can afford the 1gb 4870......
 
But seeing as the resolution is only 1680x1050 would the 1gb be and improvement over the 512mb? Not all of us can afford the 1gb 4870......

1650 x 1080 would see little to no improvement with 1GB of memory over 512MB's of memory.
 
1650 x 1080 would see little to no improvement with 1GB of memory over 512MB's of memory.
are you could be a complete moron and get the 2 GB 4850. lol :D http://www.newegg.com/Product/Product.aspx?Item=N82E16814131127

got to love the reviews for it..."Pros: This is a good card for High res because it has 2GB of MEM. I suggest this card to anyone who wants to play crysis at 1680x1050 or above at very high. It makes it smoother when you look around, it can load more into the MEM at a time."
 
So basically, you're saying that an HD 4870 doesn't benefit, either. It's the same exact GPU, just underclocked.

With roughly half the memory bandwidth as well. And actually the fastest single core CPU made by AMD was the FX-57 @ 2.8GHz...
 
So, back on topic.
The Sapphire 4850 1gb is quite cheap.
http://www.newegg.com/Product/Product.aspx?Item=N82E16814102802
It's not much more than the 512mb version, and the extra memory should keep everything a bit smoother, don't you think?

Personally I say keep the extra $40 and get the 512mb version, but in the end its your money. Make the call however you want. Flip a coin if you want. No one here will really be able to tell you how the two will compare in 6 months to a year, or if the 1GB will really be worth it in the end. Although one thing to note is that Sapphire doesn't have a very good reputation. Visiontek is the best ATI brand in terms of warranty and support afaik. My Sapphire 4850 works great and I love it, but it is one thing to take note of.

Fallout 3 isnt silky smooth on anybodys machine no matter what the framerate.

Joke? Sarcasm? I promise you, on my rig and my friends, Fallout 3 is silky smooth. I don't measure FPS to determine if its smooth, I just play it.
 
Joke? Sarcasm? I promise you, on my rig and my friends, Fallout 3 is silky smooth. I don't measure FPS to determine if its smooth, I just play it.
Im certainly not joking and its pretty well known that Fallout 3 has little hitches or stutters in it just like Oblivion. http://www.google.com/search?q=fall...s=org.mozilla:en-US:official&client=firefox-a


even right here on Hardocp "Overall, we experienced higher than expected framerates. But as we've said time and time again, that did not necessarily indicate an acceptable level of playability. The biggest performance problem with Fallout 3 is the issue of stuttering or "juddering".It produces a jerky quality in the gameplay experience, where it feels as though frames are being skipped.This problem is not reflected in the framerate counter, but it definitely happens, and it has a real detrimental effect on the gameplay experience. http://hardocp.com/article.html?art=MTU4NywzLCxoZW50aHVzaWFzdA==
 
Oh good, I thought I was the only one who noticed the stutters in FO3 and Oblivion.
I never play any games with vsync, but even with vsync enabled, the game still stutters.

I don't even think vsync helps much, since the game is capped to 60fps by default.
 
Oh good, I thought I was the only one who noticed the stutters in FO3 and Oblivion.
I never play any games with vsync, but even with vsync enabled, the game still stutters.

I don't even think vsync helps much, since the game is capped to 60fps by default.
yeah I cant believe kllrnohj said it was "silky smooth" for him. I would bet money his game is no smoother than anybody elses. ;)
 
fallout 3 runs fine on my 8800GTX. have not recalled any stuttering.
so you are actually saying that when walking outside you get absolutely no hitches or stuttering? I think you just dont notice it because its not like 99% of people got a bad version and you and a few others got the stutter free one. :p
 
My whole point in this thread is this: I've taken measurements with different games to see how much VRAM they use. Some games will use more than 512 MB of VRAM @ 1680 x 1050.

What any one person does with that information, is up to them.

Also, I played many hours of Fallout 3 with a GTX 260 and it ran pretty darn well I thought. I haven't played it much with my HD 4850 X2.. (what little I did play seemed pretty smooth)...I accidentally finished it too soon. I got caught up in the main story and I before I knew it, it was done....:confused: I bet I had a good 20-30 hours of side missions left. :(
 
If you don't see the stuttering then it's probably because you just don't notice it.
Not everyone is using the same pair of eyes!

It's really not as bad as the HardOCP makes it seem. It definitely wasn't "detrimental" to my game experience, I put 70 hours into FO3.

I didn't even realize it was a problem until it was mentioned in this thread.
 
If you don't see the stuttering then it's probably because you just don't notice it.
Not everyone is using the same pair of eyes!

It's really not as bad as the HardOCP makes it seem. It definitely wasn't "detrimental" to my game experience, I put 70 hours into FO3.

I didn't even realize it was a problem until it was mentioned in this thread.
its not really a problem but when someone calls Fallout 3 "silky smooth" like kllrnohj did then a little reality needs to be interjected.
 
yeah I cant believe kllrnohj said it was "silky smooth" for him. I would bet money his game is no smoother than anybody elses. ;)

I promise you, it is smooth. I absolutely *HATE* stuttering of any kind. Both Oblivion and Fallout 3 are very smooth on my system, as they both are on my friends system (9600GT)

so you are actually saying that when walking outside you get absolutely no hitches or stuttering? I think you just dont notice it because its not like 99% of people got a bad version and you and a few others got the stutter free one. :p

I highly doubt that 99% of the people see stuttering. I'm not saying that the problem isn't real, I'm saying it isn't on my system. 24,500 google results indicates that its a problem that affects a *small* portion of the users. Games play differently on different systems. There is probably a common culprit, and you and the [H] both have the same problem, but I assure you I do not.

If you don't see the stuttering then it's probably because you just don't notice it.
Not everyone is using the same pair of eyes!

More likely its because he didn't have the problem that is causing the stuttering, just like I didn't.
 
I promise you, it is smooth. I absolutely *HATE* stuttering of any kind. Both Oblivion and Fallout 3 are very smooth on my system, as they both are on my friends system (9600GT)



I highly doubt that 99% of the people see stuttering. I'm not saying that the problem isn't real, I'm saying it isn't on my system. 24,500 google results indicates that its a problem that affects a *small* portion of the users. Games play differently on different systems. There is probably a common culprit, and you and the [H] both have the same problem, but I assure you I do not.



More likely its because he didn't have the problem that is causing the stuttering, just like I didn't.

well when even Hard mentions it on top of all the users with high end systems then its likely there for most people. also Im pretty sure not every person with hitching or stuttering has made a comment or thread about it so the google result dont even come close to counting all the users with this issue.
 
well when even Kyle mentions it on top of all the users with high end systems then its likely there for most people. also Im pretty sure not every person with hitching or stuttering has made a comment or thread about it.

Kyle may be totally awesome, but that is still only one to ten rigs at most. Kyle's system(s) are still just a drop in the bucket, they aren't more important than anyone else's results (sorry Kyle ;) ). And I assume the "all of the users with high end systems" doesn't include the 3 of us that all just said we didn't have the stuttering problem on 5 different systems?

Just because you and others have the problem doesn't mean everyone does. And remember, no one ever makes a support thread to say that they aren't having any problems. The small minority with the problem will always be the most vocal.
 
Kyle may be totally awesome, but that is still only one to ten rigs at most. Kyle's system(s) are still just a drop in the bucket, they aren't more important than anyone else's results (sorry Kyle ;) ). And I assume the "all of the users with high end systems" doesn't include the 3 of us that all just said we didn't have the stuttering problem on 5 different systems?

Just because you and others have the problem doesn't mean everyone does.
it was actully Mark that said it so I had already edited out Kyles name. anyway I believe its there for everyone becasue its the games engine but whether you notice or not is a whole other issue.
 
it was actully Mark that said it so I had already edited out Kyles name. anyway I believe its there for everyone becasue its the games engine but whether you notice or not is a whole other issue.

Game engines behave differently on different systems, this has always been the case for every game ever (seeing as the game engine *is* the game ;) ). Do you know what the cause of the stuttering is? No? Then you really can't say that it is there for everyone. The only way it could be there for everyone is if it is a bug in the engine's algorithms, but I suspect the flaw lies in the combination of certain hardware or software (just like the X-Fi stuttering bug was only a problem on select nvidia chipsets).

If the bug DID exist for everyone then Bethesda would have fixed it. Bugs that occur on all hardware all the time are extremely easy to track down and fix.

Regardless, it doesn't really matter if you believe me and the others reporting that Fallout 3 is smooth, as I still get to enjoy its silky smoothness on my system :p
 
Game engines behave differently on different systems, this has always been the case for every game ever (seeing as the game engine *is* the game ;) ). Do you know what the cause of the stuttering is? No? Then you really can't say that it is there for everyone. The only way it could be there for everyone is if it is a bug in the engine's algorithms, but I suspect the flaw lies in the combination of certain hardware or software (just like the X-Fi stuttering bug was only a problem on select nvidia chipsets).

If the bug DID exist for everyone then Bethesda would have fixed it. Bugs that occur on all hardware all the time are extremely easy to track down and fix.

Regardless, it doesn't really matter if you believe me and the others reporting that Fallout 3 is smooth, as I still get to enjoy its silky smoothness on my system :p
they probably cant fix it completely although it does appear smoother than on Oblivion. it may not really be a bug but instead just the way the engine acts and yes maybe certain hardware makes it act up even more. I still say its there on some level for everybody but I would love to play it on your pc just to see. lol. anyway I wasnt trying to take this thread completely off track so we will just agree to disagree as they say. ;)
 
4850. OC it to 700. can't lose. with the way the economy is and new cards around the corner, get the
best bang for the buck, which is the 4850 easily.
 
Gah... I've gotta correct a few people here with some comments that might not bee too popular, I apologize in advance for the monster post.

no and you originally quoted me when I was directly responding to someone about their single core 3500. see all the confusion you caused for nothing. :p:D

also you are VERY WRONG thinking that dual cores dont matter and you need a BIG reality check. saying that a fast single core is all thats needed to run modern games is beyond silly especially for the newest titles like Far Cry 2.

here are a few games that really arent very cpu intensive but even in these you will see that the dual core helps.

in Bioshock at 1600x1200 and high quality even a slow dual core gets 50% better framerates then a fast single core Athlon. also that Athlon that is getting killed is stronger than his 3500 so you do the math.

http://www.gamespot.com/features/6177688/p-7.html

even worse results for a single core in CoD4 an max settings. again the Athlon 64 that is getting killed is faster than his 3500 cpu.
http://www.gamespot.com/features/6183967/p-5.html
Crysis at 1600x1200 and high quality the single core Athlon is still losing pretty bad. dont forget that his 3500 is slower than a Athlon 64 4000.
http://www.gamespot.com/features/6182806/p-6.html
World in Conflict at 1600x1200 and Very High Quality gets 10fps on Athlon 64 4000 which is less than half of the slowest Core 2 performance. Again remember that the A64 4000+ thats getting killed is faster than his 3500.
http://www.gamespot.com/features/6179006/p-7.html

here are some newer more cpu intensive games where a single core will not even get very playable framerates if at all.

a single core cpu cant really even get playable framerates in Far Cry 2.
http://www.pcgameshardware.com/aid,663817/Reviews/Far_Cry_2-_GPU_and_CPU_benchmarks/?page=2
single core 4000 A64 which is stronger than his 3500 cpu can only get 17fps in Warhead.[
http://www.pcgameshardware.com/aid,...ecial_and_general_info_about_the_game/?page=2
hey his single core cpu will only get him SINGLE digit fps in GTA 4. http://www.pcgameshardware.com/aid,...U_benchmark_review_with_13_processors/?page=2

here are a few of my own results with my GTX260. his 3500 cpu is about equal to one of my E8500 cores downclocked to 2.0 so his results would be close to these. this will give you an idea of how much better a fast dual core is compared to a relatively slow single core cpu like his.

E8500 @3.16 stock vs E8500 @2.0 single core(as strong or slightly stronger than his 3500 A64)


Call of Juarez 1680x1050 all high no AA
E8500 @3.16.... 61 fps
E8500 @2.00.... 27 fps
Fallout 3 1680x1050 max settings 4x AA
E8500 @3.16.... 59 fps
E8500 @2.00.... 28 fps
Crysis 1680x1050 DX10 high no AA
E8500 @3.16.... 37.25 fps
E8500 @2.00.... 12.27 fps
Far Cry 2 1680x1050 ultra settings 2x AA
E8500 @3.16.... 52.22 fps
E8500 @2.00.... 12.06 fps
turning down the settings will not really help in those newer games because its the cpu that is holding them back from getting good framerates.

This is really annoying. Please dont do it anymore.

Hardforum doesnt get its benchmarks from gamespot.com. Weather you agree with [H]'s testing methodology or not is irrelavent, get your stuff from a source that knows what they're doing.

Logically this makes sense and practically its provable:
A dual core with Core1 disabled at 4.0GHz will outrun a dualcore at 2.0GHz in every single situation you put it in, every single time. Plenty of single core processors still have plenty of juice to drive plenty of stuff. This forum is pretty OCD about upgrading with the average member upgrading every year or so I'd say. Normal people dont do that. Normal people go every 3 years and thats only if a nice PC-fluent guy doesnt come in and reformat for them (if they do its every 5 years).

Proof:

Fallout 3 1680x1050 8XAA 15XAF GTX 260 After 5 mins. of gameplay...




Crysis and Warhead will also go over 512 MB at that res. with higher shaders and objects.

If you're running newer games with high-ultra settings and have a GPU that can handle it, it's best to have over 512 MB of VRAM.

Unless Rivatuner is erroneously reporting vidmem usage...I can only comment on what i've tested and observed. Games run smoother with more VRAM.

An engine inspired driver crash occurs and you blame the Vram. I cant say more to this subject than that.

Ultra high settings arn't a prerequisate to "running the game at 1680X1050". Yes, with a 512MB on-board cache may run out of space with super-high-density textures, he may have to turn some settings down in some games, at that resolution. More to the point, Raise your hand if you thought Crysis, Crysis warhead, or Farcry 2 were good games.

And as many know, resolution isnt the enemy of Vram, texture density and AA levels are.

I dunno...i've seen when both cards (4850, 4870) are run at the same speeds, they bench about the same. Not sure just how much that DDR5 and the extra memory bandwidth is really worth. I'm reasonably sure, with a mild OC, you could run Fallout 3 on Ultra settings..with 2XAA and it would be playable with an HD 4850. As long as you put enough CPU power into it. I guess a 939 X2 wouldn't cut it. But a Q6600 @ 3.2 Ghz or E7200 @ 3.6 Ghz could get the most out of a single HD 4850. And it would probably use over 512 MB of VRAM @ 1680x1050 and above.

I could test it...but I don't feel like installing XP again. Regardless, no reason not to have over 512 MB VRAM for a [H]ard gamer these days...

Theres something I dont know about GDDR5 because everything I know says that that ram isnt at any point running at 900MHz, its running at 1800MHz SDR (3600 effective). The HD4870's 110GB/s memory (256bit @ 3600MHz) does come into play when you crank the AA as has been revealed by multiple websites (some even say the HD4870 has "free 8X AA" in many situations).

If you want to force the usefulness of the extra memory throughput to show itself, put it in a situation where it can show itself, crank the textures and increase the AA, tell me what your benchmarks say then. I'm betting you could get a 40% spread with the right setup. You'd know this if you took a look at [H]ard|OCPs tests.

Oh good, I thought I was the only one who noticed the stutters in FO3 and Oblivion.
I never play any games with vsync, but even with vsync enabled, the game still stutters.

I don't even think vsync helps much, since the game is capped to 60fps by default.

Vsync and stuttering have nothing to do with each other. V-sync is an option that forces the frambuffer to be filled before the image is displayed. With V-sync off, you may experiance tearing. With it on, you shouldnt. Nothing to do with stuttering.
NOTE: when measuring the FPS of a game with V-sync off, your not actually measuring the FPS, your measuring PARTIAL frames per second. With Vsync off in a high intensity area, the framebuffer will borrow pixels from the previous frame and use them, then starts construction from the bottom up.

And the Game isnt capped at 60FPS your monitor is capped at 60FPS. A pixel, like the physical Blue Red and Green dots on your monitor, can only change in colour (each of the three previously mentioned sub-channels chaning in intensity) so fast, usually 60FPS. LCD's are usually faster than 60 but 60 was a standard and it stuck. Video drivers know this and adjust your output signal as needed. Effectivly, any frame thats created when the ouput is above 60FPS is tossed. If anyones wondering thats what all this kerfuffle about 120Hz TV's is about, its taking your 24/30FPS source (DVD or TV signal) and upconverting it to 120FPS by taking the average colour (like a gradiant) of pixel changes in the two frames.
 
FO3 has a hard cap at 60fps unless you manually edit a config file to remove it, it's not vsync related.
 
And as many know, resolution isnt the enemy of Vram, texture density and AA levels are.

If you want to force the usefulness of the extra memory throughput to show itself, put it in a situation where it can show itself, crank the textures and increase the AA, tell me what your benchmarks say then. I'm betting you could get a 40% spread with the right setup. You'd know this if you took a look at [H]ard|OCPs tests.


Quote from the above [H]ardOCP link that you provided above: "We did try to crank up the AA setting even higher to 4X AA on the Sapphire Radeon HD 4870, but the game became too choppy for our tastes. We liken this more to the fact that the HD 4870 has 512 MB of RAM versus the GTX 280’s 1 GB (which was able to do 4X AA at this resolution) rather than memory bandwidth being a factor."


So what good is that extra memory bandwidth if you don't have enough VRAM to use it properly? Which is entirely what i've been saying....


Ultra high settings arn't a prerequisate to "running the game at 1680X1050". Yes, with a 512MB on-board cache may run out of space with super-high-density textures, he may have to turn some settings down in some games, at that resolution. More to the point, Raise your hand if you thought Crysis, Crysis warhead, or Farcry 2 were good games.

*raises hand for Warhead*
 
4850 plays far cry 2 fine and warhead just fine. Plus you won't have buyer's remorse after buying a gtx280 and realize you can't max out warhead.
 
Proof:

Fallout 3 1680x1050 8XAA 15XAF GTX 260 After 5 mins. of gameplay...

screenshot001rn7ba6.jpg



Crysis and Warhead will also go over 512 MB at that res. with higher shaders and objects.

If you're running newer games with high-ultra settings and have a GPU that can handle it, it's best to have over 512 MB of VRAM.

Unless Rivatuner is erroneously reporting vidmem usage...I can only comment on what i've tested and observed. Games run smoother with more VRAM

Thanks for this information! Interesting to see how even with resolution @ 1680 x1050 that it uses more than 512MB RAM. That sealed it for me. I'll need at least more than 512MB VRAM. Might as well get the 1GB 4850 (or if the price is right maybe a GTX260 or a 4870 HD 1GB if the pricing is pretty close)
 
Gah... I've gotta correct a few people here with some comments that might not bee too popular, I apologize in advance for the monster post.



This is really annoying. Please dont do it anymore.

Hardforum doesnt get its benchmarks from gamespot.com. Weather you agree with [H]'s testing methodology or not is irrelavent, get your stuff from a source that knows what they're doing.

Logically this makes sense and practically its provable:
A dual core with Core1 disabled at 4.0GHz will outrun a dualcore at 2.0GHz in every single situation you put it in, every single time. Plenty of single core processors still have plenty of juice to drive plenty of stuff. This forum is pretty OCD about upgrading with the average member upgrading every year or so I'd say. Normal people dont do that. Normal people go every 3 years and thats only if a nice PC-fluent guy doesnt come in and reformat for them (if they do its every 5 years).
actually I will get my results from where ever I want and some of the last set of benchmarks are actually FRAPS runs that I did myself. while youre at why dont you also remember that [H] doesnt waste their time with slow cpus not to mention outdated single core cpus. if someone is silly enough to stick a high end card with a 4 year old single core cpu then should be made aware of just what to expect.

the rest of your comment makes no sense because fast single cores arent even made anymore. there have been so many improvements to cpus that even one core of a modern cpu at 2.0-2.2 is as fast as an old single core high end cpu ever was. sure one of the cores of a fast dual core may outrun two at half the speed but so what? so you really missed the entire point that a real single core cpu from 4 years ago is not able to keep a high end card happy in most newer games. ;)
 
Thanks for this information! Interesting to see how even with resolution @ 1680 x1050 that it uses more than 512MB RAM. That sealed it for me. I'll need at least more than 512MB VRAM. Might as well get the 1GB 4850 (or if the price is right maybe a GTX260 or a 4870 HD 1GB if the pricing is pretty close)

The resolution is largely irrelevant. At 2560x1600 the framebuffer use will be ~24MB for double buffering, ~36MB for triple buffering. That is it. That is the most amount of space that the resolution will take up. The rest of the VRAM is used for things like textures, vertex lists, etc... That use is independent of the resolution, it will only change for different game quality levels. So going from 1280x1024 to 2560x1600 will only increase VRAM usage by about 17MB when double-buffered - a drop in the bucket.

I still say that the 1GB 4850 is a waste, but I hope you are happy with your purchase.

So what good is that extra memory bandwidth if you don't have enough VRAM to use it properly? Which is entirely what i've been saying....

What good is the extra VRAM if you don't have the memory bandwidth to use it? ;)

the rest of your comment makes no sense because fast single cores arent even made anymore. there have been so many improvements to cpus that even one core of a modern cpu at 2.0-2.2 is as fast as an old single core high end cpu ever was. sure one of the cores of a fast dual core may outrun two at half the speed but so what? so you really missed the entire point that a real single core cpu from 4 years ago is not able to keep a high end card happy in most newer games.

You continually ignore overclocking. How many people on this forum are actually running a stock CPU? And from what I've seen, an Athlon64 needs to be clocked about 400mhz higher than a Core 2 Duo to roughly match overall performance. So an FX-57 @ 2.8ghz (stock clock) would definitely beat one of your C2D cores at 2.2ghz.
 
You continually ignore overclocking. How many people on this forum are actually running a stock CPU? And from what I've seen, an Athlon64 needs to be clocked about 400mhz higher than a Core 2 Duo to roughly match overall performance. So an FX-57 @ 2.8ghz (stock clock) would definitely beat one of your C2D cores at 2.2ghz.
I dont ignore that at all. in fact an old Atlon 64 single core at 2.8 would be about like 1 of my cores at 2.0-2.2 is. so what though? you already have the results right there in front of you that show just how shitty having just one modern core at 2.0 would be. :confused:
 
I dont ignore that at all. in fact an old Atlon 64 single core at 2.8 would be about like 1 of my cores at 2.0-2.2 is. so what though? you already have the results right there in front of you that show just how shitty having just one modern core at 2.0 would be. :confused:

No, it would likely be one of your c2d cores at 2.4ghz, and if its OC'd to 3-3.2ghz, then it would be one of your C2D cores at 2.6-2.8ghz. So I don't care how shitty a 2.0ghz C2D runs it, how does a single 2.8ghz C2D core run it? That would be far more representative of what he could hit with a single core.
 
No, it would likely be one of your c2d cores at 2.4ghz, and if its OC'd to 3-3.2ghz, then it would be one of your C2D cores at 2.6-2.8ghz. So I don't care how shitty a 2.0ghz C2D runs it, how does a single 2.8ghz C2D core run it? That would be far more representative of what he could hit with a single core.
no thats not true. if you look at most benchmarks a 2.13 core 2 duo will easily beat a 2.6 Athlon 5000 X2. so if you take one each of those core than the results would be the same. hell in some games even the 1.86 core 2 duo even beat the 2.6 Athlon 5000 X2. that was also the old core 2 that doesnt have the improvements in cache and ipc that the newer ones do. 1 of my E8500 cores at 2.0 would easily be as fast as the fastest single core cpu(A64 4000). so even if you overclocked that old single core athlon to 3.0 my cpu would easily beat it at 2.2. my cpu is probably going to scale much better too.

the fact remains that overclocking the hell out of an old single core will still only net you the performance of 1 slow core of a modern cpu. also in this case not only did I have to disable 1 core I had to lower my speed way down just to make it slow to be comparable to the fastest single core made. also if you had been paying attention you would have seen that I already ran FC 2 benchmark with 1 core at full 3.16 speed and the game was still not really playable.

please just get through your head that a 4 year old single core cpu is not what you want in a gaming pc. there are already several games that would not be playable at all.
 
no thats not true. if you look at most benchmarks a 2.13 core 2 duo will easily beat a 2.6 Athlon 5000 X2. so if you take one each of those core than the results would be the same. hell in some games even the 1.86 core 2 duo even beat the 2.6 Athlon 5000 X2. that was also the old core 2 that doesnt have the improvements in cache and ipc that the newer ones do. 1 of my E8500 cores at 2.0 would easily be as fast as the fastest single core cpu(A64 4000). so even if you overclocked that old single core athlon to 3.0 my cpu would easily beat it at 2.2. my cpu is probably going to scale much better too.

The fastest single core was the FX-57. It had a clock speed of 2.8ghz. And all you did was backup what I said. 2.13 C2D vs. 2.6 X2 == ~400mhz difference (in this case 470, but whatever). So an A64 core at 3ghz would compare to, gasp, a C2 core at 2.4-2.6ghz, and a 3.2ghz A64 to a 2.6-2.8ghz C2 core. The new Core 2s really didn't improve that much over their older brethren.

the fact remains that overclocking the hell out of an old single core will still only net you the performance of 1 slow core of a modern cpu. also in this case not only did I have to disable 1 core I had to lower my speed way down just to make it slow to be comparable to the fastest single core made. also if you had been paying attention you would have seen that I already ran FC 2 benchmark with 1 core at full 3.16 speed and the game was still not really playable.

Yes, and FC2 is very demanding. Try disabling 1 core and playing Left4Dead, WoW, TF2, or any other modern, popular game that isn't as demanding.

please just get through your head that a 4 year old single core cpu is not what you want in a gaming pc. there are already several games that would not be playable at all.

Now you are putting words in my mouth. I never made any claim at all that a single core would be good or that it was what people wanted. I said that "the guy with the single core really isn't all that bad off". I never said or made any claim that all you need is a single core to crush Crysis. But he also doesn't *need* to upgrade unless he wants to play the very demanding games like Crysis, FC2, etc...
 
The fastest single core was the FX-57. It had a clock speed of 2.8ghz. And all you did was backup what I said. 2.13 C2D vs. 2.6 X2 == ~400mhz difference (in this case 470, but whatever). So an A64 core at 3ghz would compare to, gasp, a C2 core at 2.4-2.6ghz, and a 3.2ghz A64 to a 2.6-2.8ghz C2 core. The new Core 2s really didn't improve that much over their older brethren.



Yes, and FC2 is very demanding. Try disabling 1 core and playing Left4Dead, WoW, TF2, or any other modern, popular game that isn't as demanding.



Now you are putting words in my mouth. I never made any claim at all that a single core would be good or that it was what people wanted. I said that "the guy with the single core really isn't all that bad off". I never said or made any claim that all you need is a single core to crush Crysis. But he also doesn't *need* to upgrade unless he wants to play the very demanding games like Crysis, FC2, etc...
yes the newer 45nm Core 2 cpus are about 10% faster clock for clock than the first conroe models so my numbers are right. yes there are several modern games that do okay with a single core but most of those do even better with a modern dual core even if they are not really multi threaded. the reason is that newer cpus are much faster clock for clock than the old Athlons and P4s used to be. one core of a modern fast dual core such as an E8400 or 8500 would beat the fastest single core cpu(FX-57) in everything. if he needs to upgrade to a high end video card because the games are demanding then he would also need to get rid of that slow single core 3500 to play most of those same games.
 
Back
Top