How badly will an Athlon 64 3400+ bottlneck a 4850?

Cork1

n00b
Joined
Mar 2, 2007
Messages
59
Hey ya'll,

I know even the mighty 3850 is bottlnecked some by said cpu. My friend was gonna throw in a new psu and a 3850 agp in his older system. I forget I had a Nforce3 system in storage with said cpu and a Pci-E x16 slot so I just gave it to him.

He wants to wait for the next generation of Intel bad boyz before building a new pc. What do you gaming nuts think.............slap in an Msi OC 3850 or a 4850. Cost isn't an issue really. I know modern Cpu intensive games like Crysis will slow the vga down seriously but classics like Fear that use very little cpu should freaking fly with a 4850.

Thanks for your time.
 
That CPU shouldn't bottle neck the videocard unless you play ultra high resolutions.
 
That CPU shouldn't bottle neck the videocard unless you play ultra high resolutions.

It will bottleneck LESS at ultra-high resolutions. Any older game will fly even with a 3850. Maybe he can get a 3870 or a 4650 if they come out soon and save the rest for later. A 4850 would be wasting money if he's not going to upgrade cpu anytime soon.
 
It will bottleneck LESS at ultra-high resolutions. Any older game will fly even with a 3850. Maybe he can get a 3870 or a 4650 if they come out soon and save the rest for later. A 4850 would be wasting money if he's not going to upgrade cpu anytime soon.

+1 spend the money and get a cheap motherboard and CPU combo.
 
It's a bottleneck at any resolution for the latest games, heck I know it struggles at HL2 engine games such as CS:S and TF2 with a lot of people on the server.

lol Ghost, you're a little confused, sure upping the resolution means stressing the videocard more, but it does not use the CPU any less.


Edit: Saw you gave him the system as an upgrade from his agp system and he wants to wait... that's nice of you, but I would spend it on something cheaper to get by unless he's deep pocketed.

Used 7900 series / X1900 series card for around $40. Or just get the dang HD4850 and reuse it in the new system.
 
I am a little confused on your statements concerning bottlenecks.

I thought this poster was using AGP platform and the 3400 AMD is plenty fast. I though using high resolutions would cause bottle necks on his system? I figured if moving to PCi-E the bandwith is there therefore the CPU would have a hard time keeping up with the videocard?

Does it depend on what software/gaming resolution related to the hardware platform? I always thought that if the videocard is able to process the information and the cpu isn't supplying the needed info then the CPU is the bottleneck? If the CPU is able to supply the information and the videocard can't process it then the videocard is the bottleneck?

Pls correct me if I am wrong in my understanding? Are you telling me that the amount of strain on the PC is related to the FPS? (frames per seconds)?
 
CPU will definately hold that card back a 3800 plus will hold back anything faster than a x1900 series card
 
I ran an athlon 3800+ with a 1950 pro before I upgraded to my current system and the games DEFINATELY got higher FPS with the new system and the 1950 Pro.....
 
I think thats going to be a bottleneck, in fact some games are starting to require multicore cpus as standard, you need to upgrade that cpu fairly soon to stay in the game, so to speak ;)
 
3850 won't be bottlenecked by the system too bad, not 3870 either. 4850 won't be used to its full potential unless you enable high levels of AA and AF and play at very high resolutions..even then I doubt you'd use that much of the cards potential - if it were an X2 it would be a different story, but single-core these days doesn't provide enough performance.
 
Hmmm if cost is not an issue why don't you just pick up a cheap p35/e2200 combo and match that with a 3850 or 4850. Cause the 3400+ will bottleneck your 3850 or 4850 gpu in most modern games and I'm not talking about crysis. I saw large improvement in graw 2 after being able to oc the cpu I got now higher because of newer, better ram.
 
Of course it will be bottlenecked, but the card is only $150, the games will play fine, and what your friend doesn't know won't hurt him. It was nice to give him that stuff. I'm sure he'll love it.
 
cpu bottleneck gets a little overblown in these forums.
All it means is the 4850 will not perform at its full potential. But it will still give you a very nice fps bump, much more so then a new cpu/mobo with an old video card would.

Although a new cpu /mobo will definitely improve your fps, (and make your system more upgradable in the future), a gpu upgrade will always give you the best fps/dollar spent ratio.
 
An A64 3400+ will bottleneck it.

Look at COD 4 as an example with an 8800GT @ 1280x1024 MAX settings
A64 4000+: 26fps
FX-60 2.6 Ghz: 60 FPS
X2 6000+/E6600 : 77 FPS
 
An A64 3400+ will bottleneck it.

Look at COD 4 as an example with an 8800GT @ 1280x1024 MAX settings
A64 4000+: 26fps
FX-60 2.6 Ghz: 60 FPS
X2 6000+/E6600 : 77 FPS

QFT. There are threads about people runing opty's at 2.6 who didn't see any improvement over their old cards and current gen.
 
tiger direct AMD Athlon 64 X2 5000 black edition+ 2gb Memory Bundle 80 dollars.
http://www.tigerdirect.com/applicat...D&cm_mmc=Email-_-Main-_-WEM1654-_-100under100

MSI K9VGM-V 50 dollars
http://www.tigerdirect.com/applications/SearchTools/item-details.asp?EdpNo=2366197&CatId=2320

add 15 for shipping and you have 145 dollars in and your back to being GPU limited. pretty painless upgrade

Very very nice upgrade for you. Thanks for the link too. I might snatch one up for a htpc build.
 
An A64 3400+ will bottleneck it.

Look at COD 4 as an example with an 8800GT @ 1280x1024 MAX settings
A64 4000+: 26fps
FX-60 2.6 Ghz: 60 FPS
X2 6000+/E6600 : 77 FPS

That's a fairly low resolution, how about 1680 x 1050, max settings?
 
and what difference what it make since its not playable even at 1280 with that cpu?

Suppose all three are unplayable at 1680 x 1050? Then the 4000+ is not the bottleneck.

I just think CPU dependency tests should be at resolutions people play at, not ones that exaggerate the differences.
 
Suppose all three are unplayable at 1680 x 1050? Then the 4000+ is not the bottleneck.

I just think CPU dependency tests should be at resolutions people play at, not ones that exaggerate the differences.

yes but the first bottleneck is the cpu. it is true that the difference between processors becomes less as the pixels go up but there is a point where adding more GPU won't do anything if you can't feed it. If the processor can't render the frames fast enough for 1280 by 1024 then it can't do it for 1680 by 1050 ether. don't get me wrong, CPU needs are almost always exaggerated, usually by a large margin. but in this case the GPU is going to be waiting on the CPU constantly.
 
Suppose all three are unplayable at 1680 x 1050? Then the 4000+ is not the bottleneck.

I just think CPU dependency tests should be at resolutions people play at, not ones that exaggerate the differences.
but the point is that the game IS playable with the dual cores where its not with the single core. bumping up the res may make the gap smaller but so what? any single core cpu especially a slow one is going to really hamper you in modern games NO MATTER WHAT RES YOU PLAY AT.
 
but the point is that the game IS playable with the dual cores where its not with the single core.

This guy disagrees:

http://www.overclock.net/ati/354958-just-got-hd-4850-a.html

I was so stunned by crysis when i was running @ high and getting 35+ fps and i can say it was a completely different game form crysis @ medium.

Edit: I got 8600 points in 3dmark which i thing is good for a single core CPU.

Looks like a single core is fine for the HD 4850 with an Athlon 64 in Crysis and 3DMark06. Everything else should run fine, too.

Everyone realizes there will be a bottleneck, but when the alternative is buying a far inferior video card for not much less money, it doesn't matter. People who recommend the 3850 because of the "slow" CPU haven't been paying attention to the AA/AF fixes in the 48xx series.
 
This guy disagrees:

http://www.overclock.net/ati/354958-just-got-hd-4850-a.html





Looks like a single core is fine for the HD 4850 with an Athlon 64 in Crysis and 3DMark06. Everything else should run fine, too.

Everyone realizes there will be a bottleneck, but when the alternative is buying a far inferior video card for not much less money, it doesn't matter.
I dont give a crap what that guy thinks because it doesnt change what I said. FACT: an older single core WILL bottleneck you in modern games that take advantage of dual cores. in fact in some games the user would see a bigger fps improvement by upgrading to a dual core cpu than upgrading his gpu.


In Crysis even the slowest clocked dual core is 50% faster than the fastest single core and a good Intel Core 2 duo is 100% faster or more
http://www.gamespot.com/features/6182806/p-6.html

In Bioshock even a much slower clocked dual core would be nearly 40% better
http://www.gamespot.com/features/6177688/p-7.html

Call of Duty 4 shows massive gains with a dual core cpu and using an Core 2 duo will result in more than a 100% improvement
http://www.gamespot.com/features/6183967/p-5.html
 
I dont give a crap what that guy thinks because it doesnt change what I said. FACT: an older single core WILL bottleneck you in modern games that take advantage of dual cores. in fact in some games the user would see a bigger fps improvement by upgrading to a dual core cpu than upgrading his gpu.

I hope you realize there's a difference between being "bottlenecked" and being "unplayable" because that's a distinction you've failed to make. A 3400+ and HD 4850 will run games fine. When you calm down, you'll realize your error.
 
I hope you realize there's a difference between being "bottlenecked" and being "unplayable" because that's a distinction you've failed to make. A 3400+ and HD 4850 will run games fine. When you calm down, you'll realize your error.
when you read my edited post maybe YOU will realize that it is you that is missing the point.
 
I dont give a crap what that guy thinks because it doesnt change what I said. FACT: an older single core WILL bottleneck you in modern games that take advantage of dual cores. in fact in some games the user would see a bigger fps improvement by upgrading to a dual core cpu than upgrading his gpu.

In Crysis even the slowest clocked dual core is 50% faster than the fastest single core
http://www.gamespot.com/features/6182806/p-6.html

In Bioshock even a much slower clocked dual core would be nearly 40% better
http://www.gamespot.com/features/6177688/p-7.html

Looking at your links, the Crysis bench is at 1024 x 768, medium, which is meaningless.

The Bioshock benchmarks show 44 FPS w/ the 4000+ at 1600 x 1200, a nicely playable resolution.
 
You still don't get it cannondale06. Sorry. OP has given some decent hardware to a friend and wants to know if he can play games with an HD 4850. Everyone here has acknowledged that there will be a CPU bottleneck, but I believe only you posted the words "but the point is that the game IS playable with the dual cores where its not with the single core" which is 100% FALSE. If Crysis can be played by the poster in my link, ANY game can be played. OP's friend will be fine. Feel free to continue arguing with yourself though.
 
You still don't get it cannondale06. Sorry. OP has given some decent hardware to a friend and wants to know if he can play games with an HD 4850. Everyone here has acknowledged that there will be a CPU bottleneck, but I believe only you posted the words "but the point is that the game IS playable with the dual cores where its not with the single core" which is 100% FALSE. If Crysis can be played by the poster in my link, ANY game can be played. OP's friend will be fine. Feel free to continue arguing with yourself though.
the reply that I posted that to was showing a 26 fps benchmark for COD 4 which is not really playable.
 
well NONE of those are single core. also their dual core results are a little strange as it suggest that speed doesnt matter at all.

just to be clear a modern game that utilizes dual cores will be MUCH slower on a single core. I have already provided 3 crystal clear links to show that.

That's what I mean, your numbers show a big difference between two dual cores, their numbers show none.

Your links showed CPU is a big bottleneck at low res which nobody uses.
 
That's what I mean, your numbers show a big difference between two dual cores, their numbers show none.

Your links showed CPU is a big bottleneck at low res which nobody uses.

well in a modern game that utilizes dual cores the difference between single core and dual is still there NO MATTER WHAT resolution you play at.
 
my girlfriend's rig is a Opty 146 @ 2.7ghz with an X800 GTO (16 pipes) and the only modern games it can't play are SM3.0 required ones. Crysis was actually quite playable after some tweaking, same for Supreme Commander.
 
my girlfriend's rig is a Opty 146 @ 2.7ghz with an X800 GTO (16 pipes) and the only modern games it can't play are SM3.0 required ones. Crysis was actually quite playable after some tweaking, same for Supreme Commander.
yes and there are people on youtube with a Sempron and a 6200 video card playing Crysis too. that doesnt mean that a dual core cpu isnt a significant improvement in those games that benefit from it.
 
Jum, that COD 4 results in single core are too low, heck, I played COD 4 with everything maxxed @ 4XFSAA with the X1950XT at 1024x768 which is a heavy CPU dependant resolution, and the game never dipped under 36fps, always averaged in the mid 40's.
 
you can't put much stock in COD4, my phenom system with a 8800GTX card plays it in the 100s FPS at 1920 by 1200. I have never seen a game that runs that well.
 
Back
Top