Please stop the nonsense. GTX 460 too much for 775?

SpeedyVV

Supreme [H]ardness
Joined
Sep 14, 2007
Messages
4,210
Can [H]ardOCP or someone here put an end to this nonsense?

There is so much contradictory info on this, it makes my head spin.

Is the GTX 460 or equiv ATI (6850?) too much GPU for the current batch of 775 processors like the Core2Duo or Core2Quad?

OCed or not, my Q6600 does not show any of the cores maxed on some games like COD:MW2 or F1 2010. Yet the GPU it way underutilized. And the FPS are somewhat low.

I see people on threads that gone from a 8800GT to a GTX 460 and the fps dropped :confused:

The problem, is there is too much FUD going on the forums to come a conclusion.

- CPU is bottlenecking
- Fermis are bad HW
- Fermis are too much GPU
- Drivers are bad
- Games are bad.
- you need an i7 to game with current gen GPUs
- on and on and on

Can someone put an end to this nonsense? What is what?

Do you have the equipment to prove one way or the other what is what?
 
it is going to depend on the individual game. a fast 775 quad will give you nearly the same performance as an i5/i7 with any single gpu but the dual core will certainly hold a user back in some games. a Q6600 would need to be oced a bit to give the best overall experience especially since many games don't effectively utilize a quad.
 
I had some issues when I upgraded to a gtx460 when I still had a core2duo as well. Mostly in games that were known to be "cpu intensive" like starcraft. Upgraded to an i7 930..and the game runs amazingly well with the same video card and clocks on it.

Aside from that..stuffing the gtx460 in my old system made it feel like it was going to start on fire it ran so hot..I'd definately say there are a few games out there that will make the old cpus peg out with a new video card.
 
The reality:

You can run nearly any modern game on a fast Core 2 Duo with a single GPU. Anything more is just icing.

If you want to run at some insanely low resolution just to get 200fps, then yes you need more CPU power, but if you target is 60fps with all eyecandy enabled, then a fast Core 2 Duo is all you need.

There is no such thing as "too much GPU" for a given processor. If the CPU is the bottleneck, it doesn't matter how low-end you go with the GPU, the game will still be unplayable. On the other hand, if your CPU makes a game playable, then the sky is the limit as far as GPU power goes (barring SLI/CFX) - just turn up the eyecandy and resolution!

My GTX 460 experience:

The drivers for Fermi suck, and the game makers aren't bothering to support the new architecture, so each new game is a minefield. There was a well-publicized stuttering issue in BFBC2, and then there was the (mostly) Fermi-specific performance issues (and white flashing artifacts) in Fallout: New Vegas. That's just how it is being in last place.
 
Last edited:
The reality:

You can run nearly any modern game on a fast Core 2 Duo with a single GPU. Anything more is just icing.

If you want to run at some insanely low resolution just to get 200fps, then yes you need more CPU power, but if you target is 60fps with all eyecandy enabled, then a fast Core 2 Duo is all you need.

There is no such thing as "too much GPU" for a given processor. If the CPU is the bottleneck, it doesn't matter how low-end you go with the GPU, the game will still be unplayable. On the other hand, if your CPU makes a game playable, then the sky is the limit as far as GPU power goes - just turn up the eyecandy and resolution!
absolutely false. there are plenty of games where no core 2 duo can mange 60 fps. also many settings have an impact on cpu. and yes you can have too much gpu because if a cheaper one will give you the same playable experience then it is silly to buy a much mores expensive one with a slow cpu.
 
Last edited:
I went from a 8800 GT to a GTX 460 on a Core 2 Duo 2.6mhz and my FPS didn't drop. I'm currently only playing WoW.

Holding out for Sandy Bridge.
 
Quads at right speed 3Ghz and above stock or overclocked and a good video card can run most games just fine. Mind you the settings will have to be mid to mid high.
 
I dunno, not unless if the memory subsystem being put on a seperate chip causes massive issues, or if the cache is too teeny, or whatever. At any rate, I don't think the Core 2 series is dead, yet.
 
A GTX 460 will run great on a 775 CPU running at 3GHz or better. If it's running slower than that, please overclock it - it's a trivial overclock for just about all of them. For the most part, the only time you'll be seriously CPU bound is on games that are CPU bound even with an i7 at 4GHz (which usually means poorly designed game code).
 
I ended up with a GTX 470 for cheaper the same price that I could have purchased a 5770 for. I didn't really want nVidia as yes, the drivers aren't as good as they used to be (not that they have become worse over time, but say, not as uniformly excellent as they used to be). The idea that you are wasting your money on a card that is "more powerful" than your CPU is bunk. While I always look for balance, I think the reports of people getting a Fermi and their frames drop in games is dubious, barring some super driver conflict. RTS's tend to need way more CPU, or put another way, you won't necessarily get better FPS from just an increase in GPU unless it is the bottle neck. Plus, Civilization V gets better framerates from a 470 than a 480 (according to Anandtech - by like 1 frame at 1080p). The 460 is not a terrible card, my 470 doesn't draw as much power as everyone makes it out to be, and I'm not sad I bought it. I wanted a 6850, but I could get a new 470 cheaper. I got the 470. I'm using a Phenom II x3 with it ; the Phenom is isn''t as close to the i5/i7 as it is to the 775 C2D. Even the higher end C2Qs can break it like a shotgun and have it's way with my AMD. But I'm very pleased with this combination. Fallout NV doesn't use very much CPU or GPU maxed out @60fps. Mafia II does, but still runs locked at 60 at all times at 1080. I can't think of any game where using a three year old card would be better than a new higher end card.

So in closing,

The 8800GT is not the terrible cesspool people make it out to be. People who think their 8800 is faster than their 460 are probably child molesters.

I haven't had a problem with the 260.99 drivers and Fallout:NV. nVidia may have some driver issues, but so does AMD. And CCC is not anything to write home about.
The new Radeons are the funk. I wasn't lying about not having buyers remorse, but sometimes I do wish I had purchased a new Radeon. I like to shop for value.
 
Last edited:
Rig in sig

Im sending back the 460 gtx I bought. It is great in some games and give me trouble in others. It also gives me from 2-4k all the way to 90k DPC latency spikes, which are probably caused by the drivers and my other hardware. I have little faith in Nvidia fixing the DPC issues for older 775 chipsets anytime soon, and since I have a 14 day window to return my purchased item and get my money back, Im doing that
 
I don't get full utilization in COD or Fallout but I'm pretty positive I'm not cpu constrained. I can however peg the utilization in shader intensive games like Crysis.

What I am think is happening is a direct result of Nvidia adding a 3rd shader core on the GF104 for every 2 on the GF100 without increasing the ratio of other execution resources. My guess is that the utilization is being measured against the number of shader cores pegged, but if the game isn't shader constrained then the other chip resources will bottleneck them arround the 66% mark. Therefore, if you are playing something that isn't shader constrained then the card is indeed fully utilized but the shaders aren't.

You can read more about the architecture differences here.
http://www.anandtech.com/show/3809/nvidias-geforce-gtx-460-the-200-king/2

I could of course be utterly wrong here and would be interested to see if others have similar experiences on other platforms.
 
The 45nm core 2 duos are wicked fast, especially the ones with larger caches, and can keep a GTX 460 busy. A good core 2 duo and a GTX 460 lets you play starcraft 2 at ultra quality at 1920x200 at over 40 fps, even with aax4/afx8.

Starcraft2 is being bottlenecked by the CPU, but so what. The 460 still gives you good FPS with everything maxed out, and a lessor graphics card would give you fewer FPS.
 
Smells like FUD to me. I am running GTX 460 SLI in my second rig with Core 2 Quad CPU and I have no complaints at all. The 260.99 drivers have been rock solid so far for the games I'm playing as well.

I ran some benchmarks with this rig not too long ago. You can check it out on this thread:

http://hardforum.com/showthread.php?t=1552813
 
i have a e6600 @ 3.2ghz 400x8 with 4gb xms2 4-4-4-12 with evga GTX460 stock and it sucks holding out for sandy am i cpu bottlenecked??
 
i have a e6600 @ 3.2ghz 400x8 with 4gb xms2 4-4-4-12 with evga GTX460 stock and it sucks holding out for sandy am i cpu bottlenecked??
like I said in the very first reply...IT DEPENDS ON THE INDIVIDUAL GAME. depending on your res you are fine for the most part but yes your cpu will limit the gtx460 in the more cpu intensive games.
 
like I said in the very first reply...IT DEPENDS ON THE INDIVIDUAL GAME. depending on your res you are fine for the most part but yes your cpu will limit the gtx460 in the more cpu intensive games.

From personal experience this is false. I have a q6700 at 3.2ghz and a 5870. I haven't thrown a single game at it (sans Metro 2033) that I can't play maxed at 1920x1080. Everything is entirely playable. Going from my old 260gtx to the 5870 saw a massive increase in performance.

My cores never peg 100% or close to it. The C2Q with a decent OC will still push almost anything.
 
With a multicore CPU, you often times won't see 100% utilization due to imperfect optimization
 
With a multicore CPU, you often times won't see 100% utilization due to imperfect optimization

Even if poor optimization was the culprit you would see 1 or 2 of the cores spike up as the cpu tries to keep up with the system. This simply doesn't happen.
 
From personal experience this is false. I have a q6700 at 3.2ghz and a 5870. I haven't thrown a single game at it (sans Metro 2033) that I can't play maxed at 1920x1080. Everything is entirely playable. Going from my old 260gtx to the 5870 saw a massive increase in performance.

My cores never peg 100% or close to it. The C2Q with a decent OC will still push almost anything.
that makes no sense because it IS going to be 100% dependent on the game. do you think all games run the same? of course not. in a game like Dragon Age or GTA 4 an i5/i7 easily beats any Core 2 quad. you obviously have no clue what bottleneck actually means. its not like the game stops working or you get a warning telling you that you could be getting 10 more fps with a faster cpu.

not to mention the person I was replying to has a dual core NOT a quad core like yours. your Q6700 at 3.2 and a 5870 are a very balanced match and only in a few cases would an i7 make any difference.
 
The first number is my FPS with my old HD4830 (slightly overclocked), and the 2nd is with my GTX 460. System is in my sig:

1080p
--------
BFBC2 4xAA16xAF, Settings Highest, HBAO enabled min 17 22
BFBC2 4xAA16xAF, Settings Highest, HBAO enabled avg 28 50
F1 2010 4xaa, All Setting Maxed min 16 40
F1 2010 4xaa, All Setting Maxed avg 22 48
L4D2 8xAA, 16xAF, All Setting Maxed min 11 36
L4D2 8xAA, 16xAF, All Setting Maxed avg 62 108

STALKER: CoP went from being barely playable at medium settings to being 40-70FPS at maxed settings with 4xAA
 
Someone in the forums did a pretty complete benchmark suite on exactly this. CPU bottlenecking only affected lower end, and non-overclocked 775s. Other than that, they performed really well. I am holding out for SB enthusiast line myself.
 
GTX 460 = GTX 280.

If 775 bottlenecks it, then no benchmarks of GTX 280 are valid.
 
From personal experience this is false. I have a q6700 at 3.2ghz and a 5870. I haven't thrown a single game at it (sans Metro 2033) that I can't play maxed at 1920x1080. Everything is entirely playable. Going from my old 260gtx to the 5870 saw a massive increase in performance.

My cores never peg 100% or close to it. The C2Q with a decent OC will still push almost anything.

The cpu in question is a dual core, not a quad core. So he's right, in certain games his GPU will be bottlenecked by his CPU.
 
My q6600 and gtx 280 still run games pretty damn good. I don't have the uber high resolution as some of you, but then again a 24" monitor is enough for me.
 
Run at 2560x1600 or higher and never worry about bottle-necking again. (assuming max video quality settings)
 
Core 2 Quads are great still if overclocked a bit even in CPU-intensive games, however Core 2 Duo's really feel the pain in many like Civ5, BFBC2, WOW, WAR, SC2, and a *lot* of others when paired with a nice video card, bottlenecking the system. Still, the newer cpu's such as i5/i7 quads have large IPC improvements that do translate to modern game performance boosts.

I've advocated buying quad-cores since they were introduced, knowing people keep their CPU's a long time, simply for the reason of multicore usage coming into vogue "in the near future" (which is *NOW*). More and more games will rely on mutlithreading as time goes on; already most make use of 2 cores fully, many will thread onto the others some, and a good number will see improvement from quads simply for running everything else on the system on the extra cores.
 
Check out any of bit-techs cpu revies, and look at the crysis benchmark.
Of interest is the following, Crysis@1,680 x 1,050, DX10, 64-bit, High, no AA, no AF
Q6600 Stock (minimum framerate) 12fps
Q6600 3.7GHZ (minimum framerate) 24fps.
i5-750 Stock(minimum framerate) 20 fps
i5-750 4.15 ghz (minimum framerate) 33fps
i7920 Stock (minimum framerate) 35fps
overclocking at i7-965 to over 4ghz does not change this.

Personal Experience in upgrading to i7 = smoother gameplay using a 5870.
I would imagine that the way around this is overclocking the bus to get similar transfer rates to the x58 regardless of platform. It is probaly more chipset related than anything.
 
what did you expect? lol. I summed it up with my first reply by saying it depends on the game.

Well, I expected nothing, but was hoping (yeah I know), that more replies would be like the ones travbrad and avddreamr posted.

Basically based on real experience or refering to relevant articles.

The thing that I dont understand is why some many sites, including [H]ardOCP, benchmark CPU with resolutions like 800x600. I mean, who cares? Sure it will show CPU performance without worrying about GPU bottlenecking, still.

I mean i much rather see like a triple SLI or Quad XFire setup, and see what the CPU delivers. Does my E4300 perform the same as my Q6600? Should I upgrade to i7?

Of course a more realistic example would be a low, mid, and highend GPU setup both ATI and NVidia, across a few games and resolutions.

I guess that is why no one does this :(
 
Check out any of bit-techs cpu revies, and look at the crysis benchmark.
Of interest is the following, Crysis@1,680 x 1,050, DX10, 64-bit, High, no AA, no AF
Q6600 Stock (minimum framerate) 12fps
Q6600 3.7GHZ (minimum framerate) 24fps.
i5-750 Stock(minimum framerate) 20 fps
i5-750 4.15 ghz (minimum framerate) 33fps
i7920 Stock (minimum framerate) 35fps
overclocking at i7-965 to over 4ghz does not change this.

Personal Experience in upgrading to i7 = smoother gameplay using a 5870.
I would imagine that the way around this is overclocking the bus to get similar transfer rates to the x58 regardless of platform. It is probaly more chipset related than anything.

I find it odd that a 55% overclock on a Q6600 doubled the minimum framerate.
 
It's complete FUD.

This.

These days people like to throw the term 'bottlenecking' around so much that they don't even know what it means anymore.

What I'd really like to see is an in depth article from the [H] on this, pairing some current mainstream cards up with 'older' hardware... even MUCH older hardware.

Case in point: Fact is you can plop a GTX460 into an old Socket 754 AMD Athlon X2 rig and it will do just fine, totally blowing away any GPU that was 'current' at the time the socket 754 platform was popular. Yes, there will be bottlenecking knocking a few frames off the potential output if the GPU was running on something more current, but it's hardly a cut and dry 'wasting the power of the GPU' type of thing.


A more realistic case that I recently had first hand experience with:

A client of mine had an aged but perfectly good system built around a Socket 754 board and an Athlon X2 3200+ chip. Plenty of RAM (2x2GB DDR), Corsair TX750 PSU, etc. In fact I'd say the mobo and CPU were the oldest parts of the rig but he didn't want to upgrade - they still served him perfectly fine. He wanted something better than the 7800GT that was in there. So I took the machine and played around with it for a week, testing out various PCIE cards I had laying about. I tried the GTX260, GTX280, GTX285, and even a GTX460 and not only was the difference between any of those cards and the old 7800GT night and day, but there was a noticeable difference between say the GTX260 and GTX460... on that very same system.

He ended up going with the GTX260 since I had a used one laying around for cheap anyway and is quite happy with the results. *shrug*
 
Back
Top