Nvidia claims "GPU matters more than a CPU" controversy

AuDioFreaK39

Limp Gawd
Joined
Jan 10, 2005
Messages
475
Fudzilla posted an article about a week ago that caught my attention:

nvidia.gif
vs.
intel_logo_new.gif


http://www.fudzilla.com/index.php?optio … p;Itemid=1

"Nvidia is publicly claiming that the GPU is better and smarter than the dull CPU. CPUs are boring and Nvidia pushes the message that if you have a low-end CPU and a high-end graphics card you will play better than with high-end quad-core CPU and cheap graphics.

We agree with Nvidia, as a GPU upgrade usually means better gaming performance; but we believe that an open confrontation with Intel is the last thing that Nvidia needs at the moment. The company is arrogant, but not as arrogant as Intel, and history teaches us that Intel finds a way to penalize the bad boys.

The future just got more interesting."
-
-----------------------------------

Although I'd love to see the GPU eventually winning the controversy, I have a premonition that it's going to be the underdog (similarly to HD-DVD) in the situation, but it won't be entirely cut out of the computing process - it will just serve a different general purpose. My guess is that the multicore CPUs after Larrabee will have around 32 to 64 cores, with a given section of them (1/4 or so) dedicated solely to ray tracing through parallel processing that is directly interlinked to the other main logic cores. In effect, a GPU would be the device in a computing system that would "refine" the CPU ray traced graphics in a sense as to add depth of perception and "realization". So in theory, the computing system would be similar to that of the human mind, only in reverse order:

CPU (processes image through ray tracing and parallel processing techniques and sends to) >> GPU (which translates the image "realistically" by adding detail, texture, effects (non-physics related), and "perception" in a sense)

Would anyone agree with this, or am I just going crazy? Can't really tell 0_O
 
I'm leaning more toward crazy. Most people have no need for raytraced fancy graphics so they won't buy a bunch more cores that they don't really have a use for.

Even with many-many cored processors and a switch to raytraced graphics I think GPU manufacturers will just refine cores and set up graphics cards for raytracing, rather than CPU manufacturers suddenly becoming far more important.

It could go either way; we're talking about 10+ years from now, more than likely.
 
Even with many-many cored processors and a switch to raytraced graphics I think GPU manufacturers will just refine cores and set up graphics cards for raytracing, rather than CPU manufacturers suddenly becoming far more important.

Agreed. Sure, it's great for Intel, they get to sell super expensive many-core CPUs, but the GPU manufacturers would lose because everything would go back to software rendering. nVidia and ATi would never allow this to happen. Besides, using just raytracing presents much more problems (processing power, character/object animations, unneccesary polygon overhead, etc.) than it solves over traditional rasterization. Even David Kirk from nVidia says pure raytracing isn't the future, but lies somewhere inbetween, using rasterization for everything that can be done accurately with said technique, and raytracing for all the other tricky effects that raster graphics can't do accurately.

I call shens and BS on Intel saying pure raytracing is the future, because they are the only ones that stand to profit from such a future.
 
Seems unlikely, lighting effects are getting good enough that ray tracing has lost much of its edge. Even though you're not talking about full true reflections, it looks quite good for the common user and they'd probably prefer to have more objects instead of photo-realistic reflections. Rendering tricks do the trick well enough.
 
Wait... what the hell is the point of this debate anyway? It seems rather like two kids in elementary school fighting over whether plain milk is better than chocolate milk.

still%20retarded.jpg
 
So lets see the GPU control AI and shit if it is SO great and powerful...

each is good at what it does and they need each other
 
You know what's funny?

The bottleneck in current applications is the GPU and not necessilary the CPU as nvidia themselves pointed out. Doesn't this mean that intel is so far ahead of the game and nvidia needs to step up? lol
 
You know what's funny?

The bottleneck in current applications is the GPU and not necessilary the CPU as nvidia themselves pointed out. Doesn't this mean that intel is so far ahead of the game and nvidia needs to step up? lol

You got it the other way around. NVIDIA is right. The GPU is more important, because it IS the current bottleneck. It's more important to have a powerful GPU, than to have a powerful CPU.
So if you have a Q9650 and a 8600 GT, you will not have a better gaming experience than someone with a E6400 and a 8800 GT/GTS/GTX
 
So lets see the GPU control AI and shit if it is SO great and powerful...

each is good at what it does and they need each other

Before mentioning AI, mention physics. That was a CPU only task until just a year or two ago. With unified architectures (especially G80 and its derivatives), you will start seeing Stream Processors dedicated to physics calculations. Remember that the purpose of 3D accelerators, was to cut down the load off the CPU. Nowadays, they are actually "replacing" the CPU in many things.
 
This is bloody bullshit to the extreme. why? because i have a p4 northwood cpu and if i stick a agp 3580 card in it. i still wont be able to run games as well has someone with a dual core pci-e 3850 card

why is that? because cpu is the bottleneck of my pc not the GPU. the agp version of that said card is practicly the same has the pci e, just a different connection and maybe Just maybe slightly slower if im correct.

if there logic is true then my p4 cpu along with a 8800 gtx(if an agp version was made) will perform near identical to a dual or quad core pci e setup of teh same 8800 gtx card
 
This is bloody bullshit to the extreme. why? because i have a p4 northwood cpu and if i stick a agp 3580 card in it. i still wont be able to run games as well has someone with a dual core pci-e 3850 card

why is that? because cpu is the bottleneck of my pc not the GPU. the agp version of that said card is practicly the same has the pci e, just a different connection and maybe Just maybe slightly slower if im correct.

if there logic is true then my p4 cpu along with a 8800 gtx(if an agp version was made) will perform near identical to a dual or quad core pci e setup of teh same 8800 gtx card

CPUs are boring and Nvidia pushes the message that if you have a low-end CPU and a high-end graphics card you will play better than with high-end quad-core CPU and cheap graphics.


Try reading what it said...
 
This is bloody bullshit to the extreme. why? because i have a p4 northwood cpu and if i stick a agp 3580 card in it. i still wont be able to run games as well has someone with a dual core pci-e 3850 card

why is that? because cpu is the bottleneck of my pc not the GPU. the agp version of that said card is practicly the same has the pci e, just a different connection and maybe Just maybe slightly slower if im correct.

if there logic is true then my p4 cpu along with a 8800 gtx(if an agp version was made) will perform near identical to a dual or quad core pci e setup of teh same 8800 gtx card

No. The logic is based off different ranges of graphics cards and different ranges of CPUs.

(E2180 or E6300-E6400) + 8800 GT/GTS/GTX > (Q9650 or Q9770) + 8500/8600 GT

i.e.

low-mid end CPU + high-end GPU > high-end CPU + low-mid end graphics card.
 
This is bloody bullshit to the extreme. why? because i have a p4 northwood cpu and if i stick a agp 3580 card in it. i still wont be able to run games as well has someone with a dual core pci-e 3850 card

why is that? because cpu is the bottleneck of my pc not the GPU. the agp version of that said card is practicly the same has the pci e, just a different connection and maybe Just maybe slightly slower if im correct.

if there logic is true then my p4 cpu along with a 8800 gtx(if an agp version was made) will perform near identical to a dual or quad core pci e setup of teh same 8800 gtx card

makes a lot of sense
 
to be quite honest if nvidia could make a GPU work like a cpu and give it the same bandwidth as the GPU then it would out run any cpu we have, the problem is that they would have to change everything inside the chip to run x86/64 instructions and it would preform just like a slower 900mhz cpu. but i do agree that GPUs can boost gaming preformance more then a cpu could. the reason i say this because everytime i upgrade a cpu i see a 1-5fps increase with my games where as when i upgrade my gpus i can see anywhere from 30-40 fps increase.
 
so why cant i just buy a 8800 ultra agp card(if tehre was one) and slot that in my current rig and get teh same framerates has u pic e owners then?

The PCI-E version of the card would have slightly better performance than a AGP version...

But what the heck does AGP vs. PCI-E have to do with this thread? Did you even read what was said by the OP?
 
I don't believe that CPU's and GPU's are the only thing that people will arguing or fighting in the market about. What about hard drives, bus speed, and the bit width of the bus. People forget about providing the CPU and GPU with enough bandwidth to do what they need. This includes future flash cards with super fast read/write times or something using fiber optics, PCI Express on it's 24th hardware revision, and Motherboard manufacturers able to keep up with hardware that supports new processors and new chipsets.

My old PIII 500Mhz could not run Battlefield 2 with a Geforce 6600 like my Intel Core2Duo E8400 can. CPU, GPU, and hardware components are all equally important in keeping bottlenecks down.
 
Just get a QX9650 with Integrated Graphics vs. E2140 with 9800GX2 and see which one is "faster?"
 
The PCI-E version of the card would have slightly better performance than a AGP version...

But what the heck does AGP vs. PCI-E have to do with this thread? Did you even read what was said by the OP?

yes it has everything to do with the OP. according to nvidia you can have the slowest cpu and have a 8800gtx and still perform as good if not better then a quad core cpu with teh excact same gpu card.

pci e is only slightly faster plus these cards dont even use the full bandwith of pci e 1.6 speeds.
 
My old PIII 500Mhz could not run Battlefield 2 with a Geforce 6600 like my Intel Core2Duo E8400 can. CPU, GPU, and hardware components are all equally important in keeping bottlenecks down.

EXACTLY. some people here are blind to the fact that cpu along with otehr things such has bus speeds etc are just as in important has the gpu card for a game. yes gpu is the main difference but if ur cpu and mobo cant keep up then whats the use? why do you think nvida has stopped making agp cards? because agp systems just cant cope with such speeds of a 8800gtx
 
yes it has everything to do with the OP. according to nvidia you can have the slowest cpu and have a 8800gtx and still perform as good if not better then a quad core cpu with teh excact same gpu card.

pci e is only slightly faster plus these cards dont even use the full bandwith of pci e 1.6 speeds.

That's the problem. nVidia didn't say that the CPU doesn't matter.

What they did say was that a high-end GPU and low-end CPU combo will perform better in games than a high-end CPU with low-end GPU.

In literal terms what you're saying is that getting a great graphics card for your Northwood-based system would not run as good as a current quad-core and the same card. That is true. However, if you took a GeForce 5700 (contemporary of your CPU) and slammed it in a quad-core machine, it wouldn't perform much if at all better than a 5700 in a Northwood machine.

It's a well-known fact that for gaming ONLY, the gfx card is the single most important piece. It's not even close. You could manage to bottleneck a good card with crappy supporting hardware, but you can't manage to get good performance without a good graphics card.
 
My old PIII 500Mhz could not run Battlefield 2 with a Geforce 6600 like my Intel Core2Duo E8400 can. CPU, GPU, and hardware components are all equally important in keeping bottlenecks down.

You're right that anything can bottleneck a system. Beyond that you are taking it to an extreme.

P3 500? What's a contemporary GPU of that, a GeForce 256? Or is that even too new?

If you were running a current CPU/Mobo with a TNT2 or a GF256 and then stepped into a modern mid-end graphics card it would be a bigger difference in games than even what you did. That's essentially the mirror image of your upgrade. In your case, it was obvious what you had to do to remove your bottleneck and you did the right thing, but that is a unique circumstance that doesn't prove the rule but rather defines the limits.
 
I don't believe that CPU's and GPU's are the only thing that people will arguing or fighting in the market about. What about hard drives, bus speed, and the bit width of the bus. People forget about providing the CPU and GPU with enough bandwidth to do what they need. This includes future flash cards with super fast read/write times or something using fiber optics, PCI Express on it's 24th hardware revision, and Motherboard manufacturers able to keep up with hardware that supports new processors and new chipsets.

My old PIII 500Mhz could not run Battlefield 2 with a Geforce 6600 like my Intel Core2Duo E8400 can. CPU, GPU, and hardware components are all equally important in keeping bottlenecks down.

EXACTLY. some people here are blind to the fact that cpu along with otehr things such has bus speeds etc are just as in important has the gpu card for a game. yes gpu is the main difference but if ur cpu and mobo cant keep up then whats the use? why do you think nvida has stopped making agp cards? because agp systems just cant cope with such speeds of a 8800gtx

Nvidia probably meant a slower CPU within the same architecture/generation. You can't take the statement "if you have a low-end CPU and a high-end graphics card you will play better than with high-end quad-core CPU and cheap graphics" and apply it with CPU and GPU from different generations.

Of course a Celeron D 2.53 Ghz paired with a 8800GTX is no good for modern gaming but can you say a Q9650 paired with 8500GT will perform better than a E8200 paired with 8800GTX? My guess is the E8200 system will fare better in 99% of the games available.

Just get a QX9650 with Integrated Graphics vs. E2140 with 9800GX2 and see which one is "faster?"

Exactly what I'm trying to say.
 
Cant you see, nvidia is ruffling it feathers for intel, in 2 years their going to be competing for the same customer base that is arguing over this now. And NV wants to get in all the punches it can throw between now and then.

Technically they are right however GPU > CPU, but time will tell.
 
so in the future could we see a nvidia board that has a in build cpu that also acts has a GPU? if invidia can come up wth a quad core cpu that can be used as a GPU as well then intel are in trouble
 
so in the future could we see a nvidia board that has a in build cpu that also acts has a GPU? if invidia can come up wth a quad core cpu that can be used as a GPU as well then intel are in trouble

I don't think that would happen.
There are a whole boat load of instruction sets that nvidia would have to license form Intel and AMD to build a cpu. And since nvidia doesn't want to play nice with the SLi drivers and instruction sets - Intel and AMD would just tell nvidia to go play in another sandbox.
 
I thought everyone knew that the GPU is much more important than the CPU for gaming.
 
I thought everyone knew that the GPU is much more important than the CPU for gaming.

You'd be surprised. ;) This just reminds me of some of my friends who think they know what they're talking about, but they really are novices. I had 2 at the same time that built themselves new rigs. Both got the same exact parts etc. C2Q 2.4Ghz paired with an 8600GTS. I told them don't plan on playing anything more demanding than COD4 at medium settings unless they want under 30fps.
 
EXACTLY. some people here are blind to the fact that cpu along with otehr things such has bus speeds etc are just as in important has the gpu card for a game. yes gpu is the main difference but if ur cpu and mobo cant keep up then whats the use? why do you think nvida has stopped making agp cards? because agp systems just cant cope with such speeds of a 8800gtx

No one is saying that everything else doesn't matter. The point is, the GPU makes by far the biggest difference of any component in a PC.

The difference between a 1.8GHz Core 2 Duo and a 3GHZ Core 2 Quad is tiny in comparison to the difference between an 8400GS and 8800GTX. Even worse, most PCs ship with Intel GMA cards which are totally blown away by an 8400GS.
 
I wonder if this is pre-posturing due to nVidia fearing they will be stuck with crippled Nehalem licenses or being forced into AMD chipsets only.

You don't NEED that super high end Nehalem! You can get by with low end Nehalem or AMD processors! (Which is true, but marketing is important, too).

Conversely, it could be as innocent as "don't buy that extreme edition, spend more on US instead!".
 
I agree with nvidia. I have an e2140 with only a 3850 and it runs crysis with some settings on high most on medium, though I do tolerate lower FPS in some situations. The E2140 is overclocked but I can set it to stock and have it game just as well.

If Nvidia's argument stays in just gaming and 3D applications, then its right, but if you open it up outside of that, then they are quite wrong.
 
Hmm. Let's see. Oblivion, with a 7800GT.

My Athlon 64 3200, compared to my Q6600. Yep, same goddamn frame rate.

Upgraded to an 8800GTS... whoops, framerate doubled!

nVidia ++
 
I wish games utilized multi-core CPUs better, I hate having a quad core CPU sitting there with 25% work being done on each core while gaming. I wish games would utilize the power there for physics and AI. The power is sitting right there, it just needs to be used in new ways with gaming. Right now only very few select games leverage the power of multiple core cpus. Supreme Commander is a good example of a game that uses it well.

GPUs and CPUs are powerful, we need game developers to start using them to their full potential.
 
Back
Top