All GPU's suck

Stinkfist

[H]ard|Gawd
Joined
Mar 9, 2000
Messages
1,508
So I've given alot of thought to Conroe being released and not really showing any improvement (or minimal at best) in performance at higher resolutions. From what I've read so far, it seems like Conroe has approximately a 350MHz head start on an A64 cpu. The problem I've had, going back to the [H] article that showed you basically you need a 3.2GHz P4 or PD, 2.4GHz A64, or basically a 2GHz Conroe to get the most out of the current high end video card solutions. Why the huge freaking bottleneck here? Not to mention the fact that on platforms like the Intel 965 (currently anyhow) SLI on either dual card or single card configurations is NOT EVEN SUPPORTED. I just find it irritating that I have to pair at LEAST a $500 gpu solution to my $300 cpu to get the most out of my computer, let alone spending anymore (or overclocking for that matter) on my cpu. WTF would I need that fx-62/64 or X6800 for again?
 
Yes, GPU's suck but without one you couldn't have typed that message! ;)
 
R1ckCa1n said:
Yes, GPU's suck but without one you couldn't have typed that message! ;)

Dont remind me, it just pisses me off more. To be honest, there is probably more competition between ATI and Nvidia then there is with AMD and Intel, so why the hell cant the gpu side keep up with the cpu side? At least, why the hell the premium on performance?
 
Stinkfist said:
Dont remind me, it just pisses me off more. To be honest, there is probably more competition between ATI and Nvidia then there is with AMD and Intel, so why the hell cant the gpu side keep up with the cpu side? At least, why the hell the premium on performance?
probably because MOST people that game would be happy with a 7900gt since it delivers playable fps even over 1600x1200 in most games. there are still a lot of people out there that are happy gaming on a 19inch lcd at 1280x1024 so anything above a 7900gt might be overkill for them anyway. its nice to get a a good cpu and just make a gpu upgrade every 12 to 18 months or so. that way a rig can be very useful for 3 to 4 years.
 
Stinkfist said:
So I've given alot of thought to Conroe being released and not really showing any improvement (or minimal at best) in performance at higher resolutions. From what I've read so far, it seems like Conroe has approximately a 350MHz head start on an A64 cpu. The problem I've had, going back to the [H] article that showed you basically you need a 3.2GHz P4 or PD, 2.4GHz A64, or basically a 2GHz Conroe to get the most out of the current high end video card solutions. Why the huge freaking bottleneck here? Not to mention the fact that on platforms like the Intel 965 (currently anyhow) SLI on either dual card or single card configurations is NOT EVEN SUPPORTED.

It's because at high resolutions like that the games become very GPU limited and so a CPU upgrade means very little, and in almost all cases, you would EXPECT not to have a jump in 'best playable settings' simply because that phrase translates into 'image quality settings at which game becomes excessively GPU limited'. If you read places OTHER than [H] you'd actually notice that the Conroe has a fairly significant jump in performance, and even moreso in performance/watt.

WTF would I need that fx-62/64 or X6800 for again?

Consider this: MAYBE YOU DON'T NEED IT. I'm living with an X800Pro and a Northwood P4 3.0G CPU and playing games at fairly good image settings and getting a good framerate. You don't have to constantly upgrade your computer if you feel you won't get much of a benefit from it!
 
Gpus have become so advanced in the last year, going from 8 pipes to 32 pipes in just 2 refreshes...

The mere thought of calling a 7900gtx or X1900XTX a weak peice of garbage.... Rather annoys me. Cpus have it easy, they don't have to do alot, just AI and physics and they aren't designed to do physics to a great extent so the physics are very easy for cpus to do. Cpus just dont have a hard job to fulfill. Now, GPUs on the other hand have some of the hardest calculations to make, from lighting, to textures, with throughput of ram just like quadroopling in the last year, I just find this whole thing utterly annoying.

GPUs are the most advanced processors/memory cards on the planet. Live with it.
 
Cypher19 said:
It's because at high resolutions like that the games become very GPU limited and so a CPU upgrade means very little, and in almost all cases, you would EXPECT not to have a jump in 'best playable settings' simply because that phrase translates into 'image quality settings at which game becomes excessively GPU limited'. If you read places OTHER than [H] you'd actually notice that the Conroe has a fairly significant jump in performance, and even moreso in performance/watt.

Regardless of the fact, only 1 conroe is incapable of providing more than the gpu is capable of keeping up with, 4 am2 processors are more than enough (3800 a64, 4600 and 5000 x2, fx62) and 3 Pentium D (940, 950, 960, including the EE) ALL AT STOCK SPEEDS. Kinda makes overclocking or buying a faster cpu for gaming pointless, which is the point i'm trying to make. As it is now, Conroe is pointless for gaming, when it really should be shelling out an ass whoopin to everything out there.
 
Cypher19 said:
Consider this: MAYBE YOU DON'T NEED IT. I'm living with an X800Pro and a Northwood P4 3.0G CPU and playing games at fairly good image settings and getting a good framerate. You don't have to constantly upgrade your computer if you feel you won't get much of a benefit from it!

By this line of thinking nobody needs conroe for gaming.
 
Because cpu's aren't required to do much at all in contrast to gpus. Notice how low fps are in 3d05 when the cpu is rendering compared to gpu? the gpu is just a powerhouse, but some people ask too much of it. I'm on a 7600gt right now and the games I play competitively (ut04 and css) run more than fast with all high and AF @ 16x on my 19" lcd. A while ago 800x600 was the standard res and 30 fps was standard , now it's been more than doubled on the res side and fps side.
 
CleanSlate said:
Gpus have become so advanced in the last year, going from 8 pipes to 32 pipes in just 2 refreshes...

The mere thought of calling a 7900gtx or X1900XTX a weak peice of garbage.... Rather annoys me. Cpus have it easy, they don't have to do alot, just AI and physics and they aren't designed to do physics to a great extent so the physics are very easy for cpus to do. Cpus just dont have a hard job to fulfill. Now, GPUs on the other hand have some of the hardest calculations to make, from lighting, to textures, with throughput of ram just like quadroopling in the last year, I just find this whole thing utterly annoying.

GPUs are the most advanced processors/memory cards on the planet. Live with it.

QFT- Good post
 
what i dont get is people going to such great lengths to squeeze out a bit more from their cpu from overclocking by spending hours lapping and watercooling and all that stuff - when none of it really translate to significant fps gain
 
Big Fat Duck said:
what i dont get is people going to such great lengths to squeeze out a bit more from their cpu from overclocking by spending hours lapping and watercooling and all that stuff - when none of it really translate to significant fps gain
Folding, crunching video, encoding mp3s, decoding hi-def, zipping files, or other tasks you can do on a PC, perhaps? :p
 
Big Fat Duck said:
what i dont get is people going to such great lengths to squeeze out a bit more from their cpu from overclocking by spending hours lapping and watercooling and all that stuff - when none of it really translate to significant fps gain

Most people use there cpu(s) for tasks other than Gaming....
 
Firebat said:
Most people use there cpu(s) for tasks other than Gaming....

Learn how to grammaratize, it's obvious that you meant "they're" isntead of "there"... geebus christopherous
 
lol, it's 730am and I haven't slept, excuse my poorly executed joke.

I was kidding about the they're and grammaratize :D
 
Stinkfist said:
you need a 3.2GHz P4 or PD, 2.4GHz A64, or basically a 2GHz Conroe to get the most out of the current high end video card solutions. Why the huge freaking bottleneck here?

The reason is by design...a CPU can not process the graphics as fast as a dedicated graphics card, we abandoned asking the CPU to process graphics. Therefore frame rates are graphics card limited. It's that simple.

Don't believe me? Do you remember in the days of the 386/486/P5 before hardware graphics acceleration? Well, if you don't, it's OK but GPU's did not always exist. CPU's were too slow (and still are too slow) to handle graphics.

Now...if you were to ask the CPU to do a lot of physics calculations, and made the graphics rendering dependant on physics data, you would be CPU limited really quick. The Conroe won't even help you there.
 
Another thing that needs to be pointed out is that no other piece of computer hardware has the constant throughput that a GPU does.

If you are running at a 75Hz refresh rate, your GPU is rendering all your screen pixels at 75 times a second... Yup, right now reading this too! So if you are at 1024x768, your GPU is processing 58,982,400 pixels every second. Now imagine if you're running 1280x1024 at 85Hz.... 111,411,200 (pixels per second). ;)

Now suppose you're gamin'... Not only does it just have to display the pixels, but it has to work to figure out the result on a per pixel basis. AA, AS, texturing, lighting, etc... The GPU indeed does not have an easy task. Also, for those that dont know, calculating lighting is the most computationally intensive thing a machine can do. Radoisty, reflection, refraction is not computed easily.

Now after that oversimplified explination above what else in your machine processes data as much as your GPU, and is pushed as far?
 
Stinkfist said:
By this line of thinking nobody needs conroe for gaming.

By my line of thinking we don't need electricity, air conditioning, plumbing, sewage or commercialism to live.

Choose your words more wisely.
 
MGCJerry said:
Another thing that needs to be pointed out is that no other piece of computer hardware has the constant throughput that a GPU does.

If you are running at a 75Hz refresh rate, your GPU is rendering all your screen pixels at 75 times a second... Yup, right now reading this too! So if you are at 1024x768, your GPU is processing 58,982,400 pixels every second. Now imagine if you're running 1280x1024 at 85Hz.... 111,411,200 (pixels per second). ;)

Now suppose you're gamin'... Not only does it just have to display the pixels, but it has to work to figure out the result on a per pixel basis. AA, AS, texturing, lighting, etc... The GPU indeed does not have an easy task. Also, for those that dont know, calculating lighting is the most computationally intensive thing a machine can do. Radoisty, reflection, refraction is not computed easily.

Now after that oversimplified explination above what else in your machine processes data as much as your GPU, and is pushed as far?

Would it be easier if there is a dual core GPU?
 
It's hard to compare the two architectures, but basically, it's because the GPU's are doing way more. CPUs don't help with the extra polygons textures pixel shaders, Anti-Aliasing and Anisotropic filtering that we demand. The processing power required for GPUs are simply going up at a faster rate. CPU have physics AI, and a few other tasks to render. The demands do go up but not nearly as much as on the GPU. That's why a GPU becomes obsolete much more quickly. Try letting a Conroe render FEAR at 1600x1200 full AA and AF in software mode. You might get a frame every few minuets. :p

Seriously, even though you can't compare the two architectures directly, the GPU has made way more advances than the CPU. GPU seem to double in power every six months to a year. It's starting to take CPU two and a half to three years to do that now.
 
sabrewolf732 said:
Notice how low fps are in 3d05 when the cpu is rendering compared to gpu?

That's not how fast the cpu can render the scene. Futuremark deliberately limit the frame rate in that test so it is not indicative of how fast the cpu can render the scene at all.
 
Flyboat said:
Would it be easier if there is a dual core GPU?


Yes...which is why my GPU has 16 seperate pipelines...and each one of the pipelines has multiple shaders as well (3 each)

If you noticed, I made a jump in equating each pipeline as an additional core, because it effectively is, allowing multiple threads to run simultaneously. Thats because in CPU terms, a multithread capable CPU is physically made of multiple cores mounted together.

You may have been more asking about multiple GPU's instead...of which we do have SLI and Crossfire available...that is what would equate to multiple pairs of multiple cores.

Anyway...there you go :)
 
Oh noes, we finally have things like TRSSAA and HQAF, HDR lighting and shadow tech that would make a teenager from 1997 drool...and you're complaining because we can't stress our CPUs enough?

You want to do that, you're perfectly welcome to, just kindly turn off the AA and AF and HDR, and I GUARANTEE the highest-end processors won't be able to squeeze ALL the performance out of modern graphics cards.

You people forget just how much of a performance hit we take to enable all the goodies, and just how shitty games looked without real-time lighting and AA/AF. Maybe you need a reminder:

quake04lb4.jpg


You like that? You used to have to have SLIed Voodoo 2s to play Quake II in such glory! You see any AA? How About AF? Lighting? Just the vertex lighting for projectiles.

Now you can do the same with onboard GMA 900. So quit complaining about GPUs, they're having trouble handling all the new-fangled features you kids take for granted.
 
Gatticus said:
That's not how fast the cpu can render the scene. Futuremark deliberately limit the frame rate in that test so it is not indicative of how fast the cpu can render the scene at all.

What? I thought it was capped at 2fps, but most cpus hang around 1fps or lower.
 
arent there way more transistors in a modern GPU core then a CPU core?

I mean... they are doing the best they can. look at the highest end cards... sold at a premium, and already clocked close to their limits...

plus... memory doesn't scale at the same rate as processors... memory speed is more important to a GPU... alot more... and memory will always be a bottleneck until memory tech can somehow catch up to speed.
 
defaultluser said:
Oh noes, we finally have things like TRSSAA and HQAF, HDR lighting and shadow tech that would make a teenager from 1997 drool...and you're complaining because we can't stress our CPUs enough?

You want to do that, you're perfectly welcome to, just kindly turn off the AA and AF and HDR, and I GUARANTEE the highest-end processors won't be able to squeeze ALL the performance out of modern graphics cards.

You people forget just how much of a performance hit we take to enable all the goodies, and just how shitty games looked without real-time lighting and AA/AF. Maybe you need a reminder:

http://img397.imageshack.us/img397/4929/quake04lb4.jpg

You like that? You used to have to have SLIed Voodoo 2s to play Quake II in such glory! You see any AA? How About AF? Lighting? Just the vertex lighting for projectiles.

Now you can do the same with onboard GMA 900. So quit complaining about GPUs, they're having trouble handling all the new-fangled features you kids take for granted.[/QUOTE]Excellent post and a good reminder to the kiddies how good they have it today. Btw...I don't remember my Quake II even looking that good! :p
 
I agree all GPUs SUCK! Lets all go throw our GPUs into the street since they suck so much!
 
Stinkfist said:
So I've given alot of thought to Conroe being released and not really showing any improvement (or minimal at best) in performance at higher resolutions. From what I've read so far, it seems like Conroe has approximately a 350MHz head start on an A64 cpu. The problem I've had, going back to the [H] article that showed you basically you need a 3.2GHz P4 or PD, 2.4GHz A64, or basically a 2GHz Conroe to get the most out of the current high end video card solutions. Why the huge freaking bottleneck here? Not to mention the fact that on platforms like the Intel 965 (currently anyhow) SLI on either dual card or single card configurations is NOT EVEN SUPPORTED. I just find it irritating that I have to pair at LEAST a $500 gpu solution to my $300 cpu to get the most out of my computer, let alone spending anymore (or overclocking for that matter) on my cpu. WTF would I need that fx-62/64 or X6800 for again?
Because developer code to take advantage of grapihcs effect more then CPU's as CPU development time takes longer and performance increase fon't happen as fast, so the market will not program games to take too much CPU power.

GPU have new generation somewhere between 1.0 - 1.5 yrs, so since GPU performance rise much quicker developer will code to utilize as much GPU performance as possible knowing full well that more performance is on the way.
 
Darakian said:
I agree all GPUs SUCK! Lets all go throw our GPUs into the street since they suck so much!

No, that isn't going far enough!

We need to organize a logbbying group and get Congress to legislate against these evil GPUs!

Then we can have ourselves a grand old-fashoned GPU-burnin in the square!
 
To the person that started this topic. You start a topic like this in the VGA forum, you're asking for trouble. Maybe I can relate to your frustration, when indeed I myself want all the settings set to high at the highest possible resolution with 60 fps plus in the newest games available. But to want to do so on anything else than the premium this market has to currently offer is simply asking too much. And please do understand that GPU technology is currently one of the fastest progressing technologies on earth right now (don't know about the other planets, though, so on your planet it may be progressing slower), so I have high hopes for what the next few years can bring for gpus and gaming seeing how the technology of gaming has advanced in increasing rates in the last 10 years. Who knows how much better gpus will be (when it comes to handling games) in a couple of years, given the recent advancements. Don't be blind, open your eyes and realise that gpus have come a LONG way.
 
Back
Top