AuDioFreaK39
Limp Gawd
- Joined
- Jan 10, 2005
- Messages
- 475
Fudzilla posted an article about a week ago that caught my attention:
vs.
http://www.fudzilla.com/index.php?optio p;Itemid=1
"Nvidia is publicly claiming that the GPU is better and smarter than the dull CPU. CPUs are boring and Nvidia pushes the message that if you have a low-end CPU and a high-end graphics card you will play better than with high-end quad-core CPU and cheap graphics.
We agree with Nvidia, as a GPU upgrade usually means better gaming performance; but we believe that an open confrontation with Intel is the last thing that Nvidia needs at the moment. The company is arrogant, but not as arrogant as Intel, and history teaches us that Intel finds a way to penalize the bad boys.
The future just got more interesting."-
-----------------------------------
Although I'd love to see the GPU eventually winning the controversy, I have a premonition that it's going to be the underdog (similarly to HD-DVD) in the situation, but it won't be entirely cut out of the computing process - it will just serve a different general purpose. My guess is that the multicore CPUs after Larrabee will have around 32 to 64 cores, with a given section of them (1/4 or so) dedicated solely to ray tracing through parallel processing that is directly interlinked to the other main logic cores. In effect, a GPU would be the device in a computing system that would "refine" the CPU ray traced graphics in a sense as to add depth of perception and "realization". So in theory, the computing system would be similar to that of the human mind, only in reverse order:
CPU (processes image through ray tracing and parallel processing techniques and sends to) >> GPU (which translates the image "realistically" by adding detail, texture, effects (non-physics related), and "perception" in a sense)
Would anyone agree with this, or am I just going crazy? Can't really tell 0_O
![nvidia.gif](http://www.fudzilla.com/images/stories/Logos/nvidia.gif)
![intel_logo_new.gif](http://www.fudzilla.com/images/stories/Logos/intel_logo_new.gif)
http://www.fudzilla.com/index.php?optio p;Itemid=1
"Nvidia is publicly claiming that the GPU is better and smarter than the dull CPU. CPUs are boring and Nvidia pushes the message that if you have a low-end CPU and a high-end graphics card you will play better than with high-end quad-core CPU and cheap graphics.
We agree with Nvidia, as a GPU upgrade usually means better gaming performance; but we believe that an open confrontation with Intel is the last thing that Nvidia needs at the moment. The company is arrogant, but not as arrogant as Intel, and history teaches us that Intel finds a way to penalize the bad boys.
The future just got more interesting."-
-----------------------------------
Although I'd love to see the GPU eventually winning the controversy, I have a premonition that it's going to be the underdog (similarly to HD-DVD) in the situation, but it won't be entirely cut out of the computing process - it will just serve a different general purpose. My guess is that the multicore CPUs after Larrabee will have around 32 to 64 cores, with a given section of them (1/4 or so) dedicated solely to ray tracing through parallel processing that is directly interlinked to the other main logic cores. In effect, a GPU would be the device in a computing system that would "refine" the CPU ray traced graphics in a sense as to add depth of perception and "realization". So in theory, the computing system would be similar to that of the human mind, only in reverse order:
CPU (processes image through ray tracing and parallel processing techniques and sends to) >> GPU (which translates the image "realistically" by adding detail, texture, effects (non-physics related), and "perception" in a sense)
Would anyone agree with this, or am I just going crazy? Can't really tell 0_O