BigBallinGPR
Limp Gawd
- Joined
- Oct 7, 2009
- Messages
- 232
Its all in the die-sizes. Just look at the similarities between this battle and the Intel/AMD battle. Who is winning that one? Intel. Why? Smaller die sizes = more cpus per wafer = more cpus for the consumer. Who is seemingly winning the ATI/Nvidia battle? ATI at the moment. Why? again because of its ability to shrink its GPU and produce more GPU's per wafer.
Nvidia needs to step up its engineering a little bit and get with the program. Before we know it(granted its will be at least 5 years), CPUs and GPUs will be using carbon nanotubes as transistors and the resulting dies sizes will be so small that the number of processors per wafer will be out the roof.
Charlie has the basics right and while he is embellishing a lot (for god sakes he worked for the UK Inquirer, it comes with the territory), there is quite a bit of truth to his article. Unless Nvidia starts shrinking their GPU's and coming out with logic that will enable this, I don't see them lasting all that long, especially with the upcoming introduction of Larrabee based GPU's from Intel, who are already quite a few steps ahead of Nvidia with their advanced logic and small die sizes.
Nvidia needs to step up its engineering a little bit and get with the program. Before we know it(granted its will be at least 5 years), CPUs and GPUs will be using carbon nanotubes as transistors and the resulting dies sizes will be so small that the number of processors per wafer will be out the roof.
Charlie has the basics right and while he is embellishing a lot (for god sakes he worked for the UK Inquirer, it comes with the territory), there is quite a bit of truth to his article. Unless Nvidia starts shrinking their GPU's and coming out with logic that will enable this, I don't see them lasting all that long, especially with the upcoming introduction of Larrabee based GPU's from Intel, who are already quite a few steps ahead of Nvidia with their advanced logic and small die sizes.