vick1000
2[H]4U
- Joined
- Sep 15, 2007
- Messages
- 2,443
The 4870X2 has fixed most of these issues. They have fixed microstuttering and many other multi-GPU issues. This is the concept that I don't understand, many people assume that the technology behind the X2 and the direction AMD is heading is the same that nVidia has done. It's not. Ever think about why AMD choose GDDR5 over GDDR3?
GDDR5 has in place the ability to overcome a huge hurdle in multi-GPU solutions: a shared frame buffer. This wasn't in place in the preview, my hope is that they will have this in place for the release.
The fact about AMD is that they are designing their cards to become multi-GPU capable, and to scale together. nVidia, isn't doesn't. They make a card, where you can have them do AFR, but they aren't trying to go the route of AMD. This isn't bad, and I'm not saying one is better then the other. What I am saying is this: the route that AMD is going with it's multi-GPU technology is not the same as nVidia. Because of this, you can not say 'AMD looses because it's 2 dies'. AMD is betting on their multi-GPU strategy, and it's working! I don't understand how people can bash ATi for going this route, and to think ATi has done something 'wrong'.
I do think that the x2 and a GT200b will compete very competitively, but to rule out the x2 because it's 'mulit-GPU' is wrong.
When thet start putting multiGPUs on a single package, it may mean something. As it is, the R700 is still just Xfire on a single card regardless of a few tweaks, and power draw/ heat will reflect this. A 55nm or 40nm GT200 part would undoubtedly run cooler and draw less power, while performing similar.
What makes the multiGPU strategy a negative, is it reduces the requirment of R&D on new architecture, and stagnates new silicon growth. R770 is not much differrent than R670, just a few more SPs and die tweaks. While GT200 is a whole new animal (all be it an inefficient beast), as I hope GT300 will be as well (just not a beast).