At low resolution or no eyecandy, I'd agree.Truth is, other than Crysis, nothing needs more than an 8800-series card... . Better cards would spur more demanding games, though.
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
At low resolution or no eyecandy, I'd agree.Truth is, other than Crysis, nothing needs more than an 8800-series card... . Better cards would spur more demanding games, though.
Guessing since the G92 is 256 bit.
Why are people saying it will have a 256bit bus?
And G80 is 384, 320, and 128 (and 64?).
Because of the qouted ram amount on the 'card' 1Gb.
I wonder if it'll use GDDR3 or 4?
This is based on G92 though. Which so far has been 256.
And I don't think the low end 8x00 parts were based on G80.
That thing is friggen HUGE!!!!!
Completely retarded. This isn't what anyone really wants, NVIDIA.
I like their products and all, but this is stopgap bullshit.
So why not 512bus then?
Well there was still 384 (GTX/Ultra) and 320 (GTS).
Also, it might use GDDR5 since samsung (?) is producing that now. Its supposed to be a lot better from what I recall from the news post a few months ago or so, so maybe it would help with the smaller bus if there is one.
Something seems off about Kyles info here, no offence Kyle, but this is a really weak offering, even from the undisputed high end champ at the moment. If Nvidia isn't interested in jacking up the bar until ATI gets a competitive part, then why not just stay with the Ultra, or release a single PCB G92 Variant that slightly beats the Ultra? It would seem that that would be cheaper than this.... thing....
The GTX/Ultra and old GTS are based on the old G80, not the new die shrink G92. So I'm not at all sure what your point is
My point was there are multiple bus sizes for the G80 cards so its not impossible to have different bus sizes for G92.
Maybe not my heart, but at least my power supply. 3 ultras draw 800W+ from the wall peak... this will be lower power draw per PCB than an ultra, but there's still 4 of them. I wouldn't be surprised to see this hit 900W of power draw.Steve said:If two GeForce 8800 GPUs on a dual PCB, single video card doesnt make your heart flutter, the fact that these will most likely support quad-SLI surely will.
I realize a lot are disappointed by GX2 revisions, but does anyone think this is where graphics processing is going?
The average high end gamer will not accept 400watts++ or more power to go SLI or CF just to play the latest games, The true enthusiasts will find it acceptable, but that makes up a very small percentage of potential buyers.