9800 GTX vs. SLI vs. 3-way SLI. vs. Quad SLI @ [H]

FrgMstr

Just Plain Mean
Staff member
Joined
May 18, 1997
Messages
55,635
9800 GTX vs. SLI vs. 3-way SLI. vs. Quad SLI - How does a 9800 GTX compare to SLI, 3-way SLI, and Quad SLI? We have completed an SLI scaling evaluation using our real world gameplay techniques as well as apples-to-apples testing to see what SLI, 3-way SLI, and Quad SLI can do to improve your gameplay experience.
At this time there just aren’t enough advantages past GeForce 9800 GTX SLI to recommend 9800 GTX 3-way SLI. If you are looking for the absolute highest framerates, 3-way SLI will provide this, but it will not allow you to improve your gameplay experience in some games, so you are paying extra money for nothing unless you want the performance for canned benchmarks. GeForce 9800 GTX SLI seems to be rather efficient for high-end gaming at this time, with Crysis being the exception. While Assassin’s Creed was just released, and is a beautiful game, it ended up playing very well even on a single GeForce 9800 GTX with the highest in-game settings. GeForce 9800 GTX SLI will allow you to run it at 2560x1600, but 3-way SLI will not improve upon this.
 
Awesome article. I still find it amazing that in 3way sli, those GTX's get enough air to stay cool enough to operate, and not down-throttle or artifact...
 
Another fine piece of work; thanks Brent & Kyle :)

Yes, clearly the choices made by nVidia regarding memory performance on the 9800 series is having quite an impact, as ably demonstrated by yet another fine piece of [H] investigation.
 
Nice article.
It looks like the 8800 GTX is still "the" card to me.
I've had triple SLi with the 8800 GTX for a while now and I really think it offers some good gameplay.
I've never regretted it once. Some games benefit and I'm sure some don't.

The straight comparison using the same cards was very well done. (as usual)
 
This article proves what I tell everyone who asks... anything beyond 2 way SLI is useless, overkill and requires Windows ME 2.0 (Vista ;))

I agree with magoo... the 8800GTX is still the card to have... I've had a pair in SLI since two weeks after launch and they're absolutely amazing. The only game I can't fully max out (other than AA settings and such) is Crysis.
 
To summarize the GeForce 9800 GTX, it is a 65nm GPU consisting of 128 stream processors clocked at 1.688MHz with a.......

DAAAAAAAAAUM these things have gotte up to 1.668MHz!!!!!!!! ;)

I think that would produce... What, a whole 4 triangles persecond!!!! ;)
 
People still get caught in the scheme of having 4 GPUs in a single machine.

Now I'm wondering how 8800GTX 3-Way SLI would work.

Nice read. Shame I don't have the money for anything but a single card, lol.

~Ibrahim~
 
People still get caught in the scheme of having 4 GPUs in a single machine.

Now I'm wondering how 8800GTX 3-Way SLI would work.

Nice read. Shame I don't have the money for anything but a single card, lol.

~Ibrahim~

I can tell you from having run it that 8800GTX 3-Way SLI doesn't provide much over 8800GTX SLI.

9800GX2 quad SLI is somewhat dissappointing. I agree with the article that these cards have great shader performance but their 256bit bus and 512MB of memory just limit their maximum performance. This becomes very apparent at 2560x1600. I've found this to be the case on my own system.
 
Can someone please tell me WTF nVidia was thinking with such a limited memory bandwidth on these cards? I surely hope the 89xx series, when and if they're released, will at least fix this (1 GB memory plus a 512 bit bus anyone?)
 
Of course it is arguable that if these expensive multi-GPU solutions are not being engineered for those gamers that can afford large resolution displays, just who are they being designed for?
+1 ...one does tend to wonder.

That's especially true for those of us who have had a Dell 30" for a while - it didn't seem like cheaping out when the Apple Cinema Display was the only alternative, but it leaves you to choose between 2560x1600 or an unimpressive 1280x800 :(

Well, I hear good things about nvidia cards being able to render at lower res and scale on-card. I'll have to see if I can figure out how to render at 1920x1200 (the apparent sweet-spot with the memory on cards these days), and scale to 2560x1600. I haven't a clue where to look.
 
People still get caught in the scheme of having 4 GPUs in a single machine.

Now I'm wondering how 8800GTX 3-Way SLI would work.

Nice read. Shame I don't have the money for anything but a single card, lol.

~Ibrahim~

8800 GTX 3-Way SLI works just fine:D. At least for me.
I happen to disagree with Dan, I saw dramatic increases in IQ and FPS in Crysis.
I haven't used anything but 3-Way SLI for several months (since Crysis came out) and it has been flawlessly implemented by nvidia.
I played Crysis through twice (plain SLI and 3Way SLI) and then put it on the shelf, but every game I play looks sweet. I won't be giving this set-up up for quite a while, unless a more demanding game comes along.
 
are their any 3rd party nvidia vendors who have done 1gb of ram on these cards? if not, why?

what was nvidia thinking when they released these cards?

nvidia - "let's trick a whole bunch of people into thinking these are better than the old (8800gtx) cards so we can reduce our operating costs and simultaneously release 'newer, faster' cards."
 
are their any 3rd party nvidia vendors who have done 1gb of ram on these cards? if not, why?

what was nvidia thinking when they released these cards?

nvidia - "let's trick a whole bunch of people into thinking these are better than the old (8800gtx) cards so we can reduce our operating costs and simultaneously release 'newer, faster' cards."

I think the 256bit bus would castrate them even if they did have more than 512MB of RAM.
 
'we have increased the resolution to 2560x1600 and dropped AA so we won’t be bottlenecking the video cards.'

What's the point of uber GPUs that aren't increasing frame rate in general? I mean, my monitor is 1600x1200 native resolution. I must be part of the 'out' crowd here, as to me, a 20 inch display is big enough.
 
sigh.. I am looking at these benchmarks in disbelief. The HardOCP staff should know that GeForce 174.88 performs much smoother with 9800GX2 cards and the 175.12 drivers are more optimized for single GPU chips. With this in mind, I have reason to believe that this review should be redone with GeForce 174.88 to compensate for GeForce 175.xx driver setbacks with GX2 cards.
 
Excuse me for being picky, but where are the GX2 results?

Otherwise great article, it clears out a lot of doubts.
 
So would it be true that a single 9800 GX2 clocked to 675/1687/1100 would provide about the same performance as 9800 GTX SLI? I have my 9800 GX2 clocked to 675/1687/1050 so stock 9800 GTX SLI wouldn't provide any benefit correct?
 
I wonder how the 1GB 8800GTS from Palit would compare? Can't get 'em anymore, but on paper it's pretty competitive. It's got more framebuffer, would that aleviate the the AA perf hit, or is that purely a bus issue?

http://www.newegg.com/Product/Product.aspx?Item=N82E16814261006&Tpk=8800gts+1gb

I think it is an issue of both. The larger 384bit bus of the 8800GTX is one thing that allowed it to shine, but the other thing was the 768MB of memory. These two things were why the 8800GTX still out performed the 8800GTS 512MB at higher resolutions with AA and AF enabled. I'd bet that just fixing the bus or just adding more memory alone wouldn't do much. You need both a larger bus and more memory.

'we have increased the resolution to 2560x1600 and dropped AA so we won’t be bottlenecking the video cards.'

What's the point of uber GPUs that aren't increasing frame rate in general? I mean, my monitor is 1600x1200 native resolution. I must be part of the 'out' crowd here, as to me, a 20 inch display is big enough.

Yes, you are part of the out crowd. I use a 30" monitor and I can't imagine gaming on less than a 24" these days. Even a 22" seems small to me.

So would it be true that a single 9800 GX2 clocked to 675/1687/1100 would provide about the same performance as 9800 GTX SLI? I have my 9800 GX2 clocked to 675/1687/1050 so stock 9800 GTX SLI wouldn't provide any benefit correct?

They should be relatively close.

Most likely a bus issue. The 8800gt by palit (or was it galaxy) reviewed by [H] showed no real difference over the 512 version. The G92 GTS is not THAT much better.

I think that's because the increased memory size didn't have a large enough memory bus to feed the extra RAM.
 
This is a great comparison. And, I think it shows a lot more than gaming experiences. It's really showing that while SLI has improved, it's still terribly inefficient.

Is it possible that the 8000 or 9000 lines could have SLI reworked to not have to duplicate everything? Does Nvidia have any plans to try to get it there?
 
Long time reader first time poster.

First of all, a good read! I think that the review does a rather comprehensive view of what one can expect with the different single 9800 GTX SLI configurations however I have to agree with Stoly that a 9800 GX2 result would have been useful to see if it has the performance above equal or below a 9800 GTX 2xSLI configuration.

Also AuDioFreaK39's point on the different driver versions seems valid though I've had no experience with SLI so I'm not sure how large an impact the different drivers would have on benchmark results in different SLI configurations.
 
Isn't the GX2 essentially two 9800GTX's bolted together? It's also cheaper than two GTX's. That would seem to provide better value than two 9800GTX cards.

If the cards shared data instead of replicating it, wouldn't that reduce memory bandwidth further? It was this sharing of data that allowed the SDR Voodoo5 cards to provide the same bandwidth as the DDR Geforce cards of the time.

I wonder how much of a benefit 3-way and quad SLI would offer at lower resolutions.. If you play crysis at 1024x768, maybe quad sli would offer some real benefit.. The problem is that people who play at low resolutions are not likely to inveest that much money into videocards. Also, at that resolution, even mid-range single-GPU cards provide good performance.

I'm still using a 19" display at 1280x1024 and find it perfectly fine. It depends on how far from the screen you sit.. If you want to sit in a sofa in the oposite end of the room, you need a bigger screen, of course. These ultra-high resolutions are slowing down the pace of development of other aspects of the games - in five years, will we all be looking at the same crude blocky geometry and blurry textures, but this time at 5500x4000 with 8x AA?
 
I think it is an issue of both. The larger 384bit bus of the 8800GTX is one thing that allowed it to shine, but the other thing was the 768MB of memory. These two things were why the 8800GTX still out performed the 8800GTS 512MB at higher resolutions with AA and AF enabled. I'd bet that just fixing the bus or just adding more memory alone wouldn't do much. You need both a larger bus and more memory.

I think that the G92 compression technology works wonders and compensates for the lack of bandwidth, unfortunately it seems it doesn't work well with AA. Most reviews indicate that the performance hit is lower when increasing the resolution than enabling AA.
 
Think I see two 9800 GTXs in SLI under the Christmas tree this year to replace my 8800GTs in SLI and if it doesn't do what I want, I can always add a third later for tri sli.
 
Think I see two 9800 GTXs in SLI under the Christmas tree this year to replace my 8800GTs in SLI and if it doesn't do what I want, I can always add a third later for tri sli.

I don't think the 9800GTX will last till then, I'd ask Santa for the 9900GTX instead :D
 
At this time there just aren’t enough advantages past GeForce 9800 GTX SLI to recommend 9800 GTX 3-way SLI. If you are looking for the absolute highest framerates, 3-way SLI will provide this, but it will not allow you to improve your gameplay experience in some games

Yesterday I started reading some complaints on another forum about latency increases (felt as input lag) & syncronisation issues with SLI & Crossfire. I didn't absorb too much detail but the gist of it is AFR can have problems outputting video frames at at regular intervals i.e. odd frames may take 4x longer to update than even frames, so the effect is stuttery, jerky frame rate even though the overall framerate may measure high enough to be considered smooth. 3-way SLI seems to be the configuration where this 'micro stuttering' occurs the most.

Is this an effect that is being observed in the [H]ardOCP testing? Do you get any sense that 'input lag' is increased on the multi GPU configurations?
 
I've seen the [H] say on many occasions that the new 9800 cards are bottlenecked at higher resolutions. So... why in the world wouldn't you test with 8800 cards to prove that theory?

IIRC, nVidia released this thing called an 8800Ultra... which is never seen in these reviews.

Lets compare specs of an eVGA 8800Ultra SC to an eVGA 9800GTX SC:

8800 SC - 768MB / 384bit - [655 core / 2250 mem / 1.66 shader]
9800 SC - 512MB / 256bit - [700 core / 2200 mem / 1.72 shader]

Looks close enough to me except the added memory and memory bus... which would be perfect to prove a bandwidth point.
 
I think that the G92 compression technology works wonders and compensates for the lack of bandwidth, unfortunately it seems it doesn't work well with AA. Most reviews indicate that the performance hit is lower when increasing the resolution than enabling AA.

That's possible. In any case the conventional 384bit bus doesn't have the limitations that the 256bit bus with compression has.
 
That's possible. In any case the conventional 384bit bus doesn't have the limitations that the 256bit bus with compression has.

yup but at least its faster than the 320 bit bus.

Now 384 bit bus + compression.... hopefully we'll see this with the 9900
 
yup but at least its faster than the 320 bit bus.

Now 384 bit bus + compression.... hopefully we'll see this with the 9900

They could even go to a full 512bit bus. Though the bandwidth provided by that type of bus being useful is a subject of debate. It hasn't enabled ATI to dethrone NVIDIA from being the performance king.
 
They could even go to a full 512bit bus. Though the bandwidth provided by that type of bus being useful is a subject of debate. It hasn't enabled ATI to dethrone NVIDIA from being the performance king.

I don't think nvidia will go the 512 bit route, although it seems the next logical step, its still too costly.

Keep in mind that there are other ways to increase (or save) bandwidth. Bus width is just one of them and its quite expensive to implement. Faster memory + compression + optimizations + architecture's efficiency are better alternatives simply because they can be implemented in the complete product family not just ultra high end products.

Keep in mind that it took ages for a 256bit bus to appear on a mid range product (the 9600GT). how long until wee see a cost effective 512bit bus?
 
Yesterday I started reading some complaints on another forum about latency increases (felt as input lag) & syncronisation issues with SLI & Crossfire. I didn't absorb too much detail but the gist of it is AFR can have problems outputting video frames at at regular intervals i.e. odd frames may take 4x longer to update than even frames, so the effect is stuttery, jerky frame rate even though the overall framerate may measure high enough to be considered smooth. 3-way SLI seems to be the configuration where this 'micro stuttering' occurs the most.

Is this an effect that is being observed in the [H]ardOCP testing? Do you get any sense that 'input lag' is increased on the multi GPU configurations?

I have those issues with a single 9800 GX2, annoying to say the least.
 
Isn't the GX2 essentially two 9800GTX's bolted together? It's also cheaper than two GTX's. That would seem to provide better value than two 9800GTX cards.

it's more like, two 8800 GTS 512MB bolted together with their clock speeds reduced slightly, two 9800 GTX's in SLI are faster than a GX2 IIRC.
 
I've seen the [H] say on many occasions that the new 9800 cards are bottlenecked at higher resolutions. So... why in the world wouldn't you test with 8800 cards to prove that theory?

IIRC, nVidia released this thing called an 8800Ultra... which is never seen in these reviews.

Lets compare specs of an eVGA 8800Ultra SC to an eVGA 9800GTX SC:

8800 SC - 768MB / 384bit - [655 core / 2250 mem / 1.66 shader]
9800 SC - 512MB / 256bit - [700 core / 2200 mem / 1.72 shader]

Looks close enough to me except the added memory and memory bus... which would be perfect to prove a bandwidth point.

I agree, it would have been awesome for you guys to have included SLI/Tri-SLI'd 8800 GTX/Ultras in the mix.
 
Back
Top