BFGTech GeForce 8800 GTX and 8800 GTS

I just bought a 8800GTS... I am curious to see how it compares to a 7900GTX SLI....

Also, I have a Enermax 500Watt Dual 12v Rail (22A each)... will this be enought for the GTS?!
 
Outstanding review as always ! Well done .

Love the way [H] has changed the way cards are compared ! It's in no way subjective, it's to the purest form, comparative. Peeps are just too used to seeing the old balls out approach, first card to choke, looses. For gamers, something most enthusiast are (For the most part) the old way does nothing to give true performance analysis. If you read it for what the article is and forget the old ways of benching this is truly the way to compare the REAL world use of a product.

Again, Well done.
 
Very nicely done.

500w might be pushing it depending on what the person has in their PC and what 500w they have.
 
May I suggest a comparison with Quad-Sli.

Also are the new 8x and 16xAA modes better than their SLi versions?
 
Great review. Detailed, covered pretty much everything.

What about an AGP version? Is there a roadmap, and if so, is there an AGP variant on it? If so, when, and how crippled is nvidia going to make it?
 
The next version of DirectX is DirectX 10 also referred to as D3D10. DirectX 10 will be released alongside Windows Vista early in 2007


I though about getting one now..then read that line :(
 
USMC2Hard4U said:
I just bought a 8800GTS... I am curious to see how it compares to a 7900GTX SLI....

Also, I have a Enermax 500Watt Dual 12v Rail (22A each)... will this be enought for the GTS?!
Not to bust on you, man, but if you can laugh at your own expense, you said this at 12:41 PM today:

USMC2Hard4U said:
I will be waiting as well. This 8800 is amazing. However, I cannot justify the price right now... Even if I had it to spend, I think waiting until after Vista comes out would be a good idea. This way we can see if the 8900s will be worth purchasing or if an 8800 is good enough at a lower price point.
I bring it up because I set off a tizzy in another thread a couple of weeks ago by sarcastically ripping on the naysayers and predicting this would happen. The biggest huffiness came from nullzero, who dedicated an entire thread to proclaiming his ability to resist the lure of the G80 (the very thread, in fact, that you posted in with the statement I just quoted):

nullzero said:
I myself just bought a evga 7900gt about 2 months ago and I have decided to skip over the new 8800 series until the prices drop to $200 range or the next generation comes out. Learning from the past this new card will be old news in 6 months from the ATI counter or NVIDA revisions. Im expecting prices should drop sometime in january or later. My money is going to a Wii.
This same nullzero is now boasting about being #12 in the eVGA step-up qeue for a G80.

You guys just warmed the cockles of my dark, evil heart!:D
 
To put it into estimates, a GeForce 8800 GTX with its specified total thermal displacement of 185 watts, is going to add a solid 150 watts of load to your PSU when you put it into your system

[H] is 2leet4physics.
 
Anyone looking for apples-apples sort of review needs to go to tomshardware and use their VGA charts. If I recall correctly, they bench with default settings so you're getting out-of-the-box comparos for everything. It's not worth much to a gamer, but it can help give you a feel for relative performance without any FSAA or AF if you think that's helpful info.

I personally used the VGA charts when trying to decypher some of the more obscure nvidia and ati product lines. Like when you run across a 7900GT, GTX, GS, and they're all fairly similiar in specs but spaced apart in price, looking at "defaults" benchies can be useful in determining what exactly the extra money is buying you. But other than that specific case, it's not really a good idea to base purchasing decisions from so-called apples to apples comparos because like the [H] guys say, the cards are so different now it's impossible to use the exact same settings on 2 cards.

Benching at defaults is one reason why default settings sometimes look like crap... They're trying to lower the default settings to get as high benchmark scores as possible without making the image look too bad. It's a race to the bottom of the image quality barrel, so those numbers don't mean much to a gamer who wants his game to look nice.
 
Silus said:
The FEAR Extraction Point issue is definitely not an hardware one (ATI or NVIDIA). There's a thread around, where multiple users (including myself) complained about its incredible choppiness. You just need to turn a corner or open a door and *BAM* the game lags, with no apparent reason.

Maybe it's running on the eq2 engine :)

Any idea if finally two 8800 GTX's can run eq2 on something other then balanced graphic setting without horrible lag and choppiness?
 
You can't directly compare AA modes, MSAA, CSAA, TR MSAA/SSAA etc.. You'd have to turn off AA and AF (making you more CPU bound), and even then the base filtering is different, so you'd have to go to point sampling. There is just no way to make the two video cards have the same exact image quality. It would be pointless to anyway, pure max FPS means nothing.

You completely missed my point.

You are trying to read too far into this. It's ok. Like I said... The results of your reviews are very useful...

Every card has plain Jane 16X AF (regardless of the actual math behind it... the little windows control panel for each card brand has a slider with 16AF at one end) for example... you are wrong when you say there is no way to set up two card brands reasonably the same. And it would not be hard to just do a couple one off benchmarks cranking the "usual suspects" up to the max... forget about the new AA modes for a second. Those are part of the subjective review... that is not what I am talking about. It is still useful to see how a card performs with just good ol' 4+X AA and 16xAF at a high resolution in a demanding game or other benchmark, compared to other cards at those same settings.

I know you guys put alot of work into this... and with all that work, going the extra mile, as they say... would make every single other review completely redundant. Isn't that what you want? To be the number one enthusiast hardware site. period. on the net? Don't let other sites provide information that you are failing to provide. Maybe I see too much from a business perspective... I don't know.

As I said... I know why you do what you do in the way that you do it. I think you are mostly right... I just hate having to leave H and load up a 2nd class hardware site to get some simple one off graphs that I can use to explain a new piece of hardware to lay people. Enthusiasts use this information, but so do parents, non-enthusiasts, young gamers, and other potential *customers* (for whatever reason. gifts?).

Keep up the good work... seriously. Don't think that H isn't the first review that I read... because it is. I just wish it was the *only* review I had to read. :)
 
just installed my BFG 8800 GTS woot - but how do I get back to the old style forceware panel? I can't seem to find it anywhere - so that i can use coolbits!
 
TheGoat Eater said:
just installed my BFG 8800 GTS woot - but how do I get back to the old style forceware panel? I can't seem to find it anywhere - so that i can use coolbits!
View -> Classic Control Panel.
 
InorganicMatter said:
View -> Classic Control Panel.
its telling me that "This is now a Performance feature. Visit the NVIDIA website to learn more and download the NVIDIA nTune performance application. Would you like to go there now?"
 
Brent_Justice said:
You are missing the point, there is no way we can do anything apples to apples because nothing is apples to apples.

You can't directly compare AA modes, MSAA, CSAA, TR MSAA/SSAA etc.. You'd have to turn off AA and AF (making you more CPU bound), and even then the base filtering is different, so you'd have to go to point sampling. There is just no way to make the two video cards have the same exact image quality. It would be pointless to anyway, pure max FPS means nothing.

What matters is how high can you push the visuals and still maintain playable performance.


QFT,

I have two on order, one today and i have allready processed a stepup on my current card
 
TheGoat Eater said:
its telling me that "This is now a Performance feature. Visit the NVIDIA website to learn more and download the NVIDIA nTune performance application. Would you like to go there now?"
and its telling me I need a nforce board - which i figured anyway ---------- WHAT THE F*CK ???????????
 
Commander Suzdal said:
Not to bust on you, man, but if you can laugh at your own expense, you said this at 12:41 PM today:

I bring it up because I set off a tizzy in another thread a couple of weeks ago by sarcastically ripping on the naysayers and predicting this would happen. The biggest huffiness came from nullzero, who dedicated an entire thread to proclaiming his ability to resist the lure of the G80 (the very thread, in fact, that you posted in with the statement I just quoted):

This same nullzero is now boasting about being #12 in the eVGA step-up qeue for a G80.

You guys just warmed the cockles of my dark, evil heart!:D


Null Zero cannot be , i am 14 and i know for a fact that jasonx82 is number12 :D
 
Kibbles said:
The next version of DirectX is DirectX 10 also referred to as D3D10. DirectX 10 will be released alongside Windows Vista early in 2007


I though about getting one now..then read that line :(
actually is a wise decision to skip getting the card when there is no application out there demanding enough for current video cards.

once crysis, UT2007 and other DX10 come out is when one should start thinking about getting one of these cards

by then the price will be alot lower
I suggest for people to think wisely, instead of rushing to buy one
 
Brent_Justice said:
As I mentioned in the overclocking section you can no longer use coolbits. You have to use the nTune software for GPU overclocking.
but ntune won't work with my P5B Deluxe - what do I do? any links would be great thanks
 
When do you think we will see the SLI review? Also, I dont remember reading it in the review so thats why I am asking, but can you tell us what the 2 SLI connectors are for? Triple SLI, Quad SLI, etc.....
 
I remember reading somewhere that Vista RC2 has a beta of DX10 included in it. If this is true, are there any DX10 demos/beta apps that could be run on RC2 to get a look at DX10 in action, now that the hardware is out there?
 
Aztlan said:
actually is a wise decision to skip getting the card when there is no application out there demanding enough for current video cards.

once crysis, UT2007 and other DX10 come out is when one should start thinking about getting one of these cards

by then the price will be alot lower
I suggest for people to think wisely, instead of rushing to buy one
And still this drivel goes on...

No application demanding enough? Is Oblivion not demanding enough for you? Or widescreen gaming? Or Company of Heroes? Or Call of Duty 2? Or Prey? How about with a level of image quality never seen before?

I guess if you like gaming at 1024X768 with 0AA/0AF on Quake 2, then current hardware is more than enough...:rolleyes:
 
TheGoat Eater said:
but will it work with 965 chipsets? BTW at stock speed I am getting 15858 in 3DMark 05


SORRY - saw this in the notes ->"nTune 5.05 adds support for GPU temperature monitoring and overclocking on all motherboards." Yeah yeah!
 
I really enjoyed the in-depth coverage of the new architecture stuff. It looks like this series can
really perform well at 1920x1080, which is great because the HDTV I am buying soon is
1920x1080 and has a DVI input and dot-for-dot mode.

Suhweet!
 
Great review. I have been waitng for this moment. :D

"Overclocking the GeForce 8800 GTX and GeForce 8800 GTS is done through a difference piece of software now. In the past you could run “coolbits” and unlock a hidden tab in the control panel. You can still do that, but when you click on the overclocking option it will tell you to go download nTune."

The above is a quote from your review. Does that mean that a non-nforce MB will not be able to OC the 8800 GTX and GTS, since the nTune software only works with nForce MBs? Or does nTune work with all chipsets (intel, ATI, VIA, etc)?

If that is the case, people who bought intel chipset MB for the new Conroe CPUs will not be able to OC the new 8800 cards :mad:

Edit: well just read the post above re my issue. The nVidia website for nTune indicates that ntune supports all MB for OCing.
 
Is it possible to have SLi 800GTX on a none 32x board? I only have a nF4 SLI-DR Expert which has dual 16x PCI-E but they run at 8x speed during SLi.
 
jedicri said:
Great review. I have been waitng for this moment. :D

"Overclocking the GeForce 8800 GTX and GeForce 8800 GTS is done through a difference piece of software now. In the past you could run “coolbits” and unlock a hidden tab in the control panel. You can still do that, but when you click on the overclocking option it will tell you to go download nTune."

The above is a quote from your review. Does that mean that a non-nforce MB will not be able to OC the 8800 GTX and GTS, since the nTune software only works with nForce MBs? Or does nTune work with all chipsets (intel, ATI, VIA, etc)?

If that is the case, people who bought intel chipset MB for the new Conroe CPUs will not be able to OC the new 8800 cards :mad:

The new version of nTune will install on other chipsets

http://www.nvidia.com/object/ntune_5.05.18.00.html
 
Pretty good review and a kick ass GPU. I wish you guys added a Crossfire setup to the mix to show people if its worth upgrading to, or adding another card. Probably not, but it would have been nice.

I gambled and ordered 2 GTX's hoping Nvidia would announce SLI support on 975X chipsets. I love my AW9D-Max and would hate to get rid of it.

Did you guys maybe hear anything about that??
 
Aztlan said:
actually is a wise decision to skip getting the card when there is no application out there demanding enough for current video cards.

once crysis, UT2007 and other DX10 come out is when one should start thinking about getting one of these cards

by then the price will be alot lower
I suggest for people to think wisely, instead of rushing to buy one

Vote For Ban.
 
Aztlan said:
actually is a wise decision to skip getting the card when there is no application out there demanding enough for current video cards.

once crysis, UT2007 and other DX10 come out is when should start thinking about getting one of these cards

by then the price will be alot lower
I suggest for people to think wisely, instead of rushing to buy one

Its hard to argue with you or anyone financially speaking, that is against buying one of these cards.
We are already seeing inflation and a low initial stock.
Thats tough to argue.

But yourself and many others seem to overloook one of the biggest advantages of the G80 line,
and that is the fact that the designers and architects of those games you mentioned get to "use and abuse" the G80.

They will be able to use the G80 as the template and foundation of these games being designed and that is huge.

With that being said it is hard not to recommend both the 8800GTX/GTS based on them being the foundation cards of DX10 and Vista.

Just an observation. :D
 
just OCed my GTS and got .... 17940 '05 marks at 620/900 WOOT! What a card can't wait to go into more in depth OCing!
 
Brent_Justice said:
Or if you want to run really high AA levels, at that rez 16X TR SSAA would be easy.

jebo_4jc said:
it's pretty easy to answer this question. Do you ever have to turn down the visual quality to keep games smooth? Then a video card upgrade would benefit you.

Good points, I might have to step up from my 7900gto then.
 
Chefboy said:
Outstanding review as always ! Well done .

Love the way [H] has changed the way cards are compared ! It's in no way subjective, it's to the purest form, comparative. Peeps are just too used to seeing the old balls out approach, first card to choke, looses. For gamers, something most enthusiast are (For the most part) the old way does nothing to give true performance analysis. If you read it for what the article is and forget the old ways of benching this is truly the way to compare the REAL world use of a product.

Again, Well done.

god help us if we actually have to read something! :p

thankfully, traffic lights don't come with words.
 
Back
Top