NVIDIA GeForce 7900 Series Preview @ [H] Enthusiast

slightly off topic, but I had a BFG 7800GT and i found the fan more annoying than my current x1800XT. During system post, the x1800XT was noticeably louder, however, i found during windows and being idle the fan was slightly more whiney on the 7800GT
 
Yoshiyuki Blade said:
Yeah, I have to agree. I think Brent and Kyle really outdid themselves this time. I mean, we got real-world benchmarks from like 5 different cards for 8 different games (the popular ones too which is worth noting), and re-evaluated 4 games in widescreen WITH SLI/CrossFire configurations. I can't imagine the amount of time that would take.

I also enjoyed reading about the thermal testing, evaluation summary, and the conclusion. It addressed some pretty key points that couldn't be made through the benchmarks, and I'm glad that there were mini-essays about them rather than saying "we notice it now and didn't like it."

Thanks! What's impressive is the amount of time I actually did all of that in! :eek:
 
I would have really liked to see how HL2 did, but other then that it was a good article.
 
Fantastic review; easily the best & most comprehensive on the web. Hats off to the [H] crew.

Ordered 2 gtxs btw. ;)
 
Kyle you do realize that now you're including the 2405FPW you're simply going to HAVE TO break down and get a 3007WFP, right? :D
 
the 7900GT seems like a -really- kick-ass card now... wow... great review also.. lots of game benches... :)
 
That is the BEST evaluation I have seen of an NV product in 2 years . Way to go guys . I am beside myself.
 
revenant said:
the 7900GT seems like a -really- kick-ass card now... wow... great review also.. lots of game benches... :)
get the xfx xxx 7900gt. 565mhz core 1.65ghz mem. 52.8gb/sec bandwidth!
 
pandora's box said:
get the xfx xxx 7900gt. 565mhz core 1.65ghz mem. 52.8gb/sec bandwidth!

very impressive indeed. :D my 7800GTXs are having memory speed envy. ;)
 
theseeker said:
Im on the fence. I have 2 7800GTX's 256 and I am not convinced that I should go with 2 7900 GTX's 512. I mainly play COD2, BIA2 and Total War Rome. Any body have any suggestions?

Remember what you're doing with these cards people. There isn't a game out there that two 7800GTXs can't run well. Unless you're a benchmarking whore, and like losing money in the used card market, keep what you have and buy a DX10 card when they're available.
 
Mrwang said:
I dont want to sound ungrateful for your hard work and reviews but you were asked clear questions in this thread and you are "Beating Around The Bush".

7900GTX = slightly faster overall
XTX1900 = better image quality

Simplicity should be a primary goal for any reviewer. Your reviews and responses are a bit confusing to say the least. To say the very least.

did you even read the review? do you really need it summed up for you in such a quick and easy answer? in some games, one card was able to run some settings better, in other games it was the other card. if you can't read a review and make intelligent decisions by interpreting the data, maybe you should leave computer hardware to someone else.

it's a reviewer's job to provide his experience and the data he gathered. it's your job to make purchasing decisions based on the available information
 
aldamon said:
Remember what you're doing with these cards people. There isn't a game out there that two 7800GTXs can't run well. Unless you're a benchmarking whore, and like losing money in the used card market, keep what you have and buy a DX10 card when they're available.
No, I think that would be an "ePenis whore". :D
 
Ok im in for one but have a problem. The evga 7900 gtx overclocked looks nice and im going to go for it but here is the problem. Do you guys think that evga website will have it cheaper than the other resellers like monarch? Please help me out here because i really want one bad.
 
Kyle, Brent:

Which of the current top end cards do you consider more future proof the X1900XTX or 7900GTX ? I think I have seen some talk about future proof in your earlyer reviews, no mention about that in your latests article.

Good review anyways and good tip on that "don´t go looking what you don´t see" I wish I had followed that tip many times in PC/HTPC stuff :)
 
a-lamer said:
Kyle, Brent:

Which of the current top end cards do you consider more future proof the X1900XT or 7900GTX ? I think I have seen some talk about future proof in your earlyer reviews, no mention about that in your latests article.

Good review anyways and good tip on that "don´t go looking what you don´t see" I wish I had followed that tip many times in PC/HTPC stuff :)

Technically speaking both have high shader processing performance, there are some differences with the FP16 framebuffer support. The GF 6/7 series supports FP16 filtering, but not AA with FP16 render targets. The X1000 series does not support FP16 filtering, but does support AA with FP16 render targets. The X1000 series also potentially supports faster dynamic branching. Those are the main differences.

Therefore it is extremely hard to tell at this point in time which one is going to win out feature wise, they are so close, and we are so content limited its not funny. There just aren't any games out there that take advantage of these features. Will not having FP16 filtering support hurt the X1000 series in games? Will not having fast dynamic branching or no AA+FP16 support hurt the GF 7 series in games?

Who knows!?

All we can do is show you what is what now in today's games and speculate based on what we know about future titles. UT2007 will be the next really really big game this year, so we'll have to see what happens with it.
 
I've been fighting with EQ2 stuttering for a long time, and last week something worked:

A full reinstall of OS. The EQ2 install was actually copied from backup (not reinstalled). After this was done, EQ2 was smooth as glass (1440x900, Sempron 3000+, 1Mb PC2700, ATI 9800 Pro 128Mb, Abit NF7-S2 (not NF7-S rev 2)) on Balanced settings with texture details set to High for character and world and Medium for LOD. Max framerates and average framerates were about the same, but the low-FPS spikes were gone.

Other data points:
- Since this started working EQ2 has patched twice, and still is smooth.
- I did *not* install the NVidia board network drivers. Network throughput is considerably better than before as tested through several sites through DSLreports.com.

So, my current working theories are:
1. EQ2 may be highly sensitive to fragmentation of its files. Won't be able to test this for a while since the new install was to a 250Gb drive and it'll take some time to scatter EQ2's files all over the place. Evidence supporting this is that EQ2's stuttering often corresponds to disk access when approaching characters' or objects' data is being loaded. Also, some people on the EQ2 boards report good results from having an excess of system memory.

2. Motherboard-specific network drivers may be introducing a delay into EQ2's processing cycle. Again, when approaching characters or objects reach the range where they are displayed in detail, there's network traffic telling the client what textures and items to render. Also, my wife's machine (Epox 8RDA+, Athlon 2500+, BFG 6600GT, same memory), which has plenty of spare room on the drive and had a nice clean unfragmented install, stutters; but it had the NForce2 network drivers installed.

I have a third machine with a VIA chipset, but it's not EQ2-capable. It has enough CPU and memory, but the video card's a TI4200... so I'm out of test data.

Hope this helps clarify EQ2 testing for future reviews. If you guys can validate either of these theories or use this data to form a more correct theory, you'll make a lot of people playing EQ2 happy, not to mention SOE's EQ2 maintenance team. :)
 
I'd be willing to wager that doing a "blind taste test" so to speak, none of us, not even Kyle or Brent would be able to tell the difference between the two flagship cards. Things are good for us gamers right now. Looking forward to Unreal Tournament
 
I liked this evaluation more than any other I think. Mostly, because there are finally 1920x1200 numbers! I also like the fact that shimmering was addressed. Ive been on my 2405FPW since the first week they came out, and notice it a lot on SLI'd GTX's, coming from a X850XT/PE at the time. People on smaller CRT's dont, and have made shimmering out to be a myth. When in fact, its not. Glad it was brought up.

Im a pretty suprised by your BF2 results. It looks like both cards were tested at the same settings (except HQ AF for ATi), yet you show the 7900 being a lot faster. Where as in every other review, the XTX is a lot faster. Total reversal. Since there is no built in benchmark, Im assuming you and everyone else used fraps to test the frames. I dont understand how it can be so far off. I dont think HQ AF makes near that much of a difference.

Even so, I liked it. Its pretty much what I expected. A slightly faster 7800 512MB card, and thats about it. Both cards seem about even, with ATi faster in the games I care about most. It seems when you raise the res, and settings in these newer games, ATi takes the lead. Now I need to decide what to buy after I get my cash from my GTX's I just sold... Hmm.
 
I am not sure if this was noted already... but the 7900GT of this generation is much more like the 6800GT of the nv40 gen... having the same internal arch, but running at reduced clocks... I had guessed that by making the 7800GT with less four pipes and one less vertex unit than the flagship model, that Nvidia wanted to discourage peeps from buying the GT flavor when they wanted GTX performance, but were planning on trying to OC the card... but I guess their plan was to keep them far enough in clocks to prevent that.. to a certain extent. I mean without some serious cooling and probably a volt mod the GT of this gen will likely not be able to rival the GTX, well maybe the only "default" clocks, gpu core only, but not the overclocked GTXs... and not the memory unless vmodded.. gah! Anyways... just some mental meanderings..
 
I'm wondering, since [H] didn't post benches on the 7800GS in their review, why not save several pages and hours of testing and just put a title in the 7900GT/GTX review

"Same thing, only smaller, somewhat faster and cheaper".
 
revenant said:
I am not sure if this was noted already... but the 7900GT of this generation is much more like the 6800GT of the nv40 gen... having the same internal arch, but running at reduced clocks... I had guessed that by making the 7800GT with less four pipes and one less vertex unit than the flagship model, that Nvidia wanted to discourage peeps from buying the GT flavor when they wanted GTX performance, but were planning on trying to OC the card... but I guess their plan was to keep them far enough in clocks to prevent that.. to a certain extent. I mean without some serious cooling and probably a volt mod the GT of this gen will likely not be able to rival the GTX, well maybe the only "default" clocks, gpu core only, but not the overclocked GTXs... and not the memory unless vmodded.. gah! Anyways... just some mental meanderings..
i noticed that too. i can see the 7900gt reaching stock gtx speeds on the core with better cooling. as far as memory goes well theres 7900gt's with faster memory than some 7900gtx's, such as the xfx xxx model with a memory speed of 1.65ghz. the core speed on that card is 560mhz, only 90mhz slower than the gtx. kinda hard to justify the price of the gtx while there are overclocked gt cards in the 350 range. performance difference between a overclocked gt and a stock gtx isnt worth 150 bucks imo.

should be interesting to see what aftermarket cooling can do for the 7900gt.


i have a theory as to why the 7800gt only had 20pipelines over the gtx's 24:

perhaps nvidia had a hard time getting cores with 24pipelines to reach there desired clock speeds. they had to do something with the g70 cores that didnt make the mark, hence disabling 4 pipelines.
 
This may come off as a bit of a dumb question but I'll ask anyway:

I noticed, that you didn't put a "Must Have [H]ardware" thingy at the conclusion of the preview, whereas, you put one for the X1900 series. Is there any particular reason for doing so, or maybe you forgot to put one there?
 
RoffleCopter said:
This may come off as a bit of a dumb question but I'll ask anyway:

I noticed, that you didn't put a "Must Have [H]ardware" thingy at the conclusion of the preview, whereas, you put one for the X1900 series. Is there any particular reason for doing so, or maybe you forgot to put one there?
I think the difference is that this is a preview of nvidia reference cards, while the 1900 series article was a review of an actual retail card. But, ultimately, the authors will have to answer this one.
 
Very good preview! I've been waiting for both companies' spring products before making the final move to pcie graphics.

For myself, and my past experience with ATI's stock cooling solutions being horrid (9800), to now being very loud and still approaching boiling point of water (93°C unload?!?!? OMFG!), nVidia is what I've chosen to purchase. I also feel their drivers tend to be more stable, and I like the control panel better also.

As far as overclocking goes, it seems obvious to me that a card running at 93°C doesn't have much headroom left for overclocking. I may not be able to eek much extra performance out of nvidia either, but I don't think I'll need to worry about it anyway. But 93°C leaves me wondering what the MTBF on that card is... [no, i'm not going to dick with watercooling]

Ordering:
Asus A8N-32 SLI Deluxe
7900 gtx 512mb
(already have an AMD 4400+ dualcore)
Sata II Raid card
5 x 400gb sata II drives

possibly a new power supply, but this vantec neopower 480w has done me fine for awhile (6 hdd's, 6800U gpu, xp2700+, and then the 4400+ dual core running on asrock939dual)


Can't wait to see reviews of retail cards, and sli vs crossfire vs singles

GoodBoy
 
RoffleCopter said:
This may come off as a bit of a dumb question but I'll ask anyway:

I noticed, that you didn't put a "Must Have [H]ardware" thingy at the conclusion of the preview, whereas, you put one for the X1900 series. Is there any particular reason for doing so, or maybe you forgot to put one there?
And if it's not MHH, why not?
 
pandora's box said:
i noticed that too. i can see the 7900gt reaching stock gtx speeds on the core with better cooling. as far as memory goes well theres 7900gt's with faster memory than some 7900gtx's, such as the xfx xxx model with a memory speed of 1.65ghz. the core speed on that card is 560mhz, only 90mhz slower than the gtx. kinda hard to justify the price of the gtx while there are overclocked gt cards in the 350 range. performance difference between a overclocked gt and a stock gtx isnt worth 150 bucks imo.

should be interesting to see what aftermarket cooling can do for the 7900gt.

ok - wow, I need to read more.. so the XXX one does have faster or equal mem performance as compared to the GTX.. wow.. it -is- really tough to justify the GTX pricetag with those out.. wow.

pandora's box said:
i have a theory as to why the 7800gt only had 20pipelines over the gtx's 24:

perhaps nvidia had a hard time getting cores with 24pipelines to reach there desired clock speeds. they had to do something with the g70 cores that didnt make the mark, hence disabling 4 pipelines.

that's what I like to call Nvidia's "no silicon goes unused" plan. ;)
 
revenant said:
ok - wow, I need to read more.. so the XXX one does have faster or equal mem performance as compared to the GTX.. wow.. it -is- really tough to justify the GTX pricetag with those out.. wow.



that's what I like to call Nvidia's "no silicon goes unused" plan. ;)

http://www.bjorn3d.com/read.php?cID=891

they review all three of xfx's xxx line up (7900gtx 7900gt 7600gt)
 
pandora's box said:
http://www.bjorn3d.com/read.php?cID=891

they review all three of xfx's xxx line up (7900gtx 7900gt 7600gt)


"You can see in the table above that the XFX 7900 GT XXX Edition boasts very nice overclocks: a 110MHz boost of the core and a 165MHz (330MHz) overclock for memory! All that and XFX's industry-leading Double Lifetime Warranty. I'm betting this card will be in high demand right out of the gates."

jeeeeeezuz.. those are going to be in -uber- demand!
 
I would like to see some comments about vsync/triple buffering performance in future reviews. The tearing is significantly more noticable on LCDs and tends to proived horrible image quality. Particularly in darker games with muzzle flashes that use DX since tripple buffering is a pain in the ass on DX and works fine in OGL.

Personally I prefer synced frames over AA in any game where the scene can change significantly frame to frame, like an FPS. I run dual 512 7800s with a 3.3 GHz opteron on a 2405FPW and find it marginally acceptable at native res vsynced in most games. It is certainly the most demanding quality improvement you can run, and I consider that a better step up than 30" LCD res gaming which is out of most peoples budget. Running frame locked at 1280x1024 is something that many people can enjoy, and would like to see it addressed in at least a minimal manner.
 
on the egg:

* eVGA 512-P2-N570-AX Geforce 7900GTX 512MB GDDR3 PCI Express x16 Video Card - Retail
* $499.00

* Leadtek PX7900 GTX Geforce 7900GTX 512MB GDDR3 PCI Express x16 Video Card - Retail
* $529.00


* eVGA 512-P2-N575-AX Geforce 7900GTX 512MB GDDR3 PCI Express x16 Video Card - Retail
* $599.00

* Leadtek PX7900 GTX Extreme Geforce 7900GTX 512MB GDDR3 PCI Express x16 Video Card - Retail
* $599.00

No price gouging here! $499 and in stock. eVGA too! Imagine these babies a few weeks from now....
 
really glad to see some widescreen benchmarks. The 7900gtx and X1900xtx seem pretty evenly matched
 
:)

Great new card BUT this is NOT a DX10 part is it? Is just DX9cXX?

So when and if I go for Vista ( gaming so on...) I'll need another card?

I don't have $350 to spare every six months....

:rolleyes:

PS helloo to HardForum gang.....
 
101 said:
I would like to see some comments about vsync/triple buffering performance in future reviews. The tearing is significantly more noticable on LCDs and tends to proived horrible image quality. Particularly in darker games with muzzle flashes that use DX since tripple buffering is a pain in the ass on DX and works fine in OGL.

Personally I prefer synced frames over AA in any game where the scene can change significantly frame to frame, like an FPS. I run dual 512 7800s with a 3.3 GHz opteron on a 2405FPW and find it marginally acceptable at native res vsynced in most games. It is certainly the most demanding quality improvement you can run, and I consider that a better step up than 30" LCD res gaming which is out of most peoples budget. Running frame locked at 1280x1024 is something that many people can enjoy, and would like to see it addressed in at least a minimal manner.

problem with forcing VSYNC Is your framerate becomes a multiple of the locked highest frequency, unless of course you force tripple buffering but that tripples the amount of framebuffer memory used
 
abel said:
:)
Great new card BUT this is NOT a DX10 part is it? Is just DX9cXX?
So when and if I go for Vista ( gaming so on...) I'll need another card?
..

Let me put it this way. Vista gaming is not seriously coming this year. I can bet my balls on it :D Even if it does the first generation DX10 cards won´t be mind blowing. Or even if they are the first games won´t be. I expect some crappy 3D FPS to be the first DX10 "tech show".

Somebody will be shoutting Crysis Crysis, then again I thought that FarCry was very poor game. Pretty yes, but as a game poor. Personal opinion ofcourse :)
 
101 said:
I would like to see some comments about vsync/triple buffering performance in future reviews. The tearing is significantly more noticable on LCDs and tends to proived horrible image quality. Particularly in darker games with muzzle flashes that use DX since tripple buffering is a pain in the ass on DX and works fine in OGL.

Personally I prefer synced frames over AA in any game where the scene can change significantly frame to frame, like an FPS. I run dual 512 7800s with a 3.3 GHz opteron on a 2405FPW and find it marginally acceptable at native res vsynced in most games. It is certainly the most demanding quality improvement you can run, and I consider that a better step up than 30" LCD res gaming which is out of most peoples budget. Running frame locked at 1280x1024 is something that many people can enjoy, and would like to see it addressed in at least a minimal manner.

Brent_Justice said:
problem with forcing VSYNC Is your framerate becomes a multiple of the locked highest frequency, unless of course you force tripple buffering but that tripples the amount of framebuffer memory used


Having a good monitor with high refresh rates is every bit as important as having a quality video card.

People fail to realize without high refresh rates your monitor will not be able to keep up with these speed demon video cards, producing more "Visual Tearing". Increasing the need for v-sync. This disparagy is increasing with each new generation of video cards.
I for one spent $750 on a highly quality CRT a long time ago, capable of 100hz at 1600x1200. Most of the time I dont need to enable it.

What good is having a video card that can produce 200 frames per second if your monitor cannot display them properly?
 
Thank you for finally addressing the shimmering. While some people may not notice it, I cannot even use Nvidias products anymore because of it. I'm glad to see you guys finally mentioned this publically as Nvidia just doesn't believe there is an issue.
 
Back
Top