NVIDIA GeForce 7900 Series Preview @ [H] Enthusiast

Coolmanluke said:
I'd be willing to wager that doing a "blind taste test" so to speak, none of us, not even Kyle or Brent would be able to tell the difference between the two flagship cards. Things are good for us gamers right now. Looking forward to Unreal Tournament

I could hear the difference for sure.
 
RoffleCopter said:
This may come off as a bit of a dumb question but I'll ask anyway:

I noticed, that you didn't put a "Must Have [H]ardware" thingy at the conclusion of the preview, whereas, you put one for the X1900 series. Is there any particular reason for doing so, or maybe you forgot to put one there?


I think we should save those for retail products. And to be fair, there is not a hard and fast rule governing this. Had the 7900 simply knocked my socks off or left my jaw on the ground, we would have probably given it one. It is PURELY subjective.
 
Awesome job on the preview. By far the best I've read so far. I love how there is the new widescreen section. That helps a lot to see the preformance I will be expecting. Keep it up!
 
I don't know about you guys, but I expected a little bit more out of this. The 7900GT and GTX only differ in clock speeds. They only outperform ATI at ridiculously high resolutions [according to Anandtech]. And, the prices aren't so low as to dissuade people from buying ATI.

Are my facts wrong, or am I overlooking something? I refuse to believe that nVidia messed this up.
 
Bona Fide said:
I don't know about you guys, but I expected a little bit more out of this. The 7900GT and GTX only differ in clock speeds. They only outperform ATI at ridiculously high resolutions [according to Anandtech]. And, the prices aren't so low as to dissuade people from buying ATI.

Are my facts wrong, or am I overlooking something? I refuse to believe that nVidia messed this up.

It's shorter, so it's obviously different hardware. Just cause the pipeline count is the same doesn't mean it's the same card.

And who really cares at low resolutions? The 7900GTX was competing just fine at 1600x1200. These cards ARE supposed to be even remember. They are the SAME gen. And only like 1.5 months apart, unlike the 7800 and x1800.

And $500 on launch day isn't low?!?!?!? I think you need to check newegg again. x1800s weren't available for less than $600 when they launched.
 
Kyle/Brent, like Nvidia Ati has 2 Adaptive AA modes (perf + quality) but in the case of Ati you only mention one mode, which mode do you use for Ati?
 
I was under impression that 7900GTX-DUO (2 GPUs on one board) will be announced at the same time - does anyone know when that's coming?...

you guys are making me broke - i bought 7800GTX256 one week ago. now i have to return it, and buy 7900GTX512 (or DUO!!!).
 
The Widescreen benchmarks are long overdue.


Many gamers on this forum are using either 24" or 20" WS LCDs. It would be nice to see 1680x1050 take the place of 1600x1200 but I guess the results would be similar enough that it may not matter.

However 1920x1200 is a great start and is probably the most common used aspect ratio for widescreen gaming. The next logical step would be 27" WS displays as they seem to be comming for the forefront. Both Dell and Samsung have 27" displays on the way and many 2405 and 2005 owners will be upgrading to those. Kudos for WS resoultion benchmarks.
 
Kyle,
*cough* 7600GT *cough*? There is mention of the card at the beginning of the article, pictures too, so you obviously had access to one. Why no benchmarks/evaluation?

The 6600gt is/was one of my favorite GPUs of all time (for its time), I was excited to see what the 7600gt would bring to the table.
 
emaciated said:
Kyle,
*cough* 7600GT *cough*? There is mention of the card at the beginning of the article, pictures too, so you obviously had access to one. Why no benchmarks/evaluation?

The 6600gt is/was one of my favorite GPUs of all time (for its time), I was excited to see what the 7600gt would bring to the table.

Explained in detail in the article.
 
Coolmanluke said:
Having a good monitor with high refresh rates is every bit as important as having a quality video card.

People fail to realize without high refresh rates your monitor will not be able to keep up with these speed demon video cards, producing more "Visual Tearing". Increasing the need for v-sync. This disparagy is increasing with each new generation of video cards.
I for one spent $750 on a highly quality CRT a long time ago, capable of 100hz at 1600x1200. Most of the time I dont need to enable it.

What good is having a video card that can produce 200 frames per second if your monitor cannot display them properly?


You guys are missing the point. I don't run an LCD because I am a moron and don't understand what refresh rate is. The problem is what is state of the art right now, and what are an individuals requirements. If I could buy 3 1920x1200 120Hz SED displays that consumed 100W total to replace my 2405s, I would this second. There are always compromises in a specification, and simplifying one requirement as an umbrella for all the others is just ignorant. There are no cheap (<$1k), high speed (>100 Hz), low power (<80W), light weight (< 30lbs), high resolution (>= 1920x1200) displays for general consumption anywhere.

The question is, how do you maximize the potential for what is possible at a consumer pricepoint, looking at ALL of the specs. Then optimize for your own application. For me? vsync on a 1920x1200@60Hz LCD means a SHITLOAD more than running at 2560x1600 at any speed, and running anything on a 4:3 CRT that consumes 200W is not even an option. I've thrown out 3 21" trinitron CRTs in the last 2 years, it's a dead path as far as I'm concerned.

However that is MY requirement for MY needs, but I do know that LCDs are quite popular now. They are all limited to low refresh rates and display horrible frame tearing. So anything that can be done on the videocard end to mitigate the problem is welcome, since the display technology more or less sucks ass right now.
 
I don't plan on upgrading my 6800NU PCI-E (overclocked 441/842) until Nvidia comes up with 1GB of memory and DirectX 10 support, but I might say, they have done a superb job with cooling improvements and transistor reduction in the 7900 series.
 
Apple740 said:
Kyle/Brent, like Nvidia Ati has 2 Adaptive AA modes (perf + quality) but in the case of Ati you only mention one mode, which mode do you use for Ati?

The default setting which is quality.
 
kyle/brent:


any chance your going to get ahold of one of the xfx 7900gt xxx cards? (560mhz core 1650mhz mem) to test it?
 
now with this preview out...would it be worth going from a 7800GT to a 7900GTX ? with my rig as in my sig...
 
Rishy said:
now with this preview out...would it be worth going from a 7800GT to a 7900GTX ? with my rig as in my sig...

Probably smarter to get a 7900GT and see how far you can overclock it :)
 
if its a matter of performance rather than money, i would say yes.

I have a 6800GT @ Ultra and I'm thinking of getting the 7900GT.
 
codeflux said:
looks like newegg has sold out the 690mhz versions of 7900gtx - the eVGA and XFX - $599 either.

so i keep refreshing the page every 10-20 minutes to see if they still have one :), and see...

they raised the prices from $599 to $748!!!

eVGA
http://www.newegg.com/Product/Product.asp?Item=N82E16814130280

XFX
http://www.newegg.com/Product/Product.asp?Item=N82E16814150135

Prices on each are $599 but out of stock. No need to alarm people into thinking prices are going through the roof. Now if my damn tax returns would just hurry up and get here.
 
SpoogeMonkey said:
Prices on each are $599 but out of stock. No need to alarm people into thinking prices are going through the roof.
it seems that they took it back to $599, but yes, it is still out of stock.

i have just ordered XFX 7900GTX with 675mhz. :) :) :)
 
They run cooler, are smaller, have less transistors, and they don’t make you stuff cotton in your ears.

Kyle:

I read thru the entire review, great article and very useful info. as well.

I vaguely recall compare this review to an ATI X1800XT review, that in the ATI review, there is a dB measurement test.

So in terms of "quiet", the noise that one person accepts as quiet maybe consider as somewhat noisy by another person, So I mean, how many dB are we talking about? Is there no need for water cooling? There is no description of the noise level in details of this review, is there a way for you to describe "how quiet it is"?
 
This was a superb review kudos the the [H] team.

The widescreen eval was spot on I thoroughly enjoyed it. The sections in the review seemed very fair and provided the exact information most people want or look for in a review which is an experienced opinion with some facts.

From the review I come away with both cards being top notch and coming down to price and which one I prefer personally which isn't bad.

The 7600GT is interesting though considering it outperforms a x1600 by large margins and is comparable or even to a x1800gto while being cheaper. The 7900GT and x1800GTO are close in price from what I saw which is a no brainer interesting to see what the prices are in two months hmmm.
 
I'm surprised this slipped through, normally you guys are very good:
provides H.264 accelerated support for all GeForce 6 and GeForce 7 series video cards.
Emphasis mine
Here:
http://enthusiast.hardocp.com/article.html?art=MTAwMSwzLCxoZW50aHVzaWFzdA==

Its been known for a while that the 6800 GT and 6800 ultra cards do NOT support H.264 acceleration in hardware. But the 6600gt and etc lesser brothers do. Its kinda like how the GF4MX sucked in almost all areas but had a few video acceleration features that the GF 4 ti series didn't.
I'm pretty sure I've seen this info in a number of reviews on the geforce 6 series (maybe even in some [H] ones)

Here's the official word on what chips support h.264 hardware acceleration and which dont:
http://www.nvidia.com/page/purevideo_support.html
Scroll down to the PCI Express chart and check it out.
Under: H.264 Decode Acceleration
Some other minor players in the GF6 family dont support it like some of the 6200's and the 6100's.
(AGP people have it even worse, NONE of the 6800 series supports it in the AGP version)

You may want to re-word that so that it says most GF6 and all GF7.

BTW, nvidia charges you money to get the purevideo media player/decoder that supports this acceleration. Depending on the version its anywhere from $20-$50.
http://store.nvidia.com/
Although they do offer a 30 day trial:
http://www.nvidia.com/object/dvd_decoder.html

IMHO, this should be provided free to owners of thier cards, I mean the cards are expensive enough. But I imagine they have to do it for liscening reasons (like the dvd decoding stuff).


I would be remiss if I didn't express my thanks for widescreen gaming tests. As you say, this resolution is one place where SLI can really show its stuff. Thanks!!
 
101 said:
You guys are missing the point. I don't run an LCD because I am a moron and don't understand what refresh rate is. The problem is what is state of the art right now, and what are an individuals requirements. If I could buy 3 1920x1200 120Hz SED displays that consumed 100W total to replace my 2405s, I would this second. There are always compromises in a specification, and simplifying one requirement as an umbrella for all the others is just ignorant. There are no cheap (<$1k), high speed (>100 Hz), low power (<80W), light weight (< 30lbs), high resolution (>= 1920x1200) displays for general consumption anywhere.

The question is, how do you maximize the potential for what is possible at a consumer pricepoint, looking at ALL of the specs. Then optimize for your own application. For me? vsync on a 1920x1200@60Hz LCD means a SHITLOAD more than running at 2560x1600 at any speed, and running anything on a 4:3 CRT that consumes 200W is not even an option. I've thrown out 3 21" trinitron CRTs in the last 2 years, it's a dead path as far as I'm concerned.

However that is MY requirement for MY needs, but I do know that LCDs are quite popular now. They are all limited to low refresh rates and display horrible frame tearing. So anything that can be done on the videocard end to mitigate the problem is welcome, since the display technology more or less sucks ass right now.

What are you talking about? Like you said, LCD's are quite popular now and they suffer from low refresh rates.

Coolmanluke said:
What good is having a video card that can produce 200 frames per second if your monitor cannot display them properly?

Did you bother to read this? It's an excellent point.
 
Happy Hopping said:
Kyle:

I read thru the entire review, great article and very useful info. as well.

I vaguely recall compare this review to an ATI X1800XT review, that in the ATI review, there is a dB measurement test.

So in terms of "quiet", the noise that one person accepts as quiet maybe consider as somewhat noisy by another person, So I mean, how many dB are we talking about? Is there no need for water cooling? There is no description of the noise level in details of this review, is there a way for you to describe "how quiet it is"?

Let me put it this way, the 7900 GTX's fan barely registers on the meter at full load at its highest rpm, the ambient sound in my office is louder than the fan and that's with the video card sitting right here on my desk beside me.
 
Yeah the 7900's run really quiet, I know I have a couple sitting next to me in benches, the CPU fan is lounder than they are.
 
Impatience, thy name is cruelty. I knew I should have waited, but then, if I must wait for "top o' the line" then I will forever be waiting. It's like waiting for Ms. Right, when you could be extremely happy with Ms. Perfect-But-Not-Quite-Right but not perfectly happy. It's a stupid attitude because there will always be something that seems better on the horizon, until you see the price tag... which actually isn't bad at all this time.

Thanks for the info guys. I'm still thinking the benchmark data could be... made less equivocal... but the facts, general info, and your personal experiences are very helpful.

I thought that this preview in particular was very unbiased (granted, I skimmed over the actual benchmark data after the first page or so of it) especially in the compulsorily tactful section about shimmering.

This is also the first time I've seen an explanantion of why I'm not seeing this shimmering. Ever since the big 3dMark Incident, I have disabled what optimizations that are possible through registry hacks and control panel settings. I haven't believed this shimmering issue the whole time, until now, because I never saw it. Now I know why I never saw it. I think I'll try it out later today with the optimizations on, to see what all the hubbub is about.
 
Nice evaluation guys.

I have just one request, can you also add say, Day of Defeat (or any Source game) to your list of "games to test" with please? Especially widescreen.

I'll be reckoning alot of people (well, some at least) like to play with their source, games that is and it could be nice to see (DoD:S especially, what with the HDR and all)

Cheers fellas
 
Quoted from article:
"What do we think about this texture crawling issue? We don’t like it and we hope NVIDIA makes some improvements with their filtering quality in their drivers soon. We are aware that you can force a higher mode of texture filtering by turning off some of their optimizations, but honestly we’d like the default filtering quality to simply be better without having to turn off optimizations by hand in their driver control panel. Be aware that some folks are much more alert to this shimmering issue than others, so if you have not seen it, I suggest you don’t go out and look for it.


Do understand that the reason NVIDIA leaves these harsh filtering optimizations on is two-fold. First, on smaller displays, it simply does not make a difference to the vast majority of gamers out there. Second, it gives NVIDIA based GPUs better benchmark numbers. And in the land of video card marketing, where the size of your ePenis score is king, you don’t want to give up a few benchmark points lest you have the other team waving their ePenis back at you.


If NVIDIA is going to leave these optimizations on by default in the driver, and they are well aware that these optimizations negatively impact your gaming experience on larger displays, that is the ruler we are going to judge them by. We will continue to point out filtering issues where we see them impacting our gameplay."


Having owned top of the line latest gen Nvdia and Ati cards, I've come to realize that you simply CANNOT compair their quality at the default driver levels. Nvidia's default settings simply look worse than Ati's. This is not a Red vs Blue ordeal either.

Adjusting settings so both have similar visual quality will cause a very lopsided shootout. With Ati firmly trouncing Nvidia. Like I said I've owned the best from both companies and Ati's visual quality is simply better with drivers set to default.

That being said I am glad you acknowledge Nvidia's problem. But then make little of it. Sad indeed...
 
Kudos for the 1920*1200 and the filtering issue. Even kind of took the lcd angle of opting to tune some other things for the sake of res in the slightly lower price bracket. All in all, best review yet. :D
 
Brent_Justice said:
Let me put it this way, the 7900 GTX's fan barely registers on the meter at full load at its highest rpm, the ambient sound in my office is louder than the fan and that's with the video card sitting right here on my desk beside me.

Hey thanks, you just save a lot of future 7900s owners a bunch of money on GPU water blocks.
 
That being said I am glad you acknowledge Nvidia's problem. But then make little of it. Sad indeed...

What surprises me again and again is that [H] tries to obtain the highest possible and playable Image Quality, but sticks with the default Q mode for Nvidia...
 
Apple740 said:
What surprises me again and again is that [H] tries to obtain the highest possible and playable Image Quality, but sticks with the default Q mode for Nvidia...

They use the default driver settings for both NV and ATi. Dont make it sound like they are just doing it for one, and not the other.

That being said, I too would like benches with the quality tab all the way to the right on all cards, as thats how I always try to play games.
 
We'll considering the fact that Kyle and Brent failed to answer a direct question as too which card produced better image quality. I can only asume ATI is better.

It doesnt matter if the XTX sounds like a jet engine screaming in my ear, if it has better image quality, sign me up. ;)
 
fallguy said:
They use the default driver settings for both NV and ATi. Dont make it sound like they are just doing it for one, and not the other.

That being said, I too would like benches with the quality tab all the way to the right on all cards, as thats how I always try to play games.

Ati's default > quality slider (mipmap etc.) is maxed out. Nvidia default isn't.

And in almost each game they use angle independant HQAF for the XTX, while they don't try to get Nvidia's AF to the max. (HQ). What I want to know is how Nvidia performs with their HQAF.
 
Coolmanluke said:
We'll considering the fact that Kyle and Brent failed to answer a direct question as too which card produced better image quality. I can only asume ATI is better.

It doesnt matter if the XTX sounds like a jet engine screaming in my ear, if it has better image quality, sign me up. ;)

Hehe, people and thier image qualityness, I think my 6800GT is decent, but it slowdowns in counterstrike:source on some levels at 2xAA/8xAF at 1280x960, so I'm jumping up to Sli'ed 7900GTs (Going to keep them for along time probably - 1Year or so)

Btw I only ever notice shimmering when I set the performance to high performance, otherwise it's virtually unnoticable (for me).
 
RE image quality

We use the default driver configs on BOTH ATI and NV for our testing. This means in the NV drivers under Image Settings they are at "Quality" and ATI's drivers have Catalyst A.I. on at "Standard". THESE are the optimization settings and they are both at default settings.
 
Back
Top