bit-tech's 7950GX2 review

Mercutio

Weaksauce
Joined
May 8, 2006
Messages
86
bit-tech.net's review of the 7950GX2 calls into question that card's superiority. They reported higher quality playable settings with the X1900XTX in some significant tests.

Obliovion CRT & LCD - X1900XTX has higher playable settings with comparable framrates.

DOD:S - 7950GTX has 8xAA vs. 4xAA for the X1900XTX. But the X1900XTX min frames are 42 vs. 25 for the 7950. LCD performance is equal for both cards.

BF2 - Again the 7950GX2 pulls 8xAA vs. X1900XTX's 4xAA with CRT settings. Bottom framerates are within 6 fps of each other, but the 1900 averages ~20 fps higher. The 7950 has higher frames with higher AA settings with LCD settings.

COD2 - The 7950GX2's sole unequivocal win in this roundup.


The 7950GTX definitely performs better in COD2, the X1900XTX performs better in Oblivion, but each card takes something away from DOD:S and BF2. IIRC, bit-tech's review is a bit different from all of the others thus far, so it'll be interesting to see what [H] comes up with.
 
-=ABUSIVE-69=- said:
I think that these cards just might be cpu bound

How'd you come to this conclusion? Nomally you would need to see a test that shows differences between two CPU's to even justify that thought (the bit-tech article only has one CPU)


Anyway, Your statement is useless and wrong unless you include the application and settings used.

Is a 7950 CPU bound in FEAR at 2560x1600 8xAA? Most certainly not. This may be an extreme example, but hopefully you get the drift.
 
-=ABUSIVE-69=- said:
I think that these cards just might be cpu bound
I don't think so. The resolutions and settings seem to be aimed at maxing out the video card.
 
There is no doubting that the GeForce 7950 GX2 is the fastest video card on the market today that fits into a single PCI-Express x16 interconnect, and it holds that crown by a good distance in the majority of titles.
Mercutio said:
bit-tech.net's review of the 7950GX2 calls into question that card's superiority. They reported higher quality playable settings with the X1900XTX in some significant tests.

Obliovion CRT & LCD - X1900XTX has higher playable settings with comparable framrates.

DOD:S - 7950GTX has 8xAA vs. 4xAA for the X1900XTX. But the X1900XTX min frames are 42 vs. 25 for the 7950. LCD performance is equal for both cards.

BF2 - Again the 7950GX2 pulls 8xAA vs. X1900XTX's 4xAA with CRT settings. Bottom framerates are within 6 fps of each other, but the 1900 averages ~20 fps higher. The 7950 has higher frames with higher AA settings with LCD settings.

COD2 - The 7950GX2's sole unequivocal win in this roundup.


The 7950GTX definitely performs better in COD2, the X1900XTX performs better in Oblivion, but each card takes something away from DOD:S and BF2. IIRC, bit-tech's review is a bit different from all of the others thus far, so it'll be interesting to see what [H] comes up with.

Quote from the conclusion of this article:

There is no doubting that the GeForce 7950 GX2 is the fastest video card on the market today that fits into a single PCI-Express x16 interconnect, and it holds that crown by a good distance in the majority of titles.

They do futher go one to say about Oblivion:

If you haven't had the chance to play The Elder Scrolls IV: Oblivion, and are considering buying a new video card to play the game the choice will depend on your preference for Anti Aliasing, or not as the case may be. A lot of gamers cannot live without Anti Aliasing - if you're one of those people and want to play Oblivion with HDR turned on (we recommend you do turn it on, by the way), we'd recommend looking at ATI's current flagship card.

ATI is keen to point out the advantages of running HDR and AA. However, Oblivion is currently the only major game that ATI can do that NVIDIA can't - Tomb Raider and the Source engine are both supported by NVIDIA, whilst Ghost Recon doesn't work on either.

So I don't see where you come to the conslusion of them questioning the power of the 7950.
 
J-Mag said:
How'd you come to this conclusion? Nomally you would need to see a test that shows differences between two CPU's to even justify that thought (the bit-tech article only has one CPU)


Anyway, Your statement is useless and wrong unless you include the application and settings used.

Is a 7950 CPU bound in FEAR at 2560x1600 8xAA? Most certainly not. This may be an extreme example, but hopefully you get the drift.
I didnt say that they are, i just stated what i thought. read more carefully
 
Mercutio said:
Obliovion CRT & LCD - X1900XTX has higher playable settings with comparable framrates.

the CRT chart actually shows that the 7950GX2 allows for higher shadow detail and max fade distances with a higher minimum and average framerate over the x1900xtx. As is normal however, it won't do AA+HDR.

the LCD section shows similar results: higher in game detail settings, higher average framerate, but again it won't do AA+HDR.
 
-=ABUSIVE-69=- said:
I didnt say that they are, i just stated what i thought. read more carefully

So you admit your thoughts are wrong then? I think you should think more carefully :p
 
J-Mag said:
So you admit your thoughts are wrong then? I think you should think more carefully :p
no, I said that I THINK, i have no proof and i never said I did, that these cards might be cpu bound. can you give me a better reason why they, in many cases, they preform worse than a 1900xtx?
 
-=ABUSIVE-69=- said:
no, I said that I THINK, i have no proof and i never said I did, that these cards might be cpu bound.

When poeple finalize a thought, they come to a conclusion. You just happen to come to a conclusion with no reasoning. If you hadn't come to a conclusion then your post would have read something along the lines of "I wonder if the 7950 is cpu bound?"

Anyway, You NEED to include an application and it's settings when even talking about something being CPU bound. Do you ever just use your CPU and GPU? NO, you use a CPU and GPU with SOFTWARE!

There will be many circumstances where a 7950 is CPU bound and many where it is not.

-=ABUSIVE-69=- said:
can you give me a better reason why they, in many cases, they preform worse than a 1900xtx?

Yeah actually I could explain, but I am worried about your ability to comprehend, as you don't seem to understand the concept of being CPU bound.
 
d00d everyone knows the 7950 is totally cpu bound in all games on all pc's with any grafix settingz lawl imo 111!!11oneone
 
headless said:
d00d everyone knows the 7950 is totally cpu bound in all games on all pc's with any grafix settingz lawl imo 111!!11oneone


d00d, everyone knows your sig is longer than the alloted 10 lines.... !!!111!! oneone.
 
J-Mag said:
How'd you come to this conclusion? Nomally you would need to see a test that shows differences between two CPU's to even justify that thought (the bit-tech article only has one CPU)


Anyway, Your statement is useless and wrong unless you include the application and settings used.

Is a 7950 CPU bound in FEAR at 2560x1600 8xAA? Most certainly not. This may be an extreme example, but hopefully you get the drift.

someone's flexing their e-penis
 
Time for a resurrection. The [H] review, by far the highest quality 7950GX2 review, had me thinking that that card will be the way to go. I'm ready to trade in my X1900XT.

But then I took another gander at bit-tech's review...I don't understand why bit-tech shows the X1900XT outperforming the 7950GX2 in several tests whilst the [H] review shows the 7950GX2 outperforming X1900XT Crossfire 'd. Keep in mind that I'm comparing like with like (Oblivion @ 1920x1200).

Perhaps bit-tech didn't enable SLI mode?


bit-tech's Oblivion LCD benchies
H's Oblivion LCD benchies
 
Mercutio said:
Time for a resurrection. The [H] review, by far the highest quality 7950GX2 review, had me thinking that that card will be the way to go. I'm ready to trade in my X1900XT.

But then I took another gander at bit-tech's review...I don't understand why bit-tech shows the X1900XT outperforming the 7950GX2 in several tests whilst the [H] review shows the 7950GX2 outperforming X1900XT Crossfire 'd. Keep in mind that I'm comparing like with like (Oblivion @ 1920x1200).

Perhaps bit-tech didn't enable SLI mode?


bit-tech's Oblivion LCD benchies
H's Oblivion LCD benchies
We use 'high quality' driver settings for our testing as the shimmering gets rather unbearable in many circumstances - I prefer shimmer-free gaming and we have made our readers well aware of our feelings on that matter since the launch of GeForce 7900 GTX.

From my understanding of the linked HardOCP table, the X1900 CrossFire outperformed the single GeForce 7950 GX2 by a long shot in Oblivion at 1920x1200 (as you would expect). The X1900s are using HDR & 2xAA, HQ AF with near maximum details, while the 7950 GX2 has no grass, medium shadows and no AA with HDR.

Maybe I've missed the point you're trying to make?


As for the other games you mentioned, running with higher AA settings is going to result in a lower frame rate (because the card is doing more work). However, one is 'playable' and the other 'is not playable' at that setting implies that the higher frame rate on the Radeon X1900XTX is nulified by the higher quality settings on the GeForce 7950 GX2.


Tim
 
bigz said:
We use 'high quality' driver settings for our testing as the shimmering gets rather unbearable in many circumstances - I prefer shimmer-free gaming and we have made our readers well aware of our feelings on that matter since the launch of GeForce 7900 GTX.

There shouldn't be much if any shimmering problems with the new ForceWare 90 series drivers. The shimmering issues for the most part should be fixed. And if you're going to run high quality mode on the nVidia drivers you need to disable Catalyst AI as well. nVidia has disabled HQAF in the drivers while ATI has given the option to enable it. Its mostly a marketing thing and the fact that HQAF has a pretty big impact on performance. ATI's HQAF also causes some noticeable graphical anomalies at times in certain games.

We set Oblivion's graphical quality settings to "High." The screen resolution was set to 1600x1200 resolution, with HDR lighting enabled. 16X anisotropic filtering was forced on via the cards' driver control panel.

http://techreport.com/reviews/2006q2/geforce-7950-gx2/index.x?pg=5

The 7950 GX2 pulled an average of 14.2 fps more then the X1900 XTX at 1600x1200 w/ HDR + 16xAF at High Quality settings in Oblivion. At 1920x1200 the difference would be even greater in favor of the 7950 GX2. The X1900 XTX supports higher quality IQ with being able to do HQAF and HDR + AA but the 7950 GX2 is still the much faster card, especially at high resolutions like 1920x1200.

ATI cards do have better AF filtering though while nVidia cards produce better AA. For the most part you're not going to notice ither when actually playing the game but you will notice drops in fps. The nVidia 7950 GX2 is going to pull the highest fps without even hardly trying.
 
bunch of lame haters. Wait for some new drivers that will flex the cards muscle. They only released it soon cuz 90% of you salvating video card freaks hunger for cards and none of you have any patience.
 
Homeslice said:
bunch of lame haters. Wait for some new drivers that will flex the cards muscle. They only released it soon cuz 90% of you salvating video card freaks hunger for cards and none of you have any patience.

Yeah, the card was launched last week and nVidia STILL doesn't have WHQL drivers for the 7950. I'm curious if the next WHQL driver will support QUAD-SLI officially or not.

Seems like they could have released a WHQL certified driver by now if they weren't trying to get QUAD-SLI support in there as well.
 
burningrave101 said:
There shouldn't be much if any shimmering problems with the new ForceWare 90 series drivers. The shimmering issues for the most part should be fixed. And if you're going to run high quality mode on the nVidia drivers you need to disable Catalyst AI as well. nVidia has disabled HQAF in the drivers while ATI has given the option to enable it. Its mostly a marketing thing and the fact that HQAF has a pretty big impact on performance. ATI's HQAF also causes some noticeable graphical anomalies at times in certain games.
Trust me, it is still there - Brent also picked up on it.

I don't know what you're talking about when it comes to HQ AF graphical anormalities - I've used an X1900 across at least 20 different titles and not seen graphical issues with the option enabled. There were some issues with Oblivion early on, but that was a driver issue in general - it's fixed now. HQ AF isn't a marketing thing as it does provide visual benefits and massively reduces texture shimmering. Both video cards produce similar outputs when HQ AF / HQ Drivers are set up with a slight advantage to ATI.

Disabling Catalyst AI removes virtually all shimmering - it's not possible to do that with NVIDIA hardware at the moment. Even with the highest driver settings, there are still elements of shimmeing which leads me to believe that there are still optimisations turned on. Or alternatively, the filtering hardware just isn't as good as ATI's.

burningrave101 said:
http://techreport.com/reviews/2006q2/geforce-7950-gx2/index.x?pg=5

The 7950 GX2 pulled an average of 14.2 fps more then the X1900 XTX at 1600x1200 w/ HDR + 16xAF at High Quality settings in Oblivion. At 1920x1200 the difference would be even greater in favor of the 7950 GX2. The X1900 XTX supports higher quality IQ with being able to do HQAF and HDR + AA but the 7950 GX2 is still the much faster card, especially at high resolutions like 1920x1200.

ATI cards do have better AF filtering though while nVidia cards produce better AA. For the most part you're not going to notice ither when actually playing the game but you will notice drops in fps. The nVidia 7950 GX2 is going to pull the highest fps without even hardly trying.
No disrespect to Scott because I always enjoy reading his thoughts because they are usually right on the mark. However, his testing area (or areas) in Oblivion are not anywhere near as stressful as the ones used by myself and Brent. The minimum frame rate he recorded on GeForce 7950 GX2 at 1600x1200 is higher than my average frame rate at the same resolution.

If you read what I said in my concluding remarks, and also underneath the table at 1920x1200, I made reference to the fact that the 7950 GX2 didn't suffer from a lack of speed, it suffered from a lack of features. I feel that there was enough performance available for higher details, but they're not available because there is no HDR+AA support.
 
bigz said:
Trust me, it is still there - Brent also picked up on it.

I don't know what you're talking about when it comes to HQ AF graphical anormalities - I've used an X1900 across at least 20 different titles and not seen graphical issues with the option enabled. There were some issues with Oblivion early on, but that was a driver issue in general - it's fixed now. HQ AF isn't a marketing thing as it does provide visual benefits and massively reduces texture shimmering. Both video cards produce similar outputs when HQ AF / HQ Drivers are set up with a slight advantage to ATI.

nViida can't do HQAF. They havn't given the option to enable it in the NV4x and G7x series cards because of the performance hit. It is an option for the older FX series though. What ATI is doing is Angle-Independent Anisotropic Filtering and nVidia has tried to say their cards dont support it which is false. Its a marketing thing on nVidia's part. nVidia should give the option to enable HQAF on the 7950 GX2 though because they're more then fast enough to handle it. nVidia does have the advantage with shadow rendering though and the method used in most games is what nVidia uses so its better in that respect to ATI cards for IQ. A lot have said the AA quality on the G7x series is better then the X1900 as well. I guess your purchase just has to be based on whether you want max IQ or if you want to pull a lot higher fps with pretty good IQ. Another thing to remember though is the fact that a single X1900 XTX pulls around 27W more under load then a 7950 GX2. I would also say the X1900 XTX is probably still hotter and noisier as well.

I've got a new XFX 7950 GX2 Extreme that came yesterday but i'm thinking now i may just keep my X1900 XTX and wait till the next generation for more of a performance improvement and maybe some other advances like DX10.
 
I fired up my 7950's last night with BF2 and if there was shimmering , i did not notice it. I was using 91.31 beta in Quad SLI mode

I must say that is the first time I could say that in a long time
 
Back
Top