I'm not going to trust Anandtech for GPU reviews any more.

Rocksta107

Limp Gawd
Joined
Nov 9, 2007
Messages
195
If I were to trust Anand, I would think that the 280 GTX was a worse card with a worse gaming experience than the 9800 GX2 for more money! Judging by their performance curve, I would look like a moron buying a 280 GTX. Then I come over the the [H] and see that the GTX 280 blows the 9800 GX2 away. Take Crysis for example. Looking below, the 9800 GX2 is the clear winner;
17059.png

Then you check the [H] graph, and lo and behold the GTX 280 is clearly showing a better experience in the game;
1213329410dkPbuPba8A_4_3.gif


I think Anand has lost a lot in my views lately. I don't think the 280 GTX is what it should be for a $650 card, but it's clearly the current leader in gaming performance and Anand just fails to reach this conclusion through inferior testing methodology.

I still don't think I'll get one, and I'll at least wait for the 4800 series to come out, but if I went with Anand, I would think this was a bloody disaster!

(Sorry for the repost from Nvidia, but I think this is more relavant as a general video card post)
 
i dont mean to sound off but, nothing about anand's article is wrong, it just reflects diffrent testing methedology etc., to me it seems almost dead even between the gx2 and the 280gtx, thats my opinion. but i get that from both reviews:p
 
i dont mean to sound off but, nothing about anand's article is wrong, it just reflects diffrent testing methedology etc., to me it seems almost dead even between the gx2 and the 280gtx, thats my opinion. but i get that from both reviews:p

No off sounding taken here! I appreciate the dialogue.

It seems to me that the results are very different. Looking above, the GX2 seems stuck at medium quality throughout, whereas the 280 handles high quality faster than the GX2 handles medium. Looking at the near 40 FPS from Anand, the suspicion is, that all settings are turned to low.

Now, what good does that do at this point? If I'm shopping for a video card (which I am :p) I want to know which card is going to give me the best experience. Anand might be useful for knowing which card will run a certain setting faster, but really, I want the highest setting possible for the most satisfying visual experience I can get.

In that respect, those results are about as useful as an a$$hole on my elbow.
 
Those numbers are quite drastically different, aren't they?
I'd say there's more at work here than "testing methodology". I don't know who's right, [H] or Anand, so it's difficult to take sides.

I suppose I should take [H]'s side after looking at which website I'm on... Right?

Your best bet is to look at all the scores from the major reviewers and base your opinion on that. I wouldn't simply trust [H] or Anand by themselves.
 
You shouldn't base your purchasing decision on one review alone anyway. Try to read at least four or five reviews from different hardware sites before you decide.
 
Those numbers are quite drastically different, aren't they?
I'd say there's more at work here than "testing methodology". I don't know who's right, [H] or Anand, so it's difficult to take sides.
Well, AT uses timedemos and [H] records the performance during actual gameplay. Since I tend to play games... ;)

I do like to see both types of test results. Too bad the GTX 280 and 260 are so expensive. IMO, not worth it at $650 and $400. A pair of 8800 GT cards are available for around $300 and offer great performance if you have an SLI motherboard. That's still the current price/performance target to beat.
 
The GTX 280 at [H] is also an OC'd version. Yeah, not a huge amount but still not the same as the reference one that Anand used.

But like others have said, you have to look at all of the major reviewers and decide. I also like to wait for user's first hand experiences.
 
the problem with the anand test; and many other review sites out there for that matter, is the lack of information they provide about how they achieve their numbers. if you read the blurb the anand writer made about the crysis test, he doesn't tell readers what settings he used at all. differing numbers among reviewers is fine, as long as we know how they got their numbers.
 
Well, AT uses timedemos and [H] records the performance during actual gameplay. Since I tend to play games... ;)

I do like to see both types of test results. Too bad the GTX 280 and 260 are so expensive. IMO, not worth it at $650 and $400. A pair of 8800 GT cards are available for around $300 and offer great performance if you have an SLI motherboard. That's still the current price/performance target to beat.

I very much agree. I think it has been shown that a number of drivers are optimised for time demos. I hate to purchase a video card on one review alone, but these days I cannot trust the data anywhere but here. I would love to make my decision based on more than one data source, but I cannot state another source that I think is accurate and unbiased enough to give me a solid ground for a major purchase.

I think this was well shown at the conclusion of "Benchmarking the Benchmarks" where Anand had reporting a 3870X2 getting 34 FPS at certain setting on Crysis, but when you actually play the game on those settings, you end up with 18 FPS.

I just don't trust em'.
 
There are a number of sites out there, (hothardware, anandtech, and others) that show the 9800GX2 outperforming the GTX 280 in Crysis at 1920x1200. The things I have noticed which may explain the difference are as follows.

The [H] review has an overclocked CPU (3.66) and the drivers they used for the GTX cards are the 177.26 drivers.

The other reviews all have 3.0GHz CPUs and they all used the updated 177.34 drivers.

So the performance difference could easily be explained by either of these factors. If Crysis is being CPU bottlenecked, the 9800GX2 might perform very similar to the GTX 280. Drivers could also effect these results, which goes without saying.
 
I'm glad I can look at both tests. It shows me that the GTX 280 is arguably faster than a 9800GX2, but depending on how you test it can be beat by 8800 GTs in SLi!

If GT200 were truly the monster it was supposed to be, the 9800 GX2 and 8800 GTs in SLi shouldn't be able to beat it no matter how you massage the numbers.
 
this thread is stupid, people are seeing what they want to see. there are like 6 to 1 reviews saying the gtx280 is not that great compared to the gx2, could everyone of them be in a anti-gtx280 conspiracy?
 
There are a number of sites out there, (hothardware, anandtech, and others) that show the 9800GX2 outperforming the GTX 280 in Crysis at 1920x1200. The things I have noticed which may explain the difference are as follows.

The [H] review has an overclocked CPU (3.66) and the drivers they used for the GTX cards are the 177.26 drivers.

The other reviews all have 3.0GHz CPUs and they all used the updated 177.34 drivers.

So the performance difference could easily be explained by either of these factors. If Crysis is being CPU bottlenecked, the 9800GX2 might perform very similar to the GTX 280. Drivers could also effect these results, which goes without saying.

That's a good point. I wish I had some way of knowing if that was the case or not. I would much prefer more thorough testing methodology for the other sites. I do not want to argue from silence though; it seems to me via [H] that all other components being the same, the 280 GTX is simply a faster card.

I would also think that if the performance were highly CPU limited, other cards would be effected by this as well. We would likely see more uniform results across the board.
 
I'm glad I can look at both tests. It shows me that the GTX 280 is arguably faster than a 9800GX2, but depending on how you test it can be beat by 8800 GTs in SLi!

If GT200 were truly the monster it was supposed to be, the 9800 GX2 and 8800 GTs in SLi shouldn't be able to beat it no matter how you massage the numbers.

I don't necessarily think that's true. It was well shown when Tom's liked to test everything at 800x600 with low quality to show how great new CPU's were for gaming, that at settings someone would actually want to play on, the CPU became irrelevant. I think, sure, maybe at 1900x1200 with all settings at low on Crysis 8800GT's in SLi could beat a 280 GTX but am I going to play with settings on low in Crysis? No? Then which card setup will give me a better gaming performance? I think this is where [H] shines through.

Though, admittedly, this doesn't take value into consideration. If you can have 8800 GT's in SLi for under $300 these days (which you likely can) then obviously the 280 GTX is a horrible value. I'm not arguing for the value of the 280 GTX here at all. I'm simply stating that in trying to establish performance, which is incidently a key component of value, that Anandtech is irrelevant.
 
There are a number of sites out there, (hothardware, anandtech, and others) that show the 9800GX2 outperforming the GTX 280 in Crysis at 1920x1200. The things I have noticed which may explain the difference are as follows.

The [H] review has an overclocked CPU (3.66) and the drivers they used for the GTX cards are the 177.26 drivers.

The other reviews all have 3.0GHz CPUs and they all used the updated 177.34 drivers.

So the performance difference could easily be explained by either of these factors. If Crysis is being CPU bottlenecked, the 9800GX2 might perform very similar to the GTX 280. Drivers could also effect these results, which goes without saying.
[H] also has overclocked GTX 280 though they claim no gains:
The “OC” model we are evaluating today is overclocked out of the box according to BFGTech, though it is just barely overclocked from NVIDIA’s stock frequencies. The core clock is set at 615MHz which is only a 13MHz overclock from NVIDIA’s stock frequency of 602MHz. The stream processors are overclocked to 1.350GHz which is only a 54MHz overclock. The memory frequency is untouched and operates at the stock 2.214GHz. These overclocks do not yield any noticeable performance gains in our testing compared to NVIDIA stock frequencies
 
Why is the stock GPU frequency 602 MHz anyway, and not 600 MHz? Did they push the card to the very limit to discover exactly what would be a stable frequency on all production cards?
 
Why is the stock GPU frequency 602 MHz anyway, and not 600 MHz?
With this kind of decoupled architecture, where SP frequency scales with core clock frequency to a 27MHz crystal clock, every MHz counts ;)
 
Why is the stock GPU frequency 602 MHz anyway, and not 600 MHz?
That's probably the closest frequency at least equal to 600MHz after the core reference clock is multiplied/divided on the chip.
 
Personally I went the opposite route and stopped reading [H]'s video card reviews. [H]'s testing methodology although it sounded like a good idea when it was first concieved, best playable settings for each card; leaves too much up to the discretion of the reviewer. From the benchmarks you quoted [H] got a higher framerate with the GX2. I mean the framerate hit going from 8X AF to 16X AF is pretty much nothing on modern graphics cards, you can be pretty sure that the GX2 you could enable 16X AF with almost no performance hit in Crysis, anyone can verify this on their own and see the performance hit is almost nothing going from 8X AF to 16X AF.

The same could be said about the other settings that [H] went from high to medium. When I tried tweaking Crysis I found that there were only a few settings that made a significant difference in framerate. Anyways, [H]'s testing methodology leaves too much room for the reviewer to massage a review in the manner they want.

The lower AF settings which is pointless in terms of framerate gained, which I've seen time and time again in [H] reviews, is why I no longer bother reading [H]'s reviews. Although it had good intentions, I wish [H] would just ditch this stupid max playable settings testing.
 
[H}'s reviews provide an "apples to apples" section too.

I agree that the method is kind of unreliable - One gamer might think FSAA is most important, sacrificing things like shader quality, AF etc. to achieve 4x AA at a good frameate..another may be willing to turn off AA and AF alltogether to be able to crank every in-game setting to max.
 
that the GX2 you could enable 16X AF with almost no performance hit in Crysis, anyone can verify this on their own and see the performance hit is almost nothing going from 8X AF to 16X AF.

you would be wrong sir. the hit may equate to a handful of frames, but a handful of frames means a lot in crysis.
 
if ur so lean to the 280 gtx, why don't u just dump $580 on it NOW, nvidia'll just laugh all the way to the bank
 
[H}'s reviews provide an "apples to apples" section too.

I agree that the method is kind of unreliable - One gamer might think FSAA is most important, sacrificing things like shader quality, AF etc. to achieve 4x AA at a good frameate..another may be willing to turn off AA and AF alltogether to be able to crank every in-game setting to max.

I wish they have 2xAA and 8x af with everything in game set to highest on 1920x1200, and make it apple to apple, is this too much to ask hard?
 
I wish they have 2xAA and 8x af with everything in game set to highest on 1920x1200, and make it apple to apple, is this too much to ask hard?

this setting is unplayable for overclocked 8800 utlra sli so i'm guessing it wouldn't fly for a 280 either.
 
Rocksta - Anand's testing isn't wrong from the standpoint of average frames per second. That said, where HardOCP's testing shows its value is demonstrating that the newer cards maintain a more even framerate with higher lows against a similar level of highs. There is less variation in the framerate, giving a more consistent experience relative to the SLI 9800GTX.

Anand's benchmarks would conclude that the new cards are a poor value, while HOCP's actually demonstrate greater value since the cards keep things more consistent. You should email the author and let him know that the testing methodology they use doesn't provide the whole picture.
 
Personally I went the opposite route and stopped reading [H]'s video card reviews. [H]'s testing methodology although it sounded like a good idea when it was first concieved, best playable settings for each card; leaves too much up to the discretion of the reviewer. From the benchmarks you quoted [H] got a higher framerate with the GX2. I mean the framerate hit going from 8X AF to 16X AF is pretty much nothing on modern graphics cards, you can be pretty sure that the GX2 you could enable 16X AF with almost no performance hit in Crysis, anyone can verify this on their own and see the performance hit is almost nothing going from 8X AF to 16X AF.

The same could be said about the other settings that [H] went from high to medium. When I tried tweaking Crysis I found that there were only a few settings that made a significant difference in framerate. Anyways, [H]'s testing methodology leaves too much room for the reviewer to massage a review in the manner they want.

The lower AF settings which is pointless in terms of framerate gained, which I've seen time and time again in [H] reviews, is why I no longer bother reading [H]'s reviews. Although it had good intentions, I wish [H] would just ditch this stupid max playable settings testing.

I don't really do much more then glance at [H]'s numbers. Way to much left to the discretion of the reviewer. Plus once they start breaking down high/medium/low settings with "draw distance 2300", "grass distance" and all those other settings it's just too much work to go back and forth between charts. I don't think I've ever seen a resolution/AA/AF/advanced setting that they use in their tests that I would personally use. Having the "apples to apple" helps a bit, but they're breaking things down too much with draw distances and such on their normal testing. I think there should only be 3 things changed if they want to use their current testing method - resolution, AF and AA. Everything else should be the same across the board. IMO they are making more work for themselves then necessary. I appreciate the effort, but for me it doesn't equate into results.
 
I found another review that showed the 9800GTX in SLI beat the 280 in four of seven tests, I'm more tempted to spend out on two 9800 GTX's than a 280 primarily because your getting value for money (about $200 each cheaper) and fps wise would you honestly notice the difference between 2-3 fps?
 
I don't really do much more then glance at [H]'s numbers. Way to much left to the discretion of the reviewer. Plus once they start breaking down high/medium/low settings with "draw distance 2300", "grass distance" and all those other settings it's just too much work to go back and forth between charts. I don't think I've ever seen a resolution/AA/AF/advanced setting that they use in their tests that I would personally use. Having the "apples to apple" helps a bit, but they're breaking things down too much with draw distances and such on their normal testing. I think there should only be 3 things changed if they want to use their current testing method - resolution, AF and AA. Everything else should be the same across the board. IMO they are making more work for themselves then necessary. I appreciate the effort, but for me it doesn't equate into results.
Agreed, I can't even follow [H]'s reviews anymore, way too many variables and settings and changes and differences to compare
Apples to Apples is the way to go. It's much easier for the reviewer and for the reader
 
I found another review that showed the 9800GTX in SLI beat the 280 in four of seven tests, I'm more tempted to spend out on two 9800 GTX's than a 280 primarily because your getting value for money (about $200 each cheaper) and fps wise would you honestly notice the difference between 2-3 fps?

No, that's the point of a lot of reviews. Sure the 280's give you an extra 3 FPS and are "top of the line" but it's not worth the price increase at all. Even SLI'd the new ones don't show a huge performance increase, and they SHOULD. :mad:

I'm going to go with the GX2's if the new ATI's performs as badly as the new Nvidia's.

I'm more likely to trust anandtech, simply because their review is forward. The [H] on is all complicated when I just want solid hard facts. Plus there seems to be a general conclusion among a lot of people, that the GTX 280 is not worth it, when they are saying the opposite (and are what seems to be in the minority).
 
No, that's the point of a lot of reviews. Sure the 280's give you an extra 3 FPS and are "top of the line" but it's not worth the price increase at all. Even SLI'd the new ones don't show a huge performance increase, and they SHOULD. :mad:

I'm going to go with the GX2's if the new ATI's performs as badly as the new Nvidia's.

I'm more likely to trust anandtech, simply because their review is forward. The [H] on is all complicated when I just want solid hard facts. Plus there seems to be a general conclusion among a lot of people, that the GTX 280 is not worth it, when they are saying the opposite (and are what seems to be in the minority).

It sounds like you want the reviewers to tell you what to buy. IMO, reviewers should just review the product mention the pricing or maybe throw in a chart that shows relation of price vs. performance, but they should leave it at. It is then the readers responsbility to take the information they are given and figure out which best fits their needs. Someone people will be more than willing to spend a premium even if the performance gains aren't as large as other would like.

As for the [H] review being "complicated", i found it to be one of the more informative reviews giving educated readers more information to help make their decision. I would have liked to see some SLI info, but the overall review was really good.
 
I enjoy the variety of reviews. You can take something from all of them. I agree that Anandtech seems to use canned benches a bit much, but their thorough explanations of technologies is great. They are a very knowledgable crew. Being a gx2 owner, I loved how they compared quad sli vs. 280 sli. Something to take from every review I think. The more you read, the more you learn to pay closer attention to testing platforms and drivers. Users should be able to come to some conclusions on their own.
 
Why is the stock GPU frequency 602 MHz anyway, and not 600 MHz? Did they push the card to the very limit to discover exactly what would be a stable frequency on all production cards?

It's a multiplier artifact.
 
Anandtech reviews are generally interesting but... While the use of canned benchmarks shows consistent results it may no reflect the real gaming performance. Like someone wrote earlier, we play games, not demos.
It is true that [H] methodology depends significantly of the reviewer but this is the best we have. In order to increase confidence, I suggest to read other reviews site such as XbitLabs.com.

On the other hand, I find also interesting that [H] always use BFG video cards...
 
Agreed, I can't even follow [H]'s reviews anymore, way too many variables and settings and changes and differences to compare
Apples to Apples is the way to go. It's much easier for the reviewer and for the reader

QFT - too many variable, too many differences. I just look at the conclusion anymore.
 
I'm not looking for someone to tell me what to buy. But I'm not going to spend 200$ extra just for a new card, that is the only benefit, it is new and shiny. It does not perform as well or is on par with the GX2 occasionally giving out a couple extra FPS.

I've been reading every review posted on another forum (all different sites, not just the forums review) and this is my overall conclusion. The GTX is a piece of junk and is a flat out lie. If you just want to sit around playing Oblivion then get the GTX while the rest of us invest our extra 400$ for SLI cards into a new monitor or something.

I'm actually LOOKING FORWARD to the ATI reviews, and I never thought I'd be saying that. I was all prepared to go and buy the 280's but unless hell freezes over, it isn't going to happen. I don't need the (what do girls get for an e-peen? :p) points, I don't sit around all day running benchmarks. I need a card that will perform well in games, that is all.

No one has given me good reasoning to believe that the 200$ extra is worth it. I'm not going to take one [H] review over 5 other ones.

It would be better if the [H] review set every card the same and then posted the results, all these different settings just complicate it. I feel like I'm comparing an apple to an orange, as someone else said, apples to apples is overall better. I didn't really see that in the [H] review.
 
If I were to trust Anand, I would think that the 280 GTX was a worse card with a worse gaming experience than the 9800 GX2 for more money! Judging by their performance curve, I would look like a moron buying a 280 GTX. Then I come over the the [H] and see that the GTX 280 blows the 9800 GX2 away. Take Crysis for example. Looking below, the 9800 GX2 is the clear winner;

Then you check the [H] graph, and lo and behold the GTX 280 is clearly showing a better experience in the game;


I think Anand has lost a lot in my views lately. I don't think the 280 GTX is what it should be for a $650 card, but it's clearly the current leader in gaming performance and Anand just fails to reach this conclusion through inferior testing methodology.

I still don't think I'll get one, and I'll at least wait for the 4800 series to come out, but if I went with Anand, I would think this was a bloody disaster!

(Sorry for the repost from Nvidia, but I think this is more relavant as a general video card post)

The anandtech charts does not state any AA or AF was enabled.
 
Back
Top