What if I told you...

Status
Not open for further replies.
Ok so let me get this straight. The people who like time demos are basically saying they want to see 2 cards compared to one another using canned time demos so they can see like results in a controlled setting correct? Thus they can sompare which card is faster, in this controlled setting, and this does what exactly? Tells them which card can run a time demo faster and or which company optimized their drivers for said demo better? Ok so I go buy a $400-$600 video card and I base my decision on which card is better/faster/more worth my hard earned money based on one running said time demo better than the other card.

Hmmmm ok so then I get the card installled in my rig. I fire up some of those hot games I've been lusting to play on my new hardware and I find out that said game, isn't as fast on this hardware I purchased, as the review sites with the canned time demos showed.

Now I look at a review site that showed me some real world results, the same results I will see when I plug that hardware into my own rig, and I see pretty much WHAT I WAS SHOWN TO EXPECT and this is a bad thing??

I know I'm playing devil's advocate here but I hope my point is proven.

There is no reason in the world why said consumer would want/need to see old fashioned canned demos. No reason what so ever when I will never get said performance as was shown in that demo. So I ask you all, why do you care?

So if I want to buy a $450 card, which card is better for me to play Bioshock or Call of Juarez @2560x1600, or @1920x1200? Will I be able to use 2xAA @2560x1600 or 4xAA @1920x1200?

I doubt that I can make this decision by just looking at [H] numbers but looking at numbers from Firingsquad for an example, I can make a good guess on which card would be better in which game and which is better at high resolution and which would be better with AA.
 
If this test is gona include a built in timedemo then its gona be a useless test as most people know its gona be optimized just like 3dmark and the rest. Most reputable sites make their own custom timedemos to eliminate this problem. Oh and sometimes renaming the executeable works too. instead of crysis.exe rename it bob.exe
 
I think one of the things this whole debate comes down to is a trust issue. With canned demos you can run the test and spit out FPS numbers, then you can compare various resolutions and aa/af settings. With [H]'s method, there's more variance in each test, which is something that is known already. I think for reader's to understand [H]'s method they need to put some faith and trust in Kyle, Brent, and other [H] employees that they are out to show the readers what hardware will give them the best gaming experience possible. Since having a good experience can be very subjective, there has to be a level of trust on the readers part, and this is something why I think [H] readers are so unique. They are able to let Kyle, Brent, et al. make the decisions necessary to get a good balance of eye candy and performance while playing games.

It seems that alot of people just want numbers to look at when making comparisons. This is fine, but I don't think it should be the end all method of selecting hardware. [H]'s reviews do require that you have some knowledge of hardware and technology so that the reader can take what [H] has demonstrated then apply it to their individual situation. [H] doesn't review every piece of hardware out there nor do they keep graphs of canned timedemos for every piece of hardware but I think the reader should be able to see that if one video card performs better than most while running Crysis then that card is probably one of the better bets to get in terms of overall gaming experience. If it can handle a game like Crysis well then it's safe to say that it'll handle any other game out there on the market today.

That's my $0.02. I'm not sure if that makes complete sense because I'm pretty tired this morning (waking up at 4am will do that), but I hope I got my point across well enough. :eek:
 
So if I want to buy a $450 card, which card is better for me to play Bioshock or Call of Juarez @2560x1600, or @1920x1200? Will I be able to use 2xAA @2560x1600 or 4xAA @1920x1200?

I doubt that I can make this decision by just looking at [H] numbers but looking at numbers from Firingsquad for an example, I can make a good guess on which card would be better in which game and which is better at high resolution and which would be better with AA.

EXACTLY

[H] provide you with a small subset of results compared to other review sites, which is fantastic if you happen to fall inside that small subset of results because they you're very informed indeed. However a lot of people simply do not fall within that subset.

I would rather have a whole range of results benched at a range of resolutions with a range of settings even if those settings give us back horrifcly extreme results like very low or very high frame rates, because some of us dont care about frame rate and are perfectly happy with 25fps and some of us care very much and wont use less than 60 min (100 average usualy)

You can look through the graphs, pick out those numbers which are suitable for you and that gives you a range of settings you could comfrotably use, so if you're an AA nut you can see how it works with AA on, or maybe you hate AA and want to use a higher resolution with no AA and in which case you get that as well.

I understand what he's getting at, custom time demos are better than built in time demos, and we ALL know that because we know built in time demos are usualy optimsed for by the vid card vendors.

I think you're going to have a VERY hard time providing this disparity in any kind of realistic metric though, many different people could play the same game through and get hugely different frame rates simply depending on how they play.

Take a game like Crysis, the vertical angle of your view probably has a massive impact on your frame rate, if you keep it low and look more towards the ground your frame rate is going to be significantly higher than if it's angled straight forward.

I bet I could prove the same dispatirty in frame rate 15-20% by simply slightly altering how I play the game, by looking on average 10 degrees down angle more than usual, or by a great number of other factors, do you look directly at explosions for example which greatly dip frame rate, do you happent to cause a lot of interactions between physics objects?

In Crysis I went around punching people through walls and colapisng houses onto peoples head or blowing them up with explosive barrels, and grenades and raming cars through huts and then blowing them up in the middle sending pieces of building everywherwe.

If i was simply shooting my way through with a machine gun then none of them effects would be happening.

So I say, pshaw [H], show us this disparity and I will show the same disparity back within the same level with simply using different play styles.

Somewhere along the line I think you lost track of the link between you and your users, it's all about how YOU play your time demos through, as if thats somehow magically the best way, and what framerates YOU expect to get, and what settings YOU prefer.
 
So explain to me how your benchmarks show me which card is going to serve me best when i want an average of 100FPS in my games, since I play online in a competative environment and having a high frame rate is important. What sort of settings will a card let me use at that frame rate?

Since the purpose of [H] reviews is not to tell you which card is going to serve you best when you want a 100 fps average in your games, then it is fair to say that [H] reviews won't inform you about such settings. Such is the meaning of best playable settings. Capisce?
 
Since the purpose of [H] reviews is not to tell you which card is going to serve you best when you want a 100 fps average in your games, then it is fair to say that [H] reviews won't inform you about such settings. Such is the meaning of best playable settings. Capisce?

THEIR playable settings.

Not mine, what I accept as playable is greatly different from theirs, what I expect from the review is what they're trying deliver, they simply cannot with their current method.

"playable" and "best settings" are totaly subjective, capisce?
 
THEIR playable settings.

Not mine, what I accept as playable is greatly different from theirs, what I expect from the review is what they're trying deliver, they simply cannot with their current method.

"playable" and "best settings" are totaly subjective, capisce?


So other guys do test at what you like as playable settings but their results are done with timedemos and in no way relate to real world gameplay and that is what you want?


I am good with that. Enjoy their sites, because you have plenty to choose from! :D
 
You people are making something simple into something complex. This is pretty straight forward, either a video card play games well or they don't play games well at the settings you prefer. The optimun settings that [H] reviews show should give anybody an accurate indicator whether or not you can play the games you want at the settings you want.

I don't know how in the world you people think that timedemos and synthetic benchmarks relate to whether or not a game that you are actually going to play is going to play well, I thought we already learned this lesson from 3dMark. As far as timedemos goes, they are rarely ever an accurate comparsion to actually playing the game because timedemos are stripped of a lot of factors that dertermine performance and the fact that you have no user input or control besides setting the video config that it is going to run at.
 
So if I want to buy a $450 card, which card is better for me to play Bioshock or Call of Juarez @2560x1600, or @1920x1200? Will I be able to use 2xAA @2560x1600 or 4xAA @1920x1200?

I doubt that I can make this decision by just looking at [H] numbers but looking at numbers from Firingsquad for an example, I can make a good guess on which card would be better in which game and which is better at high resolution and which would be better with AA.

Ok, so since [H] doesn't tell you how much AA and the maximum screen resolution you can play on with a given card, then why are there those lovely charts on every review that tell what the highest amount of AA/AF was able to be used at a given resolution?:rolleyes:

And if you go back and read some of the other reviews, the you would also know that Kyle provided a COMPLETE summary of, "Highest possible settings" for the card(s) being tested.

Link: http://enthusiast.hardocp.com/article.html?art=MTQ1NCw1LCxoZW50aHVzaWFzdA==
 
There's nothing wrong with a company optimizing their drivers for a timedemo. If it improves the software, we all benefit in the next driver update. That's applicable to our real world game experience.

That doesn't mean timedemos are a perfect measure. However, it does provide a useful way to compare one card's performance (in a relative way) to another card. I.e., card X is faster than card Y by Z%.

That helps me understand whether an 8800GT spanks a 3870, or is just slightly faster. It also helps differentiate how much faster a 3870 is than a 3850. That seems helpful for weighing price vs. performance.

Real world testing helps me ensure that the card I choose will give me the actual experience I want. If real world testing shows that a 3850 will play a certain set of games just fine, then it might go back on my list. Based on timedemo results alone I may have taken it off my list.

Seems like both sets of data have their place, but maybe that's just how I think about things.
 
1. There's nothing wrong with a company optimizing their drivers for a timedemo. If it improves the software, we all benefit in the next driver update. That's applicable to our real world game experience.

2. Real world testing helps me ensure that the card I choose will give me the actual experience I want. If real world testing shows that a 3850 will play a certain set of games just fine, then it might go back on my list. Based on timedemo results alone I may have taken it off my list.

1. Actually I would suggest that companies optimizing for timedemos and optimzing for game performance may not always be the same thing.

2. This is exactly what our goal is with our video card evaluations.
 
I don't really see how other sites can claim timedemos are in any way "better." It's only done that way because it's much easier. And it's absolutely absurd when they run cut scene timedemos.
 
I see where HardOCP's review style comes into play, and I enjoy it. I prefer it to other's, as I know what to expect, and how high I can crank the settings and still play the game, which is why I buy the game and hardware in the first place.

Although, in games, a structured timedemo has it's place, too. Almost like reviewing a car. You want your test to be uniform, with the only changes to the test being the two pieces of hardware being tested. You wouldn't test a Ferrari on a dirt road to a Corvette on a chunk of pavement. It wouldn't be a fair test. You want the test to be equal to all contestants... (Also, think of an obstacle course: it has to be the same for all people to have the results mean anything).

So, it does have it's place in a controlled environment. But, it is just that: controlled. A Ferrari isn't going to be running 1/4 miles all day. A runner isn't doing obstacles all day. I want to know how I can play the game. And if the conclusion is yes, I can. Then that's my result. A time demo can be 25% more and say the game is playable, and compared to another card that's not, that means something. But, if I can't play the rest of the game, it doesn't really mean much.

Contridicting myself? No. I just see the benefits of both styles, but I prefer HardOCP's method, as it allows me to decide based on what I do with my computer: Play games. :)
 
Ok, so since [H] doesn't tell you how much AA and the maximum screen resolution you can play on with a given card, then why are there those lovely charts on every review that tell what the highest amount of AA/AF was able to be used at a given resolution?:rolleyes:

And if you go back and read some of the other reviews, the you would also know that Kyle provided a COMPLETE summary of, "Highest possible settings" for the card(s) being tested.

Link: http://enthusiast.hardocp.com/article.html?art=MTQ1NCw1LCxoZW50aHVzaWFzdA==

No other sites won't tell me the info directly or accurately but I can make a better assumption of the performance by looking at other reviews than looking at the [H] review. For an example the performance in Bioschock here:

@1600x1200 I think that I could enable 4x/8x AA with 8800GTX/Ultra if the game allows it but with the X2, I think that 2x AA would probably be the limit.

@1920x1200 I think that I can still enable 2x AA with 8800GTX/Ultra if the game allows it but with the X2, I don't think that AA is possible.

@2560x1600 I think that I can still play the game with the X2 without AA and without lowering any graphic setting or maybe just lowering one setting. On either 8800GTX/8800Ultra I think I need to lower down at least one or more setting to play the game.

This is by looking at numbers from one site, with different sites, I think that I can make a better assumption. If I just look at [H] numbers, I don't even know how the card would perform in Bioshock with different AA and resolutions.
 
No other sites won't tell me the info directly or accurately but I can make a better assumption of the performance by looking at other reviews than looking at the [H] review. For an example the performance in Bioschock here:

@1600x1200 I think that I could enable 4x/8x AA with 8800GTX/Ultra if the game allows it but with the X2, I think that 2x AA would probably be the limit.

@1920x1200 I think that I can still enable 2x AA with 8800GTX/Ultra if the game allows it but with the X2, I don't think that AA is possible.

@2560x1600 I think that I can still play the game with the X2 without AA and without lowering any graphic setting or maybe just lowering one setting. On either 8800GTX/8800Ultra I think I need to lower down at least one or more setting to play the game.

This is by looking at numbers from one site, with different sites, I think that I can make a better assumption. If I just look at [H] numbers, I don't even know how the card would perform in Bioshock with different AA and resolutions.

I think you are missing the point here buddy. It is in my personal experience that some demos are tweaked for certain vendors (i.e. Source Engine in time demos do better with ATi cards vs nVidia) even though in real world game play that may not be so. Also you have to look at it in the since that Single player mode vs online in a game are gonna put different kinds of stress on a PC and all of the key components (video card(s), RAM, and CPU). Even on my system, when I am playing single player mode in any given game, when I go to play online some times I may not be able to run the insane levels of AA/AF that I do in the SP.

Same theory applies when comparing time demos and "real world" game play, the time demo is like the Single player, and the "real world" is like the multi player. So lets say we were comparing video card "A" to video card "B" in a time demo. Said time demo benches card A at 80 FPS, and benches card B at 90 FPS. Then put in real world game play where nothing is PRE-RENDERED, say a 64 player BF2142 (or BF2) server card B gets 60 FPS and card A is getting 70 FPS, even though card B benched higher in the Time Demo.

:cool:

Allow me to refer you back to this post, I don't feel like typing all of that again.
 
No other sites won't tell me the info directly or accurately but I can make a better assumption of the performance by looking at other reviews than looking at the [H] review. For an example the performance in Bioschock here:

This is by looking at numbers from one site, with different sites, I think that I can make a better assumption. If I just look at [H] numbers, I don't even know how the card would perform in Bioshock with different AA and resolutions.

What does that tell you exactly? Does it even show how they got those numbers in that game???? I can't tell what that is representative at all.

And just to note, we stopped using BioShock on high end cards because they simply all do a beautiful job at handling the game. I think you can buy anything from a 3870 up and have a great BioShock experience, therefore we don't find it important anymore.
 
Time demos are fun for checking out quick overclocking results
 
1. I would definitely defer to your experience on this point. Kinda sad, but I guess it's why you guys work so hard on real world testing.

2. No doubt you've succeeded. Glad I'm interpreting the results as intended.

BTW, I only discoverd [H] a few months ago and love the site. Great discussions and [H] was really helpful when I built my new rig. Wish I had more time to participate.

1. I think ATI's last driver was a good example of this. AMD told us that the new driver gave a 60% fps increase in Crysis GPU benchmark, but in the game we saw 1 to 2 frames better.

2. Thanks.

Thanks for the support and glad we could help you out. The community here is pretty freaking awesome as well. A lot of smart cookies around here that I learn from all the time as well.
 
Time demos are fun for checking out quick overclocking results

Absolutely no argument from me there! Still depending on what you are overclocking and whether or not you run high res or low res demos.
 
Allow me to refer you back to this post, I don't feel like typing all of that again.

I think that you missed my point, I said that the actual numbers is not important, what is important for me is the capability to see how the cards react with a certain game and the changes in resolutions, graphic settings and AA settings.

If card A takes more hit when the resolution is increased in a time demo than card B, I doubt that card A would take less hit in REAL WORLD with higher resolution.

I already gave you the example with [H] review before, the X2 performs @2560x1600 without AA on par with 8800GTX because the X2 takes less hit with higher resolution compared to the 8800GTX.

@1920x1200, the 8800GTX can run with 8x AA and match the speed of the X2 with 4x AA because the 8800GTX takes less hit with AA compared to the X2.

What [H] numbers can't show me is the trend that show how the cards react with games, resolution and AA because [H] only use a very limited number of games and settings. Time demo or REAL WORLD, optimized or not, the reaction trend with changes would still be the same.
 
What if ATI sent Brent a new card, but there was a booger on the cooler, so Brent retaliated by throwing constant grenades in a wide open environment in a game with that card, while with the competing Nvidia card he stared at a wall indoors for the length of testing

what then, huh?!

WHAT THEN



OMGorsh how would we know what Brent had done.

Kyle put a webcam on Brent 24/7
 
I cant wait to read this article,and witness the aftermath of it in the community.It amazes me after all the explanations from Kyle and Brent over the years,and still,still many are hard headed and blind to the truth of it all.

I have also never seen Kyle draw so deep a line in the sand over an issue such as this, and I have followed his comments on this,and other 'gaming' issues closely since the sites inception.

Interesting times ahead to be sure. :D
 
I cant wait to read this article,and witness the aftermath of it in the community.It amazes me after all the explanations from Kyle and Brent over the years,and still,still many are hard headed and blind to the truth of it all.

I have also never seen Kyle draw so deep a line in the sand over an issue such as this, and I have followed his comments on this,and other 'gaming' issues closely since the sites inception.

Interesting times ahead to be sure. :D

For sure, and I do appreciate the work guys. You have helped me and I know many others over the years on hardware purchases.
 
I think that you missed my point, I said that the actual numbers is not important, what is important for me is the capability to see how the cards react with a certain game and the changes in resolutions, graphic settings and AA settings.

If card A takes more hit when the resolution is increased in a time demo than card B, I doubt that card A would take less hit in REAL WORLD with higher resolution.

I already gave you the example with [H] review before, the X2 performs @2560x1600 without AA on par with 8800GTX because the X2 takes less hit with higher resolution compared to the 8800GTX.

@1920x1200, the 8800GTX can run with 8x AA and match the speed of the X2 with 4x AA because the 8800GTX takes less hit with AA compared to the X2.

What [H] numbers can't show me is the trend that show how the cards react with games, resolution and AA because [H] only use a very limited number of games and settings. Time demo or REAL WORLD, optimized or not, the reaction trend with changes would still be the same.

Well obviously [H] is getting you those numbers, other wise how would you know what card take less of a hit at what resolution. So from the fact that you can draw a conclusion between two cards says to me that you are getting plenty of info. And as far as "trends" I think personally you are over thinking this, the numbers speak for them selves and like that long post I wrote, Time Demos will show different game play then when you acually sit down and play the game where NOTHING is PRE-RENDERED. There for, the Real World game play benches [H] does are closer to the performance levels you will see when you buy said video card.
 
Well obviously [H] is getting you those numbers, other wise how would you know what card take less of a hit at what resolution. So from the fact that you can draw a conclusion between two cards says to me that you are getting plenty of info. And as far as "trends" I think personally you are over thinking this, the numbers speak for them selves and like that long post I wrote, Time Demos will show different game play then when you acually sit down and play the game where NOTHING is PRE-RENDERED. There for, the Real World game play benches [H] does are closer to the performance levels you will see when you buy said video card.

Firstly I would like you to know that I don't think that the result from [H] latest review is wrong and for me it is consistent with the trend that I see all over the net.

What I disagree is actually the way the result is represented. It is presented in such a way that it won't help me to see the trends of the cards due the limited number of games, resolution and AA settings.

With more numbers, even if it is just from a canned benchmark, I can see the trend quite clearly. I like Firingsquad because of the way they aligned the graphs one after another. This way I don't need to do any calculation to see the trend of the changes but here let me show you how I can see the trend with some calculation.

This is how I know which card would take more performance hit with higher resolutions:

F.E.A.R.=======HD3870 X2========8800Ultra
1600x1200======100%============100%
1920x1200======85%=============80%
2560x1600======50%=============47%

CoH=========HD3870 X2========8800Ultra
1600x1200======100%============100%
1920x1200======82%=============76%
2560x1600======54%=============46%

Oblivion=======HD3870 X2========8800Ultra
1600x1200======100%============100%
1920x1200======87%=============88%
2560x1600======59%=============62%

HL2 E2=======HD3870 X2========8800Ultra
1600x1200======100%============100%
1920x1200======87%=============86%
2560x1600======55%=============54%

Lost Planet====HD3870 X2========8800Ultra
1600x1200======100%============100%
1920x1200======86%=============86%
2560x1600======53%=============51%

CoD4=======HD3870 X2========8800Ultra
1600x1200======100%============100%
1920x1200======89%=============87%
2560x1600======60%=============59%

Crysis=======HD3870 X2========8800Ultra
1600x1200======100%============100%
1920x1200======91%=============87%
2560x1600======50%=============47%

Bioshock=======HD3870 X2========8800Ultra
1600x1200======100%============100%
1920x1200======89%=============85%
2560x1600======70%=============54%

This is from one website, with multiple websites, I can see the trend better and I also can see which brand is generally better with which game, which card is better with higher AA, I can't see this with [H] review.

Now let me explain why [H] latest review looks quite different from other websites but actually it is consistent with the trend showed by other websites:

Crysis: The trend is this game runs better on nVidia cards, the result: with the same settings, 8800Ultra would probably get a higher fps or you can use a higher setting and get the same fps.

COD4: The trend, both cards are almost equal or the HD3870 X2 is just a slightly faster. What [H] did was they play the game @1920x1200 and because the trend is 8800Ultra takes less performance hit with AA, it could run 4x AA but the X2 can only run 2x. If they play the game @2650x1600 without AA, the difference would be closer and probably the X2 would be slightly faster.

HL2 E2: This game favours the X2 however with AA, the difference is not that big.
 
This is done in any in-game cinematic (or should be), that's also why you have "cinematic" characters for Gears of War and regular characters. Same as UT3, but Epic took a dump on PC players and made them all pre-rendered in-game cinematics, which just look hideous.
 
Coming from a game programmer...


Have you made sure the disparity between gameplay and timedemos isn't purely a result of less CPU overhead when playing a demo? I mean things like AI, pathfinding, collisions and physics all factor into the framerame and don't necessarily have to be performed when playing back a recorded demo; the results of all those calculations could be stored instead.


I've never played the game, so I can't comment specifically on how they've implemented their demo recording, but I recall back to the days of Quake and Unreal Tournament (and it still makes sense now-a-days) to do all the hard calculations on the server and just send the (compact) updated player positions over the network to the clients.

In the single player mode, the local machine has to do all the hard calculations and the network packets can be recorded in to the demo file for easy playback, since demo playback is essentially a playback of the network stream anyway. This is how a good, clean implementation would look and if Crysis is a modern game, I would assume this is how they do it.



Before you make any serious accusations about "cheating" you should definately consider what I've just said.
 
Coming from a game programmer...


Have you made sure the disparity between gameplay and timedemos isn't purely a result of less CPU overhead when playing a demo? I mean things like AI, pathfinding, collisions and physics all factor into the framerame and don't necessarily have to be performed when playing back a recorded demo; the results of all those calculations could be stored instead.


I've never played the game, so I can't comment specifically on how they've implemented their demo recording, but I recall back to the days of Quake and Unreal Tournament (and it still makes sense now-a-days) to do all the hard calculations on the server and just send the (compact) updated player positions over the network to the clients.

In the single player mode, the local machine has to do all the hard calculations and the network packets can be recorded in to the demo file for easy playback, since demo playback is essentially a playback of the network stream anyway. This is how a good, clean implementation would look and if Crysis is a modern game, I would assume this is how they do it.



Before you make any serious accusations about "cheating" you should definately consider what I've just said.

Been fully considered. Sort of an old issue not generally on point in today's gaming world, or at least not with the titles we focus on.
 
I cant wait to read this article,and witness the aftermath of it in the community.It amazes me after all the explanations from Kyle and Brent over the years,and still,still many are hard headed and blind to the truth of it all.

I have also never seen Kyle draw so deep a line in the sand over an issue such as this, and I have followed his comments on this,and other 'gaming' issues closely since the sites inception.

Interesting times ahead to be sure. :D

Unfortunately I already know the outcome of this test for the most part and its gona be exactly as I expect. Using a built in timedemo hasnt been done on most sites in years. The good ones use custom timedemos.

I already know both Nvidia and ATI optimize for benchmarks I dont need to know that already and thats what this is gona prove.. again. So, this wont prove anything that gamers dont already know. It wont disprove the value of timedemos.

Use a game people play and make a custom timedemo at the same settings. Example call of duty 4, Bioshock, ET Quake wars, company of heroes, Call of juarez, stalker, even crysis as long as its not built in benchmark.
 
Firstly I would like you to know that I don't think that the result from [H] latest review is wrong and for me it is consistent with the trend that I see all over the net.

What I disagree is actually the way the result is represented. It is presented in such a way that it won't help me to see the trends of the cards due the limited number of games, resolution and AA settings.

So then your main complaint is that;

A - There are not enough games being tested (Or maybe just the ones you play).
B - You don't like the way the graphs look.


A - Many sites including [H] test only the most taxing games/popular - there for, say a video card comes along and starts getting 70 FPS in Crysis, it is implied that it will perform well in Bioshock, UT3, BF, CoD4 Ect.
B - I Doubt sites are gonna change the way thier graphs look for one person.
 
So then your main complaint is that;

A - There are not enough games being tested (Or maybe just the ones you play).
B - You don't like the way the graphs look.


A - Many sites including [H] test only the most taxing games/popular - there for, say a video card comes along and starts getting 70 FPS in Crysis, it is implied that it will perform well in Bioshock, UT3, BF, CoD4 Ect.
B - I Doubt sites are gonna change the way thier graphs look for one person.

No, my main concern is that the information given is very limited, eventhough it is not wrong but it is not very helpful. Like I said before, I can't see the trend by looking at [H] review alone.
 
No, my main concern is that the information given is very limited, eventhough it is not wrong but it is not very helpful. Like I said before, I can't see the trend by looking at [H] review alone.

Well my question is how are you comparing all of these different reviews? Most sites are very different, and most of them don't do deep enough testing so-to-speak, you keep talking about this "trend" what "trend" are you referring to here? You need to be a little bit more clear on that.
 
Well my question is how are you comparing all of these different reviews? Most sites are very different, and most of them don't do deep enough testing so-to-speak, you keep talking about this "trend" what "trend" are you referring to here? You need to be a little bit more clear on that.

It's obvious that he is saying trend in terms of performance.
 
you keep talking about this "trend" what "trend" are you referring to here?

Judging from his posts ad-nauseum it would seem that he looks at how well cards A, B, and C do then draws his conclusion from there. So if card A scores low across multiple review sites he can "safely" assume it will perform poorly for him while if card B scores high then he can justify his purchase, and if card C performs all over the place then he goes to the sites that have given him more accurate info in the past that fits his playstyle.

Or just go to some forums and bitch incessently on the testing methods.

This whole thing is blown way out of proportion. It is overshadowing the card and taking away from a great review that shows a very great card for the price performing at higher levels then the company has shown for a great while.

The reviews and testing methodology here at [H] may still apparently leave some scratching their heads but the results have always provided, at least for me, with an accurate portrayal of what I'll get on either my system or one I'm building. I wasn't familiar with DH, but TH, AT, and HHH have always generally been either flat out wrong, or so one-sided over the years I just gradually stopped visiting them.
 
Well my question is how are you comparing all of these different reviews? Most sites are very different, and most of them don't do deep enough testing so-to-speak, you keep talking about this "trend" what "trend" are you referring to here? You need to be a little bit more clear on that.

What is he is meaning to say is that he would like to see scaling on several different points. Say AA at 1600, 1920, and 2560 resolutions. It is just that every time you add a "little tidbit" it usually increases our workload by 2X or 3X. Consider that and the fact that we usually have less than 7 days to put these together and you can see where it gets resource intensive for us.
 
Thanks for understanding what I'm trying to say and I know that it is harder to do it so I think that it would be much better if you just skip doing this graph here and do a couple more of this table with different resolutions and AA settings instead.

I think that a few numbers for min, avg and max fps are good enough to show how the GPUs scale with resolution and AA. With a few more numbers, it would be much clearer than just saying:

As I understand it, the table and the graph are from the same run - getting rid of the graph wouldn't save any time, just space (and as this is on the web, space isn't really at a premium ;) )
 
As I understand it, the table and the graph are from the same run - getting rid of the graph wouldn't save any time, just space (and as this is on the web, space isn't really at a premium ;) )

Well the graph is needed to prove their consistency and you don't need to be spot on consistent just to produce the table alone ;) That would save quite some time actually and people don't actually play games that consistently :D
 
Status
Not open for further replies.
Back
Top