Not that my opinion matters but i'd say while FPS lies, true, it's only because it's a very slow way of recording what you see. Your eye see's a lot more than every second and lot more than 30/60 FPS or whatever the crazed people say your eye is limited to. It just doesn't work in a manner of "FPS", analog, not digital. Therefore you need a much finer grain of measurement to get any statistical data worth noting. So much happens in that second, if 80% of that second is 60FPS you may never see the number change but you will CERTAINLY notice that 20% it's not, especially if by a large margin as you eye is PARTICULARLY sensitive to major changes in fluidity at ANY FPS.

As such those FCAT tests are borderline invaluable right now, but they are also not the only invaluable information. How did the drivers make your system feel, behave? did it break stuff you regularly used? did features works seamlessly, etc. How is the hardware BUILT, fan noise etc are all things that must also be represented and they are a important part of a product as well. And quite a lot of these things can only be done "by hand" from the reviewer you are trusting. As always it's their opinion, but it's a educated and well thought out opinion you would hope, and thus you can weight their words with your POV.

I guess my bottom line is FCAT or whatever the finest degree of measurement can show you -EXACTLY- what the performance is, but there is more to GPU's anymore than just performance. All the speed in the world matters not if it crashes half your programs, none of the features work your machine takes ten minutes to load and it's loud as fuck. all of these things are important too. Guess that's all I got to say, and for the record, professional FPS player here, high FPS is mandatory for me, but my machine can't sound like a dust blower and just has to work or I can't hear footsteps with my open ear headphones or crashing mid match is not acceptable to me or my team mates. I rely on full body reviews in this manner and sadly I pretty much can't get exactly what -I- look for, in any one place. But that's just my perspective.
 
Last edited:
I'd be perfectly fine with the removal of framerate data. To be honest I have been ignoring it for some time now, going directly to the impressions. With G-Sync it really doesn't matter what the exact framerate is from second to second. All I want to know is if the game looks good, runs smoothly and plays well using a given set of configurations.

Extra thought
I can tell you one thing, though: people wouldn't take this change if it happened at [H] nearly as well as they did when the focus moved away from apples-to-apples benchmarks ;).
Well these app style games like from the Windows store do not support Gsync.
 
Looks like you just need adaptive sync/g-sync for DX12 gaming. I wouldn't call that a horrible compromise for quality, but there is only so much of that hardware around. Just need to pay a lot more attention to the minimum framerate when benchmarking.
 
There are, at the time of this writing, 133 games on Steam that are locked to 30FPS or less, and they are curated by The Framerate Police. Some of them can be unlocked to 60FPS or higher and the curator makes notes for those; some of those break badly if you do unlock them. You may be surprised by some of the familiar names on the list.
 
Looks like you just need adaptive sync/g-sync for DX12 gaming. I wouldn't call that a horrible compromise for quality, but there is only so much of that hardware around. Just need to pay a lot more attention to the minimum framerate when benchmarking.
AGAIN these app style games like from the Windows store do NOT support things like Gsync.
 
I think the biggest issue with a purely subjective based review is people are biased by nature. The more knowledge we have the more we have predetermined expectations even before we analyze anything. I think we need to verify our impressions with empirical data. I do think though that it would be a good idea to do subjective analysis before measuring everything. For the same reasons, basically. If you were to measure that one card was faster then you would naturally expect it to play better. It would be like dismantling a PSU before the review and finding it used crappy Teapo caps. You might decide to push it harder, just to see if they fail and then if they do proclaim that the PSU sucks. When you very well might be able to get Chemicon caps to fail under the same conditions but wouldn't try simply because you trust them. (Just an example that might not be a good one, but hopefully makes my point.)
 
  • Like
Reactions: Yakk
like this
I point out framerate, for the purposes of determining playability, is a useless statistic. Case in point, you would think 60 FPS is solid, but if you render all 60 frames in a single 16ms timeslice, you have the net effect of 1 frame being drawn to the screen per second. Frame latency is much more informative, as it gives an indication of how many unique frames actually get rendered by the display, which is what most people care about, whether they know it or not. The statistics we should care about are:

% frames rendered > 16ms (number of times you get frame drops, as a percentage of all frames drawn)
% frames rendered > 33ms (number of times you get >2 frames skipped at once, as a percentage of all frames drawn)

That's it. FPS, outside of comparing the maximum throughput of two GPUs, is a useless statistic, and has been for some time.
 
I like what you guys do now: highest playable settings, show min, max, average fps, and the time series graph. I also like when you show screen shots between the cards so we can compare visual quality. If you took out the frame rates, I would still come back. HardOCP's value to me is the highest playable settings and their impressions of game play. I can get frame rates at 100 other sites. When buying a card, I really want to know what card will make the game look the best and if there is anything weird happening when playing the game (crashes, stuttering, tearing, frequent large drops in frame rate). You could tell me about the later without showing frame rates through your experience.
 
I point out framerate, for the purposes of determining playability, is a useless statistic. Case in point, you would think 60 FPS is solid, but if you render all 60 frames in a single 16ms timeslice, you have the net effect of 1 frame being drawn to the screen per second. Frame latency is much more informative, as it gives an indication of how many unique frames actually get rendered by the display, which is what most people care about, whether they know it or not. The statistics we should care about are:

% frames rendered > 16ms (number of times you get frame drops, as a percentage of all frames drawn)
% frames rendered > 33ms (number of times you get >2 frames skipped at once, as a percentage of all frames drawn)

That's it. FPS, outside of comparing the maximum throughput of two GPUs, is a useless statistic, and has been for some time.

^This. This is right on the money. CPU data would be worth noting too.
 
I think the biggest issue with a purely subjective based review is people are biased by nature. The more knowledge we have the more we have predetermined expectations even before we analyze anything. I think we need to verify our impressions with empirical data. I do think though that it would be a good idea to do subjective analysis before measuring everything.

While I agree that the data helps to back up the subjective part of a review, if I trust the reviewer then the numbers become less important. I understand that reviews are subjective. However, after coming here for years (a lot longer than my join date suggests), I have developed a level of trust with the what the authors of reviews here are telling me. It is like my brother telling me that Game A sucks, or Game B has technical issues, such as crashing or stuttering. I do not need any kind of numbers from my brother. If he is telling me Game A sucks, then I will probably avoid that game as chances are that I will find it sucks too. I cannot recall a single time where a reviewer here has steered me wrong. That track records means that I am going to trust the reviews here are accurate, even if they are completely subjective with no data backing it up. If they started to steer me wrong, then it would break that trust and they would lose a me as a long time reader.
 
Well these app style games like from the Windows store do not support Gsync.
While the quote Kyle used to kick off this discussion was directly related to the issues with games built on UWA, my reply was to give a bigger picture of what my opinion is of coming here reading reviews and why I read them the way I do. Not all games in the future will be released as UWA with Steam still being the top dog in digital distribution on PC for the foreseeable future. Not everyone has moved to Windows 10. And once again speaking personally, I refuse to buy anything from the Microsoft Store in its current state.
I point out framerate, for the purposes of determining playability, is a useless statistic. Case in point, you would think 60 FPS is solid, but if you render all 60 frames in a single 16ms timeslice, you have the net effect of 1 frame being drawn to the screen per second. Frame latency is much more informative, as it gives an indication of how many unique frames actually get rendered by the display, which is what most people care about, whether they know it or not. The statistics we should care about are:

% frames rendered > 16ms (number of times you get frame drops, as a percentage of all frames drawn)
% frames rendered > 33ms (number of times you get >2 frames skipped at once, as a percentage of all frames drawn)

That's it. FPS, outside of comparing the maximum throughput of two GPUs, is a useless statistic, and has been for some time.
If you have to include objective data in your reviews to maintain trust with outside readers, then I agree that this would be the best way to do it. That way you can quantify what you feel through the gameplay experience.
 
Framerate data is very important to me, especially the "apples to apples" portion. The reason i say this is because my personal preferences do not always reflect what the "max playable settings" constitute. For example "AA" is one of the last things I would turn on on any game if I'm having framerate issues. But for some, it's one of the first things they enable, and it seems to be very important to [H] as well. Just different strokes for different folks.
 
I like numbers. I like having something to compare to. This is the only reason I like to see canned benchmarks like 3dmark or the Unigen stuff- it's an easily reproducible control for others to compare hardware with. I don't see them in relation to gameplay by any means, and anyone who does is misinformed.
The other thing I like about numbers in a review is troubleshooting. If I have my game settings at X, and I feel that it should be running smoother, then the review with numbers provides a quick source to compare and determine if I indeed have a problem, or if I need to turn some settings down.
 
I think PCPer Ryan went into headless chicken mode and started waving his arms around like crazy.
This is one game where it is not even clear if it is in Alpha or Beta. That this game can use a different method of displaying for Nvidia or AMD is something which is pretty normal since it is written on a new API. The whole point is not to go into headless chicken mode but for Oxide to see what is the best approach to this problem , Oxide is the company which can ask Nvidia AMD and MS to sit down and figure it out for themselves it should not take them that long to what screenmode to use and reach agreement on it.

Unlike what MS wants I can see the default overlay function as intrusive, god forbid that some genius at Microsoft decides that the overlay is to be used for advertising something else and we all be wonder why the hell we needed overlay in the first place, very bad idea ....
 
While I enjoy seeing the playable experiences in reviews with usually backing statements about how the game play was, etc. I really do like to see the apples to apples comparison, along with game data on framerates, etc.

This kind of data allows me to help evaluate the card for my specific gaming needs, as a review cannot possibly cover every combination that people use to game on. I also usually look at several reviews, different games, a variety of card versions before I settle on what I am going to spend my money on. Mainly because of money I end up hanging onto hardware a lot longer than many people, so I want to make sure what I am getting works well for my needs.
 
Nearly everything in science and engineering uses some metric to measure the outcome of a hypothesis.
There is little room for subjective findings, they are in fact rarely considered, unless they are the only method of comparison. (mostly used in medical studies....patient had less nausea, felt improved, etc)

In this respect your hypothesis is GPU X is better than GPU Y in a series of tests.

Your reviews DO in fact use a benchmark. Your reviews use a suite of games. Each game is played and a segment is chosen to use for each GPU tested. This is your benchmark, you just don't call it that.
True, it may not play out exactly the same in each run-through, but with the same tester over and over, it's pretty damn close.

Your metric may not be frames per second, in fact it is "which items in a game can be "turned up/down/off/on" and this becomes the metric by which "better or worse" performance is measured. The tester then reports that the test segment was "acceptable" using those setting metrics. The number of settings "turned up or utilized" then defines "better or worse".

While you do not use a pre-defined benchmark, you do use a "representative segment of a game that generally includes use of the effects you wish to "turn on/off/u[p/down" " and you do have a metric, it just isn't the
standard numeric frame per second.
 
I don't pay attention to the framerates in your reviews because, other than the odd time I borrow the use of my roommates 4K TV, I use a 1080p TV. So does my gf, daughter and brother. I am using SLI 980ti, brother has SLI 970s, another has a 780ti and the kids PCs have GTX 680s. All of this is overkill for 1080p at max settings on almost every single game. You tend to use high resolution monitors with high panel speeds and low input lag that I do not own.

So why do I come here every day? Because you talk about fan noise being too loud in your opinion, temperatures feeling too hot to the touch, your experiences with gameplay stuttering, how bad textures and/or "popin" looks on a brand new game, your experiences installing drivers or firmware, how warped a motherboard is right out of the box. How much faster CPU/GPU X feels compared to Y. How frustrated you were dealing with <insert brand here>'s support. How you noticed a video card company suddenly stopped responding on forums and emails. That is why I come here.

I don't need numbers. I need to know the experience of someone who has had their hands on thousands of pieces of hardware or game titles for over a decade, because I don't like experimenting with my hard earned money. Period.
 
I found HardOCP around 2003/4, when I was building my first system as a university student. I scoured the internet for reliable reviews on components, I had several sites at the time that I used (honestly don't remember many of them now).

After building about 7 complete systems since then (and periodic updates to each system), I have become increasingly reliant on [H] as I have personally found your information contributes to a better gaming and PC experience more so than other sites. I absolutely appreciate the 'real world' analysis of components and their impact on actual experience.

I've found benchmarks are too much of a ballpark on performance, some of those original systems where built around highest benchmarks. While those systems ran fine, I was disappointed in the actual gaming experience, as I switched to using [H] recommended products which did not always have the highest bench, my experience improved. Hence the increased reliance on [H] reviews.

So I guess, keep up the great work.
 
Last edited:
The moment the hardware computer industry stops releasing numbers it will be the same as the high-end audiophile industry is today: a pool of liers and fake impressions, instead of what it is today, an industry that is focused on performance that uses numberse to back up their claims.
That is a very interesting comparison. I like the association.
 
I found HardOCP around 2003/4, when I was building my first system as a university student. I scoured the internet for reliable reviews on components, I had several sites at the time that I used (honestly don't remember many of them now).

After building about 7 complete systems since then (and periodic updates to each system), I have become increasingly reliant on [H] as I have personally found your information contributes to a better gaming and PC experience than other sites. I absolutely appreciate the 'real world' analysis of components and their impact on actual experience.

I've found benchmarks are to much of a ballpark on performance, some of those original systems where built around highest benchmarks. While those systems ran fine, I was disappointed in the actual gaming experience, as I switched to using [H] recommended products which did not always have the highest bench, my experience improved. Hence the increased reliance on [H] reviews.

So I guess, keep up the great work.

Despite my earlier comments in support of FPS, if HardOCP decided to go the way of qualitative experiential reviews, I would still visit just as often.

Your post is bang on - at the end of the day, HardOCP has been my 'Go to' source for computers since the day of the Celeron 300. If almost two decades of quality work isn't enough to earn your trust and faith in the reviews, then maybe you should move on to a different site.
 
Let me say here that this is simply a discussion, there are not plans, etc., in the works. Ryan's article, while truly on a totally different topic, got me thinking about all of this again in terms of how HardOCP presents GPU reviews. It has been a few years since I visited all of this "in my head" to be honest.

So what if we did something like this? One of the biggest complaints about "highest playable settings" over the years is that we are "comparing" cards at different settings. So what if we got rid of the frame data associated with highest playable settings, but surely retained it for Apples to Apples? That way, maybe we could focus more on talking about the actual IQ settings and what those brought to the "highest playable settings" gameplay experience?
 
I'd be perfectly fine with the removal of framerate data. To be honest I have been ignoring it for some time now, going directly to the impressions. With G-Sync it really doesn't matter what the exact framerate is from second to second. All I want to know is if the game looks good, runs smoothly and plays well using a given set of configurations.

Extra thought
I can tell you one thing, though: people wouldn't take this change if it happened at [H] nearly as well as they did when the focus moved away from apples-to-apples benchmarks ;).

I will say that this is a dumb idea, and I have a G-Sync monitor. Why? Very few people have one, if that weren't enough (it is) there's been many instances where G-Sync produces unexpected results.

I will also say that trying to make a case for excluding actual data just seems like a way to justify lazy/incomplete reviews. Anyone can play a game and write a short article about how they feel it performed. This thread feels like the powers that be are testing the waters to gauge public opinion on doing just that. The data is important. Not having any, or keeping it to a minimum won't make the reviews merely "different" it would border them on almost useless. Heck, might as well just have someone video record you with their smart phone playing a game while you talk about it at that point.

And yes, I know we have the option of not reading them or reading other reviews, since I can feel a reply like that coming. That isn't the point.
 
Last edited:
Honestly for me, i find apples to apples one of the best, especially when all you show is max min and avg. That pretty much clinches decisions for me as to which video card to buy, especially when im shopping for mid range and low end cards.
 
Honestly for me, i find apples to apples one of the best, especially when all you show is max min and avg. That pretty much clinches decisions for me as to which video card to buy, especially when im shopping for mid range and low end cards.

This and I look at minimums. Minimums are what matter to me the most and you can also see stutter.

Maybe doing a standard deviation on the normal chart would be interesting too. Maybe averages are the same but one is more consistent and that wouldn't be more work.
 
Kyle, I understand the idea of dropping metrics but for me personally, not having them would make the articles less useful because the experience I aim for is different than what you are looking for.

For me, clarity trumps visual effects which is why I try to run every game at 4K, even if that means sacrificing all those lovely light effects - the only thing I keep at maximum are texture-related. It should be noted here that I use a Freesync monitor, otherwise this would be fools' errand - but when you can go as low as 33FPS without averse effects, even a 290x can be surprisingly 4k-capable, as long as you are prepared to dial back on the effects.

And that brings me back to the topic at hand: I can roughly calculate whether or not I will be able to play with the settings I want by extrapolating your settings & numbers - but if you drop the numbers, I would be left guessing.
 
While we are speaking of hypothetical content changes on [H]...

Some context:

I bought a used car and made the mistake of only reading reviews of the car and neglected to Google "2010 Audi A4 issues". All reviews pointed to it being roughly on par with Mercedes C series and BMW 3 series models and a fantastic car.
Within days of purchase, the A4 burned through 1 litre/quart of oil every 300kms/186 miles. A bit of Googling and I would have known thats very common on these cars from 09-12 due to a poor PCV and piston ring design.
Dealer won't take the car back and I'm screwed.

Conclusion:

I would love to see hardware revisits and long term subjective experiences on [H] articles. Because initial reviews aren't everything.
 
I would love to see hardware revisits and long term subjective experiences on [H] articles. Because initial reviews aren't everything.
I 100% agree with you. I would love to produce that content. The fact of the matter is that I see no way we could get the resources needed to tackle that content scope.
 
The more replies I read, the more I realize that I simply think the data being provided both here and other outlets is simply no longer meaningful to me as a reader, explaining my reluctance to look at it. I think the frametime percentage idea from gamer2k is a good suggestion as to what would be more meaningful, but as a programmer and analyst I don't like the idea of setting a static 60 FPS as the benchmark. Because as many people have said, what is an acceptable framerate in one game may not be acceptable in another.

I like where Dayaks is going:
Maybe doing a standard deviation on the normal chart would be interesting too. Maybe averages are the same but one is more consistent and that wouldn't be more work.
As an analyst I can get behind this idea. This could be a better representation than the frame time reply above. One standard deviation would tell you that 68% of your game time was in this particular FPS range, with 2 giving you 95%. I think this would give a better idea of the gameplay experience than simply min, max and average. Could pretty it up with overlaid bell curves (or burves, as my fingers wanted to type). A skinnier curve should represent a better gameplay experience so long as the average is within reason.

While we are speaking of hypothetical content changes on [H]...

Some context:

I bought a used car and made the mistake of only reading reviews of the car and neglected to Google "2010 Audi A4 issues". All reviews pointed to it being roughly on par with Mercedes C series and BMW 3 series models and a fantastic car.
Within days of purchase, the A4 burned through 1 litre/quart of oil every 300kms/186 miles. A bit of Googling and I would have known thats very common on these cars from 09-12 due to a poor PCV and piston ring design.
Dealer won't take the car back and I'm screwed.

Conclusion:

I would love to see hardware revisits and long term subjective experiences on [H] articles. Because initial reviews aren't everything.
We already kind of get that with articles like the recent 4 Weeks with NVIDIA TITAN X SLI - 4 Weeks with NVIDIA TITAN X SLI at 4K Resolution article. I think it would be very prohibitive resource wise to that kind of review for a broader range of hardware, though. Maybe get the feedback directly from the users of the forum in a survey with quantifiable data and post some select personal impressions to go along with it after a certain amount of time has passed since the hardware was released to consumers (like one month, three months, and/or six months).
 
Last edited:
I like knowing if card x is going to give me the best experience in games for the money at the resolution or device im viewing it. I guess in a year or so, we will be seeing VR performance too :)
FPS really does not matter to me, more bragging rights imo.
 
Sorry, but no.

The moment the hardware computer industry stops releasing numbers it will be the same as the high-end audiophile industry is today: a pool of liers and fake impressions, instead of what it is today, an industry that is focused on performance that uses numberse to back up their claims.

Objective data is all that is needed to analyze anything. That or blind testing when you can't simply use numbers. On the matter at hand you have your own opinion of what is playable or not, and my idea might not be the same. Numbers have no bias (so long as they are taken honestly)

There's at least one problem though, even raw numbers can lie, aka "runt frames" that were eventually caught.

1) fps + frame times + latency
2) game mouse-kb-video code
 
Last edited:
The main point of expensive, high-powered video cards is to provide more framerate than cheaper alternatives. So until all GPUs can surpass the refresh rates of our monitors in all games, measuring framerate is necessary.

And this is a bit off topic, but as someone who's on a budget and is just looking for a smooth gameplay experience, I like it when review sites use the same benchmark and settings for all cards. Benchmarking the featured card along with competitors in its price range is useful for figuring out which one provides the best price/performance ratio, but it has no use in figuring out if I should upgrade an older card, which hasn't been benchmarked recently with those same titles. The only reason I still visit anandtech and toms hardware is for their CPU/GPU charts...
 
The only reason I still visit anandtech and toms hardware is for their CPU/GPU charts...
We do not keep those history charts for two main reasons. As drivers change and game patches change, so do the performance of those cards. Neither of those sites go back and run new data when patches and drivers are dropped. You are seeing old data. I have an issue with that and am not willing to show that old data to our readers. That is just my personal choice. The resources needed for staying on top of that are not ones I am willing to spend.
 
It seems like this might be the time to setup something similar to MINIMUM and RECOMMENDED system requirements, but for Subjective Gaming Ratings (SGR). Just a suggestion but maybe have a top 5 games of the Month each get tested and given a SGR rating and those could be posted.

I like that approach. At the end of the day, that is is mainly what I care about. I don't give a damn what the numbers for a particular card vs another are if they don't let me play a game well and have it look nice. Oh look, this AMD APU runs this game at 16fps versus this Intel's 12fps! Take that Intel! Who gives a damn! I can't play it lol
 
What I WOULD like to see is an added section (that would probably increase your workload substantially) is a 'settings for 60 minimum' showing the settings one would have to use to maintain a minimum of 60FPS at a given resolution. It's no secret that MANY members of [H] feel any frame rate under 60FPS is unacceptable, and many times you hear people say something along the lines of 'my setup used to play everything locked at 60, now it's slowing down and I need an upgrade'

Or
'I bought that card, but I wasn't able to play the game maxed at 60, had some dips into the 40s and needed to turn down the AO, so I got a refund ad bought another card'
It's a real [H] attitude and I think it would be awesome to see what games/cards require to run at 60FPS minimum.
THis would be very cool. Call it the "[H]60 Standard" or something. With a nice graphic.
 
While i often review the benchmarks , i personally have felt they are not really telling me what i want to know in regards to the hardware.

Reviews often shy away from media playback capabilities, standards conformance, and other errata that sometimes would actually influence my purchasing decision over any frame rate data.
Often when buying a product you hope it works and continues working with new media and applications and sometimes even old applications. and reviews today do little to reinforce that hope.

However expecting that kind of in depth review is pretty much unrealistic in this day and time with the complexities that exist with software and hard ware configurations
i still feel more can be done to demonstrate what the hardware is capable of, rather than can it play game x at y speed.

because when it comes to video cards they do much more than play games .

they playback and process video
they now send audio over hdmi
they still render 2d
and now they accelerate our web browsers
and they act as special compute systems for various software.
and soon vr rendering

some of these tasks should at least be tested in video cards these days so that we can get a better understanding of the capabilities of the cards.

Unfortunately thats asking for quite alot.
 
While i often review the benchmarks , i personally have felt they are not really telling me what i want to know in regards to the hardware.

Reviews often shy away from media playback capabilities, standards conformance, and other errata that sometimes would actually influence my purchasing decision over any frame rate data.
Often when buying a product you hope it works and continues working with new media and applications and sometimes even old applications. and reviews today do little to reinforce that hope.

However expecting that kind of in depth review is pretty much unrealistic in this day and time with the complexities that exist with software and hard ware configurations
i still feel more can be done to demonstrate what the hardware is capable of, rather than can it play game x at y speed.

because when it comes to video cards they do much more than play games .

they playback and process video
they now send audio over hdmi
they still render 2d
and now they accelerate our web browsers
and they act as special compute systems for various software.
and soon vr rendering

some of these tasks should at least be tested in video cards these days so that we can get a better understanding of the capabilities of the cards.

Unfortunately thats asking for quite alot.
Maybe an important question: but what are the chances that there is ANY difference in these things among cards reviewed at [H]?
 
Maybe an important question: but what are the chances that there is ANY difference in these things among cards reviewed at [H]?


Well thats what one would expect but several times it has become apparent that all cards are not created equal when it comes to menial tasks. and for some reason their is a difference. sure they can play games and play them fine.
but in one particular instance a card could not pass surround over hdmi in i tunes . had surround in other players but nadda in i tunes.

another instance is that a entirely different chipset cant play zulu gems seems to play every thing else fine except zulu gems.

so no not all cards are created equal and nor do they all do what we would assume to be generic tasks equally well.

but as i mentioned this kind of testing is quite difficult. but the void of not having such metrics leaves a lot to speculation and some costly trial and error.
 
but as i mentioned this kind of testing is quite difficult. but the void of not having such metrics leaves a lot to speculation and some costly trial and error.
I guess I see what you are saying, but it sounds like individual program or driver issues, not card-specific issues. Plus, for the [H] guys, it seems to me that trying out all the kinds of hardware/software combinations is basically impossible. At least unfeasible. This IS a gaming site, not an iTunes site.
 
I guess I see what you are saying, but it sounds like individual program or driver issues, not card-specific issues. Plus, for the [H] guys, it seems to me that trying out all the kinds of hardware/software combinations is basically impossible. At least unfeasible. This IS a gaming site, not an iTunes site.

I wouldnt say [H] is a gaming site, its a hardware enthusiast site, games just so happen to be one of the more impressive ways to demonstrate the hardware.

in regards to the reviews video cards have become so much more than just displaying graphics they fill several roles now and reviews that just test one subset of those capabilities
may be missing key details that may be of interest to potential buyers.

and unfortunately as much as i hate it the hardware is only as good as it drivers. dont matter how whiz bang the hardware is, if the drivers causes issues with other things and or prevents proper operation
of other components the hardware ends up not being all that whiz bang after all.
 
Back
Top