What Sucks About HardOCP Video Card Reviews

FrgMstr

Just Plain Mean
Staff member
Joined
May 18, 1997
Messages
55,534
This thread is here for those of you that have issues with the FORMAT of HardOCP Video Card Reviews.

Discussion here can range from how we pick cards to compare, what cards we picked, or what games we use for those comparisons. This discussion can cover how we collect framerate data, or do not collect framerate data. You can talk about whatever YOU think is wrong with our GPU format and how our content does not serve your own personal needs when it comes to video card reviews. You can talk about HardOCP and its biases, be those Red or Green. Let us know what you think about the test setup, both in terms of hardware and software. Complain about the graphs and Highest Possible Settings used, please.

And by all means complain about points laid out specially in the review but you did not take time to read.
 
Honestly I can't say there is anything wrong with the testing you have done for the past decade on gpu reviews. For starters you include both types of review: for those who care about "the experience" and for those who care about the "numbers". Everybody is happy. Bias? I believe you take testing seriously and only consider data that looks legit (anybody that has done any sort of testing knows when data gets fucked up, it happens), and thus I trust your numbers as my own. Also, the fact that you keep reviewing cards that have been for a long time on the market is also another thing to be thankful for, since as everybody knows drivers tend to get better, and reviews from day #1 for a given product might show a different picture that you would get should you review the card now, 2 years later.

But, imo, all of this gets thrown away if you skip the numbers and go directly into the experience. Why? Well, how do I know your "good" experience is the same as mine? Also, how do I know you weren't simply suggested (i'm not talking about bias, since I consider bias to be on purpose)? Some time ago I truly wanted to compare a few audiophile stuff I had lying around. The thing is, when you compare a product that costs more than the other, and that has a better "name"... those impressions get into your head, specially when differences are pretty small. To the point that I had to discard all the data I obtained because I couldn't trust my own gut when I knew what I was testing. So... I had to do some blind testing. And it is a HUGE pita because you need somebody else to do the plugging and unplugging... but then, once you are out of suggestions or impressions... you truly analyze the product for exactly what you feel. But still, what you feel might not be the same your readers feel... which make the sharing of experiences on something that can be objectively testing somewhat... unpredictable.

For starters, I joined this forum long after knowing the site, and I'm still not sure what your "happy zone" is regarding framerate. That, and the fact that so many games have different "comfort zones" (that can be explained in how the engine works and the way it outputs the frames.. IE frame-times) makes it a bit "hard" to understand how anybodies impressions will translate into our own.

But yeah, I guess that since DX12 fucked up all the testing procedure (IE no more standard benching tools) I guess we will all have to adjust to it somehow. But I just feel helpless without some hard data to look at.
 
But, imo, all of this gets thrown away if you skip the numbers and go directly into the experience. Why? Well, how do I know your "good" experience is the same as mine?
I agree.
Hopefully I am still on topic this time.
 
I'm all about how [H] does things currently.

As mentioned in another thread, a separate testing method of "settings for 60FPS minimum" would not be lost amongst MANY [H]ard enthusiasts. Essentially, showing how cards compare when you lower settings to make 60FPS the life-or-death no-compromise minimum FPS. Maybe you'll find that at the 'sweet spot' 45 FPS with minFPS in the 30s, card A is the clear winner. But when you demand 60FPS minimum and lower settings to achieve this, Card A requires everything to be turned down, but Card B requires less visual sacrifice and actually pulls ahead!

Just food for thought.
 
There's actually something I've wanted to say right from the start, so I'm glad you're directly asking...


I always felt the idea of providing too much interpreted data might actually be less useful versus dumb apples-to-apples testing, despite being technically superior. The problem is, as soon as you say 'comparable' settings, that's experimenter bias, and you immediately present a load of unknowns for the reader to consider; why 2xaa on this and 4xx on that?, why hdr on for this and not on that? etc etc. And each of those factors has to be considered both individually in the context of the specific game, and in the context of how the author feels about it, which are both unknowns.

Graphs are meant to be intrinsic, but without intrinsic knowledge of the particular set of compromises you chose, they kinda lose that. And all because of that one word, 'comparable'.

Sorry for being completely unconstructive! But you did ask, so... :joyful:
 
Last edited:
Good to know you are always open to constructive criticism,but,the reviews are near perfect as is. Keep up the great work.


OT.... Only thing I would like to see is extensive testing of e-GPU devices in the near future.
 
I'd suggest a standardized list of terms. It could help reduce people bitching about bias.

You gave the nano a gold award but people still complained IIRC.

Otherwise my favorite reviews are the OC vs OC reviews and game performance reviews.
 
I enjoy the reviews for the most part. One thing that may help is a condensed setting difference.

For example. In your dx12 tomb raider review it was a huge table (kind of a puke of data) and 99% of it didn't change between cards/dx versions. You summarized it it in the text but I find myself trying to do a diff on the tables to find the different settings. Maybe find some way to make it easier to see what settings you are either [+]ing or [-]ing
 
There's actually something I've wanted to say right from the start, so I'm glad you're directly asking...


I always felt the idea of providing too much interpreted data might actually be less useful versus dumb apples-to-apples testing, despite being technically superior. The problem is, as soon as you say 'comparable' settings, that's experimenter bias, and you immediately present a load of unknowns for the reader to consider; why 2xaa on this and 4xx on that?, why hdr on for this and not on that? etc etc. And each of those factors has to be considered both individually in the context of the specific game, and in the context of how the author feels about it, which are both unknowns.

Graphs are meant to be intrinsic, but without intrinsic knowledge of the particular set of compromises you chose, they kinda lose that. And all because of that one word, 'comparable'.

Sorry for being completely unconstructive! But you did ask, so... :joyful:

I feel that, generally, [H] does a pretty good job of explaining why they picked a setting in each game. People get too caught up in pure framerate. Smoothness is more important than framerate, by far. If you ask me to pick between perfectly smooth 30fps or widly uneven 60fps I'll take the 30fps option any day of the week. Not only is that unevenness incredibly annoying but it makes me feel ill if the framerate noticeably fluctuates too often. I feel that the framerate charts [H] provides in reviews are a lot more useful than the numbers themselves. I look at the numbers for the sake of curiosity. They're there and so they get looked at. I just read the Tomb Raider DX12 review a few minutes ago and I couldn't tell you a single one of the frame rate numbers they mentioned. I saw them and proceeded to completely forget by the end of the page as they're useless to me. I feel the same about the apples-to-apples charts. They're neat, but I find them ultimately useless outside of looking at smoothness.
 
Very good reviews, I think the graphs and charts are great just the way they are. The reviews always give an overall 'feel' for what to expect. The only thing I would say is: I'll never own a 980ti when it's the cream of the crop and it'd be nice to see some mid-range cards thrown in there when comparing with various games. (Depending on what's being reviewed of course)

But with that said, I can still get a 'feel' for what my GTX 950 is 'lacking' hehe. I just recently upgraded to my 950 and I'll tell you that GPUBoss and websites alike did not contribute to a good deciding factor on what to buy. That's why I love the [H] reviews. However, top of the line cards are not in my budget at all so it would be fun to see how a more mid range, budget friendly cards compare to the top of the line; with [H] standards as far as highest playable settings on various resolutions and so forth..

It might give someone with a budget an idea on buying a lower priced card that works for their needs, or someone with a mid range card that has been holding out may decide that 'Said Game' really needs some more power under the hood and it's time to make the jump. :)


why 2xaa on this and 4xx on that?, why hdr on for this and not on that? etc etc. And each of those factors has to be considered both individually in the context of the specific game, and in the context of how the author feels about it, which are both unknowns.

I hear what you're saying, but sometimes comparing is a research project. Sometimes I see a card that looks really close apples to apples and then notice "Damn, the settings are medium vs high! Interesting!"
I enjoy seeing what each card is capable of, and it really helps in deciding when buying new. If I wanted to see how an MSI and EVGA card of the same type varied by a few FPS, well, then, that would be very easy to read.

The way I see it is: "Ah! This card can get 'X' FPS at this resolution and these settings, and this one can get 'XX' FPS with these settings, and here are the results... Hmmm... Lets look at the price difference."

Just my 2 pennies.
 
I wouldn't mind some video reviews. I know it'd probably be hard to convey the quality of the gpu video, but we might get a nice idea of the gameplay experience you're describing over other cards.
 
While I have a preference for hard benchmarks, I actually don't mind the transition to experiential reviews. I remember when the community was up in arms in the early 2000s when HardOCP and a handful of others were moving away from canned benchmarks. People are resistant to change, but at the end of the day, that change was for the better and adopted as the industry standard.

I guess we have to ask ourselves what is the purpose of a videocard review? If it is to evaluate the ability of a particular piece of hardware to play a range of games, then a qualitative assessment of performance by the reviewer can actually provide ADDITIONAL insights that may not be captured in benchmark data. Where benchmark data is useful to a guy like me is provide me with a point of comparison to see how my system is performing on a relative scale. I have mentioned this before, but in a perverse way, I am more interested in squeezing every last ounce of performance out of my system than I am actually playing the game (I am probably in the minority here).

Regardless of what direction HardOCP takes, I have implicit confidence in the quality of the content and the unbias nature of the reviews. When it comes to making purchasing decisions, I know I can turn to a select number of sites to give me a recommendation that I can trust - If I want hard benchmark numbers, I will read through XtremeSystem/Overclockers.net forums.

P.S: If we are talking about "nice to have" features in the reviews, I would love to see how videocard performance scales with different platforms (Does Sandybridge still stand the test of time when paired with a modern GPU, or is it time to jump to Skylake?). Unfortunately, this is probably super time consuming, so it is not a reasonable expectation to have any review site run the test on multiple CPUs.
 
I enjoy the reviews for the most part. One thing that may help is a condensed setting difference.

For example. In your dx12 tomb raider review it was a huge table (kind of a puke of data) and 99% of it didn't change between cards/dx versions. You summarized it it in the text but I find myself trying to do a diff on the tables to find the different settings. Maybe find some way to make it easier to see what settings you are either [+]ing or [-]ing

Yes, we are going to be changing that some moving forward for sure. I was not very happy with how the information was presented. Keep in mind this was our first time using this format.
 
I think that, with so many cards compared, it looks a little messy, the new format was more specifically made for regular card reviews which have at max 4 cards compared, not a game review with 10+ cards. I think in the future, separating the cards more into price categories, will help break up the tables and it won' look as messy. A lesson learned using the new format, and it will evolve for the better as we use it in more game reviews.

I am open to peoples suggestions here in this thread on how we can better present game settings and compare them among many cards, typically we use about 10 cards or more in a game review, so that's a lot to compare side-by-side, I look forward to suggestions on how that can be made easier to compare and less messy, with a game like tomb raider that has a lot of game settings to show, well it's just a lot of information in one area.
 
Last edited:
Keep the settings consistent, unless absolutely necessary, among all cards in a market segment. That sort of provides an apples to apples comparison. So if a Gameworks feature is locked to Nvidia cards, feel free to enable it and make note of it as a possible performance hit. Opposite for async compute on some titles. Tessellation, if you subjectively think is overboard, turn down a notch or two. Just pick a resolution that segment targets (1080P@60Hz for midrange, VR@90Hz and 4k@60Hz for high end minimum) and pick settings that are practical for all cards in that segment. Maybe you have a 20fps difference, but the reader can see and understand that. Turning down some of the "Ultra" settings that are absolutely unnecessary as well. As I mentioned before, also be sure to include that average CPU utilization, because that's nice to know with the new APIs.
 
Last edited:
I am open to peoples suggestions here in this thread on how we can better present game settings and compare them among many cards, typically we use about 10 cards or more in a game review, so that's a lot to compare side-by-side, I look forward to suggestions on how that can be made easier to compare and less messy, with a game like tomb raider that has a lot of game settings to show, well it's just a lot of information in one area.

Maybe utilize expandable text or table boxes (& possibly with a scrolling function) that are collapsed as default in an article. This would allow you to cram a lot of information into the usual space without destroying the usual format.

For comparison, you could utilize some sort of CSS highlighting of differences/similarities between two or more of the collapsible boxes.
 
For what its worth (not much) but i almost ALWAYS go straight to the apples to apples comparison. The apples to oranges comparison just dont interest me much - Just throwing that out there for my 2cents
 
I think that, with so many cards compared, it looks a little messy, the new format was more specifically made for regular card reviews which have at max 4 cards compared, not a game review with 10+ cards. I think in the future, separating the cards more into price categories, will help break up the tables and it won' look as messy. A lesson learned using the new format, and it will evolve for the better as we use it in more game reviews.

I am open to peoples suggestions here in this thread on how we can better present game settings and compare them among many cards, typically we use about 10 cards or more in a game review, so that's a lot to compare side-by-side, I look forward to suggestions on how that can be made easier to compare and less messy, with a game like tomb raider that has a lot of game settings to show, well it's just a lot of information in one area.

Most of the text is common for settings, so, per block of comparison, just make the left column the most common settings and then list the cards to the right of it, only putting text (and color coding?) where there's a difference. Any tile that's identical to the common settings is simply left blank. For example: DX11 v DX12 Highest Playable Settings - AMD - Rise of the Tomb Raider DX11 vs. DX12 Review the top chart is identical across the board--just have a column to the left of AMD Fury X DX11 that has the whole list and make the rest of the cells empty (cuz there's no difference). For the 390 vs 390x, copy over the 390x DX11 column to a "common", then highlight in red (any colour you like :D) the places where the 390 is different, e.g. Sun soft shadows row would have the text "on" rather than blank.

Also, preset vs custom is pretty much a waste of text. Doesn't really mean much when you're customizing things as much as you guys do, so that's one fewer row to worry about.

Top graph of pg. 5 is useless and very visually imposing. Not that you couldn't have an infograph to draw attention to the fact that they're all running SMAA @1440, but it's a huge block of repetitive, BIG text.
Bottom graph of page 2 is similar. You could just list all the cards in a row that are under their respective drivers.

No intent to tear down so I hope that helps. Appreciate the work you guys put in. My old boss, whom I found an excellent presenter harped on us to simplify, simplify, simplify such that charts were stripped down to their essence, so that's where I'm coming from.
 
Keep the settings consistent, unless absolutely necessary, among all cards in a market segment unless absolutely necessary. That sort of provides an apples to apples comparison. So if a Gameworks feature is locked to Nvidia cards, feel free to enable it and make note of it as a possible performance hit. Opposite for async compute on some titles. Tessellation, if you subjectively think is overboard, turn down a notch or two. Just pick a resolution that segment targets (1080P@60Hz for midrange, VR@90Hz and 4k@60Hz for high end minimum) and pick settings that are practical for all cards in that segment. Maybe you have a 20fps difference, but the reader can see and understand that. Turning down some of the "Ultra" settings that are absolutely unnecessary as well. As I mentioned before, also be sure to include that average CPU utilization, because that's nice to know with the new APIs.

So, you want them to do the exact thing they got away from doing 13 years ago?
 
So, you want them to do the exact thing they got away from doing 13 years ago?
Not quite. Get away from averages in favor of minimums and use settings that are still roughly comparable. Minimums are sort of what their "best playable" settings should be, but averages seem more important than minimums. Maybe a lower quartile average. And yeah I know figuring out a good minimum can be rough. I'd be more interested in knowing what a framerate looked like if vsync was forced. ROTR for example DX12 is slower, but it looks like there are bad areas which see significant improvements.
 
On thing I'd really like to see is the Apples to Apples comparisons benchmark more GPUs, often ones from a previous generation. Lets say when the new GPUs hit some time in the next months I'd like to be able to see how much of an improvement a new card will be over my GTX 970.
 
Since upgrading to gsync, my minimum playable fps is around 75. If I have to cut some settings, so be it.

It'd be nice to see highest playable settings for different preferences. 45, 60, 75, etc.

I'd REALLY like to see an emphasis on minimum or even the 'Nth percentile' frame rate. Minimums mean more for experience than averages, and maximums may as well be rubbish.

"Highest settings for X FPS" would be much more useful for today's gamers than 'highest playable'. Its much more scientific.
 
(snipped)
Also, preset vs custom is pretty much a waste of text. Doesn't really mean much when you're customizing things as much as you guys do, so that's one fewer row to worry about.

Top graph of pg. 5 is useless and very visually imposing. Not that you couldn't have an infograph to draw attention to the fact that they're all running SMAA @1440, but it's a huge block of repetitive, BIG text.
Bottom graph of page 2 is similar. You could just list all the cards in a row that are under their respective drivers.

No intent to tear down so I hope that helps. Appreciate the work you guys put in. My old boss, whom I found an excellent presenter harped on us to simplify, simplify, simplify such that charts were stripped down to their essence, so that's where I'm coming from
.

THIS!

Simplify but don't over simplify. Keep it to presets + or - two major options like shadows for hair fx (like the witcher). (if?) When adaptive sync becomes a mainstream thing, framerate game will change immensely.
 
The reviews are pretty damn good as they are - but to go to the next level . . .

Maybe switching up the games a little between the video card reviews so a wider selection of games/data are presented can give over time a better understanding of performance. Plus may make it more fun to do as well vice stuck for a period of time with the same game.

Throw some old games in there once in awhile (actually you did this on the side) but many of use do play older games as well. Speaking of older games, how do they perform at 4K and does that resolution give life to them? Relieving some of the best moments in PC gaming is not a bad idea and will also show how far hardware has come. You do have some comparative data of the older games which can be contrasted with today's hardware performance.

I like the concept of concentrating just on the play ability of a game on a given GPU but without the numbers I think it will be hard to accept the conclusions - they compliment each other and are vital for an overall review. Steam does streaming logs but not sure if FPS logs only can be extracted from this or not. Since the FPS counter does work with DX12 in Steam.

Using the Steam In-Home streaming logs :: Steam In-Home Streaming

Some of the 60fps settings needed for a given game is also intriguing as well mentioned before.
 
Last edited:
I would like to see a card index/ranking type system. Some way to easily identify which cards perform better in certain games at X resolution.
 
What sucks? That they don't give away the cool stuff they get to play with mostly to me and some to the rest of you.
 
At the end of the day, I think you have to include FPS data for both apples to apples and the settings you feel are best. I know this isnt possible right now with dx12, but someone needs to find a way or I'm not sure I"ll read the reviews.

I would like to see you test more ranges of cards. I don't need 10+ 970 reviews, but a 950 review would be nice. I'm not even sure you guys reviewed any 960 cards. How about passive cooled cards or cards that don't require external power?

This doesn't apply to just video card reviews, but trying to search for old reviews is a pain in the ass.
 
Am I being forgetful or have 1920x1080's become and thing of the past? A lot of us still use, and will continue to use that resolution for a long time. If it's the collective wisdom that something like the 390x, FuryX, and 980ti are wasted on 1080, let's see it, please. I don't want to have to split my review info by having to go to other, less analytical sites which don't follow the same methodology as you guys.
 
Am I being forgetful or have 1920x1080's become and thing of the past? A lot of us still use, and will continue to use that resolution for a long time. If it's the collective wisdom that something like the 390x, FuryX, and 980ti are wasted on 1080, let's see it, please. I don't want to have to split my review info by having to go to other, less analytical sites which don't follow the same methodology as you guys.
Not the least of which HDR 1080 will likely be the preference over 4k(norm) here soon soon.
 
I think that, with so many cards compared, it looks a little messy, the new format was more specifically made for regular card reviews which have at max 4 cards compared, not a game review with 10+ cards. I think in the future, separating the cards more into price categories, will help break up the tables and it won' look as messy. A lesson learned using the new format, and it will evolve for the better as we use it in more game reviews.

I am open to peoples suggestions here in this thread on how we can better present game settings and compare them among many cards, typically we use about 10 cards or more in a game review, so that's a lot to compare side-by-side, I look forward to suggestions on how that can be made easier to compare and less messy, with a game like tomb raider that has a lot of game settings to show, well it's just a lot of information in one area.

TukfKg9.gif


Lets see. It is pointless to show data that doesn't change, because it gives us no insight and makes reading the rest of the data much more difficult. So, what I would do is only write about the data that doesn't change (IE "we found that X, Y and Z settings can be kept at W level throughout the cards and systems we tested") and graphically show the things that do change. This makes understanding the differences quite easier, as your eyes get to see straight what is what. In this particular instance the tables would be less than half the size. So not only you reduce clutter, you simplify the understanding part of the review.
 
Honestly I don't go to the [H] for any hardware review at all. The formatting and pages are hard to read/follow. The way pictures are linked and presented is annoying and hard to read. I go to PCPER for all of my reviews, but the forum community here is much better. Plus, as mentioned in the post above, the pointless repetition of text / features.
 
From where I see it [H] differentiates itself from other sites by giving a first-hand account of how a game plays on hardware -- in this, keep on keepin' on. Hard numbers are all well and good, but if we've learned anything over the past years, numbers don't completely represent how performance "feels", and worse the numbers don't show well a performance quirk e.g. micro-stutter with multi-GPU. It's kind of like why car reviewers e.g. Top Gear, Motor Trend will post a short table of specs/performance results that is there to compliment the play-by-play driving experience. There's no way to get a feeling for how a bugatti Veyron drives by looking at simple performance specs: you need the guy driving it to tell you.

That said, there are some stats that are key when comparing performance between cards you should keep in. Namely, minimum-maximum FPS and frame time.

As for including 1080p in reviews, I don't think this is needed since a) 1440p and above really has become the standard for enthusiasts and b) barring any disparity in VRAM requirements, it really doesn't take a whole lot of mental gymnastics to figure how a card is going to perform stepping down from 1440p to 1080p.

As for the format of the card reviews, it's fine, shows you exactly what's important: highest playable settings, and apples to apples comparison followed by a run-down of the experience.
 
Honestly I don't go to the [H] for any hardware review at all. The formatting and pages are hard to read/follow. The way pictures are linked and presented is annoying and hard to read. I go to PCPER for all of my reviews, but the forum community here is much better. Plus, as mentioned in the post above, the pointless repetition of text / features.
You dont read ANY reviews here at all? LOL Im not positive they give a fuck about your thoughts...Sure they can tweak things to improve stuff but DAM lol you know they do have pictures in their reviews and they send out crayons if you ask
 
Personally, while I do read, enjoy and am entertained by the reviews here, I find them next to useless from a buyers/consumer standpoint.

If I'm in the market for a GPU, I want to see how it stacks up against the competition. I want hard numbers/data taken in a controlled environment (AKA, all settings used are the same across the board) because ultimately, that tells me exactly what I need to know. When settings are adjusted across tests to hit a target framerate, that really doesn't tell me anything. Thats what I, as an enthusiast, already do to to make a game perform to what I consider an acceptable level.

For an example, I just built a 4K gaming box. I wanted to see how various GPU's stood against each other rendering games at 2160p. [H]'s reviews wouldn't of really helped me with that unfortunately. I wanted the BEST performing GPU at the resolution (all else being equal), and I'd adjust settings from there to get the framerates I wanted.
 
Personally, while I do read, enjoy and am entertained by the reviews here, I find them next to useless from a buyers/consumer standpoint.

If I'm in the market for a GPU, I want to see how it stacks up against the competition. I want hard numbers/data taken in a controlled environment (AKA, all settings used are the same across the board) because ultimately, that tells me exactly what I need to know. When settings are adjusted across tests to hit a target framerate, that really doesn't tell me anything. Thats what I, as an enthusiast, already do to to make a game perform to what I consider an acceptable level.

For an example, I just built a 4K gaming box. I wanted to see how various GPU's stood against each other rendering games at 2160p. [H]'s reviews wouldn't of really helped me with that unfortunately. I wanted the BEST performing GPU at the resolution (all else being equal), and I'd adjust settings from there to get the framerates I wanted.

now this i agree with....matter of fact i mentioned i always skip the apples to oranges part myself. I wonder how many of us feel this way?
 
Back
Top