What Sucks About HardOCP Video Card Reviews

I would like to see some BOINC comparisons in your GPU testing. This would be especially usefully even to [H] itself since you have your own BOINC team. Comparison on PrimeGrid, Milkyway, GPUgrid, etc. between various GPUs would be great for the thousands of people who crunch for the various teams or solo. And since it's rare to find a review with this data, you would draw in viewers who otherwise may have never been to this site.
 
Here's a serious answer that's fairly generic:
I like to skip ahead and read the conclusions. If i want specs or scores or whatever, that's easy to find everywhere. The reviewer's thoughts on the price, if it's worth the cost, what the possible negatives might be, where it will shine and where it will have problems, if it's something they could see using for the next few years.
 
For an example, I just built a 4K gaming box. I wanted to see how various GPU's stood against each other rendering games at 2160p. [H]'s reviews wouldn't of really helped me with that unfortunately. I wanted the BEST performing GPU at the resolution (all else being equal), and I'd adjust settings from there to get the framerates I wanted.

My only request would be to have some comparison to older, high end cards. Some of us skip generations between upgrades and it would be helpful to have more relevant data when we do make a purchase. I know it's not feasible do do multiple extra run-throughs of every game tested, but some fairly constant measure would be useful

4K resolution numbers and comparisons against same tier but last generation cards would be a HUGE plus, as people move to upgrade a card, and as we are moving into the era of possible viable 4K PC gaming.
 
A Dollar/FPS chart. The majority of us don't have unlimited funds to spend on playing video games. A good portion of our paychecks go towards our mortgage, utilities, groceries, etc. Getting the most bang-for-your-buck is the primary goal for many people when shopping for a new video card. A Dollar/FPS chart for video cards (similar to your Dollar/Performance Ratio charts for your CPU cooler reviews) would be welcomed.

That sounds like a good idea. There are some cards that are equal in performance but differ quite a bit in price (aside from the other specs - power efficiency, heat, etc.). Just a simple FPS to Dollar comparison. If I can get 60 FPS on Card A for $300, why should I pay $600 for Card B that does 66 FPS? There are other things, but that part does come into play. The cost of something is a huge thing to me. I buy the best bang for the buck. Which brings me to the next :

Here's a serious answer that's fairly generic:
I like to skip ahead and read the conclusions. If i want specs or scores or whatever, that's easy to find everywhere. The reviewer's thoughts on the price, if it's worth the cost, what the possible negatives might be, where it will shine and where it will have problems, if it's something they could see using for the next few years.

They usually bring up the cost/performance in comparison to other cards. So, it's not a huge thing. Just something that sounds neat, but probably adds a bit of time to research... I always read the conclusion first. There have been some that were bad conclusions, or that there was a better card for less money...
 
I think its really interesting the way you guys review cards. Its a completely new take on how to judge a card. The fact you guys talk about stuttering and smoothness is something that really resonates with me. I like the idea of the highest playability settings instead of ULTRA. I wouldn't want every review site to do it the way you guys do, but I like the variety.
 
I really like how reviews are handled at HardOCP. You guys are really honest about your evaluations and you constantly update the games themselves to keep up with the times, contrary to some sites who are still using old games that do not push modern cards.

I'm the type of guy that prefers to lower resolution (reasonably) instead of disabling effects. For example, my GTX 970 is slowly turning into a mid range card, but it tends to be tested at 2560 x 1440, where some effects have to be disabled. I would really like for reviews to show how far down in resolution I would need to go to be able to keep all effects on. Of course if say, I would need to go down to 640 x 480, that would be absurd, but 1080p or 1440 x 1050, etc wouldn't be that bad, at least for me.
 
I'm thinking establish a baseline, say:


1080p @ 60fps average
What settings can so and so hardware manage before failing this requirement?

And then maybe contrast it against something a little more forward looking, like:
4k, maxed settings
How far can so and so hardware take you in this regard? Is it possible to get this into "playable" ranges with tweaking?



I'd like to see frametimes along side it, as it's possible for something to be a jittery, microstuttery shitshow with a good average framerate.
 
Full disclosure: I purchased my current cards (2x AMD Sapphire 7950) based entirely on the box art. My friend was trying to convince me to get an R9 or GTX 960 but I chose the rendered marketing ploy.

As far as the graphics reviews here, I've always wished there was a section focusing on optimizing high frame rates, as I'm more interested in hitting the human retina limit of ~83fps. I dislike how most gamers feel 60fps is acceptable for the benchmark of "playable" just because for so long they have been force-fed 60hz LCD's and were sold the lie that frames above this level are indistinguishable. I can usually estimate the settings needed to reach a playable frame rate based on the data given in your reviews but there is a lot of unwanted data like 4k resolution and the sections where you talk about how the game play smoothness feels. 4k resolution will only matter when monitors supporting >80hz refresh are available. To me it is just a waste of space to scroll over in the reviews.

You're hardware reviews are still and have always been the highest quality on the web, and you don't really need to change anything.
I do have to say the most recent Tomb Raider review of DX12 was abysmal, and it left a sour taste in my mouth.

Personally, i think 4K is awesome, and 60 FPS is the standard, not the unanimous ideal. The only people who are worried about arbitrary framerates inbetween 60 and 120 are CRT truthers, and this is a hardware enthusiast forum, not an antiques dealers' forum.

I can personally tell the difference between 90 amd 120FPS. Some people can probably tell the difference betweem 120 and 240. There is no 'retina limit', so demanding that one maintains such an imaginary number is a slippery slope.

60FPS is a standard because LCD refresh is set at 60Hz, yes. But many people find it to be quite smooth, and most people are happy with it.
 
I think when considering [H] reviews you have to go in answering a few questions, in order.
1- Who is the intended audience?
1.5- What does that reader want to know?
2- What information does [H] want to provide?
3- What information does [H] want to avoid?
4- What is the best way to show the differences between the cards?

Answering these questions will tell you what the review needs to contain to inform the intended reader of the information they want to know. IMHO, the answers should be as follows:

1- Enthusiast (~$1500+) gaming rig-builders who don't put together a system that is not overclocked as well as high-end gamers (systems up to $1500) who probably overclock some components, but not heavily (and only go water in an AiO solution), and last, members of those two groups interested in slightly non-standard or niche products (like low profile high end gpus or mitx cases. Clearly, the second group will be the meat and potatoes of the [H] viewership. These three groups, who collectively I'll call the "audience" likely wants a few things:
a- To read about the newest, shiniest hardware on the market.
b- To read about absolute performance and feedback regarding the newest shiniest hardware on the market.
c- To read about the relative performance and feedback about the newest and shiniest hardware on the market.
d- To learn about niche versions of the newest and shiniest hardware on the market.

2- Because the audience doesn't always know what they want to know about (i.e., the Phantom Console series, the Benchmarking the Benchmark series, etc), [H] needs to understand the market and know what its audience needs to know, before they need to know it. This includes some of the "softer" topics that allow [H] content producers to add editorial flavor and feedback based on their ability to spend far more time in the industry than us shmucks do. The recent Titan 4k article for example.

But in terms of GPUs, I think that [H] at its core wants to educate the reader as to what products on the market represent the best value/performance in both absolute and relative terms.

3- I think it's clear that [H] generally is concerned with how well a GPU performs, not how well it spits out precalculated numbers and crunches raw numbers. [H] readers come here because at the end of the day, the GPU-reviewing team has established a certain cache, reputation and trust among readers and the industry such that when Kyle says, "this gpu is fast but stuttery as hell because the drivers are borked," we can trust that it is true. Or when comparing cards, when he says, "A is better than B because it provides the best in game experience," we know it's not a bought and paid for review.

4- This, of course, is the key. How do you provide the necessary information in a palatable, consumable and understandable way, but also in a sufficient manner to give the whole picture?
The charts in the recent reviews were NOT very readable. I'm not sure how to fix them, but they were too convoluted, too data-dense, and not capable of easy consumption. I'm thinking you need to do some color coding or something to make is more readily apparent where the differences are and what they mean. So, color cells green where the max ("max", "ultra" etc) setting is used, orange if a slightly lower setting is used ("high", "very high" etc), or red if anything below a "High" setting is used.
This will call out the text to the reader so that the differences can be visually grasped without having to actually parse out the words of each cell (which should still be included for reference).
Also, I would switch to a horizontal format. Vertical is nice, i know, for mobile. However, the human brain compares things better when they are oriented like that as opposed to as-is.

All this is to say the CONTEXT of the articles is there, it's just a matter of formatting.

And for the love of GOD do not change your evaluation methods.
 
A Dollar/FPS chart. The majority of us don't have unlimited funds to spend on playing video games. A good portion of our paychecks go towards our mortgage, utilities, groceries, etc. Getting the most bang-for-your-buck is the primary goal for many people when shopping for a new video card. A Dollar/FPS chart for video cards (similar to your Dollar/Performance Ratio charts for your CPU cooler reviews) would be welcomed.

I like that idea, in theory. There are a couple problems with it. You're equating FPS to the only important thing for determining real world performance. [H] has been trying to say that that is not the case for the last 13 years. On top of that you cannot directly compare FPS when using "best playable" settings as a basis of a review. I suppose you could use apples-to-apples for it but that goes back to the issue of thinking that FPS is the only thing important. The other big issue is the prices themselves. GPU prices can change weekly, hell sometimes daily. They could put all the work into a chart and then within an hour of the review being published a $20 shift in prices completely invalidates it. This is especially true when we're talking 4-6+ months from the launch of a card.
 
I like that idea, in theory. There are a couple problems with it. You're equating FPS to the only important thing for determining real world performance. [H] has been trying to say that that is not the case for the last 13 years. On top of that you cannot directly compare FPS when using "best playable" settings as a basis of a review. I suppose you could use apples-to-apples for it but that goes back to the issue of thinking that FPS is the only thing important. The other big issue is the prices themselves. GPU prices can change weekly, hell sometimes daily. They could put all the work into a chart and then within an hour of the review being published a $20 shift in prices completely invalidates it. This is especially true when we're talking 4-6+ months from the launch of a card.
Plus, with DX12, since you can't do FPS anymore, there's no way to do the math.
 
I love the reviews here and I check the site daily in order to compare the playable results vs just the raw numbers others throw up.

I'm not sure how to write this without it sounding confusing. :)

Sometimes I feel that too much weight is added to some fluff settings in the reviewers thoughts/conclusions. Mainly Gameworks settings. With some of these Gameworks settings, card A may show an extra 5fps performance increase compared to card B. But if the overall game still can't be played with either card with the specific setting turned on, then I wouldn't expect the thoughts/conclusions to include it when stating why card A is better than card B. Heck, most of the time, on my 980, I can't even tell when some of these settings are turned on unless I'm standing in one spot, staring at the screen or taking screenshots. Standing around in spot, staring and taking screenshots isn't playing the game.

I guess the confusing part of all this is the definition of a playable setting. If you have to stand in one spot and stare at the screen/take screenshots to see if a setting is working, then you aren't actually playing the game. If you aren't actually playing the game in order to see the difference, then is it truly worthy of making one card better than another?

We all love new tech and new settings and benchmarking those settings. I just think the weight that is applied to these settings for conclusion purpooses is sometimes over done because in reality, you can't tell some of these settings are even turned on while you are actually playing the game. :)
 
A Dollar/FPS chart. The majority of us don't have unlimited funds to spend on playing video games. A good portion of our paychecks go towards our mortgage, utilities, groceries, etc. Getting the most bang-for-your-buck is the primary goal for many people when shopping for a new video card. A Dollar/FPS chart for video cards (similar to your Dollar/Performance Ratio charts for your CPU cooler reviews) would be welcomed.

This may be asking a lot for the HardOCP staff, but a user-adjustable Dollar/FPS chart would be great, using stock or near-stock video card settings. That way, anyone could see just what kind of value would be gotten from getting, say, a top-tier used card from a slightly under top-tier new card. I know that Tom's Hardware (kinda) has this with their "Desktop GPU Performance Hierarchy Table", and TechPowerUp has their "Performance Per Dollar" section of their video card reviews, but -- combining the two, adding the MSRP amount for the stock cards, and adding a slider tab that the user can put in a price for (and being able to add/input a stock card, or cards in the case of Crossfire/SLI) would be great. HardOCP could even put this feature in a separate section of their website that is updated with each new card review (as needed -- another slightly OC'd version of, say, a GTX 980 would probably not be needed).

The ultimate would be combining what I just proposed with GPUReview's "Compare Cards" feature and allowing users to plug in values for their stock or OC'd cards directly -- allowing very precise comparisons to be made, and if updated drivers/multiple driver versions could be factored into that, it's comparison heaven.

Such a user-adjustable chart should have a BIG caveat/sticky saying "YMMV" since the reviews will not always reflect actual performance/comfort levels (as per rig being used for video card; user rig may not be reviewer's rig, and so on).

Probably all pipe dreams, but they would bring a lot of value to the HardOCP table.

HardOCP's current evaluation method is fantastic, however, and doesn't need to be downscaled. If anything, what I propose is just adding lots more work on the HardOCP staff -- and, hopefully, a big payoff in terms of greatly increased hits to the site and more ad clicks.
 
Last edited:
I read all hardocp reviews but I find myself skipping through them a lot cause I don't feel a lot of the info is relating to me. Might be good to see what the latest and greatest will do but in real world doesn't apply much to me cause I'm not going to have that system anyway. Don't personally care for the apples to apples comparison. It might be pertinent to some to see what the latest and greatest system will do but doesn't really tell me how the game is going to perform on my system. Showing different resolutions and settings on direct comparisons leaves too much computing on my end to understand it. I always find myself going to other reviews to find this info that show more straight performance numbers on more vid cards. Seems hardocp reviews are more geared for the Elite Gamer rather than for gamers who still have good systems but don't have the bucks (or don't want to spend it) to be in the Elite crowd. Just my cent and a half.
 
My biggest problem is how you're always biased to whichever manufacturer is making the better performing video cards, instead of the manufacturer that's making the worse performing video cards.



<wait for it>



/s !
 
I generally love the format. The benchmarks are very detailed real-world breakdowns of performance.

What I don't love is the general lack of 1080p benching among high-end cards. Has everyone moved on from 1080p monitors? I get 1440 on some level, but 4k seems like a dumb upgrade given the minimal IQ improvement versus additional resources needed.

One other personal issue -- I'd rather know if I'm hitting 50-60+ as playable rather than 30-40 (I forget your specific cutoff). One reason I'll never go back to consoles is the 30fps madness. That's not an enjoyable experience.
 
I generally love the format. The benchmarks are very detailed real-world breakdowns of performance.

What I don't love is the general lack of 1080p benching among high-end cards. Has everyone moved on from 1080p monitors? I get 1440 on some level, but 4k seems like a dumb upgrade given the minimal IQ improvement versus additional resources needed.

One other personal issue -- I'd rather know if I'm hitting 50-60+ as playable rather than 30-40 (I forget your specific cutoff). One reason I'll never go back to consoles is the 30fps madness. That's not an enjoyable experience.

When I had a 1080p monitor I found it pretty easy to look at the 1440p data and use it to get a pretty good guess about 1080p. Let's face it, it's super easy to run 1080p for the vast majority of games these days. There isn't a huge amount of value in taking a $600+ video card and going "Yep, it gets 100+ FPS at 1080p with settings maxed out". I'd find that rather boring, personally. For lower end cards, like 970/390 grade and lower I can see a purpose for 1080p numbers but once you hit the high end enthusiast and extreme high end cards it's a bit silly. Heck by the time the next generation of cards get going we might see the 970/390 replacements being the sweet spot for 1440p and even lower end cards pushing max or near max settings at 1080p.

4K is the new Eyefinity/Surround. It's the next step in resolutions (even though I, personally, prefer ultrawide over 4K) and the next milestone for video cards. Nvidia and AMD keep pushing their cards as 4K cards because they realize there is excitement in the enthusiast community over it. Viable, single-card 4K is the thing people want right now.
 
On the target fps subject:
60+ refresh can reveal cpu bottlenecks in a hurry with certain games, it'd be interesting to see driver comparisons there.


IMO on the frametime metrics:
numbers would be nice, but if a game is crap, you can see it. I'd take your word for it there.



I thought your state-of-drivers articles in 2012 were a neat idea, btw...
 
I see everyone talking about standard 16:9 resolutions. I wouldn't mind seeing ultrawide (21:9) resolutions like 2560x1080 being tested as well but I understand its not as popular (though enthusiasts do go after the not so popular standards).
 
Lets see. It is pointless to show data that doesn't change, because it gives us no insight and makes reading the rest of the data much more difficult. So, what I would do is only write about the data that doesn't change (IE "we found that X, Y and Z settings can be kept at W level throughout the cards and systems we tested") and graphically show the things that do change. This makes understanding the differences quite easier, as your eyes get to see straight what is what. In this particular instance the tables would be less than half the size. So not only you reduce clutter, you simplify the understanding part of the review.


Totally supporting this, highlights the changes, avoid information that is repeated.

I know this is HardOcp but I gotta say it... sometimes users wont upgrade the processor and have an old graphic card. In the cases where a relative old CPU could bottleneck an more expensive GPU, the less expensive would be wiser to buy. What I mean is that you could include a What if I only upgrade a GPU from this 2-5 year old setup review section. Or how would a mainstream PC would perform with a High end GPU, we all know the the GPU rules for the most part the FPS in Games.

What I really like is the best playable settings for the most popular resolutions. Is something that really gives meaning to the consumer. Although in real life, the consumer might have to tune down some settings as they might not have the same CPU/RAM, though we know that this do not affect much in some scenarios.

Overall HardOcp have top notch reviews, it is one of my best sources when looking for reviews.
 
Besides attic booth babes and alienating the oft readership, it would be really convenient if there was a searchable collection of all your video card scores that could be compared across the wide range of video cards and games since people really Believe in your testing methodologies. I know that a lot of consumers would greatly benefit from a slightly streamlined way too goddamn self store that shiny new GPU.
 
Your tested target resolutions are usually too high for the given video card. If a card can't crack 60fps with most settings maxed out (except AA) at a given resolution, then the card isn't good enough for that resolution with that game.
 
Your tested target resolutions are usually too high for the given video card. If a card can't crack 60fps with most settings maxed out (except AA) at a given resolution, then the card isn't good enough for that resolution with that game.

Unfortunately resolution has more to do with the monitor's native display than the video card. I agree that 60FPS minimum should be looked at, but if your monitor is 1440p, you should try to run the game in 1440p. If that means you've got to turn down settings, so be it.







Don't you even dare mention CRTs.
 
I would like to see the same video card tested at both 1080p and 1440p as I feel these are the most commonly used. I know that you try to push them to their max at 1440p and then assume that everything will work at 1080p but I feel like there may be some situations where that may not be the case or even that in the case of 1080p that things may not always be able to be "maxed out" so to speak. I hope that makes sense.

I really do appreciate the time and effort you guys take. Your reviews are always top notch and I always look to you guys first before a purchase.
 
I've never been able to embrace the "highest playable settings" reviews you guys adopted. Honestly, I read the [H] conclusions and summary, but I hit up another popular site for benchmark comparisons. It feels too subjective to me, and the graphs are confusing to interpret. I much prefer the "apples to apples" comparisons and bar graphs that other sites use, as it's much easier for me to make a simple comparison between multiple cards without having to also factor in exactly what settings you were using. That's not to say others don't appreciate the kind of testing you guys do, it just doesn't appeal to me. I hold your opinions and recommendations in high regard, and they hold a great deal of influence over which cards I choose to purchase, I just don't care for the testing methodology or trying to interpret the results.
 
TukfKg9.gif


Lets see. It is pointless to show data that doesn't change, because it gives us no insight and makes reading the rest of the data much more difficult. So, what I would do is only write about the data that doesn't change (IE "we found that X, Y and Z settings can be kept at W level throughout the cards and systems we tested") and graphically show the things that do change. This makes understanding the differences quite easier, as your eyes get to see straight what is what. In this particular instance the tables would be less than half the size. So not only you reduce clutter, you simplify the understanding part of the review.

It's harder to do when there is a lot of video cards, but when the review has less video cards I think some form of highlighting the highest settings when there is a change works well too:
tableHighlight.png
 
I wouldn't mind some video reviews. I know it'd probably be hard to convey the quality of the gpu video, but we might get a nice idea of the gameplay experience you're describing over other cards.

I'm on board with this. I think it would be good even if it was just an overview and the meat of the article is where it belonged. Plus if you want to show any kind of weird behavior (With Freesync or Gsync or just anything really) its much better to actually record it in video form rather than explain it. For many people seeing is believing.

Also I think when a major driver update is released for Nvidia or AMD , something like Crimson AMD drivers for instance it might not be a bad idea to toss an article out about them now and again and how they compare to AMD's previous driver overhauls for example. Updated articles to reflect how SLI/Crossfire support is being done and if its scaling as it should or if either company is failing to keep updating profiles (looking at you AMD). Considering how flaky SLI/Crossfire support can be it would be great if you showcased it across the currently most popular games. Lots of people on the forums are always asking about if they should SLI/Crossfire and the response changes now and again. More information for the consumer = better buying habits and right messages sent to companies responsible for it and it'll give you guys more traffic.

Otherwise your general review structure is pretty solid.
 
I enjoy reading the reviews here but this thread made me realize that they don't influence my opinion on GPU performance whatsoever. The methodology and conclusions are simply too subjective to be useful.

What exactly does "playable" mean and what guarantee is there that the reviewer's opinion will be shared by anyone else?
 
The toughest part about these tests are how they are virtually no help to anyone on a modest income.. who works more then 40 hours a week to pay the bills.. and really don't have the disposable income to continue playing the infantile "keeping up with the Joneses" proliferation game that BOTH major companies depend on to survive and profit.

I would like to know how much of an improvemnt, and how much its going to cost me to get significant result by making an upgrade, and from what levels is it recommended. I'm running on 4 year old tech, which in product cycles is practically stone knives and bear skins to this crowd.... so yeah.. these articals aren't terribly useful any more. Hmm.... lets see... drop $1200 on a pair if video cards to get an increase of 2% in frame rates... or the thumping in the Jeep fixed and pay the mortgage this month.... decisions decisions....

Some of you guys know me.. I been posting a long time... some of you know exactly what I'm talking about.
 
The toughest part about these tests are how they are virtually no help to anyone on a modest income.. who works more then 40 hours a week to pay the bills.. and really don't have the disposable income to continue playing the infantile "keeping up with the Joneses" proliferation game that BOTH major companies depend on to survive and profit.

I would like to know how much of an improvemnt, and how much its going to cost me to get significant result by making an upgrade, and from what levels is it recommended. I'm running on 4 year old tech, which in product cycles is practically stone knives and bear skins to this crowd.... so yeah.. these articals aren't terribly useful any more. Hmm.... lets see... drop $1200 on a pair if video cards to get an increase of 2% in frame rates... or the thumping in the Jeep fixed and pay the mortgage this month.... decisions decisions....

Some of you guys know me.. I been posting a long time... some of you know exactly what I'm talking about.

That will be monumental work, even if done for just the one game. There are far too many upgrade paths (Upgrading between 750ti to 980ti already involves 25 combinations, EG 750ti to 950, 750ti to 960, etc etc etc., let alone AMD to nVidia and vice versa).

I'd say one would be better off learning how to extrapolate using known benchmarks and use rough relative performances to make that decision. That's how I ended up extrapolating 570 to 970, using benchmarks from 600 series (which usually includes 500 series cards for comparison), then 700 series (ditto for 600), then 900 (ditto for 700) series to work out the rough relative performance between 570 and 970.
 
Personally, i think 4K is awesome, and 60 FPS is the standard, not the unanimous ideal. The only people who are worried about arbitrary framerates inbetween 60 and 120 are CRT truthers, and this is a hardware enthusiast forum, not an antiques dealers' forum.

I can personally tell the difference between 90 amd 120FPS. Some people can probably tell the difference betweem 120 and 240. There is no 'retina limit', so demanding that one maintains such an imaginary number is a slippery slope.

60FPS is a standard because LCD refresh is set at 60Hz, yes. But many people find it to be quite smooth, and most people are happy with it.
Before the Dual-link DVI monitors capable of 120hz to 144hz came out hardware enthusiasts were overclocking their LCD's to increase refresh rate. One of those enthusiasts is a member of this forum and even designed a device to help people do this: No-latency overclockable (up to 1080p@~270Hz) monitor controller board

I'm sure you can feel the difference between 90 and 120FPS, but human eyes are incapable of seeing any difference in frame rate above 80FPS. This is why 80FPS should be the standard, not 60FPS. It's not ideal because any increase in framerate can be utilized, especially in 3D applications and VR.

Many people find 30FPS to be quite smooth, and cinematic. Doesn't make them right.
 
You guys only use one CPU. I expect to see everything from Via C3 to AMD Geode to Intel Skylake @ 5GHz, for all video cards tested. Also, on WinXP, Win7, Win8.1, Win10, and Linux. Then re-test on low, low-med, medium, med-high, high, high-ultra, ultra, and ultra+texture mod settings. With 1, 2, 3 and 4 cards installed. Vary the ram in the machine with 4, 8, and 16GB, first with single channel, then with dual channel. But be sure to test the difference between 1333, 1600, 1866, 2133, 2400, 2866, and 3200. If you have a platform that can switch between DDR, DDR2, DDR3 and DDR4, that would be even better. Double check to make sure there are no notable changes from 450W to 1500W power supply. Summarize all data in one easy to read chart, and post the review for free perusal, so we can whine bitch and moan.

No but really, I suspect big title games have "ultra" settings that are more for pushing hardware sales, without changing the visual on-screen quality much at all (or just applying random filters that make the game look different, but not necessarily any better). Any time I see a review that uses high vs the top settings, I check the charts with keen interest. This isn't really specific to [H], just my pet peeve in general for gfx reviews. [H] already varies the quality settings to achieve desired gaming frame rate.
 
The only thing I have an issue with, is the way I like a game to be smooth and have a good experience is different then the who ever does the review. Changing settings based on how you feel the game is smooth might not be the same to me or other people.

So I think maybe going away from that model in the future. I want to see a review, to max out details with all video cards regardless of settings (Crapworks) to try and get 60FPS period. Anything dipping below that is not smooth. That is [H]!!! I would also like to see more 120hz testing as well.

I would also like to see older cards in the reviews as well. So many people say kepler is now slower in newer games. Well I think from now on in the reviews you need to have a flagship card from the previous generation to compare to the new generation.

I want to see if its true a 780ti or Titan really is faster or slower then a 290/290x in newer games.
 
Honestly? I think you folks have carved out a niche for yourselves as far as the reviews are written (occasional typos and grammar mistakes aside =P), and what they present... and you generally try to be impartial.

My main beef (not sure if this is what you want here, though):
DATA!!! RECORD IT ALL. PUT IT SOMEWHERE. LET ME QUERY IT!

Like just take the data on all the various findings you collected throughout, put it somewhere (database would be nice), and have some easy way to get to it. Yes, perhaps even the playable settings graphs and such. Clock rates vs settings...Maybe even link it to Hoverhound; you're in a unique position because you have such a tool that YOU are supporting. Because at the end of the day when I'm trying to figure out what card is right for someone, I need to do a performance comparison between many past and present options, vs their price tag. Being able to query past results to do comparisons is something that keeps me coming back to Anand:
GPU 2015 Benchmarks - Compare Products on AnandTech

That being said I think you folks can go further than they did. Honestly to some extent I would prefer to be able to just take all of the raw data, exported in xls format, and do various pivot tables on it to get to one answer. It'd also be really cool to see how performance changed on the same card with driver updates. Or maybe some kind of json format data API for people that want to create their own views. Stuff like that to me would be awesome.

That's just me though. I kind of just skim reviews and then just jump to the numbers. From the looks of it, many people in this thread are of that same opinion. Giving people more of what they want (though not sure if you'll go as far as what I want) can rarely be wrong. Also maybe get someone to do some responsive javascript tables and charts for you. I'm not sure why people are still printing out images of bar charts and such, when there are good plotting numbers out there. Well, I guess it reduces the amount of people doing img tags and linking the chart from across various domains... I'm sure that can be fixed though.
 
first things first, [H] is the only place on the web that i trust when it comes to video card reviews.

i also know that [H] is a site for enthusiasts. but i'd still like to see comparisons to somewhat older cards, like a 770, to establish a baseline. i mean, 1080p is still the standard resolution for most people, so seeing the cards perform at this resolution would add immense value, i'd wager.

but human eyes are incapable of seeing any difference in frame rate above 80FPS.

cease the peasantry. eyes & brain are capable of perceiving frame differences in the range of 200+ fps. anyone who games regularly will be able to tell 80fps and 144fps apart.
 
GPU 2015 Benchmarks - Compare Products on AnandTech

Not a carbon copy or anything - but something which can pull up an easy comparison would be nice. If you start over a new list each time the base rig is updates, a table like this would be good for easily a year.

I can't be the only person who thinks that an at-a-glance comparison of at least SOME apples-to-apples numbers has value. Look, I just compared a 290x to a 980ti. Why? Because I can. No, drivers won't always be the same for a card released and reviewed in May 2016 vs a November 2016, but that'll just have to be one of the *'s down below. Each entry in the chart could be a link to the article when the test and review was completed, and users could see the exact methodology of each run.
 
Didn't read through all of the replys so I apologize if this has been mentioned already. I've always wished that whenever [H] does a new gen GPU review they would include a few older equivalent reference cards, that way peeps can have a more exact idea of what an upgrade will net them. I know the reviews generally exist and we can go find them separately but it would be so fantastic if the data was right alongside the current gen.

Well then... Just read the post immediately above and had not seen that on Anand before. It's literally exactly what I was looking for.
 
GPU 2015 Benchmarks - Compare Products on AnandTech

Not a carbon copy or anything - but something which can pull up an easy comparison would be nice. If you start over a new list each time the base rig is updates, a table like this would be good for easily a year.

I can't be the only person who thinks that an at-a-glance comparison of at least SOME apples-to-apples numbers has value. Look, I just compared a 290x to a 980ti. Why? Because I can. No, drivers won't always be the same for a card released and reviewed in May 2016 vs a November 2016, but that'll just have to be one of the *'s down below. Each entry in the chart could be a link to the article when the test and review was completed, and users could see the exact methodology of each run.

Considering I posted the very same link just two posts up, no...
 
I really like the power consumption section. It's always the first part of the review I go to. It's the biggest reason why I haven't used AMD CPU and graphic cards in years.
 
Back
Top