FrgMstr

Just Plain Mean
Staff member
Joined
May 18, 1997
Messages
56,344
I was just reading what Ryan wrote over at PCPer yesterday and the following statement very much got my attention.

PC Gaming Shakeup: Ashes of the Singularity, DX12 and the Microsoft Store | PC Perspective

Benchmarking is likely going to see a dramatic shift with move to these app-style games on Windows 10, as the sandboxed nature will keep anything from “hooking” into executable as we have seen in the past. This means that overlays, like Fraps, EVGA Precision X, MSI Afterburner and even the FCAT overlay we would like to use for our capture-based Frame Rating testing, are kind of at a standstill. Measuring the performance of each game will necessitate the game developer writing an in-game benchmark mode that exports the kind of information that we want to see measured, and that we trust them to do it correctly, and that it will properly represent the experience the user sees.

I have to say that deep down, one part of me would love to see this happen industry-wide. It would bascially mute a lot of voices in the industry that only bark about data points. It is sad to think that we NEED benchmark numbers to write video card and game performance reviews. While I understand that video card reviews with zero framerate data is something far left of where the industry is at now, it is interesting to see video card reviewers basically tell us that they cannot do their jobs without these tools. Framerate is not the gaming experience.

The fact is that it has been my contention for a long time that we at HardOCP could possibly write a video card review with zero framerate data and give you as good, if not a better conclusion as it pertains to actually playing games with the video cards in question. However drawing those conclusions without "facts" is not acceptable today. (I have suggested for years that motherboard benchmarks in reviews are worthless, and I still believe I am 100% correct in that.) The fact is that framerate data is not needed to figure out if a video card supplies you with a great gaming experience. In some ways, you could argue that framerate data tends to overshadow really what we should be focused on when buying a video card...the overall experience it is going to provide you with.
 
The problem is, framerates (plus frame timing, framerate consistency, etc.) are the only objective results a reviewer can show. Beyond that, everything is subjective. While that doesn't tell the whole story, for sure, it does give a pretty clear indication of relative performance from one card to the next.

And while subjective data is a wonderful thing, it doesn't mean a whole lot without the objective data to support it. The way you guys do video card reviews, with a balanced mix of both, is a uniquely beautiful thing. But if you just up and stopped providing the objective data, I guarantee the readership would drop off drastically.
 
The problem is, framerates (plus frame timing, framerate consistency, etc.) are the only objective results a reviewer can show.
Agreed. But some will tell you that 25fps in Game X is sufficient for a good experience where 60fps is needed in Game Y for a good gaming experience. And we have seen exactly those scenarios. At that point, what is the data worth when it truly comes down to the actual gaming experience? And this overall is exactly my point. (And I am glad that this was the very first sentence in the very first response on this.) Even "objective" data is somewhat meaningless. This is exactly one of the reasons (not the primary reason) why we did not publish FCAT data.
 
But if you just up and stopped providing the objective data, I guarantee the readership would drop off drastically.
Hehe, do you think I do not know this? Do you still see the data there? Even in motherboard reviews, where it is "useless." :)
 
Oh yeah, for sure. The subjective data is necessary to determine what's necessary to create a "quality experience" and the objective data is necessary to determine how much better one card is over the other.

But that is the point. The 25fps data supports our subjective conclusions if that is what we tell you is needed for a good gaming experience. So while that data is objective and can be measured and re-measured, it is still actually subjective as to its true value in the gaming experience.
 
I completely agree -- motherboard benchmarks are entirely worthless; 90% of the performance of a given board is going to be from its chipset, and that's almost completely controlled by Intel or AMD. Barring incredibly old drivers or some serious misconfiguration they're going to work similarly in the real world, with rarely more than 1% difference in performance across boards with a given chipset. I do not care that ASUS boards run ~20 FPS higher at 640x480 in Doom 2 (or whatever it is) than Gigabyte or MSI. It will never matter. Focusing on what actually matters to the end user and no longer wasting our time catering to the small but vocal subset of readers who will scream "but numbers tho!" is a completely positive thing. The only thing that worries me about DX12/sandboxing is that I'm afraid companies may decide to go the unscrupulous route and cook the books, so to speak. If a game reports 60FPS and you're getting 55, or 50... who's to know?
 
I'd be perfectly fine with the removal of framerate data. To be honest I have been ignoring it for some time now, going directly to the impressions. With G-Sync it really doesn't matter what the exact framerate is from second to second. All I want to know is if the game looks good, runs smoothly and plays well using a given set of configurations.

Extra thought
I can tell you one thing, though: people wouldn't take this change if it happened at [H] nearly as well as they did when the focus moved away from apples-to-apples benchmarks ;).
 
I would be kinda sad to not see framerate data. Sure I want the game to run smooth. But anything less then 60FPS is not good enough for me. I know some single cards cant even run a game at 60fps at 1080p, but this is what I want to know.

To me 40fps is too damn slow. 60fps or bust.

canned benchmarks need to die.
 
The only thing that worries me about DX12/sandboxing is that I'm afraid companies may decide to go the unscrupulous route and cook the books, so to speak. If a game reports 60FPS and you're getting 55, or 50... who's to know?
Oh you KNOW that will happen. Just more nail in the coffin of framerate data at that point.

To be honest I have been ignoring it for some time now, going directly to the impressions.
You are the first person I have seen say that. Te be honest, I pay no attention to it when I edit documents outside of checking to make sure the data is correct. I have not used framerate data in a long time when it comes to formulating an opinion on the hardware we review.

I would be kinda sad to not see framerate data.
It would certainly be different.

canned benchmarks need to die.
AGREED! AoS is showing us just how bad these benchmarks need to go away. AMD is building drivers to win benchmarks again, and that is NOT a good direction. But we have a whole other thread about this, so let's keep this conversation on topic.

What you guys might feel is enough for a good gaming experience might not fit the bill for someone else, though.
Agreed, but that has NOTHING to do with benchmark numbers.

A hardcore FPS gamer that is used to a 144Hz setup might find 25fps choppy and unplayable.
We have had games that are "unplayable" at 60fps with certain configurations, but yes I get your point about the hardware twitch shooter guys, and quite frankly I do not think HardOCP content has been truly geared at that reader for 10 years.
 
Wrt motherboard data, I think it's just to spot any lemon boards.

Maybe video card drivers could log frame rates in the future.

My gsync screen has a built-in FPS bar on the left. Must be a way to scoop fps off the DP cable at least for variable sync setting that maintains the minimum ... which is not ideal of course.
 
This thread is NOT about AoS, please take that conversation somewhere else. Please stay on topic.
 
When I buy a game, the first thing that I do is to make a benchmark scenario with the game's opening (if the game doesn't have a benchmark. I then tune the settings to what's playable to me. At no time do I even consider the framerate (unless the in-game benchmark forces it on me). I'm looking for stuttering (turn off Vsync), then tearing (turn on Vsync, lower settings) until I get it right.

Of all the games that I played in the last year, Need for Speed Rivals was the one I sunk the most hours into (90+ according to Origin). At no time did the constant 30fps significantly bother me. It did surprise me at first, but I adapted real quick and simply enjoyed the game.
 
Of all the games that I played in the last year, Need for Speed Rivals was the one I sunk the most hours into (90+ according to Origin). At no time did the constant 30fps significantly bother me. It did surprise me at first, but I adapted real quick and simply enjoyed the game.

I've noticed that driving games are like that. Generally no sharp direction changes (like twitch gaming), so even with lower but constant fps, the game play still feels adequately smooth enough.
 
Wrt motherboard data, I think it's just to spot any lemon boards.

Maybe video card drivers could log frame rates in the future.

This. I don't expect a certain board to outperform others, but I would worry that there could be a lemon model.

I don't think we could trust the video card companies to be honest.
 
  • Like
Reactions: xorbe
like this
Wrt motherboard data, I think it's just to spot any lemon boards.

Maybe video card drivers could log frame rates in the future.

My gsync screen has a built-in FPS bar on the left. Must be a way to scoop fps off the DP cable at least for variable sync setting that maintains the minimum ... which is not ideal of course.
100% motherboard benchmarks are to find out what is wrong, rather than what is right. They still serve a purpose, but not really one worthy of making graphs about unless there are issues.
 
Last edited:
I haven't given a graph a serious look since the GPU release pace slowed from being a new product every six months...which has been a long time. As a few others have said, I mostly read the intro; the new tech in the hardware; and the conclusion. If it's a couple flagship cards that are comparatively close, I may check the apples-to-apples graphs and one or two best settings graphs but that's about it. I'm more interested in the subjective experiences of the staff since they have been doing this for long time and know what 'feels' right and what doesn't. What about VR reviews, is that in the pipe at all?

What is AoS?
 
i'm not sure what the answer is here, but i can share what i have found helpful from reviews (not just here) and why:

side by side image quality has been very helpful. i find often the difference are very subtle and i have trouble finding them just from an image. maybe use video to this end? i can see split screen 1080p (one card on each half) being fun.

power, temperature, noise and overclocking will remain unchanged. the big thing is how smooth my gameplay is. some cards/drivers often give little hiccups in certain games. some focus on that end might help.

unfortunately framerates are important to some degree. i even find straight up 'apples to apples' helpful at times when cards are very close in performance. for myself at least, i'm not just looking for what the best visual experience i can get is but i'm also looking down the road. i like to keep my hardware until its firmly beaten into the ground. for that reason, raw performance and not just say how one card handles AA better than the next is a factor. i'll happily turn down or turn off some detail settings if it means i can get my video card through another 6 months.

but given the fact as to how things are changing with benchmarks...you can't trust the developers. someone will fake it. maybe some kind of external equipment could be devised? something along the lines of a high speed camera but i'm not too sure how.
 
Ha! Sorry, damn autocorrect.

*Frame Times in 2D graph-form are a FANTASTIC way to get a really good feel for how a game runs. Its not the whole story, but it sure is a HELL of a lot more than FPS alone. It's also a much MUCH more accurate representation, as 'FPS' measures how many frames are packed into one second, which seems silly if you represent a graph with a resolution of higher than 1 second: The frametimes are measured in native milliseconds and then converted into an FPS 'estimation'. a single frame is not rendered in '60FPS', its rendered in 16ms, it just so happens that you could fit 60 of those frametimes in one second. Whereas frametimes measure actual delay between frames, illustrating hitches, smoothness, and stuttering.

I see it a LOT in [H] articles "The game just didn't feel right on X card, even though it benchmarked higher". Measuring and representing frametimes means that highly subjective statement can be quantified into an objective, scientific value: Which is really what benchmarking is ALL ABOUT, amiright?

No, playing a game is a FANTASTIC way to get a really good feel for how a game runs. You do not need a graph for that at all.

HardOCP and our reviews is the reason NVIDIA developed the FCAT frame time software. We were asked to be part of that from the get-go and we declined. Ryan at PCPer took up that mantle.

I would suggest that FCAT, much like framerates but to a lesser degree, are as worthless. FCAT is a good tool to use to show your readers the "problem" in data form.....but at that point if you use real gameplay, you ALREADY know there is a problem. (Now if you run canned benchmarks, you would have no idea, but I am not tackling that argument since it does not impact HardOCP.) My take on FCAT is, if I already know I have a problem, then I do not need a tool to quantify that for me, because my real world gaming experience has already told me there is an issue. Yes, some like a data point attached to that, but I decided against adding another data point to GPU reviews a long time ago. No one ever played a game and went, "This game played awesome, but these FCAT numbers are shit so this video card gets a bad review!"

I have always seen FCAT numbers as somewhat a waste of time and a point of fact is that they are extremely resource intensive to collect.
 
How could a game be tested for responsiveness and lag (input or video) other than as FPS? (I agree fps is largely useless)

What other things could we look at, point at, mark, score, or rationally and unbiasedly discuss?

Other than with artificial, controlled input to mouse/keyboard could we even test responsiveness?

I play a lot of MMO/RPG type games. After I jump into the world almost all of them will choke big time if I try to spin in place (the first time anyways) but give them some time and they are ok.

How would Kyle or anyone measure that and put it on a review?

People want numbers <==> Numbers Lie

I am curious what other systems we could employ, cause I agree that just posting a bunch of FPS graphs doesnt give much useful data.
 
You know, an interesting experiment might be to do some "blind" tests. Have a number of testers/reviewers play the selected games, but have a different individual handling configuration of both the hardware installed and software settings. The reviewers would record their impressions before getting to find out the configuration and settings are. I imagine some of the testers might be surprised at the results without "data" to color their impressions.
 
Isn't the whole purpose of a benchmark to collect performance data and present it to your readers?

You can type whatever you want in your benchmarks, but if you don't provide the hard data or a source for your hard data you are providing your readers with nothing. It would be like writing a thesis paper without any sources. You DO need benchmark data to review a video card. That is your evidence, the evidence that you put a card through its paces. The evidence that this card is objectively better or worse than another card. The evidence that readers use to base purchasing decisions on. You would be eliminating the validity of your own benchmarks by omitting this information. You could play a game for 20 minutes and say, "Well, it ran fine". That means nothing to a reader. Hard data should be presented to the reader and it should be their decision if the frame rates shown are "fine" or not. When Batman was released some people had no issues with the low frame rate. Some people said it was an unplayable mess. If you did a Batman benchmark without frame rate data, no matter if you said it ran fine or it ran like crap, you can't decide that for individual people.

If you want to write articles about hardware similar to your "4 Weeks with NVIDIA TITAN X SLI at 4K Resolution" article, you have every right. But a benchmark without data is not a benchmark. It is not a "sad thing" as you put it, it is the reality and a requirement.
 
Framerate data is crucial for comparison purposes. It isn't everything, but without it a review is of limited value.

A review NEEDS an objective measure. Without it any review is just a subjective mess. Subjectivity is the devil.

If - however - you can turn subjective opinions into an objective measure (like frame time, jitter, etc.) then this information is very useful too, but conclusions NEED to be made on objective grounds, or the review isn't worth even the time spent reading it.

The editors opinion on their experience IS valuable, but it is valuable in order to add to the objective data, not replace it.
 
Isn't the whole purpose of a benchmark to collect performance data and present it to your readers?

HardOCP has not run benchmarks for more than a decade. We have simply shown framerate data taken from real world gameplay and given our opinions on the video cards' performances in games based on the actual gameplay. The framerate data is just there to show support for our conclusions. Which is back to my point, do we really need that data at all? And is it meaningful?

It would be like writing a thesis paper without any sources.
This is sort of what I feel like we might be doing now. People know the references are there, but they don't look at those, as a couple people above have said, which is something I really did not think about till today as actually happening.

You could play a game for 20 minutes and say, "Well, it ran fine".
Well, we could, but we would never do that.

When Batman was released some people had no issues with the low frame rate. Some people said it was an unplayable mess. If you did a Batman benchmark without frame rate data, no matter if you said it ran fine or it ran like crap, you can't decide that for individual people.
I have never had any running Batman, so I could have shown you that with a fancy graph running great on my machine, and it still would not have changed the reality of everyone else's situation. Again, a graph that means nothing...
 
A review NEEDS an objective measure. Without it any review is just a subjective mess. Subjectivity is the devil.

If that's the case, why do reviews at all? Just post a chart with framerate data and call it a day. "Subjectivity" often includes the context needed to make sense of the data.

Also, if the original point regarding benchmarking being fundamentally changed in Win 10 is accurate, most users would be ill-equipped to determine whether or not the data provided is accurate.
 
This whole debate, for me, boils down to getting back to the basics.

The goal of more expensive video cards is better performance and features that allow me to turn up the image quality settings in my game at the resolution I play at and have a smooth and responsive experience as I play. I buy faster video cards to max out the graphics and play new demanding games at great performance, that is the goal.

If that is the goal, then the only way to quantify which video card is "better" is to evaluate which one allows me to turn on and up more graphics settings at a specific resolution. The video card that allows me to run the game with higher settings, and smoother responsiveness, wins.

In that way, you can then compare different video cards, find out the image quality that they allow in games, and compare them by price, the one with the lowest price, and highest image quality settings shows you the best value.

It is in fact, what we've been doing for a very long time now. If that is the goal, and that is the answer, framerate numbers are irrelevant. It all depends what you are trying to show and what your reasons are for purchasing a video card.

The catch is, this method of evaluation is among the hardest and most time consuming to do, it requires hands on game playing and experience over time playing games on many graphics cards. The longer you play, and the more cards you test, that experience goes a long way to relating to readers which card is best for gaming and how they compare, you get a real good feel how video cards compare, and I personally have been doing video cards since the early-mid 90's, that's a lot of experience that has taught me a lot about how video cards perform in games.

We've learned this over time, framerates, lie. They just do. What you see as a framerate, and what you feel while playing (responsiveness and smoothness) are not the same thing. If framerates lie, then they are not so objective as people want them to be.

I think we are all want some kind of imperial evidence, simply because that's how things have been done since the beginning, and most of us are very scientifically minded. However, science isn't necessarily the best "benchmark" for analyzing the gameplay experience.
 
While I understand and appreciate Kyle's perspective (and in many ways agree), benchmarking data is immensely useful for evaluating relative performance (my system at home vs the evaluation system). I am probably in a very small minority, but as an overclocker, I derive an inordinate pleasure from squeezing every last bit of performance out of my system. This is independent of my experience actually playing the game - as noted, a qualitative assessment by a trusted reviewer is more than enough to convince me that Card X is capable of playing Game Y.

I probably spend more time benchmarking than I do actually playing games. I know this is strange, but given that there are threads on this very forum devoted to the topic, I can only imagine that there is a desire to see FPS/benchmark data if only to provide you a point of reference.
 
VooDooPC - Do you read our reviews here now and if you do, do you let those influence your GPU purchasing decisions?
 
While I understand and appreciate Kyle's perspective (and in many ways agree), benchmarking data is immensely useful for evaluating relative performance (my system at home vs the evaluation system). I am probably in a very small minority, but as an overclocker, I derive an inordinate pleasure from squeezing every last bit of performance out of my system. This is independent of my experience actually playing the game - as noted, a qualitative assessment by a trusted reviewer is more than enough to convince me that Card X is capable of playing Game Y.

I probably spend more time benchmarking than I do actually playing games. I know this is strange, but given that there are threads on this very forum devoted to the topic, I can only imagine that there is a desire to see FPS/benchmark data if only to provide you a point of reference.

I agree, and sometimes I go through phases. Some times I get a bug-up-the-ass and I go serial-killer-crazy trying to clock the Hell out of my rig. Other times I REALLY just want to sit back and enjoy games.

Its like ANY hobby: People enjoy building and painting their RC planes as much as flying them.
 
I guess, for me, I've based most of my video card decisions on [H] reviews. I'd have to gut feeling that 1/3 of my choice is on framerate charts and about 2/3 of it based on the subjective 'feels' experience along with price. I also know I don't game higher, now, than 1600 and don't need all settings cranked. I don't think I'd really miss frame graphs. I suppose, if it's possible in the future with DX12 and tools developed that worked, you could pick out specific cases where drops occurred or anomalies showed up. Again, assuming tools are available that work. And, I suppose, someone out there somewhere will have graphs posted if one were inclined to see them. Just seems like DX12 might turn all of our high dollar hardware into Xbox's down the road.
 
If that's the case, why do reviews at all? Just post a chart with framerate data and call it a day. "Subjectivity" often includes the context needed to make sense of the data.

Also, if the original point regarding benchmarking being fundamentally changed in Win 10 is accurate, most users would be ill-equipped to determine whether or not the data provided is accurate.

No, as I stated, the experience and opinion of the reviewer IS important, especially as tests become more difficult to interpret.

I agree wholeheartedly with the [H]'s long standing review philosophy that real world performance is what matters. Canned benchmarks are useless at best, and outright deceiving at worst.

I just disagree with the sentiment that the data doesn't add value. The problem with subjective opinion is that everyone experiences things differently.

I like the methodology that exists, where framerate tests (and if significant, other tests, including frame time, jitter, etc) are the most significant, but if the reviewer runs these, and something seems off, that's when we need them to dig deeper.

Eliminating the objective basis of the review would - IMHO - just harm it.

Just because a difference in performance between two cards is not perceptively different to the reviewer, does not mean that this will be the case with all, or even most of the readers. If these disagreements we have on here all the time conclude anything, it is that different people have different criteria, and different things they notice, oar sensitive to, or completely miss. Having an objective basis for the review - even if that objective basis is imperfect - at least helps frame the real of discussion. Without it we are just discussing the "soft" stuff, you can't pin down.

Obligatory Lord Kelvin quote:

"In physical science the first essential step in the direction of learning any subject is to find principles of numerical reckoning and practicable methods for measuring some quality connected with it. I often say that when you can measure what you are speaking about, and express it in numbers, you know something about it; but when you cannot measure it, when you cannot express it in numbers, your knowledge is of a meagre and unsatisfactory kind; it may be the beginning of knowledge, but you have scarcely in your thoughts advanced to the state of Science, whatever the matter may be."
 
You know, an interesting experiment might be to do some "blind" tests. Have a number of testers/reviewers play the selected games, but have a different individual handling configuration of both the hardware installed and software settings. The reviewers would record their impressions before getting to find out the configuration and settings are. I imagine some of the testers might be surprised at the results without "data" to color their impressions.

Not a bad idea, but probably cost prohibitive in many or most instances, I would guess.

Kyle, I might glance at the graphs but I pay most attention to the intro, features, and conclusion sections.

I have appreciated the stance you have taken on canned benchmarks and real-world experiences in hardware evaluations.

Have you ever considered video reviews or including gameplay or other videos within the review. I'm sure it's been done but not as an integral component. Perhaps it was not cost effective in the past but that shouldn't be an issue now.

The biggest hangup about not including frame rates or other objective data seems to be trust. Trust that no one will lie or skew the review. Trust that my experience will be the same as yours even under the same circumstances. While you may have my trust, that trust is limited and is supported by the objective data. Others have lost my trust and still others have never had it.

Anyone who would buy solely on frame rates is not a very smart consumer in my book. It's like buying a TV based solely on resolution with no thought as to clarity, color, motion blur, and so on. Just not very smart.

When I read a review I want attention paid to more than just how many frames the hardware spits out. That can be manipulated anyway (and we've seen just that), sometimes at the expense of picture quality. Sure, I want to zoom in to see certain aspects of the card closely, but that overhead view at the whole is just as or more important.

Context matters. Data points and subjective impressions both have to be taken with a grain of salt. Having both helps, IMHO, to a discerning reader.
 
Personally, I find the framerate data very interesting. I appreciate that when it comes down to whether card x can play game y frame-rate (particularly average fps) really doesn't tell you the whole story and can be outright misleading, but everybody has different tastes. If you want a game to run at 144 fps, or whatever the number may be that's interesting. Lots of people have specific targets in mind, there's playable there's silky smooth :). I think you'd lose people by omitting it.
 
When I decided to go with less than top of the line for my always on gaming computer, I NEEDED to know what was the very lowest card that would play at a decent framerate the games that I play. As my main computer is now in the living room, always on, and always connected to the big LCD TV screen as the second monitor, I needed to know if my games would run well at 1200 and 1080 resolutions for either screen. Having been a Radeon buyer for 2 decades because of the better quality image they used to have (I know, I know nVidia image is subjectively the same), I was very interested in a mid-range card and I didn't limit my choice to Radeon. I perused this site and a few others gobbling up benchmarks before I made my decision. Oh, I read the whole articles and whether a game feels slow or didn't quite feel/run correctly in spite of the benchs was very useful info also. I still want both - sorry. I just like to see the numbers. I made a sound decision and for the first time in years, I am running a mid-range quieter, cooler card (not to mention cheaper) and am way more than satisfied. I am also one of the ones that had a really bad experience with Batman the first couple weeks, and I feel safe in saying that had a chosen one of the borderline benchmarking cards, my experience would have been way worse. If I had just ordered the standard $600 card this time instead of the $250-$350 category one, I am sure all things would have run just fine, but that is not what I was after this time and I feel that benchmarks showing specifically framerates more than helped me make my decision. I could be wrong but I really like seeing them.

Like many others that have read this board for a couple decades, I am tech-support and resident guru for tons of family and friends. I try to stay on top of what a certain build is going to need and exactly how much computer is needed for x and y jobs/uses. I have made many informed recommendations to the ones I serve in that capacity based on all of the available info/benchies/framerates etc. I really like having the whole article (especially from here that goes above and beyond the framerates) with the numbers to help make decisions. That's my 2 pennies anyways!
 
Imho FPS numbers are kinda like RPM meter on a car's dashboard, provides a vague measurement of performance but not very useful really.

I think frame times (ala techreport) are where it's at if you want objective numbers that may actually relate more closely to a gamer's actual experience. Absent frame time results I try to make a poor approximation using minimum, maximum and average frames to get a vague impressions about frame fluctuations, the stuff that often can mean the experience jerky.

So basically I'm saying I go with the reviewer impressions as well when trying to get a feel for what the game's actually doing, with the FPS as condiments on the side.

In short:
I half agree about the FPS numbers simply being there to attract eyeballs - like laser pointers for cats - with little actual value, but I'm arriving at your conclusion from a different angle. I think objective numbers are useful, it's just that FPS counts are not the numbers I'm after personally.

I really wouldn't miss the FPS numbers that much, the one thing they do is help keep the review's reading material from feeling "dry" so to speak.

Just my 2 cents.
 
Last edited:
i like to look at the min fps data per game see what kind of numbers it can dip too. As long as you all keep defining what you call "playable" in your test set i'll keep reading. But i hope the numbers dont ever go away.
 
I'm with a few others here. Framerates do not really matter to me. When I'm reading a video card review here, I read the Intro, the "Power, Temperature and Fan Noise" page, and the Conclusion. I will flip through the individual game pages, but that is more to see what features you enabled/disabled to get a good playable experience. I do not really even look at the framerate graphs. So, if you dropped them, I will still be coming to this site to read your reviews.
 
So in theory...

History of GPU reviewing in a nutshell: FPS, frametimes, architectural advantages, clock rates comparisons etc

Future in a nutshell: No fps or times, just an account of an overall experience which would continue to point out architectural fundamentals.

Mmm. I guess this would be okay for the less savvy/enthusiastic consumer, but then I'm in the boat of wanting to see MIPS ratings through some sort of standard return to CPU reviews.... Would probably be a blessing in disguise for the companies themselves and marketing divisions of respective sub-vendors , cos yeah the mentality of that 1% difference (example number) can sway a lot of minds.

Maybe one day, one day, GPUs and CPUs will be so powerful there will be no need to review anything!
 
I can definitely say that getting rid of the FRAPS overlay gives me more enjoyment out of a game. With it up I'll wonder why I'm only getting 50fps and I'll try to figure out what I can do to get that up. Without the meter I'm just playing the game and having a good time.
 
Back
Top