The simple fix is G-Sync. Ever since I purchased my G-Sync monitors I stopped caring about overall FPS reviews, etc. Everything looks amazing and fluid all of the time. Problem solved. :)
 
1.) There's definite value in [H]'s subjective reviews. E.g. "We jacked up the ____ detail on this card, and couldn't on that one. Experience-wise, they're still pretty much the same." Or the reverse. It gives us an idea what's valuable settings wise and whether or not the card will get there. Especially since you guys are consistent, so I can kinda calibrate my own values off your reviews.

2.) I still like the FPS graphs, but pretty much eyeball the median FPS and look for long stretches of FPS below that median and especially for huge dips. Pretty much don't care about above-median. Frame times would be even better, IMO, but don't know if that's harder/easier to capture. Shockingly, when those charts are bad, you guys tend to complain about gameplay. :)

The data complements the review and vice versa. I think we'd be losing something if we had one and not the other. I likewise would not want to run the direction of audio, where it's become a world of subjective gobblygook--where "quality" is derived from the amount of $$ spent and how impressive your system looks rather than its performance (in a blinded format).
 
i'm not sure what the answer is here, but i can share what i have found helpful from reviews (not just here) and why:

side by side image quality has been very helpful. i find often the difference are very subtle and i have trouble finding them just from an image. maybe use video to this end? i can see split screen 1080p (one card on each half) being fun.

power, temperature, noise and overclocking will remain unchanged. the big thing is how smooth my gameplay is. some cards/drivers often give little hiccups in certain games. some focus on that end might help.

unfortunately framerates are important to some degree. i even find straight up 'apples to apples' helpful at times when cards are very close in performance. for myself at least, i'm not just looking for what the best visual experience i can get is but i'm also looking down the road. i like to keep my hardware until its firmly beaten into the ground. for that reason, raw performance and not just say how one card handles AA better than the next is a factor. i'll happily turn down or turn off some detail settings if it means i can get my video card through another 6 months.

but given the fact as to how things are changing with benchmarks...you can't trust the developers. someone will fake it. maybe some kind of external equipment could be devised? something along the lines of a high speed camera but i'm not too sure how.


This explains my perspective, as well.

Framerate is unimportant when compared to smoothness.

As far as benchmarks not being worthwhile, the motherboard benchies showed the DPC latency issues with Gigabyte and MSI and how they were caused by the bios settings. (Errr, I think.)

Benchmarks have a place, but they are a secondary place to image quality and smoothness of gameplay.
 
Well, here's how I kinda see it.

FPS by itself is pretty useless. Canned benchmarks, also pretty useless.

I do read reviews here, I really like the comparison between the Highest Playable Setting versus Apples to Apples test - I think that tells a lot about a set of cards on a particular piece of software.

The issue with removing FPS all together - right now, that's a key data point in the "Highest Playable Setting" analysis. I mean, let's go back to the last pure video card review - the XFX R9 380:

In GTAV the XFX R9 380 DD BLACK Ed. OC was playable with all settings at the "Ultra" or "Very High" level, but each option was at the highest setting. The grass quality was set to "Ultra" as well, and no advanced graphics were enabled. It averaged 50.8 FPS with these settings.

The NVIDIA GeForce GTX 960 was playable with the same graphics options enabled, but performed slightly slower than the XFX R9 380 DD BLACK Ed. OC. It averaged 48.4 FPS.

Once we overclocked the XFX R9 380 DD BLACK Ed., we found it was capable of sustaining quality gameplay with "High Resolution Shadows" enabled from the advanced graphics menu. It could not utilize any other advanced options. It averaged 49.6 FPS.

Your using FPS as one of the metrics to determine if a title is playable or not with a particular feature set enabled. Now, maybe your just doing that to pay lip service to the people that ~need~ qualitative data, and the entire "Highest Playable Settings" evaluation really is just 100% subjective, I don't know, it's not particularly clear to me just reading the reviews and not digging into the background of the testing methodology.

I know that personally, I will accept lower levels of performance than a lot of people will. 30FPS is fine for me, varying FPS is fine for me (so long as it isn't extremely noticable or induces tearing), I don't mind turning a few features off or down to get a game to run. Not everyone is that way. What I would consider acceptable, if I were to write up a "Highest Playable Settings" for a review, would probably be drastically different than most people on this forum. The FPS may be useless by itself, but at least it provides a qualitative data point in an otherwise entirely subjective analysis.

I wouldn't be opposed to replacing FPS with something else that's qualitative, but I can't see a review holding any weight at all if it's 100% subjective, even if I trust the reviewers.

In full disclosure, I don't base my GPU purchases on H reviews (although I very much do PSUs, specifically because of the emperical data - although I do admit that I cheat and jump to the last page to see if the unit Passed or not - it's because of all that data that I trust the bottom line analysis). GPUs are much more subjective, I usually start with the budget for the build, and try to get the best bang for that buck. Reviews help to determine that, but usually the reviews here are so limited, because you try to go into such deep detail, that they only present a very few number of options, compared on a very limited selection of software.

I lean more on Anandtech's GPU benchmark data base when comparing GPUs - it's just straight numbers across any pair of GPUs you want to compare, against dozens of canned scenarios. Sure, some particular ones may get gamed, but a comparny would have a hard time gaming all of them. And yes, there isn't really any subjectiveness to that analysis, I am just seeing some basic FPS numbers, and there could be serious issues with particular titles, but for the most part, on GPUs, I 'm just looking for the best bang for the buck, and a wide swath of objective numbers from which I can compare any selection of cards against (based on whatever is available that day) gets me there better than a really deep subjective review on a very limited scope does. Just using the XFX R9 380 review again as an example, if I wanted to compare that card against anything besides a 960GTX, I'd have a hard time inferring that from just the H reviews, especially when the software selection gets changed up, or trying to compare across generations of cards (like determining if an upgrade is worth while or not)
 
Last edited:
It's true, you do go back and re-evaluate occasionally on new drivers and game patches.

But.

Let's say I was evaluating if I should upgrade from a 780Ti or not. Not an uncommon scenario, and just one "For-Instance" case because I had mentioned it previously.

Do you have a 780Ti review with the current nVidia driver, current games, on a current platform?

If you don't want to show old data, did you pull the previous reviews for the 780Ti?

Yes, you are absolutely right that the Anand and Tom's benchmark lists have out of date data. But at least it's data - I can, being an intelligent person, realize that a driver or patch may swing a certain benchmark +/- a few percent, and I certainly recognize that patches and updates are continuously being issued. But it usually won't change the overall picture much.

We RETEST every scenario when there is a new driver or patch. We hardly every re-use data, and if we do, we are specific about it.

If ancient data on old drivers, and old patches, and old hardware, and that is is good with you, you know where to get it....but it ain't going to be at HardOCP. :) You think it does not change the overall picture, but you do not KNOW that. Brent and I like to KNOW what we are talking about.
 
When I decided to go with less than top of the line for my always on gaming computer, I NEEDED to know what was the very lowest card that would play at a decent framerate the games that I play. As my main computer is now in the living room, always on, and always connected to the big LCD TV screen as the second monitor, I needed to know if my games would run well at 1200 and 1080 resolutions for either screen. Having been a Radeon buyer for 2 decades because of the better quality image they used to have (I know, I know nVidia image is subjectively the same), I was very interested in a mid-range card and I didn't limit my choice to Radeon. I perused this site and a few others gobbling up benchmarks before I made my decision. Oh, I read the whole articles and whether a game feels slow or didn't quite feel/run correctly in spite of the benchs was very useful info also. I still want both - sorry. I just like to see the numbers. I made a sound decision and for the first time in years, I am running a mid-range quieter, cooler card (not to mention cheaper) and am way more than satisfied. I am also one of the ones that had a really bad experience with Batman the first couple weeks, and I feel safe in saying that had a chosen one of the borderline benchmarking cards, my experience would have been way worse. If I had just ordered the standard $600 card this time instead of the $250-$350 category one, I am sure all things would have run just fine, but that is not what I was after this time and I feel that benchmarks showing specifically framerates more than helped me make my decision. I could be wrong but I really like seeing them.

Like many others that have read this board for a couple decades, I am tech-support and resident guru for tons of family and friends. I try to stay on top of what a certain build is going to need and exactly how much computer is needed for x and y jobs/uses. I have made many informed recommendations to the ones I serve in that capacity based on all of the available info/benchies/framerates etc. I really like having the whole article (especially from here that goes above and beyond the framerates) with the numbers to help make decisions. That's my 2 pennies anyways!

I'll cast my opinion that I need numbers so that I can verify someone's opinion. Relying on just opinion in any circumstance is widely seen as stupid.

Too all the people claiming that numbers are useless, so why do we have science again? Oh yes, that thing about testing, validating, and being able to reproduce what someone else claims as true. It's called the Scientific Method.

I personally do not understand the comments about "like a speedometer it only gives a small reading for a small amount of time, it's useless". Really? So do you drive your car with the speedometer disabled, or what? I did drive a vehicle once with no speedometer, and I can tell you it's really rough to try to guess how fast you are actually going. Try it.

There are so many times I thought the performance was slow, and to make sure I used fraps and was able to dial in a good config. Without it much more guess work, and why try to do that when you can actually know? The world today astounds me.
 
Last edited:
Do you have some evidence that MS won't be able to implement adaptive sync in the future since all the displays are going that way?
By how it was ferried here, the information, it looked a lot like MS was heading toward adaptive-sync-like uniformity. It looked like they were attempting to champion the issues that have been brought up in the last few years. I mean good for them for trying to make the system better, I am all for that, but the road to hell is paved with good intentions.
 
By how it was ferried here, the information, it looked a lot like MS was heading toward adaptive-sync-like uniformity. It looked like they were attempting to champion the issues that have been brought up in the last few years. I mean good for them for trying to make the system better, I am all for that, but the road to hell is paved with good intentions.
What about people that still perceive tearing with Adaptive vsync? It's way way better than no vsync, but it's there. Also MS should be flooded with complaints about what they are doing with not being able to disable vsync if you desire, no support for g-sync and no way to test the application's performance with fraps and the like. I guess from now on I will buy both graphics cards and test them myself and decide which I like better if I can't get a review that I feel showcases the card's strengths and weaknesses. Be ready for lots of returns retailers.
 
What about people that still perceive tearing with Adaptive vsync? It's way way better than no vsync, but it's there. Also MS should be flooded with complaints about what they are doing with not being able to disable vsync if you desire, no support for g-sync and no way to test the application's performance with fraps and the like. I guess from now on I will buy both graphics cards and test them myself and decide which I like better if I can't get a review that I feel showcases the card's strengths and weaknesses. Be ready for lots of returns retailers.
Unfortunately I don't think any of this is concrete as far as information goes, mostly speculative. DX12 is more of a per game than a per driver so I am not entirely sure how this will shake out. I think it is best we wait a bit, maybe hammer the reviewers to ask their contacts the questions we would like answers to but to base any kind of final decision on the information we have now would be far to premature.
 
I'm watching Ryan Shroud's feed tonight to see if they have discussion on his article about FCAT and AotS
 
Unfortunately I don't think any of this is concrete as far as information goes, mostly speculative. DX12 is more of a per game than a per driver so I am not entirely sure how this will shake out. I think it is best we wait a bit, maybe hammer the reviewers to ask their contacts the questions we would like answers to but to base any kind of final decision on the information we have now would be far to premature.
Yep, I see that too. Just chiming in on the current situation. I hope it will change. I'm fine with my GPU right now, will probably be a while until I spend for a new one, luckily, so I will see how it pans out. Thanks too for your insights.
 
I'm watching Ryan Shroud's feed tonight to see if they have discussion on his article about FCAT and AotS
Hopefully he will cover the interviews others were having where they might ask MS about all this. And double hope that MS gave straight answers.
 
What about people that still perceive tearing with Adaptive vsync? It's way way better than no vsync, but it's there. Also MS should be flooded with complaints about what they are doing with not being able to disable vsync if you desire, no support for g-sync and no way to test the application's performance with fraps and the like. I guess from now on I will buy both graphics cards and test them myself and decide which I like better if I can't get a review that I feel showcases the card's strengths and weaknesses. Be ready for lots of returns retailers.
We're talking about FreeSync and not a Adaptive VSync. Where all apps would get locked to a moving refresh rate with no tearing. FreeSync, unlike GSync, did get adopted into the standards for displayport and works over HDMI.
 
We're talking about FreeSync and not a Adaptive VSync. Where all apps would get locked to a moving refresh rate with no tearing. FreeSync, unlike GSync, did get adopted into the standards for displayport and works over HDMI.
Oh. He said "adaptive-sync-like" so that means g-sync and any other g-sync like tech? I didn't see a definition, and I thought he meant Adaptive Vsync, so sorry about that. I thought MS had really gone full special and was going to force Adaptive vsync on all of us, because "reasons". Thank you.
 
Hard numbers mean something.
Keep up with what you're doing. Might mean a lot more man hours invested with differences in technology but I guarantee alot of followers need the data the way it's currently being presented.
I certainly do and certainly appreciate it.
 
Did not read thread, only first post so this was probably already stated.

The problem with stating that a game is playable or not is very subjective. Some people claim that anything less than 90fps is unplayable, others 30.

FPS data lets the readers make thier own conclusions. I do not agree though that the game makers need to have a built in benchmarking tool to provide the data, thats just retarded. Yes its nice, but there are better dedicated tools out there. I'd rather use the same tool for all games with the same data types, than inconsitant data that game makers provide, hell most games dont even have proper options for PC let alone dedicated benchmarking tools.
 
While I agree that the data helps to back up the subjective part of a review, if I trust the reviewer then the numbers become less important. I understand that reviews are subjective. However, after coming here for years (a lot longer than my join date suggests), I have developed a level of trust with the what the authors of reviews here are telling me. It is like my brother telling me that Game A sucks, or Game B has technical issues, such as crashing or stuttering. I do not need any kind of numbers from my brother. If he is telling me Game A sucks, then I will probably avoid that game as chances are that I will find it sucks too. I cannot recall a single time where a reviewer here has steered me wrong. That track records means that I am going to trust the reviews here are accurate, even if they are completely subjective with no data backing it up. If they started to steer me wrong, then it would break that trust and they would lose a me as a long time reader.
I'm just talking about basic human behavior. Nothing about Kyle in particular.
 
Oh. He said "adaptive-sync-like" so that means g-sync and any other g-sync like tech? I didn't see a definition, and I thought he meant Adaptive Vsync, so sorry about that. I thought MS had really gone full special and was going to force Adaptive vsync on all of us, because "reasons". Thank you.
  • Adaptive Sync is what the DisplayPort standard is called for enabling variable refresh rate.
  • FreeSync is the driver implementation by AMD that works with Adaptive Sync to get variable refresh rate with their hardware.
    • It seems Microsoft would like Adaptive Sync support to be handled by the operating system instead of video cards and their drivers. It sounds good because it would make it a standard feature for Windows that is hardware agnostic. Unfortunately in its current state this is the inferior technology compared to G-Sync.
  • Adaptive V-Sync is a NVIDIA driver feature that turns off V-Sync when the framerate is less than the refresh rate.
  • G-Sync is a proprietary variable refresh rate technology from NVIDIA that replaces the hardware scalar in a monitor, while drivers tell supported video cards to send frame information over the video output to the scalar when it detects a G-Sync display.
 
  • Adaptive Sync is what the DisplayPort standard is called for enabling variable refresh rate.
  • FreeSync is the driver implementation by AMD that works with Adaptive Sync to get variable refresh rate with their hardware.
    • It seems Microsoft would like Adaptive Sync support to be handled by the operating system instead of video cards and their drivers. It sounds good because it would make it a standard feature for Windows that is hardware agnostic. Unfortunately in its current state this is the inferior technology compared to G-Sync.
  • Adaptive V-Sync is a NVIDIA driver feature that turns off V-Sync when the framerate is less than the refresh rate.
  • G-Sync is a proprietary variable refresh rate technology from NVIDIA that replaces the hardware scalar in a monitor, while drivers tell supported video cards to send frame information over the video output to the scalar when it detects a G-Sync display.
This is good to know, thank you. Adaptive Vsync has helped my performance in several games also, people may want to enable it to see if it helps. Some games barely hit 30 fps, and with Adaptive Vsync 60 and only as low as 45 occasionally.

A couple of people mentioned they like a smooth 30 fps vs. sometimes 60 and sometimes 45. Of course sometimes it's better to just get used to it and allow it to get 60. Another tip is to limit the framerate to 30 or so with nvidiainspector, and then you don't get that jarring faster slower feel in your favorite 3D engine. So many options, although it pays off to test them.
 
ok this is going to sound like a stupid question... so fraps runs on windows 10, I know I have run it... it can not hook into an exe... that makes sense... so when it runs with the default profile or with the game profile that is the default setting most people will use... why is that an issue? is it too expensive bandwidth wise to post fraps captures of the tests they ran with the counter at the top so people can see an image with a number? I look at the graphs to see if it ever drops below thirty fps... because no those are going to give me a splinting head ache. but I have played games that were locked at forty to fifty frames that looked great and footage that was locked at 24 fps that looked great. I have seen footage of games at 150 fps that had tearing and low resolution mess that just looked like it could have used the higher resolution version of the assets they were likely made and not used to run on a console or hit a target frame count... I understand having a low version of games that runs on enough machines to play for the development... you need to start high and remove edges until you lose the silhouette. if you do it the other way you get a triangles representing circles that only look go from one angle.
so why not just let them optimize the drives and when they make the game look bad post that and highlight the higher numbers... and when it the games run at 40 fps and look like they are running smooth as glass... highlight that.
I mean really do we need to look look at long bar graph to tell us what looks good and what does not? I have to be missing something...
 
  • Adaptive Sync is what the DisplayPort standard is called for enabling variable refresh rate.
  • FreeSync is the driver implementation by AMD that works with Adaptive Sync to get variable refresh rate with their hardware.
    • It seems Microsoft would like Adaptive Sync support to be handled by the operating system instead of video cards and their drivers. It sounds good because it would make it a standard feature for Windows that is hardware agnostic. Unfortunately in its current state this is the inferior technology compared to G-Sync.
  • Adaptive V-Sync is a NVIDIA driver feature that turns off V-Sync when the framerate is less than the refresh rate.
  • G-Sync is a proprietary variable refresh rate technology from NVIDIA that replaces the hardware scalar in a monitor, while drivers tell supported video cards to send frame information over the video output to the scalar when it detects a G-Sync display.
How if Gsync superior?
 
ok this is going to sound like a stupid question... so fraps runs on windows 10, I know I have run it... it can not hook into an exe... that makes sense... so when it runs with the default profile or with the game profile that is the default setting most people will use... why is that an issue? is it too expensive bandwidth wise to post fraps captures of the tests they ran with the counter at the top so people can see an image with a number? I look at the graphs to see if it ever drops below thirty fps... because no those are going to give me a splinting head ache. but I have played games that were locked at forty to fifty frames that looked great and footage that was locked at 24 fps that looked great. I have seen footage of games at 150 fps that had tearing and low resolution mess that just looked like it could have used the higher resolution version of the assets they were likely made and not used to run on a console or hit a target frame count... I understand having a low version of games that runs on enough machines to play for the development... you need to start high and remove edges until you lose the silhouette. if you do it the other way you get a triangles representing circles that only look go from one angle.
so why not just let them optimize the drives and when they make the game look bad post that and highlight the higher numbers... and when it the games run at 40 fps and look like they are running smooth as glass... highlight that.
I mean really do we need to look look at long bar graph to tell us what looks good and what does not? I have to be missing something...
The Fraps/FCAT issue is with DX12 and how MS is using overlays. They may likely get updates that could fix said issue, albeit a timely venture. Also AMD claims they will add DirectFlip which is what allows Nvidia to circumvent the Frame cap of 60 when Vsync is off.
 
Well I can see his point of view, publishers make a good chunk of change as they should, they are the ones that put the money down to get a game on the selves up front, they are the ones that get press releases, advertisement spots etc. They are the ones that may help in development funds too. What Microsoft can do is force publishers and game developers to use their store and only their store, if they start dropping win32. All other revenue from other stores will pale to it since 95% of pc gaming is on windows? Also piracy is a big problem with current games, even for steam games, so if the Microsoft store stops this that's more a bigger push for game developers to start using the ms store.
 
Epic Games have been best chums with Microsoft for quite a while, so if one of those good pals thinks that they have to call out to Microsoft Publicly i will give them the benefit of the doubt.
 
Back
Top