Sprayingmango
[H]ard|Gawd
- Joined
- Jan 20, 2012
- Messages
- 1,259
The simple fix is G-Sync. Ever since I purchased my G-Sync monitors I stopped caring about overall FPS reviews, etc. Everything looks amazing and fluid all of the time. Problem solved.
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
i'm not sure what the answer is here, but i can share what i have found helpful from reviews (not just here) and why:
side by side image quality has been very helpful. i find often the difference are very subtle and i have trouble finding them just from an image. maybe use video to this end? i can see split screen 1080p (one card on each half) being fun.
power, temperature, noise and overclocking will remain unchanged. the big thing is how smooth my gameplay is. some cards/drivers often give little hiccups in certain games. some focus on that end might help.
unfortunately framerates are important to some degree. i even find straight up 'apples to apples' helpful at times when cards are very close in performance. for myself at least, i'm not just looking for what the best visual experience i can get is but i'm also looking down the road. i like to keep my hardware until its firmly beaten into the ground. for that reason, raw performance and not just say how one card handles AA better than the next is a factor. i'll happily turn down or turn off some detail settings if it means i can get my video card through another 6 months.
but given the fact as to how things are changing with benchmarks...you can't trust the developers. someone will fake it. maybe some kind of external equipment could be devised? something along the lines of a high speed camera but i'm not too sure how.
In GTAV the XFX R9 380 DD BLACK Ed. OC was playable with all settings at the "Ultra" or "Very High" level, but each option was at the highest setting. The grass quality was set to "Ultra" as well, and no advanced graphics were enabled. It averaged 50.8 FPS with these settings.
The NVIDIA GeForce GTX 960 was playable with the same graphics options enabled, but performed slightly slower than the XFX R9 380 DD BLACK Ed. OC. It averaged 48.4 FPS.
Once we overclocked the XFX R9 380 DD BLACK Ed., we found it was capable of sustaining quality gameplay with "High Resolution Shadows" enabled from the advanced graphics menu. It could not utilize any other advanced options. It averaged 49.6 FPS.
It's true, you do go back and re-evaluate occasionally on new drivers and game patches.
But.
Let's say I was evaluating if I should upgrade from a 780Ti or not. Not an uncommon scenario, and just one "For-Instance" case because I had mentioned it previously.
Do you have a 780Ti review with the current nVidia driver, current games, on a current platform?
If you don't want to show old data, did you pull the previous reviews for the 780Ti?
Yes, you are absolutely right that the Anand and Tom's benchmark lists have out of date data. But at least it's data - I can, being an intelligent person, realize that a driver or patch may swing a certain benchmark +/- a few percent, and I certainly recognize that patches and updates are continuously being issued. But it usually won't change the overall picture much.
When I decided to go with less than top of the line for my always on gaming computer, I NEEDED to know what was the very lowest card that would play at a decent framerate the games that I play. As my main computer is now in the living room, always on, and always connected to the big LCD TV screen as the second monitor, I needed to know if my games would run well at 1200 and 1080 resolutions for either screen. Having been a Radeon buyer for 2 decades because of the better quality image they used to have (I know, I know nVidia image is subjectively the same), I was very interested in a mid-range card and I didn't limit my choice to Radeon. I perused this site and a few others gobbling up benchmarks before I made my decision. Oh, I read the whole articles and whether a game feels slow or didn't quite feel/run correctly in spite of the benchs was very useful info also. I still want both - sorry. I just like to see the numbers. I made a sound decision and for the first time in years, I am running a mid-range quieter, cooler card (not to mention cheaper) and am way more than satisfied. I am also one of the ones that had a really bad experience with Batman the first couple weeks, and I feel safe in saying that had a chosen one of the borderline benchmarking cards, my experience would have been way worse. If I had just ordered the standard $600 card this time instead of the $250-$350 category one, I am sure all things would have run just fine, but that is not what I was after this time and I feel that benchmarks showing specifically framerates more than helped me make my decision. I could be wrong but I really like seeing them.
Like many others that have read this board for a couple decades, I am tech-support and resident guru for tons of family and friends. I try to stay on top of what a certain build is going to need and exactly how much computer is needed for x and y jobs/uses. I have made many informed recommendations to the ones I serve in that capacity based on all of the available info/benchies/framerates etc. I really like having the whole article (especially from here that goes above and beyond the framerates) with the numbers to help make decisions. That's my 2 pennies anyways!
Do you have some evidence that MS won't be able to implement adaptive sync in the future since all the displays are going that way?AGAIN these app style games like from the Windows store do NOT support things like Gsync.
By how it was ferried here, the information, it looked a lot like MS was heading toward adaptive-sync-like uniformity. It looked like they were attempting to champion the issues that have been brought up in the last few years. I mean good for them for trying to make the system better, I am all for that, but the road to hell is paved with good intentions.Do you have some evidence that MS won't be able to implement adaptive sync in the future since all the displays are going that way?
What about people that still perceive tearing with Adaptive vsync? It's way way better than no vsync, but it's there. Also MS should be flooded with complaints about what they are doing with not being able to disable vsync if you desire, no support for g-sync and no way to test the application's performance with fraps and the like. I guess from now on I will buy both graphics cards and test them myself and decide which I like better if I can't get a review that I feel showcases the card's strengths and weaknesses. Be ready for lots of returns retailers.By how it was ferried here, the information, it looked a lot like MS was heading toward adaptive-sync-like uniformity. It looked like they were attempting to champion the issues that have been brought up in the last few years. I mean good for them for trying to make the system better, I am all for that, but the road to hell is paved with good intentions.
Unfortunately I don't think any of this is concrete as far as information goes, mostly speculative. DX12 is more of a per game than a per driver so I am not entirely sure how this will shake out. I think it is best we wait a bit, maybe hammer the reviewers to ask their contacts the questions we would like answers to but to base any kind of final decision on the information we have now would be far to premature.What about people that still perceive tearing with Adaptive vsync? It's way way better than no vsync, but it's there. Also MS should be flooded with complaints about what they are doing with not being able to disable vsync if you desire, no support for g-sync and no way to test the application's performance with fraps and the like. I guess from now on I will buy both graphics cards and test them myself and decide which I like better if I can't get a review that I feel showcases the card's strengths and weaknesses. Be ready for lots of returns retailers.
Yep, I see that too. Just chiming in on the current situation. I hope it will change. I'm fine with my GPU right now, will probably be a while until I spend for a new one, luckily, so I will see how it pans out. Thanks too for your insights.Unfortunately I don't think any of this is concrete as far as information goes, mostly speculative. DX12 is more of a per game than a per driver so I am not entirely sure how this will shake out. I think it is best we wait a bit, maybe hammer the reviewers to ask their contacts the questions we would like answers to but to base any kind of final decision on the information we have now would be far to premature.
Hopefully he will cover the interviews others were having where they might ask MS about all this. And double hope that MS gave straight answers.I'm watching Ryan Shroud's feed tonight to see if they have discussion on his article about FCAT and AotS
We're talking about FreeSync and not a Adaptive VSync. Where all apps would get locked to a moving refresh rate with no tearing. FreeSync, unlike GSync, did get adopted into the standards for displayport and works over HDMI.What about people that still perceive tearing with Adaptive vsync? It's way way better than no vsync, but it's there. Also MS should be flooded with complaints about what they are doing with not being able to disable vsync if you desire, no support for g-sync and no way to test the application's performance with fraps and the like. I guess from now on I will buy both graphics cards and test them myself and decide which I like better if I can't get a review that I feel showcases the card's strengths and weaknesses. Be ready for lots of returns retailers.
Oh. He said "adaptive-sync-like" so that means g-sync and any other g-sync like tech? I didn't see a definition, and I thought he meant Adaptive Vsync, so sorry about that. I thought MS had really gone full special and was going to force Adaptive vsync on all of us, because "reasons". Thank you.We're talking about FreeSync and not a Adaptive VSync. Where all apps would get locked to a moving refresh rate with no tearing. FreeSync, unlike GSync, did get adopted into the standards for displayport and works over HDMI.
I'm just talking about basic human behavior. Nothing about Kyle in particular.While I agree that the data helps to back up the subjective part of a review, if I trust the reviewer then the numbers become less important. I understand that reviews are subjective. However, after coming here for years (a lot longer than my join date suggests), I have developed a level of trust with the what the authors of reviews here are telling me. It is like my brother telling me that Game A sucks, or Game B has technical issues, such as crashing or stuttering. I do not need any kind of numbers from my brother. If he is telling me Game A sucks, then I will probably avoid that game as chances are that I will find it sucks too. I cannot recall a single time where a reviewer here has steered me wrong. That track records means that I am going to trust the reviews here are accurate, even if they are completely subjective with no data backing it up. If they started to steer me wrong, then it would break that trust and they would lose a me as a long time reader.
Oh. He said "adaptive-sync-like" so that means g-sync and any other g-sync like tech? I didn't see a definition, and I thought he meant Adaptive Vsync, so sorry about that. I thought MS had really gone full special and was going to force Adaptive vsync on all of us, because "reasons". Thank you.
This is good to know, thank you. Adaptive Vsync has helped my performance in several games also, people may want to enable it to see if it helps. Some games barely hit 30 fps, and with Adaptive Vsync 60 and only as low as 45 occasionally.
- Adaptive Sync is what the DisplayPort standard is called for enabling variable refresh rate.
- FreeSync is the driver implementation by AMD that works with Adaptive Sync to get variable refresh rate with their hardware.
- It seems Microsoft would like Adaptive Sync support to be handled by the operating system instead of video cards and their drivers. It sounds good because it would make it a standard feature for Windows that is hardware agnostic. Unfortunately in its current state this is the inferior technology compared to G-Sync.
- Adaptive V-Sync is a NVIDIA driver feature that turns off V-Sync when the framerate is less than the refresh rate.
- G-Sync is a proprietary variable refresh rate technology from NVIDIA that replaces the hardware scalar in a monitor, while drivers tell supported video cards to send frame information over the video output to the scalar when it detects a G-Sync display.
How if Gsync superior?
- Adaptive Sync is what the DisplayPort standard is called for enabling variable refresh rate.
- FreeSync is the driver implementation by AMD that works with Adaptive Sync to get variable refresh rate with their hardware.
- It seems Microsoft would like Adaptive Sync support to be handled by the operating system instead of video cards and their drivers. It sounds good because it would make it a standard feature for Windows that is hardware agnostic. Unfortunately in its current state this is the inferior technology compared to G-Sync.
- Adaptive V-Sync is a NVIDIA driver feature that turns off V-Sync when the framerate is less than the refresh rate.
- G-Sync is a proprietary variable refresh rate technology from NVIDIA that replaces the hardware scalar in a monitor, while drivers tell supported video cards to send frame information over the video output to the scalar when it detects a G-Sync display.
The Fraps/FCAT issue is with DX12 and how MS is using overlays. They may likely get updates that could fix said issue, albeit a timely venture. Also AMD claims they will add DirectFlip which is what allows Nvidia to circumvent the Frame cap of 60 when Vsync is off.ok this is going to sound like a stupid question... so fraps runs on windows 10, I know I have run it... it can not hook into an exe... that makes sense... so when it runs with the default profile or with the game profile that is the default setting most people will use... why is that an issue? is it too expensive bandwidth wise to post fraps captures of the tests they ran with the counter at the top so people can see an image with a number? I look at the graphs to see if it ever drops below thirty fps... because no those are going to give me a splinting head ache. but I have played games that were locked at forty to fifty frames that looked great and footage that was locked at 24 fps that looked great. I have seen footage of games at 150 fps that had tearing and low resolution mess that just looked like it could have used the higher resolution version of the assets they were likely made and not used to run on a console or hit a target frame count... I understand having a low version of games that runs on enough machines to play for the development... you need to start high and remove edges until you lose the silhouette. if you do it the other way you get a triangles representing circles that only look go from one angle.
so why not just let them optimize the drives and when they make the game look bad post that and highlight the higher numbers... and when it the games run at 40 fps and look like they are running smooth as glass... highlight that.
I mean really do we need to look look at long bar graph to tell us what looks good and what does not? I have to be missing something...
I'm just talking about basic human behavior. Nothing about Kyle in particular.
Sorry but at no point does he give any proof that what we saw with one game is indicative of all games after. Too ranty for me. I get concern but an outright call to arms when so far no real evidence is present is a bit worrisome.Tim Sweeny from Epic is speaking openly against Microsoft's UWP.
Microsoft wants to monopolise games development on PC. We must fight it