Blind Test - RX Vega FreeSync vs. GTX 1080 TI G-Sync

Nothing wrong with that but at the end of the day if I get a Pepsi when having asked for a Coke, my experience is lessened.

You just twisted the point which is if you can't tell the difference between Pepsi and Coke after tasting them, then what? How is the experience lessened if you cannot ascertain the difference between them? It feels like you're going in the direction of a bait and switch which this is not. This review encapsulates a very simple point. A point we all face in life when buying all sorts of products. For some strange reason when applied to gpus, some go ape and do not like this reality.

When I buy a car price vs quality/reliability/safety, and of course comfort all come into play. If I can get a similar experience in a cheaper Subaru vs a more expensive Volvo, I will take the Subaru and pocket the change.

Shrugs...
 
You just twisted the point which is if you can't tell the difference between Pepsi and Coke after tasting them, then what? How is the experience lessened if you cannot ascertain the difference between them? It feel like you're going in the direction of a bait and switch which this is not.

I CAN tell the difference over the myriad games that I play at a variety of settings.
 
pretty neat testing idea, would not mind seeing more of this type of testing in the future. it is always pretty awesome to see results of these types of test. Yeah Doom being the game tested big shocker there :p. But thanks for putting the time and effort into making this happen.
 
how about a more tasking game? One that beats the gpu to death would show a lot more than a game the doesn't.
 
Nice bit of info to delve into, Kyle and team...thanks for putting this together!

My thoughts: I've never used G-Sync or Freesync, but I do use adaptive Vsync, because I do get a noticeable benefit at higher refresh rates/FPS (my monitor is 144 Hz capable). That being said, I've always seen G-Sync as a gimmick...albeit one that works and gets a lot of praise, but also one that NVidia gouges for. Until G-Sync monitors are either priced the same or a miniscule amount more than the Freesync equivalents, I will likely never pay the premium in price for NVidia's offering.

just remember freesync doesn't work w/ nvidia cards tho.
 
That's not a blind test because you have bias from knowing which is which.

*facepalm*

Your largest contribution to this thread is ass-kissing with "I think you did something right. Look at all the low post count posters coming out of the woodwork, dusted off from their hibernation to throw in their 2 cents" while I linked to Blur Busters (page 2) and answered people's questions directly and honestly from my own experience. You then made WRONG assumptions about what I was saying - if anybody was "twisting" the argument it was you - and then make a value judgment that has nothing to do with what I was talking about.

I purposefully avoided addressing this specific blind test in my posts because I find it completely irrelevant for me but you on multiple occasions defend it as if it needed defending. It stands on its own merits and was stated to be unscientific; that's good enough for me. I was addressing actual, individual points and questions with personal experience as well as scientific data gathered by Blur Busters users (which includes Arduinos set up to measure input latency), not conducting a study on climate change.
 
just remember freesync doesn't work w/ nvidia cards tho.

I know. Sorry...should clarified: meant when I upgrade to whatever next-gen GPU.

If I get AMf, then I may entertain getting a Freesync monitor.

If I get another Nvidia, then I'll just keep using adaptive vsync, unless G-sync monitors are around the same price as the equivalent Freesync models.
 
Interested to see more testing with games like BF1. A potato can run doom maxed. Currently have 1080ti and 1440p Gsync, but would like to see how Vega/Freesync compares in some stressful games
 
Interested to see more testing with games like BF1. A potato can run doom maxed. Currently have 1080ti and 1440p Gsync, but would like to see how Vega/Freesync compares in some stressful games
Which games do you suggest? I would like to see or request:
  • Dirt 4 (racing games should really show differences well)
  • Prey
  • BF1
  • Sniper Elite 4
  • Watch Dogs 2
 
Which games do you suggest? I would like to see or request:
  • Dirt 4 (racing games should really show differences well)
  • Prey
  • BF1
  • Sniper Elite 4
  • Watch Dogs 2

Prey's graphics i s supbar at best. I think the game list should be:
  • Doom
  • Fallout 4
  • BF1
  • Watch Dogs 2
  • Sniper Elite 4
  • Total War Warhammer
  • Deus Ex Mankind Divided
  • Rise of the Tomb Raider.

Every single title is either graphically intensive or CPU limited or both.
 
Prey's graphics i s supbar at best. I think the game list should be:
  • Doom
  • Fallout 4
  • BF1
  • Watch Dogs 2
  • Sniper Elite 4
  • Total War Warhammer
  • Deus Ex Mankind Divided
  • Rise of the Tomb Raider.

Every single title is either graphically intensive or CPU limited or both.
Like that list, that blind test would be more telling than a single game and as far as I know would be the first on the internet. Probably would overnight make other type of reviews obsolete. I would really like to know if the Ultra settings or max settings make a real experience difference to players in general, also if mega fps is even noticeable to 80% of the gaming folks. Could throw the whole old fashion review system totally out the window (not talking about HardOCP reviews but other review site methods). Kyle definitely nailed the process on this one keeping it simple to the point and focus on the experience from each player. I wonder if Kyle would accept volunteers, as in you buy your plane ticket, meet up, play some games, give feedback (drink a few beers or more) and go home with some found memories.
 
Like that list, that blind test would be more telling than a single game and as far as I know would be the first on the internet. Probably would overnight make other type of reviews obsolete. I would really like to know if the Ultra settings or max settings make a real experience difference to players in general, also if mega fps is even noticeable to 80% of the gaming folks. Could throw the whole old fashion review system totally out the window (not talking about HardOCP reviews but other review site methods). Kyle definitely nailed the process on this one keeping it simple to the point and focus on the experience from each player. I wonder if Kyle would accept volunteers, as in you buy your plane ticket, meet up, play some games, give feedback (drink a few beers or more) and go home with some found memories.
Unfortunately I don't think the conventional way can disappear, although this blind test does add some unique perspective and the all elusive CONTEXT to REAL WORLD. I have seen many of these tests and they tend to get buried simply because it flies in the face of elitism. Over the last few years I have seen many of these tests done with AMD CPUs and they tend to win these type of tests by a decent margin just like you see with this one. But because the results lack Epeen they tend to garner little credence among the community. However if they would look at this as another valid data point for users of said monitors and those interested in real world performance as a metric to help further their decision then I think it would make a great addition.
 
Except the wisdom of numbers just does not capture the experience like this one test for this one game. Way more come into play here and I do believe folks who are going to buy will be way more interested in real facts vice vague numbers, graphs and opinions. It would be also different and way more interesting as well then the normal canned benchmarks where no one is even playing the game. The canned benchmark is easy, this would be divine in usefulness on first thought.

Potential buyers mean companies may pay more attention to sites that really points out the facts giving them the node for advertisement money. I know I would rather see a number of gamers using the hardware in a very controlled and accurate manner then a bunch of charts, graphs ad nausea.
 
Again: not how it works.
As I said previously: every time you pass a multiple of the refresh rate, you get a new tear line.

~360 FPS at 60Hz = 6 torn frames per refresh:
tearingq0sy87oskh.jpg


High framerates don't eliminate tearing. Syncing frames eliminates tearing.
VRR allows you to sync frames with negligible latency so long as you keep the framerate below the display's maximum refresh rate.

If you can get the framerate high enough (at least 10x the refresh rate, higher preferred) it stops looking like tearing and just looks like the image is warping, but nobody wants that either.
I would rather take a slower GPU and a VRR display than a faster GPU and a fixed refresh rate display; e.g. GTX 1070 + 144Hz G-Sync vs GTX 1080 + 144Hz display.

The next time I play, Watch Dogs 2 or PUBG at 360 FPS I will post a screen shot and prove your point.
 
Except the wisdom of numbers just does not capture the experience like this one test for this one game................................................. I know I would rather see a number of gamers using the hardware in a very controlled and accurate manner then a bunch of charts, graphs ad nausea.

The wisdom of numbers are never mistaken if you/we can use them properly.
Best example in my opinion, of how numbers should be used, is Gamer's Nexus reviews. ( http://www.gamersnexus.net/hwreviews/2973-amd-vega-frontier-edition-reviewed-too-soon-to-call/page-4 )
Besides the average fps, it also measures the latency between frames, (*in Gamer'sNexus charts check the 1% low & 0,1% low ), which basically measures the smoothness of the gameplay.
So, if i can have numbers & stats which are never mistaken, why to prefer some subjective opinions from persons that i don't know? (*Kyle says he vouch for all of them, and i believe him, but nevertheless, these subjective opinions are not data that i can use anywhere, not even for myself. )
 
Last edited:
The wisdom of numbers are never mistaken if you/we can use them properly.
Best example in my opinion, of how numbers should be used, is Gamer's Nexus reviews. ( http://www.gamersnexus.net/hwreviews/2973-amd-vega-frontier-edition-reviewed-too-soon-to-call/page-4 )
Besides the average fps, it also measures the latency between frames, (*in Gamer'sNexus charts check the 1% low & 0,1% low ), which basically measures the smoothness of the gameplay.
So, if i can have numbers & stats which are never mistaken, why to prefer some subjective opinions from persons that i don't know? (*Kyle says he vouch for all of them, and i believe him, but nevertheless, these subjective opinions are not data that i can use anywhere, not even for myself. )
some thoughts:
  • Higher numbers or even lower frame times may not mean a better experience: If this incurs screen tearing, judder etc. the gaming experience could be worst.
  • What may appear to be better may not make a gaming difference
  • Color, artifacts, non-rendering, glitches on rendering, flashing etc. may not show up at all in the numbers but could be devastating to game play
There is much more than just frame times.
 
some thoughts:
  • Higher numbers or even lower frame times may not mean a better experience: If this incurs screen tearing, judder etc. the gaming experience could be worst.
  • What may appear to be better may not make a gaming difference
  • Color, artifacts, non-rendering, glitches on rendering, flashing etc. may not show up at all in the numbers but could be devastating to game play
There is much more than just frame times.

I'm not a tech-expert (*want to hear what the experts have to say on this), but in my opinion, most of those you mentioned, glitches, artifacts, screen tearing... etc, would result as an impact among frame-latencies, which is exactly what Gamer's Nexus measures at its reviews.
 
I'm not a tech-expert (*want to hear what the experts have to say on this), but in my opinion, most of those you mentioned, glitches, artifacts, screen tearing... etc, would result as an impact among frame-latencies, which is exactly what Gamer's Nexus measures at its reviews.
In many cases it won't affect frame rates - for example Sniper Elite 4, 1070 SLI gave about 70% better frame rates, very smooth, except at times depending upon location and direction flashing textures would occur. Number wise it probably looked much better with SLI, in reality it was an utter mess at times. Now I have a 1080Ti which actually gets lower frame rates (not by much) but it is a much better gaming experience.
 
Sniper Elite 4's SLI issues are an artifact of poor SLI support, and that's on the developer (Remedy); notice that 'flashing textures' is not endemic of SLI (or Crossfire), or G-Sync/FreeSync, etc. Want to find out about that or "colors/artifacts/non-rendering/glitches on rendering/flashing" beforehand? Read a review of the game.

If the game also had poor/inconsistent performance, that'd show up in frametime analysis, and when it comes to game rendering performance, frametimes really are the end all/be all. If a GPU reviewer does their job right, they'll expose such issues.
 
Like that list, that blind test would be more telling than a single game and as far as I know would be the first on the internet. Probably would overnight make other type of reviews obsolete. I would really like to know if the Ultra settings or max settings make a real experience difference to players in general, also if mega fps is even noticeable to 80% of the gaming folks. Could throw the whole old fashion review system totally out the window (not talking about HardOCP reviews but other review site methods). Kyle definitely nailed the process on this one keeping it simple to the point and focus on the experience from each player. I wonder if Kyle would accept volunteers, as in you buy your plane ticket, meet up, play some games, give feedback (drink a few beers or more) and go home with some found memories.
No.

3440 X 1440, sync is certainly one item on a $500 card users possible buy list. As are Eyefinity, 4K, super high refresh, VR. Raw benchmarks are needed.
 
man as much as i like nvidia's graphics cards, not always a fan of their greediness. Like in this situation why not let us use V-vsync technology on free sync monitors? It's already built into the card, for gods sake!! that is so weak. so now they are expecting us to have to buy a certain monitor to get the best performance instead of just buying one good monitor that will last through many upgrade cycles?

word to nvidia... You see what's happening to intel, don't you? some people are going to get tired of your cash milking tactics and once a comparable product is available from an honest reputable company that actually cares for their customers and gives back to the community in almost priceless ways but also making pc gaming a more accesible, and overall better experience for everyone. And you know, those people may just jump ship and never look back.

So, as for me, for the foreseeable future i'm gonna be giving my money to amd. I think they deserve it.

Rock On AMD
 
Usually AMD hypes the shit out of everything and tries to make it seem as good as possible...it must be really bad this time since they aren't doing anything public.
 
I just pick up an Oculus Rift bundle during its Summer sale so I will be looking forward to your VR review. Any chance you will test it on the Rift or will it still be on the Vive for the time being?
I will still be testing on Vive, but the performance profile in terms of framerate are identical, so HMD is irrelevant.
 
Prey's graphics i s supbar at best. I think the game list should be:
  • Doom
  • Fallout 4
  • BF1
  • Watch Dogs 2
  • Sniper Elite 4
  • Total War Warhammer
  • Deus Ex Mankind Divided
  • Rise of the Tomb Raider.

Every single title is either graphically intensive or CPU limited or both.


+ Wildlands
 
should we upgrade to the ti or is a aftermarket oc 1080 good enough for 4k
 
Last edited:
Sniper Elite 4's SLI issues are an artifact of poor SLI support, and that's on the developer (Remedy); notice that 'flashing textures' is not endemic of SLI (or Crossfire), or G-Sync/FreeSync, etc. Want to find out about that or "colors/artifacts/non-rendering/glitches on rendering/flashing" beforehand? Read a review of the game.

If the game also had poor/inconsistent performance, that'd show up in frametime analysis, and when it comes to game rendering performance, frametimes really are the end all/be all. If a GPU reviewer does their job right, they'll expose such issues.
Personally I rather see 5-10 seasoned gamers try it out and extract accurately what they experienced. Combined that with some empirical data to contrast would make for some very interesting and probably revealing information. Getting down to real performance, gaming conditions that give the best experience to people, vital information/knowledge for better decisions. You could have great frame times and yet have a washed out TN panel with severe tearing and a panel that the color fades out due to the angle from your eye to the panel on the edges. Just very short sighted testing, assumptions and conclusions I see.

Another test which I would love to see is what folks prefer, as in 1440p 144hz compared to 4K 60hz FreeSync/GSync on like a 28" monitor. I would bet most would like the 4K more if you can maintain the same color characteristics. Yet some would indeed see the difference and prefer the 144hz. We just don't have a wide sampling of data other than average frame rates, frame times (which really only addresses one characteristic of game play).
 
some thoughts:
  • Higher numbers or even lower frame times may not mean a better experience: If this incurs screen tearing, judder etc. the gaming experience could be worst.
  • What may appear to be better may not make a gaming difference
  • Color, artifacts, non-rendering, glitches on rendering, flashing etc. may not show up at all in the numbers but could be devastating to game play
There is much more than just frame times.

Hmm nope. Frame times its the best possible method, stutter in example it's more visible in frame times than in a video, specially in the way [H] do, with a complete graph of the frametime, any issue with FPS will be directly and faster noticeable in a frame time graph, as the frame stability can be contrasted with the max/avg/min numbers, not everyone can notice stutter, I know some people that for the sake of their live can't notice stuttering or a choppy game just in front me while i can perfectly see the choppy gameplay, I know people that are less sensitive to screen tearing.. etc etc.. actually [H] have the best testing methodology, adding all objective and subjective data, this esa pretty evident in the "real world" ryzen test made by Brent which had a certain experience and final thoughts but kyle added his own experiences and thoughts as they were different than what Brent felt. Data was the same but the experience was different, who to trust? See? That's my point subjective experiences are never entirely reliable.
 
man as much as i like nvidia's graphics cards, not always a fan of their greediness. Like in this situation why not let us use V-vsync technology on free sync monitors? It's already built into the card, for gods sake!! that is so weak. so now they are expecting us to have to buy a certain monitor to get the best performance instead of just buying one good monitor that will last through many upgrade cycles?

word to nvidia... You see what's happening to intel, don't you? some people are going to get tired of your cash milking tactics and once a comparable product is available from an honest reputable company that actually cares for their customers and gives back to the community in almost priceless ways but also making pc gaming a more accesible, and overall better experience for everyone. And you know, those people may just jump ship and never look back.

So, as for me, for the foreseeable future i'm gonna be giving my money to amd. I think they deserve it.

Rock On AMD

I have news for you.

The day AMD launches a card with competitive advantage, they start "milking it" because they are a business, they are not your friend. When the Athlon 64 was the stuff, they sold it for $1000+. When they had the first 5GHz cpu, they sold it at $799 or more, even though it was outperformed by $300 CPUs. Etc.

AMD and NVIDIA are for profit, publicly traded companies.

They will make money where they can. Cheering for either when they're behind the tech curve and forced to sell at lower prices makes no sense.
 
Last edited:
Hmm nope. Frame times its the best possible method, stutter in example it's more visible in frame times than in a video, specially in the way [H] do, with a complete graph of the frametime, any issue with FPS will be directly and faster noticeable in a frame time graph, as the frame stability can be contrasted with the max/avg/min numbers, not everyone can notice stutter, I know some people that for the sake of their live can't notice stuttering or a choppy game just in front me while i can perfectly see the choppy gameplay, I know people that are less sensitive to screen tearing.. etc etc.. actually [H] have the best testing methodology, adding all objective and subjective data, this esa pretty evident in the "real world" ryzen test made by Brent which had a certain experience and final thoughts but kyle added his own experiences and thoughts as they were different than what Brent felt. Data was the same but the experience was different, who to trust? See? That's my point subjective experiences are never entirely reliable.
Frame times is one aspect and not the all be all testing method. HardOCP does much more than just frame times and was way ahead of the curve in finding issues such as items not rendering, hesitations or jerkiness, flashing textures, slow sections in a game or shaders causing issues etc. A higher number or lower frame times does not make me feel better on my purchase - a real upgrade in the experience does.

In other words you need to play the damn game or games to really be able to analyze the whole performance.
 
"Hesitations or jerkiness"
"Slow sections in a game"

These items can be subjectively described by a reviewer, and a good reviewer should catch and expound on them- but they are also objectively quantified in frametime analysis.

I do not disagree that subjective analysis is important, but when it comes to performance, all of the performance issues you have mentioned can be both exposed and quantified in frametimes.
 
Prey's graphics i s supbar at best. I think the game list should be:
  • Doom
  • Fallout 4
  • BF1
  • Watch Dogs 2
  • Sniper Elite 4
  • Total War Warhammer
  • Deus Ex Mankind Divided
  • Rise of the Tomb Raider.

Every single title is either graphically intensive or CPU limited or both.

He can add a couple of PC "optimised" games like:

- Quantum break
- Mafia III

These games will make G-Sync and specially the usually narrower freesync Hz range a hard time on 3440x1440 100 Hz monitors.
 
The next time I play, Watch Dogs 2 or PUBG at 360 FPS I will post a screen shot and prove your point.

This. Anyone that hasn't experienced G-Sync/FreeSync, I would highly recommend you do. Its mind boggling to me how much better a gaming experience is with either. No tearing 100+fps is amazing.
 
"Hesitations or jerkiness"
"Slow sections in a game"

These items can be subjectively described by a reviewer, and a good reviewer should catch and expound on them- but they are also objectively quantified in frametime analysis.

I do not disagree that subjective analysis is important, but when it comes to performance, all of the performance issues you have mentioned can be both exposed and quantified in frametimes.
You could have less performance but yet a better experience ;) - frame times is not going to show you that. So what is even the point of frame times if it doesn't even determine if the experience is better or not? It is just one aspect of measurement and very limited in scope. Performance I look at a broader view than just frame times or fps. Performance in rendering, color, latency, AA quality/performance, loading and then of course smoothness performance.
 
The next time I play, Watch Dogs 2 or PUBG at 360 FPS I will post a screen shot and prove your point.
It seems like you are being derisive of my post because it's not possible to run many new games at high framerates. I agree with that.
I have been saying all along that VRR is required for new games since they can be so demanding to run - whether that is due to steep hardware requirements, or poor optimization.
It's just not possible to keep all games locked above 60 FPS any more.

The post that you quoted was demonstrating how high framerates do not eliminate screen tearing if you disable V-Sync though.

You could have less performance but yet a better experience ;) - frame times is not going to show you that. So what is even the point of frame times if it doesn't even determine if the experience is better or not? It is just one aspect of measurement and very limited in scope. Performance I look at a broader view than just frame times or fps. Performance in rendering, color, latency, AA quality/performance, loading and then of course smoothness performance.
The standard usage of "game performance" is referring to how well it runs.
Rendering quality or errors would not typically be grouped under "performance".
Outside of some rare exceptions, that has not been an issue in the past 15 years or so. Before then, it used to be a big deal though.

High framerates and smoothness are what matters for performance - and frame-time measurements are the best method we have of evaluating that.
Performance consistency - which is what a frame-time graph or percentile measurements show - is far more important than high average framerates.
It's why anyone that cares about smoothness rather than framerate would avoid SLI at all costs.
Look at how terrible The Witcher 3 appears to be running here, despite the framerate staying above 60 FPS.
That's why framerate is largely a useless metric without frame-time graphs.
Frame-pacing is also important, but that requires an FCAT setup to measure.
 
"So you woudn't pay $300 for one more than the other?" If I were going to blob out on just Doom for the next three years, that would be a meaningful question to ask.
 
OK, I went TL;DR on the comments, and just throw in my opinion.

As far as I can determine, adaptive sync technology is much better than V-Sync technology. My problem is going from "System 1" (nVidia) to "System 2" (AMD). How about the reverse.... System 2 to System 1? Also, more games.... Doom is a very optimized game, and there are other, more challenging games out there.
 
Back
Top