Blind Test - RX Vega FreeSync vs. GTX 1080 TI G-Sync

I got a Poseidon 1080ti and it will boost to 1850 on friggin air lol... over 2k on water but I will take it back in a heart beat for 2x Vegas if price to perf is there. Heres to hoping Kyles already writtwn review we cant see yet pans out. From the look of the side by side test Kyle knows something he cant tell us and I think the Vega cards may be impressing him somewhere possibly.

But then again a 1080ti a 1080 and a furyX can deliver the sa.e fps in a particular title the end user probably woulsnt notice a difference. Yall need to stop getting mad and bitching at Kyle. Yall act like you pay monthly subscribtions and he owes you or something.
 
Seriously I get you aren't interested in Vega but is it necessary that you post every so often the exact same negative drivel over and over? Its a GPU that is stronger than other cards in its own stack. It doesn't best Nvidias line. If you need Nvidias performance level then buy Nvidia. If you hate paying for Nvidia, well tough. Pick one.

This.

Obviously AMD is going to play up the Freesync aspect, it's an area where they have a HUGE advantage. $200 difference in a popular monitor feature is a big deal for a lot of people, especially when Freesync and GSync are very similar in functionality.

NVIDIA has done the same thing in the past, asking review sites to write about their GSync, 3d Vision, and PhysX features.

Just how business works. If a person is considering a new monitor at all, the Freesync vs GSync has to be part of the decision.

If Vega truly does offer near GTX1080 performance, that is enough for everything but 4K which only accounts for <20% of TVs in the US and a tiny percent of monitors.

http://store.steampowered.com/hwsurvey/

Only .86% of PC Monitors- not a large market compared to over 3% for the 2560X1080 - 3440 X 1440 market this level of card would serve.

So in the PC only market for $500 video cards the Vega should serve over 75% fine, and in the TV as a monitor market Vega serves over 80% of the current market well. When you add in the FreeSync advantage I see a big potential market for Vegas.
 
If you give any shit about performance of an FPS you are running at whatever is the lowest input lag frame-rate - which is normally uncapped or capped based on the monitor technology (like maximum refresh rate -2 on a Gsync monitor).

If you can do that at maximum ingame settings, it is preferable. There should be no variable framerate which makes G-Sync/Freesync pointless. The smoothness debate on FPS multiplayer games is pretty moot.

I know that isn't how HardOCP works or tests, but it is how to do it.

*Shrug* If all you care about is your fps in you fps, then you are doing it wrong. (Games are supposed to be fun, which is why I could not care less about "modern" multiplayer games.
 
  • Like
Reactions: N4CR
like this
Well most people can tell from looking at my rigs I'm mostly a intel/NV person(presently) but over the decades the bottom line is what works and what I can afford. People who know me have heard me tell them for the last 2 years to keep an eye on AMD because things are looking promising. I just hope they keep it up so that by the time I actually need to do a new build I can really consider trying them again(probably another 2-4 years).

Regardless of which side someone chooses or why, its nice to see both reaching such heights with present display tech again. From 1440p to 4k it's felt like single GPU solutions have been playing catch up for awhile now, too long in fact.

& thanks Kyle for the amazon link. Done and bookmarked. I do shop amazon quite frequently so it'll get used. Most of my rigs were built between them and Newegg.
 
Last edited:
If AMD does not get greedy this could be a fantastic gaming experience for a great price.

As far as the experiment, I appreciate the effort and hearing from people who have a lot of experience when it comes to twitch style gaming. I would've had a more definite conclusion if it was a double blind test, motion blur off, and panels that were more comparable in quality. No experiment like this can be perfect, but I think Kyle did a very good job making it as fair as possible.
 
I have only 1 question about this test, and that question will remain unanswered properly until we know official MSRP:

Why the hell did they use monitors with vastly different panels? I know, i know, these Samsung VA panels are the shit and Samsung are magicians for making VA work with high refresh rates, but why the hell use LG panel on G-sync monitor? Just to make sure you are using the most expensive G-sync monitor on the market?
 
I have only 1 question about this test, and that question will remain unanswered properly until we know official MSRP:

Why the hell did they use monitors with vastly different panels? I know, i know, these Samsung VA panels are the shit and Samsung are magicians for making VA work with high refresh rates, but why the hell use LG panel on G-sync monitor? Just to make sure you are using the most expensive G-sync monitor on the market?

My guess...the GPU is lacking in performance/power metric vs NVIDIA...so AMD PR is trying to get people to focus on something other than the GPU and performance/Watt...simple as that
 
"Totally unscientific and subjective testing is scorned by many, so if that gets your panties in a bunch, I suggest you stop reading"
Heh, I should've listened...
 
In fairness, you have just been proven wrong by Kyle's test. IMO.

Has he been, though? Chances are, both cards hit FPS higher than the refresh rate of these monitors, so the effects of G-Sync that most people are talking about was not really in play here.
 
Nice, these are the type of videos that make me love this site. ..............................

Exactly the opposite!
I'm a person that uses numbers and stats. I can not do anything without them!!!
This video had neither of them. At the beginning i thought that at least in this video we would witness some FPS measurements at DOOM. Instead, i found myself being in a shock-mode when i realised that... not only this video had only subjective opinions, we also didn't have any info about the drivers from AMD's setup!!!
Can someone please tell me: What kind of conclusions can someone have, if we don't even know the driver's version? !!:eek:
While it was entertaining to watch Kyle and the other guys expressing their opinion, this video didn't provide me with anything really useful.
 
Exactly the opposite!
I'm a person that uses numbers and stats. I can not do anything without them!!!
This video had neither of them. At the beginning i thought that at least in this video we would witness some FPS measurements at DOOM. Instead, i found myself being in a shock-mode when i realised that... not only this video had only subjective opinions, we also didn't have any info about the drivers from AMD's setup!!!
Can someone please tell me: What kind of conclusions can someone have, if we don't even know the driver's version? !!:eek:
While it was entertaining to watch Kyle and the other guys expressing their opinion, this video didn't provide me with anything really useful.

My personal take, is that AMD knows the card is slower, and they want to prove that the performance deficit doesn't matter because you can't really tell the difference in a Pepsi Challenge type scenario. More over, a lower priced setup on the AMD side will serve you just as well and put money in your pocket. As for the AMD drivers, what about them? They are most likely an unreleased version of the drivers. How does that make a difference in regard to the conclusions we can make concerning Vega?

As for the panel differences, it's easy to say why didn't you use X panel vs. X panel. The reality is, sometimes you have to work with whatever is on hand. Sometimes you can't lineup everything you want to and if you want to get an article out on time, you need to run with what you have. Lastly, FreeSync and G-Sync panels can't be the same because monitors are either one technology or the other which prevents you from being able to do a perfect apples to apples comparison on them.
 
It's like if they don't like the result, attack the process. If ya can't attack the process, attack the source. What next?

LOL you hit the nail on the head here.

AMD won in this test simple as that.Imagine if Nvidia won how different this thread would be.
 
............... As for the AMD drivers, what about them? They are most likely an unreleased version of the drivers. How does that make a difference in regard to the conclusions we can make concerning Vega?..................

I will answer with a question: If the driver's version is of no importance, then why does AMD feels the need to keep them secret?
 
I will answer with a question: If the driver's version is of no importance, then why does AMD feels the need to keep them secret?

AMD's gotten pretty secretive over the last 10 years or so. I'm not surprised they did this. It might be a modified version of an existing driver, or a driver that might never see the light of day. I get what you are implying. The performance we saw with this driver may not be reproducible with whatever driver hits when the card officially launches. The implication being that this driver was massaged to put on a show in which image quality was somehow compromised in order to improve the speed of Vega during testing. Does that sound like it's inline with what you are thinking?

I'm not going to say that's impossible. We've seen NVIDIA and AMD pull shit like that in the past. I will say that we all discussed Vega and asked the AMD rep questions about Vega which can't be discussed at this time. None of us saw any image quality issues or differences between setups. These systems were right next to each other for easy visual comparison. I do think that Vega's amazing performance, which seemed to exceed the GeForce GTX 1080Ti is limited to Doom and more specifically Doom under the Vulkan API.

The last thing I'll say, which Kyle alluded to in the video is that AMD had a different playbook for this which was thrown out as usual. Kyle said in the video that he pulled the AMD supplied NVIDIA card and blew both OS installations away to ensure that the test couldn't be skewed by AMD. The only thing he didn't do was install the AMD Vega drivers.
 
Things I took away form this video:

NVIDIA needs to REALLY examine the premium cost of g-sync monitors.
NVIDIA needs to rethink the price of their card lineup at least some.
 
Things I took away form this video:

NVIDIA needs to REALLY examine the premium cost of g-sync monitors.
NVIDIA needs to rethink the price of their card lineup at least some.

G-Sync being locked to hardware that costs about $200 does suck. There is no getting around it. Earlier tests indicated that it was the superior solution between the two, and it might be. Those earlier reviews indicated that this was primarily due to G-Sync being better at lower refresh rates and frame rates. This past weekend, I wasn't able to discern any advantage between the G-Sync and FreeSync monitors specifically. Unfortunately, while FreeSync is an open standard, only AMD supports it because we only have to GPU manufacturers to speak of. This locks you into one brand or the other unless you want to replace your monitor everytime you change video cards. Most people don't do that.

As for pricing, NVIDIA may or may not need to do anything. We don't know what Vega's pricing will be and that card's success will depend on that pricing.
 
AMD's gotten pretty secretive over the last 10 years or so. I'm not surprised they did this. It might be a modified version of an existing driver, or a driver that might never see the light of day. I get what you are implying. The performance we saw with this driver may not be reproducible with whatever driver hits when the card officially launches. The implication being that this driver was massaged to put on a show in which image quality was somehow compromised in order to improve the speed of Vega during testing. Does that sound like it's inline with what you are thinking?

I'm not going to say that's impossible. We've seen NVIDIA and AMD pull shit like that in the past. I will say that we all discussed Vega and asked the AMD rep questions about Vega which can't be discussed at this time. None of us saw any image quality issues or differences between setups. These systems were right next to each other for easy visual comparison. I do think that Vega's amazing performance, which seemed to exceed the GeForce GTX 1080Ti is limited to Doom and more specifically Doom under the Vulkan API.

The last thing I'll say, which Kyle alluded to in the video is that AMD had a different playbook for this which was thrown out as usual. Kyle said in the video that he pulled the AMD supplied NVIDIA card and blew both OS installations away to ensure that the test couldn't be skewed by AMD. The only thing he didn't do was install the AMD Vega drivers.

No i wasn't thinking something that complicated such as the one you said (*i was impressed by your thoughts to be honest!! ).
I was more like thinking that AMD could have used a newer driver version (*compared to EDIT: FRONTIER Edition, not FOUNDERS as i wrote by mistake :LOL:), but perhaps they were afraid about its performance compared to the 1080Ti (*don't forget that at first, Kyle was thinking to test 5 games instead of DOOM), and they didn't want to reveal to the public that their newest driver can't compete with the competition (*just a thought, but it's AMD's fault because they chose to keep the drivers secret)
 
Last edited:
[
My personal take, is that AMD knows the card is slower, and they want to prove that the performance deficit doesn't matter because you can't really tell the difference in a Pepsi Challenge type scenario. More over, a lower priced setup on the AMD side will serve you just as well and put money in your pocket. As for the AMD drivers, what about them? They are most likely an unreleased version of the drivers. How does that make a difference in regard to the conclusions we can make concerning Vega?

As for the panel differences, it's easy to say why didn't you use X panel vs. X panel. The reality is, sometimes you have to work with whatever is on hand. Sometimes you can't lineup everything you want to and if you want to get an article out on time, you need to run with what you have. Lastly, FreeSync and G-Sync panels can't be the same because monitors are either one technology or the other which prevents you from being able to do a perfect apples to apples comparison on them.

That was my take as well. The point they're making is still valid, in some respects, for a certain subset of users: if you're playing Doom (and likely a lot of other games), you'll be hard pressed to tell a difference AND you can save $greenbacks on your display. If you're playing something more intensive, the difference may become more clear.

Totally agree on the drivers. What does it matter what driver AMD installed? They surely weren't going to install duds, and no magic driver is going to push hardware beyond what it can do.
 
Totally agree on the drivers. What does it matter what driver AMD installed? They surely weren't going to install duds, and no magic driver is going to push hardware beyond what it can do.

And taken a step further magic driver or not it doesn't make much sense for them to be able to make a magic driver and not pass it onto the whole stack especially considering their position.
 
Exactly the opposite!
I'm a person that uses numbers and stats. I can not do anything without them!!!
This video had neither of them. At the beginning i thought that at least in this video we would witness some FPS measurements at DOOM. Instead, i found myself being in a shock-mode when i realised that... not only this video had only subjective opinions, we also didn't have any info about the drivers from AMD's setup!!!
Can someone please tell me: What kind of conclusions can someone have, if we don't even know the driver's version? !!:eek:
While it was entertaining to watch Kyle and the other guys expressing their opinion, this video didn't provide me with anything really useful.
This the the prefect example of real world test results number and figures don't tell you shit. The fact that a majority can't see the difference tells you that paying 300$ for "numbers and figures " means nothing if you can't see a difference. So if you want to get that extra .05% and pay $300 that's your prerogative. This video tells me that the "on paper" results don't show the real world results.
 
This the the prefect example of real world test results number and figures don't tell you shit. The fact that a majority can't see the difference tells you that paying 300$ for "numbers and figures " means nothing if you can't see a difference. So if you want to get that extra .05% and pay $300 that's your prerogative. This video tells me that the "on paper" results don't show the real world results.


The extra 300 bucks can make a difference, a difference that can't be seen in a subjective test based on one game. That is why you need both the subjective and hard data to make an informed discussion.

On paper results don't give you the subjective results, but if there were other games in the test suite the results can be very different in a subjective test. That is why the hard data is extremely valuable. from the hard data you can extrapolate to a certain degree what other apps might function like based on raw performance although its not wise to solely do it based on that.

Also we are talking about 2k there, wasn't Vega marketed as a 4k card?

Many things NOT aligned with early marketing efforts by AMD. Something changed in there views of Vega? Could be.
 
No i wasn't thinking something that complicated such as the one you said (*i was impressed by your thoughts to be honest!! ).
I was more like thinking that AMD could have used a newer driver version (*compared to EDIT: FRONTIER Edition, not FOUNDERS as i wrote by mistake :LOL:), but perhaps they were afraid about its performance compared to the 1080Ti (*don't forget that at first, Kyle was thinking to test 5 games instead of DOOM), and they didn't want to reveal to the public that their newest driver can't compete with the competition (*just a thought, but it's AMD's fault because they chose to keep the drivers secret)

The system came with the GTX 1080, not the 1080Ti. Yes, the original idea was to compare 5 games instead of 1, but again AMD wasn't shooting for the 1080Ti. I don't think the secret driver was anything other than a pre-release driver that probably doesn't have support for other cards. Companies have all kinds of alpha and beta drivers are often internal that we'll never see, use or ever even know about. I would know, as I've been a part of alpha testing hardware in the past. This driver may be fine for Vega, but it might brick anything else it touches. There could simply be issues with it that make it unsuitable for other cards. Drivers like this are often cobbled together and aren't good enough for production. I don't think it's something to be concerned about.

I've seen Intel do the same thing with chipset drivers by the way.
 
Last edited:
The perfect blind test would have removed the freesync vs Gsync BS and used the same exact panel on both machines, with exact same settings, that would have truly been a perfect blind test about GPU/system performance, and not monitor A vs monitor B test.. all the rest of the hardware was actually irrelevant.
 
The perfect blind test would have removed the freesync vs Gsync BS and used the same exact panel on both machines, with exact same settings, that would have truly been a perfect blind test about GPU/system performance, and not monitor A vs monitor B test.. all the rest of the hardware was actually irrelevant.

You mean they should not have done the test the way they did it at all and throw it out for regular testing? The irony.
 
You mean they should not have done the test the way they did it at all and throw it out for regular testing? The irony.


Its called a double blind test, not sure if that was done, but a test like that pretty much the users don't know what they are testing for or what the test machines are. There would be no chance of users expecting differences between the two systems.
 
Personally I hope AMD can come and kick some butt. I like seeing "it's not so simple" results and the majority of feedback in the test was great to hear. Most couldn't tell a difference, and that puts them on a even playing field for the majority of gamers. Some will go one way or the other, I guess, depending on other factors. For me as I have a great monitor, the acer xb271hu, going with the 1080 ti was naturally the direction to go. If I was going to replace my monitor and gpu? Then AMD/freesync being competitive only helps me widen my choices, not hinder them.

So many factors, such as certain games going more amd or NVidia, and discounts/prices/promotions that the price difference isn't so cut and dry for me to think my 1080ti gsync monitor combo is anything but excellent for me, even if the vega freesync combo is going to be excellent for someone else. All that matters to me is that there is more than one paved road to cruise on, down the road.
 
This the the prefect example of real world test results number and figures don't tell you shit. The fact that a majority can't see the difference tells you that paying 300$ for "numbers and figures " means nothing if you can't see a difference. So if you want to get that extra .05% and pay $300 that's your prerogative. This video tells me that the "on paper" results don't show the real world results.

Totally disagree. Exactly the opposite.!!
Check out JosiahBradley's #34, about why:
I like the video style and testing being done in a blind manner, but like it has been mentioned before Doom is never going to dip below 100fps on at least a 1080ti. For me it stays closer to 200fps. So even if Vega was half as powerful both systems would display the equivalent locked 100hz as if v sync was on with the only difference being possible input lag between freesync and gsync but not visual differences. If the experience was the same that's because it was the same not because I've was faster etc. Without a time by frame graph we only know that the limiting factor was the slow panels. I do however enjoy the format regardless of this and like hearing from real gamers. Hopefully a full review will be out soon if AMD doesn't delay Vega until 2019.
 
So that was with default settings on both machines.

AFAIK*, on nvidia the default is still v-sync OFF (and probably the same with AMD) so when you go over the VRR range, smoothness suffers (and there is tearing too but it's hard for most people to see it at high fps/hz especially in a game like Doom). I'm pretty sure the 1080 ti must have been going past 100fps a lot of the time there, much more often than Vega anyway. I would have preferred Vega too.

*happens to me whenever I upgrade my drivers using the "clean install" button or when I set up a new system with g-sync
 
So that was with default settings on both machines.

AFAIK, on nvidia the default is still v-sync OFF (and probably the same with AMD) so when you go over the VRR range, smoothness suffers (and there is tearing too but it's hard for most people to see it at high fps/hz especially in a game like Doom). I'm pretty sure the 1080 ti must have been going past 100fps a lot of the time there, much more often than Vega anyway. I would have preferred Vega too.

We're back to discrediting the test or process? The settings were per Nvidia so if you think they messed up...?
 
So that was with default settings on both machines.

AFAIK*, on nvidia the default is still v-sync OFF (and probably the same with AMD) so when you go over the VRR range, smoothness suffers (and there is tearing too but it's hard for most people to see it at high fps/hz especially in a game like Doom). I'm pretty sure the 1080 ti must have been going past 100fps a lot of the time there, much more often than Vega anyway. I would have preferred Vega too.

*happens to me whenever I upgrade my drivers using the "clean install" button or when I set up a new system with g-sync
V-Sync is on by default when you install the drivers with a G-Sync display attached, and G-Sync is enabled for full screen applications. I always install the drivers with the clean install option and this is the behavior for me.
 
V-Sync is on by default when you install the drivers with a G-Sync display attached, and G-Sync is enabled for full screen applications. I always install the drivers with the clean install option and this is the behavior for me.

OK, not sure why it doesn't for me then. It also doesn't for some other people, remember on the nvidia forums when people went up in arms because they were seeing a bit of tearing with g-sync monitors? The issue was exactly this, v-sync off with g-sync on and high framerate games (and of course those people had not read the changelog of the drivers).
 
The extra 300 bucks can make a difference, a difference that can't be seen in a subjective test based on one game. That is why you need both the subjective and hard data to make an informed discussion.

On paper results don't give you the subjective results, but if there were other games in the test suite the results can be very different in a subjective test. That is why the hard data is extremely valuable. from the hard data you can extrapolate to a certain degree what other apps might function like based on raw performance although its not wise to solely do it based on that.

Also we are talking about 2k there, wasn't Vega marketed as a 4k card?

Many things NOT aligned with early marketing efforts by AMD. Something changed in there views of Vega? Could be.
Totally disagree. Exactly the opposite.!!
Check out JosiahBradley's #34, about why:

That's why this test was a great example of "was it worth 300$ extra?" In most cases here people said it wasn't worth the difference, in this case. I'm not a fanboi here i'm just saying based of the results of what real world players said. Now im not going to say YES based one 1 game and one instance one is better than the other but Kyle took it upon himself to have this test setup this way. Kyle did this, not Nvidia not AMD. Numbers are numbers on paper, I look at both. The raw data and the real world results......like i said before i'm glad they did this.


Sith'ari - i'm not sure how you can say this has Zero merit at all. By that case then i'm sorry but you are completely wrong. Like Kyle stated to start this isn't the only test they are going to run, but this sure as hell is a good start to show real world results. You can't take raw number and say here you go this is what tells you is better solely off of canned results. This type of test shows authenticity of how people feel.
 
Last edited:
OK, not sure why it doesn't for me then. It also doesn't for some other people, remember on the nvidia forums when people went up in arms because they were seeing a bit of tearing with g-sync monitors? The issue was exactly this, v-sync off with g-sync on and high framerate games (and of course those people had not read the changelog of the drivers).
Could be the .inf isn't consistent among the varying hardware, but I believe you. I'm sure that if there was tearing on one machine and not the other it would have been brought up by the people testing them.
 
Thanks for the effort, but it has absolutely no value for me. I was excited to hear the opinions of seasoned gamers, but they didn't have a chance to form one tbh. I understand the restrictions you were under, but it all just looked like a reality tv kind of ad for AMD. 1 game? Vulcan only? Motion blur??? Really? Not [H]ard enough for me.
 
Could be the .inf isn't consistent among the varying hardware, but I believe you. I'm sure that if there was tearing on one machine and not the other it would have been brought up by the people testing them.

Yea but at 150-200fps in Doom I think most people won't notice tearing :p
 
Correct me if I am wrong here:

The $200 G-Sync tax we have all been assuming doesn't seem correct for Ultra-wides. In fact, Ultra wide monitor's G-sync tax is actually closer to $500...

That changes the entire argument.
 
Correct me if I am wrong here:

The $200 G-Sync tax we have all been assuming doesn't seem correct for Ultra-wides. In fact, Ultra wide monitor's G-sync tax is actually closer to $500...

That changes the entire argument.
Well you would have to also factor in other variables, like the cost of the actual panel being used. VA panels are typically cheaper than IPS to begin with.
 
Back
Top