Blind Test - RX Vega FreeSync vs. GTX 1080 TI G-Sync

G-Sync being locked to hardware that costs about $200 does suck. There is no getting around it. Earlier tests indicated that it was the superior solution between the two, and it might be. Those earlier reviews indicated that this was primarily due to G-Sync being better at lower refresh rates and frame rates. This past weekend, I wasn't able to discern any advantage between the G-Sync and FreeSync monitors specifically. Unfortunately, while FreeSync is an open standard, only AMD supports it because we only have to GPU manufacturers to speak of. This locks you into one brand or the other unless you want to replace your monitor everytime you change video cards. Most people don't do that.

As for pricing, NVIDIA may or may not need to do anything. We don't know what Vega's pricing will be and that card's success will depend on that pricing.


Isn't Intel going to start supporting freesync?
 
Isn't Intel going to start supporting freesync?


They are supposed to for their next gen IGP but how useful will that be is the question. How useful will be is another question, cause its an IGP ;)
 
would you rather have a 120 FPS card with a "good" pic, or a 100 FPS card with a "great" pic ? FPS is only a small part of the whole. seems like many treat it as the end all of all stats.
 
Eh, I'd say the opposite, the worse the GPU, the better VRR is.


up to a certain point, really bellow 60 FPS, the benefits of adaptive sync tech really diminish. Most monitors right now natively run @ 60 hz (gaming monitors). With IGP, the resolution above 1080p, they won't be able to get 60 fps for the most part. Of course if someone is going to spend money on a 1080p monitor with running games above that mark, I sure hope they spend the 100 bucks or 150 to get a discrete GPU.
 
would you rather have a 120 FPS card with a "good" pic, or a 100 FPS card with a "great" pic ? FPS is only a small part of the whole. seems like many treat it as the end all of all stats.

This argument only makes sense if you plan to play one game for the entirety of your ownership of that card. Even then, there wasn't unanimous delineation on whether one *-sync was better than the other. I saw the end result as more evidence that both solutions are viable alternatives that improve the experience.
 
  • Like
Reactions: B770
like this
This argument only makes sense if you plan to play one game for the entirety of your ownership of that card. Even then, there wasn't unanimous delineation on whether one *-sync was better than the other. I saw the end result as more evidence that both solutions are viable alternatives that improve the experience.
was just making a blanket statement about FPS not always being what it seems, image quality is king for me, and anything over a certain FPS/settings threshold is without Merritt. "all settings being equal(like that ever happens)" i will take better image quality over higher FPS any time. as long as it wont affect gameplay.
this holds true no matter what game you use it on. usually while the FPS may change from game to game the image quality will generally stay the same. card A might be faster in FPS in one game while card B may be faster in others but if card A has better image quality it will always have the better image regardless of FPS.
 
I actually was pleasantly surprised by the test, but I must say I think the [H] influence really made it what it was. I would not have been so impressed with AMD's approach to it.

Overall I thought it was very well put together by Kyle, thanks for this video.

I think for what it was, it was explained very well by Kyle and should just be taken as it is, also just one game and it's free to interpret. I think considering the pressure some companies can put on review sites I think it was very level headed and not influenced in any way. Total hats off on that on. That takes some character.

I think the end result was quite interesting but I'm wondering if this wasn't more of a Gsync vs Freesync result. Which is entirely defendable, as that's what I would expect, the best gaming experience from both camps would not have been done on traditional monitors.

I don't think it says much from RX Vega, except that it games well enough on a fairly standard resolution, on a single game. I think there's some good news and bad news in this test. I don't think RX Vega will hold a candle to nvidia this time around and that we'll get yet another card that can't rival the flagship of the other team, which is a bit of a shame, and I'd consider myself a pretty avid nvidia fan as my main gaming GPU has always been from that camp.

Look forward to the full review (later this month perhaps?)
 
Only one thing is certain. With all this marketing smoke and mirrors the actual reviews are going to be hilarious.
 
Only one thing is certain. With all this marketing smoke and mirrors the actual reviews are going to be hilarious.

Yeah, it's going to be interesting.

I'm expecting them to be burned at the stake honestly, after all the frustration especially.
 
Yea but at 150-200fps in Doom I think most people won't notice tearing :p
That's not how it works.
The higher the framerate, the more tearing you get. For every multiple of the refresh rate that you cross, you get an extra tear line added on-screen.
What does change is that the gap between tears gets smaller at higher framerates though.
Eventually when the framerate gets high enough (maybe 10x the refresh rate) it just looks like the image is warping when you look around, instead of tearing.

I'm not sure why anyone here is suggesting that VRR and V-Sync should be disabled.
How are you supposed to make a good comparison when the game is stuttering and the screen is tearing?

The flaw with this test was not choosing settings which were demanding enough to keep the fastest GPU below 100 FPS at all times since they were 100Hz displays, not that VRR was being used.
 
Yeah, it's going to be interesting.

I'm expecting them to be burned at the stake honestly, after all the frustration especially.

They've hidden any kind of performance metrics and are spinning it [H]ard before it even launches. It's going to be an epic crash and burn.
 
Last edited:
Yeah, it's going to be interesting.

I'm expecting them to be burned at the stake honestly, after all the frustration especially.
The Vega FE cards weren't even sampled to the press. It was unprecedented. It will be very interesting to see how AMD handles the media with this launch. They tend to be *very* controlling about who gets to review their cards.
 
This is an important info Kyle, the 1080Ti is constrained with the vsync on, with ultra settings and 3440x1440 it can do 150fps+, with vsync on it only does 100fps, which is on par with Vega and a 1080. Essentially the 1080Ti is tested with breaks on.
100Hz Panel with GSync. That is how Nvidia wanted it done. I understand your issue. And to be quite frank you guys are making mountains out of mole hills.

I suggest you wait for reviews.
 
Seriously, who cares about 150 FPS on a 100Hz monitor? You want V-sync on to avoid screen tearing above the max refresh.
 
I really like this style of review. It sheds some light on what people are looking for in a pleasant gaming experience. Looks like a PITA to do though but that's why we read [H]. Doing the stuff the [H]ard way for more meaningful results.
 
Just fyi there are increased input delays when vsync on and the framerates are above max refresh rate of the monitors for both gsync and freesync, that's why most people would cap framerates below max refresh rate.
I wasn't aware of this, thanks. I just capped my FPS in MSI Afterburner.
 
The impression I got, was that AMD and Vega are better at handling the Vulkan API than NVIDIA is. FreeSync vs. G-Sync are so close as you wouldn't really be able to tell them apart when the frame rates you are getting are sufficient. The difference between the two systems was minimal, but clear. The Vega system felt snappier, but you couldn't say "hey, it gets xx frames more than the NVIDIA system."



Yes he did. I was shocked to learn that Vega was in System 2.

That was Awesome.. I for one never stare at the frame rate counter when I play games.. I check out reviews of hardware, from different review sites then make my purchase. So far I have been happy with systems I have had.
 
Vega will be fine, although I am sticking with my 2 x Sapphire Furies that work well at 4k. (One does work at 4k pretty good too but, it depends on the game.) It is incredible to have full 10 bit hardware support right out of the box, everything looks sharp, bright and clear with AMD right out of the box. I do not have a Freesync monitor but, I am content with my Samsung 4k 28 inch monitor I bought in 2015, before they released the Freesync enabled model.
 
Well you would have to also factor in other variables, like the cost of the actual panel being used. VA panels are typically cheaper than IPS to begin with.

It's not the fact I am surprised at $500 difference between the monitors instead of the usual $200

What I am surprised at is that AMD claims their setup is cheaper by $300, the monitors themselves are cheaper by $500, there is a $200 missing somewhere.

I am guessing that it all went into the GPU, and that could change the argument quite dramatically, and may give us a lower and upper bound of possible Vega pricing ($700 ~ $899, depending on what 1080 price you use).
 
Using monitors with different panels, it would make sense to leave the monitor price out of the comparison.

Yeah, but AMD tries too hard to put the monitor into the comparison along with their GPUs!!
Why something like that is being done now, with the imminent RX VEGA GPUs? I'm wondering,.....during FuryX's period weren't any Freesync monitors in the market in order for AMD to combine them with their GPUs? :p
What has changed now, besides the obvious?:yawn: (a.k.a AMD's GPUs cannot compete directly with the competition by themselves anymore, so new marketing tricks have to be invented )
 
Yeah, but AMD tries too hard to put the monitor into the comparison along with their GPUs!!
Why something like that is being done now, with the imminent RX VEGA GPUs? I'm wondering,.....during FuryX's period weren't any Freesync monitors in the market in order for AMD to combine them with their GPUs? :p
What has changed now, besides the obvious?:yawn: (a.k.a AMD's GPUs cannot compete directly with the competition by themselves anymore, so new marketing tricks have to be invented )

Everyone uses marketing tricks. Nvidia does it as well, all the damn time.
 
Tell you what, I'll buy RX Vega and eat all my words if it wins a blind test against a 1080Ti on the same panel type (Monitor) with GSync/Free Sync and vsync disabled. Kyle's work is pure entertainment and I like that, it reminds me of the new Chevy Silverado commercials, when the truck is in camo and the guy rips the front cover off and its a Chevy beneath. It doesn't mean the product is better than the competitor, its purely perception.
 
Yeah, but AMD tries too hard to put the monitor into the comparison along with their GPUs!!
Why something like that is being done now, with the imminent RX VEGA GPUs? I'm wondering,.....during FuryX's period weren't any Freesync monitors in the market in order for AMD to combine them with their GPUs? :p
What has changed now, besides the obvious?:yawn: (a.k.a AMD's GPUs cannot compete directly with the competition by themselves anymore, so new marketing tricks have to be invented )


no i believe the whole freesync thing was just getting off the ground with fury X at the time.. as far as marketing goes it's pretty important for all the companies involved including AMD that freesync shows no difference to gsync and quite frankly as a consumer it should be just as important to everyone to get rid of the price premium of gsync when freesync has shown to be just as good.

i'd love to buy a new monitor that has these features but i refuse to be locked into a specific brand or pay a premium for a product that is unnecessary, be it nvidia or AMD.


It's not the fact I am surprised at $500 difference between the monitors instead of the usual $200

What I am surprised at is that AMD claims their setup is cheaper by $300, the monitors themselves are cheaper by $500, there is a $200 missing somewhere.

I am guessing that it all went into the GPU, and that could change the argument quite dramatically, and may give us a lower and upper bound of possible Vega pricing ($700 ~ $899, depending on what 1080 price you use).

they most likely went by the MSRP prices since the freesync monitor is showing a normal price of 800 dollars on amazon and newegg.
 
Last edited:
Tell you what, I'll buy RX Vega and eat all my words if it wins a blind test against a 1080Ti on the same panel type (Monitor) with GSync/Free Sync and vsync disabled. Kyle's work is pure entertainment and I like that, it reminds me of the new Chevy Silverado commercials, when the truck is in camo and the guy rips the front cover off and its a Chevy beneath. It doesn't mean the product is better than the competitor, its purely perception.

Disabling vsync would be a bad idea. Introducing screen tearing would alter people's perceptions of the systems.
 
no i believe the whole freesync thing was just getting off the ground with fury X at the time.. as far as marketing goes it's pretty important for all the companies involved including AMD that freesync shows no difference to gsync and quite frankly as a consumer it should be just as important to everyone to get rid of the price premium of gsync when freesync has shown to be just as good.
i'd love to buy a new monitor that has these features but i refuse to be locked into a specific brand or pay a premium for a product that is unnecessary, be it nvidia or AMD.
they most likely went by the MSRP prices since the freesync monitor is showing a normal price of 800 dollars on amazon and newegg.

Don't remember back at FuryX's period, AMD stating that ....."we won't tell you the price of our GPU, BUT IF you combine a FURYX GPU with a Freesync monitor, the cost will be ...200...300...400$ cheaper than the competition". That's what they are telling us now with RX VEGA.
So, what does this mean?
1) Either Freesync monitors were equally priced with the GSync ones, during FuryX's period (*which of course isn't true)
2) Either AMD did a horrible marketing , during their FuryX GPUs
or....
3) Something is not right -(*price , performance, combination of these two.....i don't know yet )- with the current line of AMD's GPUs, so they are trying to divert attention away from their GPU itself as a solo-product.
At last we will find out what is going on.....few days left, but i'm running out of patience to be honest. An entire year we discuss rumours & hype about RX VEGA.!! Can't stand it anymore !! :wacky:
 
they most likely went by the MSRP prices since the freesync monitor is showing a normal price of 800 dollars on amazon and newegg.

That's actually where I got the $500 from, the G-Sync version at $1300, barring AOC's version at $900, the $1300 has missing $200, and the $900 G-Sync would put Vega at $300 using current 1080 MSRP, an EXTREMELY unlikely proposition, and I don't think AMD would use that particular model either.

PS The AOC G-Sync is actually a bit of a shocker. I found the $500 G-Sync tax to be completely absurd, $100 is much less so...
 
It's hard to predict where G-Sync prices will go- each monitor has its own microcosm of supply and demand affecting prices. However, G-Sync should get cheaper (relative to the lightweight hardware used in FreeSync monitors) over time as both a) more modules are manufactured and b) improvements to the modules to make them less expensive to manufacture are implemented.

At least manufacturers are halfassing FreeSync monitor sync ranges less often lately.
 
The largest problem with G-Sync is though, the variety of monitors that uses them.

Currently the selection is pitiful compared to FreeSync. The poor FreeSync range only seem to plague the 60~75hz panels, on the 120hz+ they still look pretty good.
 
Well, I'd guess that monitor manufacturers market research indicates that many/most gamers using higher-end GPUs aren't interested in monitors using crappy panels, outside of twitch gamers that sacrifice everything for the lowest persistence possible (otherwise even crappier TN panels).

Would fit manufacturers limiting G-Sync to the higher-end monitors.
 
I was talking to Dan/Rix tonight and he made a point I think I agree with. 100Hz is the spot for most enthusiast consumers. That in league with a frame delivery technology is a huge step in gaming immersion. And losing yourself in the game is really what it is all about. Of course roomscale VR will put all that to shame.
 
Back
Top