AMD RX Vega FreeSync vs NVIDIA GTX 1080 TI G-Sync Coming Soon

Not in a direct way at least.

The reason why I said what I said was that, if I were to find a perfect monitor that would fit into my gaming and everyday needs, currently that monitor will only exist in FreeSync, not G-Sync, meaning that if I were to use that technology, I'd have to choose AMD right now. But currently their most powerful GPU has already EoL'ed for a while, rest of their current GPUs are hard to come by due to the mining craze, and their next top dog is only matching what USED to be nV's top dog.

I am using [H]'s TW3 benchmark as a basis for the kind of performance I am looking for, 60fps minimum at ultra settings, 1080 manages 60fps average, 1080ti is the card that closest GPU to fit the bill, but AMD has had nothing close to it, and this Vega doesn't look like it's going to cut it any time soon.

Hence the extra disappointment in Vega. My ideal monitor is in their camp.

It would be nice for nVidia supporting FreeSync, but with this current state of affairs, it could very well be the last nail in RTG's coffin because that's the only remaining advantage they have (abundance of cheaper FreeSync monitors). Once that's gone, there is nothing going for them.

And it'd probably be in nV's best interest NOT to support FreeSync.

I'll admit, I'd be much more surprised at nV supporting FreeSync than RX vega beating an 1080ti.
 
Not in a direct way at least.

The reason why I said what I said was that, if I were to find a perfect monitor that would fit into my gaming and everyday needs, currently that monitor will only exist in FreeSync, not G-Sync, meaning that if I were to use that technology, I'd have to choose AMD right now. But currently their most powerful GPU has already EoL'ed for a while, rest of their current GPUs are hard to come by due to the mining craze, and their next top dog is only matching what USED to be nV's top dog.

I am using [H]'s TW3 benchmark as a basis for the kind of performance I am looking for, 60fps minimum at ultra settings, 1080 manages 60fps average, 1080ti is the card that closest GPU to fit the bill, but AMD has had nothing close to it, and this Vega doesn't look like it's going to cut it any time soon.

Hence the extra disappointment in Vega. My ideal monitor is in their camp.
My Fury X Crossfire experience on three 1440p Freesync HP Omen 32" monitors was nothing short of excellent over the last nine months or so (when I bought my second card)

I think it's fair to expect the same crossfire experience from Vega. That's an option you might consider.


Also consider the whole point of freesync (and gsync) is that you don't need 60FPS or more to "feel" smooth. In my experience with my Omen monitors down to Freesync minimum range at 48hz feels buttery smooth.
 
AIO on Fury X was one of the primary reasons why I didn't go near it. I don't trust water.

AIO on Vega will probably be the same, but after using SLI, I am now also weary of using mGPU again, so I'd like to avoid that as much as possible...

If I trusted water anymore and if I had more free time to be able to tinker with mGPU setups, I'd probably have made the switch. I was close to doing so with Crossfiring Fury instead of going single 1080, but I decided to go 1080 due to it being single GPU.
 
Or implement freesync in their gpu driver software. So Nvidia cards would work with freesync monitors.
Yeah that too could happen....but probably never will.

NV has lots of mindshare tied up in Gsync and they won't let something like that go easily. Monitor manufacturers will have start abandoning Gsync before they do I believe.

it could very well be the last nail in RTG's coffin because that's the only remaining advantage they have (abundance of cheaper FreeSync monitors). Once that's gone, there is nothing going for them.
Nah. Even if Vega turns into a total bust they're going to try and stick it out with GCN and ho hum marketshare until Navi comes out. If Navi turns into a bust...yeah RTG might be finished in the dGPU market. They'll only maybe do APU's.

AIO on Fury X was one of the primary reasons why I didn't go near it. I don't trust water.
Reliability seems to be OK for AIO's these days. Bigger issue is making it fit in some cases IMO. Air is still simplest though so understandable if you'd still want air only instead.
 
Also do you plan to do some benchmarking with the Beyond3d suite? if you dont have it why not ask for it since Anandtech and PCGH got it
So, wonderfully attractive wrapper, awesome theoretical performance, but blows up when used?

All sound and no Fury (hhahhhaha!!)

I do like me some RX-7's though to be honest; built 3 SX13s from ground up, one CA18, one NA and one KA24DET. Damn I love Japan!
 
Given the fact that this is a Freesync vs. G-Sync test and given the demonstrations they've been doing so far, I have the sinking suspicion that AMD is pushing the total cost vs. gaming experience angle. Check out Adored's latest video on that. My guess is that AMD wants us to see that we can't tell the difference between the two when playing them, and given the cheaper cost of Freesync, you'll save a little money while getting the same "experience". So, we can kind of guess that the price of an average freesync monitor plus Vega is less than the price of a comparable G-Sync monitor plus GTX1080.

Using fuzzy math here, say that a G-sync 4K monitor is $300 more than a comparable freesync model. Then Vega can be priced $200 more than a GTX1080 at say, $700, and you can still save $100 while getting the same experience. Therefore, my guess on Vega reference pricing is somewhere in the neighborhood of $649.99 to $699.99. Hope I'm wrong.

They did it with Fury, they'll do it again. Ryzen has nothing to do with Vega people, margins are not close.
 
and if it mines digital currency efficiently....o_O

I know FE is not RX, but someone posted the 1k+ FE hashing between 30-32 at 300W+. It would be an inefficient mining card hell the 1070 beats it hands down in efficiency. But maybe RX will mine good.... I guess we shall wait and see.
 
1080 is 500
1080ti is 700

Both sell well.

RX Vega will be $600 in my estimation.

They don't have to price it at 1070 levels for it to sell, contrary to some of the comments I've read.

There are those of us with freesync monitors that are eagerly awaiting release.

Or people that flat out hate team Green and want Red to win at any cost.
 
Even if amd sponsored it, kyle wouldnt pull any punches or skew anything in their favor.

Honestly, it was the first thing that crossed my mind after the backlash TPU got with their AMD sponsored review. Anyway, the results should speak for themselves if the testing methodology is fair and sound. Actually, I'm more interested if people can tell the difference between G-Sync vs. FreeSync in its current state.
 
Given the fact that this is a Freesync vs. G-Sync test and given the demonstrations they've been doing so far, I have the sinking suspicion that AMD is pushing the total cost vs. gaming experience angle. Check out Adored's latest video on that. My guess is that AMD wants us to see that we can't tell the difference between the two when playing them, and given the cheaper cost of Freesync, you'll save a little money while getting the same "experience". So, we can kind of guess that the price of an average freesync monitor plus Vega is less than the price of a comparable G-Sync monitor plus GTX1080.

Using fuzzy math here, say that a G-sync 4K monitor is $300 more than a comparable freesync model. Then Vega can be priced $200 more than a GTX1080 at say, $700, and you can still save $100 while getting the same experience. Therefore, my guess on Vega reference pricing is somewhere in the neighborhood of $649.99 to $699.99. Hope I'm wrong.

The most obvious logical explanation for AMD promoting the combination of ( VEGA+FreeSync total cost ) Vs ( 1080+GSync total cost ), is that their GPU as a standalone will be similarly priced to the competition (*perhaps even higher due to the HBM2 )
So apparently, it's not beneficial for them, -from marketing point of view-, to compare only GPU Vs GPU.
 
I would have also compared the list of 4k FreeSync capable monitors to a puny-in-comparison G-Sync monitors, just to show that there are much greater variety of FreeSync monitors than G-Sync.

(of course, some of those FreeSync ranges are horrible, like 56~61 hz)
 
hey this is awesome Kyle. I always look forward to the 2 camps go at it and duke it out. Looking forward to what you have to show us all. I gotta say the last3 or 4 cards now have all been Nvidia for me, but I hold no allegiance. Im always gonna go with the best bang for the buck when it comes to hardware. Now if we could only have some monitors that Freesync and G-sync so we dont have to choose. Are there any?
 
Hi Kyle, long time reader here, check in everyday and love the site, I thought I had a forum account from some years ago but for the life of me couldn't figure out username/email so just made a new account.

Anyway, I can appreciate the desire to play with unreleased hardware even before a product launch, but I can't help but feel (no disrespect intended) that AMD is trying to pull the wool over peoples eyes with these blind tests. I think it's not the right way to go in for review sites to go along with this charade. It's awesome that in some way you are still sticking it to them by throwing a 1080Ti in the mix, but I can't help but feel that this blind test is not in the tradition of [H] when it comes to hard facts that we have come to love. Especially when it comes to a level playing ground or full declaration of hardware used by a competitor. Again, I understand that it might be a fine line with AMD, perhaps especially so with previous dealings with them but I don't personally like the very dubious route AMD has taking with these tests.

Perhaps I completely got this wrong and I look forward to the video to perhaps alleviate some concerns. Again, no disrespect meant but it's just something I wanted to put to paper so to speak.
 
I must confess to being disappointed that the tests were not done at 4K, but maybe next time.
 
Wake up, damnit, I need information! You can sleep when your dead!!

Seriously though, bated breath and all that.
 
For transparency, is this an independent test or is this sponsored in any way, shape or form by AMD?
Currently, AMD is not an advertiser on HardOCP. It did not pay for an article. It did buy the equipment. Equipment is going back. AMD is a sponsor of Inside VR, which gets almost zero views through HardOCP. So I will let you make your mind up on what you want to label things. I found out a long time ago that it really makes no difference what I say.

Truth hurts.
 
I really hope you have the newest drivers from AMD that actually treat it as vega and not just a high clocked fury. Tile rasterization etc.
 
Where that video at?

You're awake, you're posting. Who cares about your first cup of coffee, lets put that video up!
 

Old OLD article. Problem is when you drop below a certain refresh rate, you have to frame double the LCD. GSync has this frame buffer to do the double. Freesync goes into a tearing. But at these low rates, your game play is going to suffer anyway.

If your monitor complies with the 2.5x rule for FreeSync 2, your experience will be much more fluid at higher rates. And those monitors are now available.
 
I really hope you have the newest drivers from AMD that actually treat it as vega and not just a high clocked fury. Tile rasterization etc.

There's some debate as to whether or not these are hacked fury drivers. Looking at how the GPU is broken up you could assume there would be a high degree of similarity between the two. And from an engineering perspective, trying to save both time and money, a refresh is a lot cheaper than a ground up implementation.

But claiming it's just a refresh doesn't sound nearly as exciting. So until someone puts a reverse compiler on the code and examines the code, it's hard to say.
 
There's some debate as to whether or not these are hacked fury drivers. Looking at how the GPU is broken up you could assume there would be a high degree of similarity between the two. And from an engineering perspective, trying to save both time and money, a refresh is a lot cheaper than a ground up implementation.

But claiming it's just a refresh doesn't sound nearly as exciting. So until someone puts a reverse compiler on the code and examines the code, it's hard to say.
AMD made a big deal of the next gen unit is different and better than GCN:

Wiki:

AMD began releasing details of their next generation of GCN Architecture, termed the 'Next-Generation Compute Unit', in January 2017.[34][37][38] The new design is expected to increase instructions per clock, higher clock speeds, support for HBM2, a larger memory address space, and the High Bandwidth Cache Controller. Additionally, the new chips are expected to include improvements in the Rasterisation and Render output units. The stream processors are heavily modified from the previous generations to support packed math Rapid Pack Math technology for 8-bit, 16-bit, and 32-bit numbers. With this there is a significant performance advantage when lower precision is acceptable (for example: processing two half-precision numbers at the same rate as a single single precision number).

Nvidia introduced tile-based rasterization and binning with Maxwell,[39] and this was a big reason for Maxwell's efficiency increase. In January, AnandTech assumed that Vega would finally catch up with Nvidia regarding energy efficiency optimizations due to the new "Draw Stream Binning Rasterizer" to be introduced with Vega.[40]

It also added support for a new shader stage - primitive shaders.[41][42]
 
I really hope you have the newest drivers from AMD that actually treat it as vega and not just a high clocked fury. Tile rasterization etc.

I'm exited for sure, and really do hope something is hidden in the amd drivers.

Old OLD article. Problem is when you drop below a certain refresh rate, you have to frame double the LCD. GSync has this frame buffer to do the double. Freesync goes into a tearing. But at these low rates, your game play is going to suffer anyway.

If your monitor complies with the 2.5x rule for FreeSync 2, your experience will be much more fluid at higher rates. And those monitors are now available.

regarding G-sync:
Horrible hack to create sync.
Using general purpose hardware to do it is just plain wrong. and the reason why it's pricey....
if they used a more optimized asic it may have been better.

The end result would have been the same for us as an experience but without 300$ premium :)

But maybe I'm just interested in the how and not the result at times :)
 
AMD made a big deal of the next gen unit is different and better than GCN:

Wiki:

[redacted for brevity]

Okay let me explain it to you this way.

The last HUGE architecture change from Intel was from NetBurst P4 to the Core 2

Everything past Sandybridge has been minor improvements. ie: Better scheduling, OOE look-ahead steps, better branch prediction, better loop unrolling, better cache storage, bigger cache, better energy usage & states, better iGPU, AVX extensions etc.

Together these give you a 20% IPC improvement. But make no mistake, this is not ground up. This is a revision or evolution.

Now any software written specifically for Sandybridge or later will work with Skylake. But timingings and features may not be implemented get the best speed, until the software is rewritten to take advantage of the new instructions and timings like AVX512

So even though there are INTERNAL improvements, the layout and resource allocation is pretty much the same. Same ROPs etc...Just like Sandybridge i3 has 2 + 2 cores. Skylake i3 has 2 + 2 cores.

Until we analyze the code from Fury to Vega, we won't know for sure how similar they are. But getting an initial driver out the door is relatively easy since it looks like just an upgrade.
 
WHY are these still competing standards? This is something which should be a single ISO or RFC or whatever applies to it, with a committee behind it. When are AMD and nVidia going to stop this type of bullshit?
 
WHY are these still competing standards? This is something which should be a single ISO or RFC or whatever applies to it, with a committee behind it. When are AMD and nVidia going to stop this type of bullshit?

Blame Nvidia. AMD is using the open standard defined by VESA. Nvidia decided to go their own, closed ecosystem, route.
 
WHY are these still competing standards? This is something which should be a single ISO or RFC or whatever applies to it, with a committee behind it. When are AMD and nVidia going to stop this type of bullshit?

Adaptive v-sync is actually a VESA optional standard.
 

Well that's Swedish pricing. You poor Europeans and Aussies get screwed on pricing for consumer electronics.

But I will say if it really is that much off the mark compared to performance, then AMD Radeon division should:



They would have been better off not releasing anything at all and sticking to the professional market. Recovering from looking like a fool is much harder than not saying anything at all.
 
Well that's Swedish pricing. You poor Europeans and Aussies get screwed on pricing for consumer electronics.
.

I didn't post this for RX VEGA's price itself. What i care about this thread, is the price difference (*still rumours of course) between RX VEGA & GTX1080.
Even if the price is in Swedish currency, the price difference between these 2 is a large one ( 5600 SEK for GTX1080 Vs 9,000 SEK the roumored price for RX VEGA !!! )

EDIT: What i mean by that, is that there is a strong possibility that RX VEGA's US-pricing will be more expensive than the competition, just as i said at my previous post #114.
 
Last edited:
Too much for the article icon?

upload_2017-7-26_13-7-27.png
 
Back
Top