Blind Test - RX Vega FreeSync vs. GTX 1080 TI G-Sync

It wasn't running at 4k though. It was running at a lower 21:9 3440x1440 resolution (or whatever that first number is), of which 4k is 67% more pixels. Just by extrapolation alone, a GTX 1080 should average over 70fps with the same settings.

3440x1440

I think that's where this comment comes from,

But yeah next time i'd suggest finding a pair of monitors that are the same panels, with the free sync being a samsung LTM340YP03 and g-sync being a LG LM340WU2-SSA1, it's a variable that could be controlled? Assuming the objective is a look just at freesync and g-sync operation not necessarily implementation and product line.

It's all a matter of objective, seeing as this is a vega preview, makes me wonder if vega handles freesync in a different matter from current cards. It's hard to say, seeing as this idea is an extension of what AMD was doing, maybe AMD just thought freesync needed promotion and VS format is a popular promotion format.

That's a valid point to find similar panels but does it really matter when one panel is almost twice the price if it gives a similar experience?

The impression I got, was that AMD and Vega are better at handling the Vulkan API than NVIDIA is. FreeSync vs. G-Sync are so close as you wouldn't really be able to tell them apart when the frame rates you are getting are sufficient. The difference between the two systems was minimal, but clear. The Vega system felt snappier, but you couldn't say "hey, it gets xx frames more than the NVIDIA system."

Yes he did. I was shocked to learn that Vega was in System 2.

I'd love to see this applied to more games.
 
A while ago, I took the "pepsi challenge" comparing a geforce 1080 vs a GTX1070 on an Asus G-Sync monitor @1600p. The systems were not identical but comparable.
I couldn't tell a difference.
Now with a normal monitor (actually a LCD TV with vsync on) the 1080 seemed faster.
 
So ASUS MX34V FreeSync display ($720) and an ASUS PG348 G-Sync display ($1300). $580 difference. Is Vega really going to be $280 more than a 1080?
The MX34VQ uses a Samsung SVA panel, while the PG348Q uses a LG S-IPS panel. A G-Sync monitor using a VA panel at the same resolution and size is closer to $900.
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
Kyle,

Did you run the test System 1 then System 2 for all testers, or did you have some start with System 2 then move to System 1? There is a known bias towards the last option among generally favorable options.

http://faculty.chicagobooth.edu/nicholas.epley/Li&Epley.pdf

Our procedure is very simple. We set the systems up side by side. We let one player start gaming on "System 1," play until comfortable, then move to "System 2." Once they felt they had a good hold on gaming on each system, we ran them back through the level on each machine again.
 
Enjoyed the video and liked the real world results. Can't wait for the lab work.

Damnit Li&Epley, well Kyle did say it was not scientific. ;)
 
I seem to recall a similar outcome back during a HOCP AMD event back in 2013 or so. (I was not there but I watched a video on it.)



It does matter how smooth a game plays, that is definitely not subjective but objective. (This test is not the be all, end all but, it is a good example of what is important to folks.)


If you give any shit about performance of an FPS you are running at whatever is the lowest input lag frame-rate - which is normally uncapped or capped based on the monitor technology (like maximum refresh rate -2 on a Gsync monitor).

If you can do that at maximum ingame settings, it is preferable. There should be no variable framerate which makes G-Sync/Freesync pointless. The smoothness debate on FPS multiplayer games is pretty moot.

I know that isn't how HardOCP works or tests, but it is how to do it.
 
  • Like
Reactions: Maxx
like this
Either of those cards probably never dipped below 100 fps @ 1440p, which defeats the purpose of G-Sync / Freesync.

Fact is when you approach the monitor's refresh rate with G-Sync you need to make sure you not only properly cap the FPS (in-game > RTSS > NV, and any/all must be <100 for this) but also have V-Sync set up appropriately (NVCP: on, game: off) a la the Blur Busters guide. Having any of those settings off can add tearing or input latency.
 
I would use Nvidia to get 90-120 minimum fps for ULMB in as many games as possible and use gsync in every game that can't, but I know I'd pay considerably more for it. I wish 1440p 144Hz monitors weren't still inflated in price after 2 years.
 
I have no experience with G-Sync or Freesync. I'm using an Asus 1920x1200 TN panel and I just upgraded to a 1080 when my 290 (flashed to 290x) died a couple of months ago. However, all of my reading suggests that Doom is an AMD title, especially if using Vulkan. Can't a 480/580 hang with a 1080 fps-wise when Vulkan comes in to play? I understand this was a lot to put together in a short amount of time and I appreciate the effort. It will be nice to get a better idea across a wider range of titles in DX11, DX12, and Vulkan, once this actually releases for testing by reviewers. That will be quite a bit of review work, but I can't help but wonder what the results of playing BF1, ROTR, or a game like Witcher 3 would have done for the results. Keep up the good work and congrats on being back in AMD's good graces!
 
Whichever way it goes, I'm pleased there is at leat a monikom of competion at the (mid?) high end. I mean, it's been nearly four years since nV had competition in this segment.
 
I have no experience with G-Sync or Freesync. I'm using an Asus 1920x1200 TN panel and I just upgraded to a 1080 when my 290 (flashed to 290x) died a couple of months ago. However, all of my reading suggests that Doom is an AMD title, especially if using Vulkan. Can't a 480/580 hang with a 1080 fps-wise when Vulkan comes in to play? I understand this was a lot to put together in a short amount of time and I appreciate the effort. It will be nice to get a better idea across a wider range of titles in DX11, DX12, and Vulkan, once this actually releases for testing by reviewers. That will be quite a bit of review work, but I can't help but wonder what the results of playing BF1, ROTR, or a game like Witcher 3 would have done for the results. Keep up the good work and congrats on being back in AMD's good graces!

Witcher 3 is more than likely not played anymore, and the engine is only used for Witcher 3, nothing else uses it, where as Vulkan has gained a good amount of support with Unity including it now. Frostbite, UE4 and Cryengine titles would be beneficial for judging performance of upcoming titles as well.
 
nobody would call doom an amd title when it was running like crap with opengl only. now all of a sudden its an amd title because its optimized for both manufacturers. I guess the norm is crap optimization for AMD and optimized for nvidia huh? that's a neutral game to people now.

it happens to make good use of AMD cards, end of story. it also makes good use of nvidia cards, its just that nvidia cards on pure specs are weaker so there is less to make use of beyond dx11


LOL no its optimized more for AMD cards, it uses AMD intrensics, now is it the fault of the developers or AMD no, nV didn't have their intrensics exposed in drivers when the developer was putting them in, so its nV fault ID didn't optimize for them. But with that all said, the game runs just fine on both hardware vendors.

beyond dx11 is BS, you know that, everyone knows that, nV has no problem with the newer API's. Its very hard to tailor API's to hardware IHV's unless tat IHV locks the API to their hardware. And that isn't happening with DX or Vulkan so.....
 
DOOM is just interesting because it is an ungodly (pun intended) well optimized title for both platforms. Calling it an AMD title is silly.

I would of course look forward to seeing how this card runs other things when not locked to 100 Hz and without things like motion blur, but it is good to know that with a good title it does have chops.
 
nobody would call doom an amd title when it was running like crap with opengl only. now all of a sudden its an amd title because its optimized for both manufacturers. I guess the norm is crap optimization for AMD and optimized for nvidia huh? that's a neutral game to people now.

it happens to make good use of AMD cards, end of story. it also makes good use of nvidia cards, its just that nvidia cards on pure specs are weaker so there is less to make use of beyond dx11

LOL no its optimized more for AMD cards, it uses AMD intrensics, now is it the fault of the developers or AMD no, nV didn't have their intrensics exposed in drivers when the developer was putting them in, so its nV fault ID didn't optimize for them. But with that all said, the game runs just fine on both hardware vendors.

beyond dx11 is BS, you know that, everyone knows that, nV has no problem with the newer API's. Its very hard to tailor API's to hardware IHV's unless tat IHV locks the API to their hardware. And that isn't happening with DX or Vulkan so.....

You would be remiss to suggest that either company has optimized for DOOM more than the other. The biggest shooter title of 2017 was not overlooked.
 
Great video and very interesting results. Thanks.

A number of them said one system was closer to what they were used to. I'd be interested to know if they used Freesync or G-Sync on their home systems and, if so, if the system they said was closer to what they were used to matched.
 
Next topic, who's dropping $1000 on a monitor? Not this guy.


You're missing the point

If you'd consider one of these $700 cards and don't have freesync or gsync you're leaving a positive experience on the table.

And it'll be a positive experience as long as you are in the synced range! You don't need 90+ fps to feel smooth. The sync tech makes it feel butter smooth all the way to the minimum FPS sync.

The sync tech changes the rules of the game.


It'd be interesting to repeat the test next week with the sync tech turned off! And see if opinions stay the same?
 
I am actually shocked the play is comparable to a 1080ti. I wonder what the results would have been if you went with the 1080 like amd wanted?
 
I am actually shocked the play is comparable to a 1080ti.

I'm not. Doom doesn't stress the GPU as much as some other games, and twitch games like Doom benefit from high FPS. I'm looking forward to tests that do stress the GPUs.
 
A good tease but this deserves a much more full-fledged test where you also account for low framerates vs high (unlimited fps cap) vs in-between range. Whichever handle low fps better (<60 fps would be a huge deal to me).
 
A good tease but this deserves a much more full-fledged test where you also account for low framerates vs high (unlimited fps cap) vs in-between range. Whichever handle low fps better (<60 fps would be a huge deal to me).
You mean like doing a REVIEW when AMD drops the information embargo/NDA? Good idea. ;)
 
the thing about this is it makes the 1080ti value moot. if you get better experience out of a weaker GPU with freesync than a more powerul one without.
.

This is why I'd like to see a weaker AMD GPU like a 480 go up against Vega with Freesync on both.
 
It's like if they don't like the result, attack the process. If ya can't attack the process, attack the source. What next?

Don't assume that every question is an attack, lots of the time they're just questions.

Kyle, are you going to have the time to do any VR testing? If not, can I request a follow up article?

Looking forward to the full review and sincerely hoping to have ATI/AMD/RTG back in the game at the high end.
 
That first picture in the review is a little misleading...

1501092044luf12r9i4c_title.jpg


Great review, can't wait to see how it performs in other games.
 
Adaptive Framerate monitors are the shit.

The End.

Seriously I will never game on something that doesn't have Freesync or Gsync again. It's amazing.

I wish there was a way to show people what there missing! It's a bit hard to grasp never owning one lol!
 
I wish there was a way to show people what there missing! It's a bit hard to grasp never owning one lol!

It's really true, I was playing WoWS last night and it was just buttery smooth, happened to look up at RTSS and saw I was getting 48FPS. Before I got this monitor, I could feel when it dropped to 58 from 60.
 
Thanks for this experiment Kyle. However, It's easy to figure out the direction AMD and Kyle are going with this piece and that's to showcase that you can still have a great gaming experience without having the higher frame rates along with the higher cost.

But there is still a major hurdle here that will be almost insurmountable. Frame rates are king cost be damned.

Had this been any other game other than Doom with Vulcan I might have taken notice but .... it wasn't. The reason Doom was picked is very clear. AMD hardware pushes this title exceedingly well. Had this been any other title, the difference I think would go in favor of system 1 which held the nVidia card.

In fact, a warning to all. In the coming weeks be very very wary of reviews done specifically with Doom as a reference game. It's just not a good title to get an accurate picture of what the new AMD cards can do. To me, this is a misleading piece. BUT, that's my own personal opinion and I am still excited to see this experiment.
 
I believe Kyle will cover all the metrics. AMD just needs to give us the PRICE. With those 2 we can then decide what's the best combos work for our needs:).

Can't wait for the full review...ssss!
 
Back
Top