- Joined
- May 18, 1997
- Messages
- 55,601
FreeSync was set up with AMD instruction. G-Sync was set up with instruction from NVIDIA.Question: how was Freesync/G-sync configured ?
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
FreeSync was set up with AMD instruction. G-Sync was set up with instruction from NVIDIA.Question: how was Freesync/G-sync configured ?
G-Sync being locked to hardware that costs about $200 does suck. There is no getting around it. Earlier tests indicated that it was the superior solution between the two, and it might be. Those earlier reviews indicated that this was primarily due to G-Sync being better at lower refresh rates and frame rates. This past weekend, I wasn't able to discern any advantage between the G-Sync and FreeSync monitors specifically. Unfortunately, while FreeSync is an open standard, only AMD supports it because we only have to GPU manufacturers to speak of. This locks you into one brand or the other unless you want to replace your monitor everytime you change video cards. Most people don't do that.
As for pricing, NVIDIA may or may not need to do anything. We don't know what Vega's pricing will be and that card's success will depend on that pricing.
Isn't Intel going to start supporting freesync?
Intel was said to start supporting Adaptive-Sync, but that has yet to come to fruition.Isn't Intel going to start supporting freesync?
They are supposed to for their next gen IGP but how useful will that be is the question. How useful will be is another question, cause its an IGP
Eh, I'd say the opposite, the worse the GPU, the better VRR is.
would you rather have a 120 FPS card with a "good" pic, or a 100 FPS card with a "great" pic ? FPS is only a small part of the whole. seems like many treat it as the end all of all stats.
As per Nvidia and AMD instruction, vsync on.Deeply sorry to bring this up again, but were you using anykind of Vsync/Fastsync/Enhancedsync/Adaptivesync/frame limiters with either AMD or NVIDIA?
was just making a blanket statement about FPS not always being what it seems, image quality is king for me, and anything over a certain FPS/settings threshold is without Merritt. "all settings being equal(like that ever happens)" i will take better image quality over higher FPS any time. as long as it wont affect gameplay.This argument only makes sense if you plan to play one game for the entirety of your ownership of that card. Even then, there wasn't unanimous delineation on whether one *-sync was better than the other. I saw the end result as more evidence that both solutions are viable alternatives that improve the experience.
Only one thing is certain. With all this marketing smoke and mirrors the actual reviews are going to be hilarious.
That's not how it works.Yea but at 150-200fps in Doom I think most people won't notice tearing
Yeah, it's going to be interesting.
I'm expecting them to be burned at the stake honestly, after all the frustration especially.
The Vega FE cards weren't even sampled to the press. It was unprecedented. It will be very interesting to see how AMD handles the media with this launch. They tend to be *very* controlling about who gets to review their cards.Yeah, it's going to be interesting.
I'm expecting them to be burned at the stake honestly, after all the frustration especially.
100Hz Panel with GSync. That is how Nvidia wanted it done. I understand your issue. And to be quite frank you guys are making mountains out of mole hills.This is an important info Kyle, the 1080Ti is constrained with the vsync on, with ultra settings and 3440x1440 it can do 150fps+, with vsync on it only does 100fps, which is on par with Vega and a 1080. Essentially the 1080Ti is tested with breaks on.
Yeah, it's going to be interesting.
I'm expecting them to be burned at the stake honestly, after all the frustration especially.
I wasn't aware of this, thanks. I just capped my FPS in MSI Afterburner.Just fyi there are increased input delays when vsync on and the framerates are above max refresh rate of the monitors for both gsync and freesync, that's why most people would cap framerates below max refresh rate.
I wasn't aware of this, thanks. I just capped my FPS in MSI Afterburner.
The impression I got, was that AMD and Vega are better at handling the Vulkan API than NVIDIA is. FreeSync vs. G-Sync are so close as you wouldn't really be able to tell them apart when the frame rates you are getting are sufficient. The difference between the two systems was minimal, but clear. The Vega system felt snappier, but you couldn't say "hey, it gets xx frames more than the NVIDIA system."
Yes he did. I was shocked to learn that Vega was in System 2.
Well you would have to also factor in other variables, like the cost of the actual panel being used. VA panels are typically cheaper than IPS to begin with.
Using monitors with different panels, it would make sense to leave the monitor price out of the comparison.
Yeah, but AMD tries too hard to put the monitor into the comparison along with their GPUs!!
Why something like that is being done now, with the imminent RX VEGA GPUs? I'm wondering,.....during FuryX's period weren't any Freesync monitors in the market in order for AMD to combine them with their GPUs?
What has changed now, besides the obvious? (a.k.a AMD's GPUs cannot compete directly with the competition by themselves anymore, so new marketing tricks have to be invented )
Yeah, but AMD tries too hard to put the monitor into the comparison along with their GPUs!!
Why something like that is being done now, with the imminent RX VEGA GPUs? I'm wondering,.....during FuryX's period weren't any Freesync monitors in the market in order for AMD to combine them with their GPUs?
What has changed now, besides the obvious? (a.k.a AMD's GPUs cannot compete directly with the competition by themselves anymore, so new marketing tricks have to be invented )
It's not the fact I am surprised at $500 difference between the monitors instead of the usual $200
What I am surprised at is that AMD claims their setup is cheaper by $300, the monitors themselves are cheaper by $500, there is a $200 missing somewhere.
I am guessing that it all went into the GPU, and that could change the argument quite dramatically, and may give us a lower and upper bound of possible Vega pricing ($700 ~ $899, depending on what 1080 price you use).
Tell you what, I'll buy RX Vega and eat all my words if it wins a blind test against a 1080Ti on the same panel type (Monitor) with GSync/Free Sync and vsync disabled. Kyle's work is pure entertainment and I like that, it reminds me of the new Chevy Silverado commercials, when the truck is in camo and the guy rips the front cover off and its a Chevy beneath. It doesn't mean the product is better than the competitor, its purely perception.
Disabling vsync would be a bad idea. Introducing screen tearing would alter people's perceptions of the systems.
no i believe the whole freesync thing was just getting off the ground with fury X at the time.. as far as marketing goes it's pretty important for all the companies involved including AMD that freesync shows no difference to gsync and quite frankly as a consumer it should be just as important to everyone to get rid of the price premium of gsync when freesync has shown to be just as good.
i'd love to buy a new monitor that has these features but i refuse to be locked into a specific brand or pay a premium for a product that is unnecessary, be it nvidia or AMD.
they most likely went by the MSRP prices since the freesync monitor is showing a normal price of 800 dollars on amazon and newegg.
they most likely went by the MSRP prices since the freesync monitor is showing a normal price of 800 dollars on amazon and newegg.