AdoredTV: NVIDIA’s Performance Improvement Has Dropped from 70% to 30%

If you can show me any evidence that AMD was planning to do adaptive sync BEFORE Gsync was announced, then sure, I would call it innovation. Otherwise, it's responding to your competitor and doing something similar to what they're doing, except in a different way. I don't call that innovation.

Another way to look at it: If Nvidia never decided to make Gsync, Freesync probably wouldn't exist. That should tell you all you need to know about how innovative it is right there. I mean christ, the technology behind the concept had been around long enough even prior to Gsync. For something to be innovative, you have to do something new. I'm not trying to pick on Nvidia or AMD, again, AMD has a few examples of pioneering graphics tech that Nvidia followed after the fact. Both companies have had different innovations over time.
It is easily googled, no time at the moment. AMD and Intel were pushing for a form of adaptive sync as far back as 2009, it is related to Vesa standards and can be found when searching for AMDs push for adaptive sync after G-sync (they reference the push in 2009). All these backdoor loving Intel/Nvidia fanatics are ill-informed and don't care to KNOW the facts but to create them.
 
It is easily googled, no time at the moment. AMD and Intel were pushing for a form of adaptive sync as far back as 2009, it is related to Vesa standards and can be found when searching for AMDs push for adaptive sync after G-sync (they reference the push in 2009). All these backdoor loving Intel/Nvidia fanatics are ill-informed and don't care to KNOW the facts but to create them.

Are you making baseless claims without documentation again?
 
Are you not reading and comprehending again? Looks like it. I told you what to look for as I don't have the time at the moment.

The double standards :D

I await your documentation that you always demand.
 
It is easily googled, no time at the moment. AMD and Intel were pushing for a form of adaptive sync as far back as 2009, it is related to Vesa standards and can be found when searching for AMDs push for adaptive sync after G-sync (they reference the push in 2009). All these backdoor loving Intel/Nvidia fanatics are ill-informed and don't care to KNOW the facts but to create them.


Adaptive sync is not Variable Adaptive Sync JustReason.....

Nvidia stated they would support adaptive sync if the demand was there, and it wasn't at that time. Variable Adaptive Sync tech has many differences.

This is AMD talking about adaptive sync

"Adaptive-Sync is the underlying standard, but makes no qualitative demands of adopters. As far as the spec is concerned, a 2Hz range is Adaptive-Sync, for example. Obviously that is useless for gaming. We have specific tests for backlight bleed, DRR range, motion blur, backlight flicker, pixel persistence, etc. We want monitors that bear our brand and logo to clear a certain quality threshold for the users that might buy them. To put it bluntly: the rubric is not available because this is a competitive industry and we're not interested in having our rubric needlessly and pettily nitpicked."

Free sync improvements over adaptive sync

Low Framerate Compensation(LFC)
Certification (loosely regulated but still there), certain features must be there, not all Adaptive sync have all the features to be Freesync

Gysnc improvements
Certification, certain features must be there, not all Adaptive sync have all the features to be Freesync
Software solution added to hardware ware solution. Programmable. Certification. And many others.

Because of the certification process, many features that are available on both Free Sync and G Sync are not available on adaptive sync monitors more importantly gaming needed features.
 
Last edited:
The double standards :D

I await your documentation that you always demand.
Seriously! You claim to know everything but you don't know anything really, just create your own truth.

http://www.nag.co.za/2014/01/22/lets-discuss-variable-refresh-rates/

Intel and AMD, on the other hand, have included support for changing the VBLANK interval rate already and have been technically capable of this for some time. The VBLANK interval determines how long a monitor must wait before a new frame is ready for delivery. AMD calls their implementation FreeSync, owing to the fact that no customised hardware is required to support variable VBLANK intervals. Altering VBLANK intervals is a feature that monitors conforming to the new VESA and Displayport 1.3 standards will be required to do and happily, AMD says they’ve been able to do this since the HD5000 series.

https://www.dvhardware.net/article48223.html

VESA announced Embedded DisplayPort (eDP) version 1.3. This new standard includes a new Panel Self-Refresh (PSR) feature that promises to save power and enhance battery life in portable PC systems. The first portable computers with eDP 1.3 and PSR are expected to arrive in 2012.

...

Industry market trends indicate that eDP is slated to replace LVDS (Low-Voltage Differential Signaling), the previous standard for LCD panel inputs. In December 2010, Intel and AMD announced that by 2013, the two companies will no longer support LVDS in favor of scalable and lower power digital interfaces, such as DisplayPort. Leading PC manufacturers Dell, Lenovo, Samsung and LG Display announced similar plans to phase out legacy technologies in the coming years.

Getting there. This shows that in 2009 a FORM of Adaptive sync (early) PSR was created.

http://www.vesa.org/wp-content/uploads/2010/12/DisplayPort-DevCon-Presentation-eDP-Dec-2010-v3.pdf

Description of Panel Self-refresh
•  Frame Buffer in TCON can maintain display image without receiving video data from GPU.
•  For a still video image, this allows the GPU to enter a low power state and the eDP main link to turn off.
•  Allowing the GPU to power down between display updates will save significant power and extend battery life.
•  Except when watching a movie or playing a game, there are many times when the video does not change for multiple frames.

As you can see from the 4th point it wasn't originally intended for gaming, or rather wasn't setup for that implementation yet.

https://www.vesa.org/news/vesa-adds-adaptive-sync-to-popular-displayport-video-standard/

Adaptive-Sync is a proven and widely adopted technology. The technology has been a standard component of VESA’s embedded DisplayPort (eDP™) specification since its initial rollout in 2009. As a result, Adaptive-Sync technology is already incorporated into many of the building block components for displays that rely on eDP for internal video signaling. Newly introduced to the DisplayPort 1.2a specification for external displays, this technology is now formally known as DisplayPort Adaptive-Sync.

Thus far after hitting about 200 articles and PDFs I cant seem to locate the discussion of AMD and Intel pushing for PSR in 2009, although it is inferred in the eDP 1.3 links and quotes above. Also cant be sure which if either AMD or Intel is the one behind the idea. Although I did come across an article in 2008 by some random guy talking about VRR and power savings and why it would be a boon to goming with fluctuating frame rates ie: 24-48fps.

In the least I proved you are full of it and that Nvidia DID NOT Invent the technology and that in the least AMD and Intel had a hand in its infancy.
 
Adaptive sync is not Variable Adaptive Sync JustReason.....

Nvidia stated they would support adaptive sync if the demand was there, and it wasn't at that time. Variable Adaptive Sync tech has many differences.

This is AMD talking about adaptive sync

"Adaptive-Sync is the underlying standard, but makes no qualitative demands of adopters. As far as the spec is concerned, a 2Hz range is Adaptive-Sync, for example. Obviously that is useless for gaming. We have specific tests for backlight bleed, DRR range, motion blur, backlight flicker, pixel persistence, etc. We want monitors that bear our brand and logo to clear a certain quality threshold for the users that might buy them. To put it bluntly: the rubric is not available because this is a competitive industry and we're not interested in having our rubric needlessly and pettily nitpicked."

Free sync improvements over adaptive sync

Low Framerate Compensation(LFC)
Certification (loosely regulated but still there), certain features must be there, not all Adaptive sync have all the features to be Freesync

Gysnc improvements
Certification, certain features must be there, not all Adaptive sync have all the features to be Freesync
Software solution added to hardware ware solution. Programmable. Certification. And many others.

Because of the certification process, many features that are available on both Free Sync and G Sync are not available on adaptive sync monitors more importantly gaming needed features.
You really got to stop being the word/term Nazi. My point was in regards to the tech and whether Nvidia is the inventor as you can see above, not the specific terms and meanings.
 
You really got to stop being the word/term Nazi. My point was in regards to the tech and whether Nvidia is the inventor as you can see above, not the specific terms and meanings.


Nvidia was the inventor of adaptive sync technology, its different, taking something and adding features to make something new is not the same thing as what it was before. Its like saying Windows was the same thing as Mac OS. Microsoft took mouse and Visual OS schemes from Apple and created their own OS with more features and made something new.

If you think that is a word nazi, you have issues with that you have issues with the US patent office not me.

Not only that adaptive sync and G-sync work COMPLETELY differently

http://rebrn.com/re/lets-talk-about-v-sync-free-sync-g-sync-adaptive-sync-and-fast-s-2639948/
 
Nvidia was the inventor of adaptive sync technology, its different, taking something and adding features to make something new is not the same thing as what it was before. Its like saying Windows was the same thing as Mac OS. Microsoft took mouse and Visual OS schemes from Apple and created their own OS with more features and made something new.
That's not inventing, it is adapting. This is Nvidia. If they invented it they would have a patent on a napkin and suing the hell out of everybody. And again they can only make claims to the module inside the monitor as far as their implementation. If you go back and look at what PSR does and its implementations you can see it doesn't take any sort of huge changes to get to the VRR we have today.
 
That's not inventing, it is adapting. This is Nvidia. If they invented it they would have a patent on a napkin and suing the hell out of everybody. And again they can only make claims to the module inside the monitor as far as their implementation. If you go back and look at what PSR does and its implementations you can see it doesn't take any sort of huge changes to get to the VRR we have today.


It is inventing, because they don't work the same way!

G sync doesn't work like Freesync or adpative sync AT ALL, the later two work similarly though.

https://www.gamersnexus.net/guides/2373-op-ed-unproductive-battle-of-freesync-and-gsync

gsync11-2.jpg




The extra chip for G sync synchronizes the timings of the frames from the GPU. There is actually communication going on between the GPU and the FPGA to do this.

That is not a feature of adaptive sync nor will it ever be.
 
Last edited:
It is inventing, because they don't work the same way!

G sync doesn't work like Freesync or adpative sync AT ALL, the later two work similarly though.
Now you're splitting hairs. It is the same thing, just the implementation is different. All solutions match the monitor refresh rate to the output of the GPU, in its selective range capability. Simply put... the same thing. How each does it is different but not that different.
 
Now you're splitting hairs. It is the same thing, just the implementation is different. All solutions match the monitor refresh rate to the output of the GPU, in its selective range capability. Simply put... the same thing. How each does it is different but not that different.


Implementation is the basis of patents dude that is an ivention.

Can't tell me this is like

Graham bell vs Meucci and who invented the telephone first man.

What AMD did with Freesync was a cheaper alternative with almost the same feature set. That is it. G sync wasn't built on adaptive sync. The method of implementation is different just as you stated, and that is why its patented.

So it goes back to who made VRR tech first. nV did, and free sync came after, adaptive sync is different and doesn't have the features of VRR tech. So you can't compare it to G Sync, though you can compare it to Freesync because Freesync is built on top of adaptive sync.

G Sync was THE FIRST VRR tech that is it. Free Sync used software and adaptive sync to have most of the feature of G Sync. Which was THE SECOND VRR TECH. Adaptive Sync is not VRR tech it lacks the features that VRR tech needs to be considered VRR tech.
 
Last edited:
Its the same from the AMD side as well. Might have something to do with both companies focusing on other avenues with their cards and it also might have to do with the death of Moore's Law. Lack of solid competition might be a factor from Nvidia, but I doubt it is the largest one.


Or, perhaps it has something to do with faulty research. I don't have the most powerful GPU available by far, but when I read and relized that the new GTX 1070 cards were not only faster but used far less power then previous GForce card I knew it was a superior product and the bar had been raised again. I think these guys have simply done a poor job quantifying performance and failed to take in all the performance factors involved.

Furthermore, his presentation references comments from others about whether certain cards are a good buy or not, but these are taken out of context. The context I mean is the existence and offerings of the competition. Back when he is talking about the GForce cards from the early 2000s, it's done without noting that the best performance card wasn't a GForce card at all, it was the Radeon cards that were clearly the superior product. Radeon was the best choice for several years in the early 2000's so of course there are people saying that $500 for a GForce is a tough sell.

Furthermore, the story makes a big deal right from the start pointing out that NVidia doesn't make graphics cards and that they really don't even make graphics chipsets, he says all they really do is design graphics chipsets. But I seem to remember that they write software as well, that along with designing the chipsets, NVidia writes the reference software, firmware, drivers, and utilities that make these cards function.

Yes, so along with new cards being released for performance benefits, they also get released for feature improvements. Who remembers the first DX9 cards? Yes, people bought the new cards not just because they were faster then the previous cards but because newer cards sometimes supported entirely new graphics capabilities.

Speed is not the only measure of improvement and performance is not just about speed or even efficiency, but image quality as well.
 
Last edited:
And we are going round and round about Adaptive Sync again with people making frankly unbelievable claims again.

I have some questions, explain how freesync is an open source free adaptive sync product / feature / technology ... whatever it is. I want to know why you say it's free of charge.
 
And we are going round and round about Adaptive Sync again with people making frankly unbelievable claims again.

I have some questions, explain how freesync is an open source free adaptive sync product / feature / technology ... whatever it is. I want to know why you say it's free of charge.
Not free in the sense most make it sound. Freesync is AMDs connection or use of Adaptive sync, the VESA standard. Being a standard means that anyone who wants to use it, anyone being manufacturers, only need meet the standard set forth by VESA. I am not sure that in and of itself is free as these type of standards at least have some buy in to even get the certification. Any way the free part mostly is in regards to the cost of implementing it by a manufacturer of displays. G-sync requires the purchase of a board and certification by Nvidia before getting that label. That is usually attributed a $200 cost above what the same panel may cost without G-sync. Freesync/Adaptive sync generally have negligible costs keeping relatively equal monitors $200 apart in favor of the freesync/Adaptive sync monitor. Of course AMD choosing the name FREEsync was initially a stab at Nvidias implementation and cost to enter and possibly the closed nature of G-sync as well. That helped propel the FREE moniker as well.
 
not all adaptive sync monitors are free sync monitors JustReason.

That is what you are not understanding, and have been trying to explain to you. Freesync is a super set of adaptive sync. It has additional features that make it a VRR tech. ALL VRR tech is adaptive sync but not all adaptive sync is VRR.
 
Seriously! You claim to know everything but you don't know anything really, just create your own truth.

In the least I proved you are full of it and that Nvidia DID NOT Invent the technology and that in the least AMD and Intel had a hand in its infancy.

I can see your issue if you dont understand you are trying to compare 2 different technologies and claim its the same.
 
I can see your issue if you dont understand you are trying to compare 2 different technologies and claim its the same.
Not man enough to admit defeat? Seriously? You and even Razor1 are splitting hairs just to make your point seem valid. Gsync and freesync are VRR technologies with the exact same conclusion, only the means to the end differ, and even then only slightly. If what you are trying prove ,and that is indeed what you inferred, NVidia created the VRR tech you are WRONG completely. Again proof galore just by googling, hell articles as far back as 90s of the idea and how it could work and with a few interfaces coming into existence how each would make it possible but never really coming to fruition until eDP 1.3. PSR is a form of VRR however more akin to v-sync than adaptive sync. But the manipulation of the Vblank interval is what made it possible. NVidia was first to market, albeit the slowest race ever, no one here denies that. However they didn't invent VRR just their interface medium. And again I am unsure who exactly pushed the original idea of PSR, only know the only 2 companies ever mentioned were AMD and Intel.

None of this changes the fact that you and the others creating this false reality are wrong and unlike you I linked proof. What do you have? And sorry Nvidias board, gsync module, does not count as it is not inventing VRR just a means of achieving it.
 
Not man enough to admit defeat? Seriously? You and even Razor1 are splitting hairs just to make your point seem valid. Gsync and freesync are VRR technologies with the exact same conclusion, only the means to the end differ, and even then only slightly. If what you are trying prove ,and that is indeed what you inferred, NVidia created the VRR tech you are WRONG completely. Again proof galore just by googling, hell articles as far back as 90s of the idea and how it could work and with a few interfaces coming into existence how each would make it possible but never really coming to fruition until eDP 1.3. PSR is a form of VRR however more akin to v-sync than adaptive sync. But the manipulation of the Vblank interval is what made it possible. NVidia was first to market, albeit the slowest race ever, no one here denies that. However they didn't invent VRR just their interface medium. And again I am unsure who exactly pushed the original idea of PSR, only know the only 2 companies ever mentioned were AMD and Intel.

None of this changes the fact that you and the others creating this false reality are wrong and unlike you I linked proof. What do you have? And sorry Nvidias board, gsync module, does not count as it is not inventing VRR just a means of achieving it.


This is not fucking splitting hairs man, you just don't understand logic.

Math class, if some A's are B's,and if some C's are B's, are all C's and A's the same?

No C's are not A's, nor do B's have to be A's either.

Are you serious this is 8th grade logic in math?
 
Well, it is accurate to say that performance increase is far less than it once was. One reason is probably that it is harder to accomplish then before. However, I would imagine another reason is because there is no reason for them to bother if they can make more money per generation well not offering as much as they once did in the past. No biggie, it happens, I am sticking with my Sapphire R9 Fury Nitro for a long while anyways, nothing worth upgrading to for the money.

Edit: Lol on the change in their name area Kyle. Can you change mine to AMD Lover, Always! :)
 
It is interesting that somehow the term adaptive sync started getting used interchangeably with VRR at some point.
It would be a fine term for the technology (as the words are basically correct from an English standpoint), except just as razor1 points out, that term is already used to describe something different. But sadly, tangentially related to make sure it's maximally confusing.

So yeah, when razor1 and JustReason emerge sweaty from the hotel room, let's all use the term VRR for Gsync/FreeSync style techs if we can. It just avoids a whole raft of potential confusion.
 
Although I think that is very funny and got a laugh out of it, doesn't change that fact that Adaptive sync is not Freesync nor is it G sync, but G sync and Free sync tech can still do adaptive sync though lol. Just reason is trying to say is Adaptive sync is the first VRR tech, it is not as explained below, did it give birth to VRR tech, yes, but that doesn't mean its the same. You can't even categorize them in the same league.

Adaptive Vsync doesn't change the monitor refresh rate to sync with the GPU, both Freesync and G sync do this, that is what made VRR, VRR.

VRR gives the control to the GPU to control the monitor refresh rates, with Adaptive sync that is not possible, and the problems that we had with games tearing and what not, adaptive sync did not take are of them, it was better than before but the problems were still there, and it also introduced input lag, which VRR tech still has but is minimized.
I don't understand. Could you try to state that again, but clearly this time?
 
It is inventing, because they don't work the same way!

G sync doesn't work like Freesync or adpative sync AT ALL, the later two work similarly though.

https://www.gamersnexus.net/guides/2373-op-ed-unproductive-battle-of-freesync-and-gsync

gsync11-2.jpg




The extra chip for G sync synchronizes the timings of the frames from the GPU. There is actually communication going on between the GPU and the FPGA to do this.

That is not a feature of adaptive sync nor will it ever be.

Yep. G-Sync costs more $, but it's a more sophisticated solution. I'll gladly pay extra, especially because I keep my monitors for 5+ years.
 
Gaming technology channel AdoredTV has uploaded two videos detailing the history of NVIDIA’s GeForce graphics cards: a graph detailing the performance of the company’s GPUs per generation suggests that improvement has been significantly reduced, dropping from 70% (pre-2010) to 30% (post-2010). DSOGaming claims that this is all due to AMD failing to compete.

Prior to 2010, and with the exception of the NVIDIA GeForce 9800GTX, PC gamers could expect a new high-end graphics card to be faster by 70% than its predecessor (that’s average percentage). Not only that, but in some cases, PC gamers got graphics cards that were more than 100% faster than their previous generation offerings. Two such cards were the NVIDIA GeForce 6800Ultra and the NVIDIA GeForce 8800GTX. However, these days the performance gap has been severely reduced.

Or it could be due to economic and technical realities.
 
If each release was 100% faster I would be buying cards more often. Nvidia's milking is only shooting themselves in their own foot with people like me.

Stay one generation behind and buy discounted or used cards.
You'll save a lot of money and get more value.

You can also stick to buying *70 cards since it has the performance of last gen top card at lower power draw.
 
Think Phasenoise said it the best way.

G Sync and Free Sync are VRR technologies.

Adaptive Sync is not VRR tech.

And the difference between the two types of tech is G Sync and Free Sync allow the GPU to change the refresh rate of the monitor to sync with the game's frame rates, that is why with Gsync, free sync when it goes above the max frame rates of the monitor things get wonky it goes back to tearing and ghosting and all the other issues.
I was being sarcastic. But can you explain it again?
 



He is probably a 20-25 year old kid that doesn't even have a degree in any tech field or was never exposed to tech as a young kid and finally when he got his first forum account somewhere started getting into tech and listening to all the BS over at that forum and started up a youtube account, thinking he knows what he is talking about, but as we all know most forums don't have a lot of in depth info about tech, they are few and far between.


He's actually in his late 30s and unemployed. Last year there was a controversy where in one of his vids he said that he's living with his pregnant gf who works while he's at home doing these tech speculations vids. As you can imagine it didn't go well on reddit and he had to delete his account because someone doxed him. I've started watching his vids ever since his gimpworks vid and the more i watched his vids the more i came into realization that he's actually an AMD fanboy. You can pretty much guess that by listening to his voice while he's criticising AMD - his tone is completely different to when he's going at Intel or Nvidia. He sounds like a fanboy rooting for his team to do better while when its the competition he goes in full swing berating them like they're the most evil thing out there. Then there's his other stuff like someone already mentioned about how he bigged up Polaris being a Titan killer in one of his 2 part vids then went back on his word, Insulted other tech reviewers, did shady benchmarks - 2500k v fx8370, insulted Nvida buyers as nvidiots, about nvidia gimping older cards with drivers - this guy made a benchmark - , ...
There's lots more of these
 
ouch think by 30 year old a person would be more inclined to look into the things they talk about more in depth. And he has the time to do it too if he is unemployed.
 
I'm convinced these people own AMD stock and are trying to manipulate public opinion, while simultaneously fishing clicks from rabid fanboys.
It seems too easy.
 
Not free in the sense most make it sound. Freesync is AMDs connection or use of Adaptive sync, the VESA standard. Being a standard means that anyone who wants to use it, anyone being manufacturers, only need meet the standard set forth by VESA. I am not sure that in and of itself is free as these type of standards at least have some buy in to even get the certification. Any way the free part mostly is in regards to the cost of implementing it by a manufacturer of displays. G-sync requires the purchase of a board and certification by Nvidia before getting that label. That is usually attributed a $200 cost above what the same panel may cost without G-sync. Freesync/Adaptive sync generally have negligible costs keeping relatively equal monitors $200 apart in favor of the freesync/Adaptive sync monitor. Of course AMD choosing the name FREEsync was initially a stab at Nvidias implementation and cost to enter and possibly the closed nature of G-sync as well. That helped propel the FREE moniker as well.


Very good, but it's time to mince some words and if you need to rephrase something I'll understand. Sometimes for brevity's sake we tale short cuts when explaining something and are not expecting to have to get down in the weeds on what was said. So feel free to make corrections, I'm not keeping score.

Any way the free part mostly is in regards to the cost of implementing it by a manufacturer of displays.
Now here is the tricky part, see, the standard might be free, the monitor end is set and all ready for the display device's calls to the code. But that is the other part, the manufacturer, say AMD, has to write their end of the code to make the calls to the display. AMD call's what they developed Freesync and it cost AMD to develop it and it costs buyers when they buy it. There is nothing free about it. If NVidia wanted to make use of adaptive sync in VESA, they could as well, and they would have to write their code and we as buyers would pay for it.

Now I can see that you aren't completely fooled by AMD's marketing on this issue. I bought my 1070 not long after they released because I had my sights set on an Acer X34 monitor I figured I might need G-Sync for that display and card combination. As soon as I saw the numbers, in particular the lower power requirements of the 1070 for increased performance over it's predecessors, I knew it was a better engineered product.

At the time I bought my 1070, shortly after release, what would it have cost for me to go with an AMD solution instead, and a Freesync 34" curved 21:9 display at 3440x1440 ? Bear in mind my PSU was a 600W model from Seasonic?

I'm going to carry on with what I believed at the time and what I believe you should come up with as well. That I would have needed two AMD cards, a new power supply, and the monitor. I believe you will find that the total cost for each option is roughly equivalent, but that the result of purchasing NVidia and going with G-Sync at that time provided a more efficient and long term cost effective solution with better performance overall. The G-Sync "tax" is easily spent on a new PSU and additional card in order to reach comparable performance and the AMD route is no bargain unless you already have a sufficient PSU on hand. Either way, you'll still be paying more every month on your power bill, you'll have a warmer room or a you'll be running your A/C more. For some people this may not matter but for me, at that time it did. And anyone looking at the entire picture would face the same situation.

Furthermore, I would argue that AMD simply can't afford to charge much for their adaptive sync solution because frankly they have problems selling what they have and remaining competitive. AMD couldn't sell their cards at a higher price point anyway.
 
haha this guy :facepalm:
He just released a MSI Gtx 1080ti gam x video review and he just can't help himself defending the AMD badge of honour. Doing a comparison benchmark vs Vega LC at 4k in a couple of games, he comes to Hitman where the 1080ti has 17fps more on avg and then goes on to say that he doesn't see any difference in the gameplay cause of FS, next one is Mass Effect Andromeda and again the ti with a massive lead in avg fps - 94 vs 66. Says gameplay felt similar and free sync for the win... Im speechless at how somebody can be so biased.
 
Back
Top