AMD Radeon FreeSync 2 vs NVIDIA G-Sync @ [H]

Freesync seems to be the winner, come on AMD give us a gpu with Telsa/Volta/Ampere Ti parity and fuck up GPP.... I can dream.

One thing I find interesting, as far as I can tell, Vega 56 and 64 can compete with 1070, 1070ti and 1080 cards, but not 1080ti and titans, so what more people really need?

Per Steam survey, 1080p resolution is king, which both Vega can handle nicely, so why the comment always is that AMD simply cant compete?

I get it, 4K is the new hotness and if you are playing at that resolution, then ok, go get a titan, but if your gaming at 1080p, then I say, we have options from AMD.
 
One thing I find interesting, as far as I can tell, Vega 56 and 64 can compete with 1070, 1070ti and 1080 cards, but not 1080ti and titans, so what more people really need?

Per Steam survey, 1080p resolution is king, which both Vega can handle nicely, so why the comment always is that AMD simply cant compete?

I get it, 4K is the new hotness and if you are playing at that resolution, then ok, go get a titan, but if your gaming at 1080p, then I say, we have options from AMD.

Unfortunately I game at 3440x1440 and 4k.

Adding, you, I and most others here also know that the Halo effect is real.
 
Last edited:
One thing I find interesting, as far as I can tell, Vega 56 and 64 can compete with 1070, 1070ti and 1080 cards, but not 1080ti and titans, so what more people really need?

Per Steam survey, 1080p resolution is king, which both Vega can handle nicely, so why the comment always is that AMD simply cant compete?

I get it, 4K is the new hotness and if you are playing at that resolution, then ok, go get a titan, but if your gaming at 1080p, then I say, we have options from AMD.

I'm very certain the problem isn't that Vega is not competitive. The problem is that Nvidia delivered that performance 1 year before Vega and everyone bought that performance 1 year before Vega was released.
 
He's kind of right though in regards to the monitors. While there are Freesync versions of the Gsync monitors at a cheaper price they often lack LFC, have frame-skipping issues at high refresh rates on NV cards, and they always lack ULMB (which is a better feature than either Freesync or Gsync IMO). Then some like the Samsung panel advertise HDR support but actually have an awful HDR implementation. QA in the current display market is pretty awful and Freesync has no real standards whereas Gsync has specific requirements to be certified (All panels must have ULMB, a sync range of 30-Max Hz (Some Freesync monitors have ranges as low as 48-75hz), high contrast ratios, etc). Barring the normal QA issues you can go to the store and blindly buy a Gsync panel knowing that it will be good; not so with a Freesync panel. Whether it's worth the extra $100-$200 is debatable and I'd argue that you're better off spending extra on something that you're going to keep for as long as most people keep monitors.

The fact that Freesync was only equal in votes to Gsync despite the inclusion of HDR (and on a VA panel compared to TN no less) shows that it's not exactly equal and probably deserving of having a lower price point.

You do know that Hardocp used a Freesync 2 monitor? Freesync 2 is Gsync competitor in terms of tighter quality controls. That should mean Freesync 2 monitors will have to be high end monitors to get Freesync 2 certification. From AMD press release "Qualifying FreeSync™ 2 monitors will harness low-latency, high-brightness pixels, excellent black levels, and a wide color gamut to display High Dynamic Range (HDR) content.1 In addition, all FreeSync™ 2 monitors will have support for Low Framerate Compensation (LFC)."
 
the Halo effect is real.

I know, thats pretty much what i disputed on my comment.

I'm very certain the problem isn't that Vega is not competitive. The problem is that Nvidia delivered that performance 1 year before Vega and everyone bought that performance 1 year before Vega was released.

True, but if someone is a fan of AMD or is just pissed off at nvidia bullshit, if you need a card at certain performance levels, amd does has an option now.

In my case, i have a gtx970 and play at 1080p, any Vega card is a valid upgrade for me. If i move to 4K, then i'm screwed, because i am not paying for a 1080ti or titan, since they are just too much money (MSRP).

Except of course, nobody cant buy shit because of the crypto crap.
 
You do know that Hardocp used a Freesync 2 monitor? Freesync 2 is Gsync competitor in terms of tighter quality controls. That should mean Freesync 2 monitors will have to be high end monitors to get Freesync 2 certification. From AMD press release "Qualifying FreeSync™ 2 monitors will harness low-latency, high-brightness pixels, excellent black levels, and a wide color gamut to display High Dynamic Range (HDR) content.1 In addition, all FreeSync™ 2 monitors will have support for Low Framerate Compensation (LFC)."

This monitor possesses an HDR implementation that is so bad that it's frequently recommended to keep it disabled. The fact that it as a VA panel was panned by 1/2 of the users as having worse colors than a TN panel should tell you something about it's quality.
 
Last edited:
Freesync2 being improvement and similar to G-Sync doesn't mean shit if you own Nvidia card and do not even intend to change GPU brand, like many people actually do :meh:
There are tons of people who cheap out on monitor and buy Freesync monitor which doesn't give them nothing of variable refresh rate goodness this tech is all about.

And this is ridiculous, especially when they live in rich countries and own 1080Ti. What is the point of saving 200 bucks? Reselling prices of G-Sync monitors are much higher than Freesync monitors so price difference is mostly virtual ...

On the other hand if AMD should end their slumber and release more competitive products many people will be more compelled to switch to AMD which is a good thing! (y)
 
In my case, i have a gtx970 and play at 1080p, any Vega card is a valid upgrade for me. If i move to 4K, then i'm screwed, because i am not paying for a 1080ti or titan, since they are just too much money (MSRP).

I upgraded from a pair of GTX970s, not bad cards at all, still have one- and I'm on 1440p 165Hz. I could easily use more GPU power without upgrading the monitor, but 27" was actually a downgrade from my ancient 30" panel, so I'm waiting for something like 35" 4k 120Hz with a functional HDR implementation.

Freesync2 being improvement and similar to G-Sync doesn't mean shit if you own Nvidia card and do not even intend to change GPU brand, like many people actually do :meh:
There are tons of people who cheap out on monitor and buy Freesync monitor which doesn't give them nothing of variable refresh rate goodness this tech is all about.

And this is ridiculous, especially when they live in rich countries and own 1080Ti. What is the point of saving 200 bucks? Reselling prices of G-Sync monitors are much higher than Freesync monitors so price difference is mostly virtual ...

On the other hand if AMD should end their slumber and release more competitive products many people will be more compelled to switch to AMD which is a good thing! (y)

I'd really, really like AMD to be more competitive; my issues have been that they simply have not had a product that I'd be interested in recently, and before that, their drivers were trash. And yes, I speak from experience.

You do know that Hardocp used a Freesync 2 monitor? Freesync 2 is Gsync competitor in terms of tighter quality controls. That should mean Freesync 2 monitors will have to be high end monitors to get Freesync 2 certification. From AMD press release "Qualifying FreeSync™ 2 monitors will harness low-latency, high-brightness pixels, excellent black levels, and a wide color gamut to display High Dynamic Range (HDR) content.1 In addition, all FreeSync™ 2 monitors will have support for Low Framerate Compensation (LFC)."

G-Sync is missing one feature: HDR, which is coming in monitors soon, though is still very likely to be poorly implemented by OS and games/applications [i.e., not particularly worth the effort on its own]
FreeSync2 is still missing the wide sync range; this is perhaps the most egregious omission from FreeSync2, because it means that gamers with AMD GPUs must be more cognizant of game settings to keep framerates in the VRR range. LFC helps, but it still only forces a relative minimum versus G-Sync's absolute minimum.
 
I had Radeon HD7950 for a long time and had no issues with it whatsoever.
Only real driver issues that I saw in ATI/AMD cards were around time of HD2000/3000 cards with glitches happening, then very minor issues with HD4000 but mostly related to things like forcing v-sync, no glitches. Since HD5000 I would say there was not that much issues.
At this time I remember NV cards had blurry as hell textures of mountains in Crysis and you could not force AF on many textures, overally making game look 'meh' compared to Radeons. Also on GeForce cards 8000/200 series I remember being unable to force AA in Unreal3 engine games and some other with post processing enabled while Radeons of the time had no such issues or at least forcing AA and AF worked more reliably in more games. Also there was mayor issue with terrible performance on 2-core systems and Fermi based cards. When you consider how many people were affected by it not knowing it (benchmarks were usually done on 4 core systems...) you might want to reconsider your driver bashing priorities :nailbiting:


Only issue of AMD cards that I can think of today is much higher power consumption and year long catching to competition... and due to power consumption it feel morel like competition for 980Ti and not 1080...
AMD can correct gamut of any display to sRGB in hardware without performance loss and never have any banding when calibrating display even in 8bit mode. Inability of NV cards to do it is for me a bigger 'driver issue' than I can remotely think of AMD have.
 
I've historically bought more AMD cards than Nvidia, but have no particular allegiance to any hardware manufacturer other than the one making the best product that meets my requirements. My last two cards were Nvidia cards, with two gsync monitors.

I give Nvidia credit for inventing variable frame rate monitors, promoting it and creating a full infrastructure out the gate with g-sync. Unfortunately the first generation of freesync monitors had issues with low frame rates, but with a VESA/DisplayPort and now HDMI standard for variable frame rates, and improvements with freesync 2 I do have to question why Nvidia won't just support the standards. Support G-sync, by all means, but also support the standards, Nvidia, please.
 
I've historically bought more AMD cards than Nvidia, but have no particular allegiance to any hardware manufacturer other than the one making the best product that meets my requirements. My last two cards were Nvidia cards, with two gsync monitors.

I give Nvidia credit for inventing variable frame rate monitors, promoting it and creating a full infrastructure out the gate with g-sync. Unfortunately the first generation of freesync monitors had issues with low frame rates, but with a VESA/DisplayPort and now HDMI standard for variable frame rates, and improvements with freesync 2 I do have to question why Nvidia won't just support the standards. Support G-sync, by all means, but also support the standards, Nvidia, please.
Nvidia wont support any open standards unless they absolutely have no choice.

Every tool, format or tech they release is for one purpose only, to keep you locked to them.

And they have gone to extreme actions that has ended affecting their own customers.

And yes, my last 2 gpus has been nvidia, but it wont happen again, if i can avoid them.
 
It’s kicking ass, next to my Kyro 2 card.

So, i guess that you are ok in observing the severe anticonsumer actions on nvidia part and instead of helping in taking them down a couple of notches (by not giving them your money), you prefer to make fun of someone who is willing to do what it’s right for us the consumer?

Oh well, sheeps don’t fight back on their way to the slaughter either.

You are guessing wrong and I have said so in the GPP consumer choice thread. I don't like what NVidia is pushing with the GPP. I have also stated that while I don't see anything wrong with someone boycotting NVidia, I think boycotting GPP partners is unfair as they really don't have a choice. An individual does, but a company's leadership who has to meet board member and shareholder need, no.

Still that doesn't make you're comment correct, nor does it make the reverse unlikely. If AMD were in a commanding situation I wouldn't put it past them to do the same.

And as I said, reality means a big "So what" when it comes to choices. You can talk open standards and all that when it comes to Freesync and G-Sync but you have only two choices. It's like "I don't want to play NVidia's bullshit and be locked into a proprietary standard ..... so I'll stay with Freesync which is an open standard and that way I can choose between AMD and ..... AMD.

Like I said, so what?

Oh and the "slaughter", I bought my EVGA 1070 SC a year and a half ago for the purpose of driving a 34" Ultrawide at 3440x1440 with adaptive sync. Tell me please, how many options did I have for that set up? If you don't remember, it was none. No AMD solution was capable, no Freesync monitor had the same capability at that time as the Acer Predator X34 and ASUS PG348Q. There was no comparable setup in existence. And if you want to upgrade a video card to something newer from that point, are you going to buy another NVidia card and go AMD with $1,000+ sunk into one of these monitors?

Everyone's situation is different, they are dependent on many factors including time. Calling anyone foolish, (not that you called anyone foolish), for making the choice that they find is best, for their needs, isn't really right is it?

Bhaaaaahhhhh ! Bleeeeett. I spent a lot of money, I got a lot for my money, and it was what I believed I wanted at that time. I'm not going to change the way I make these decisions because they are largely based on what I believe I will get for my money. And it is my money after all so ......
 
You're hurtin ma feelins now

So around 6/28/2016 when I ordered my EVGA 1070 SC card for the purpose of driving a 3440x1440 100Hz Adaptive Sync display, there was a a comparable Freesync monitor available at that time, and a single AMD card that could drive that monitor at these settings without requiring Crossfire and a larger PSU than what a 1070 would require?
OK.
 
You are guessing wrong and I have said so in the GPP consumer choice thread. I don't like what NVidia is pushing with the GPP. I have also stated that while I don't see anything wrong with someone boycotting NVidia, I think boycotting GPP partners is unfair as they really don't have a choice. An individual does, but a company's leadership who has to meet board member and shareholder need, no.

Still that doesn't make you're comment correct, nor does it make the reverse unlikely. If AMD were in a commanding situation I wouldn't put it past them to do the same.

And as I said, reality means a big "So what" when it comes to choices. You can talk open standards and all that when it comes to Freesync and G-Sync but you have only two choices. It's like "I don't want to play NVidia's bullshit and be locked into a proprietary standard ..... so I'll stay with Freesync which is an open standard and that way I can choose between AMD and ..... AMD.

Like I said, so what?

Oh and the "slaughter", I bought my EVGA 1070 SC a year and a half ago for the purpose of driving a 34" Ultrawide at 3440x1440 with adaptive sync. Tell me please, how many options did I have for that set up? If you don't remember, it was none. No AMD solution was capable, no Freesync monitor had the same capability at that time as the Acer Predator X34 and ASUS PG348Q. There was no comparable setup in existence. And if you want to upgrade a video card to something newer from that point, are you going to buy another NVidia card and go AMD with $1,000+ sunk into one of these monitors?

Everyone's situation is different, they are dependent on many factors including time. Calling anyone foolish, (not that you called anyone foolish), for making the choice that they find is best, for their needs, isn't really right is it?

Bhaaaaahhhhh ! Bleeeeett. I spent a lot of money, I got a lot for my money, and it was what I believed I wanted at that time. I'm not going to change the way I make these decisions because they are largely based on what I believe I will get for my money. And it is my money after all so ......

Soo much to reply to, but to be honest, I prefer to use my time on a more productive endeavor.
 
Soo much to reply to, but to be honest, I prefer to use my time on a more productive endeavor.
He is in love with hearing himself, over and over and over. Did you not get his opinion being fact the first 10 times you read it? ;)
 
Ain't heard no rebuttals, just dodges.
If I wanted to argue with someone that just wants the opportunity to hear themselves rattle on more saying the same thing over and over, I would go talk to my mother-in-law.
 
Nvidia wont support any open standards unless they absolutely have no choice.

Every tool, format or tech they release is for one purpose only, to keep you locked to them.

*sigh* I'm totally at risk of sounding like a fanboy today, given what I have posted today in a couple of threads.

I have to say you are being very unfair to Nvidia here. Can you list all the open standards that they don't support?

With CUDA, they developed the API because there were no viable computing APIs available in the market. CUDA appeared on the market at least 1-2 years before any open computing APIs appeared.
When G-Sync hit the market, there was nothing similar in the market.
They supported OpenGL better than AMD.
They support OpenCL albeit AMD is better. Oh, guess who created OpenCL? Everyone's favorite monopoly, Apple.

Now, just to say something about "open source" and "open standards".
Regardless of whether anything is open, the most important thing is whether the manufacturer supports it. DirectX is the most closed source API in the market, yet it's the best supported API available.
AMD created a whole bunch of open source standards and so many of them are deprecated today.
In Linux, AMD has been far more open and contributed far more to the drivers. Nvidia is famously closed and hardly opens anything to the open source community. Yet Nvidia performance on Linux is superior to AMD.
 
I wasn't looking for an argument, I asked for your confirmation, was I right or wrong in Post #102.

Did you miss the two question marks Kyle?

That was me, asking what you thought.
I do not care if you are right or wrong.
 
*sigh* I'm totally at risk of sounding like a fanboy today, given what I have posted today in a couple of threads.

I have to say you are being very unfair to Nvidia here. Can you list all the open standards that they don't support?
.

Off topic, but you really didn't refute what he said "Nvidia wont support any open standards unless they absolutely have no choice. Every tool, format or tech they release is for one purpose only, to keep you locked to them." You cited examples where Nvidia used open standards, but he did say Nvidia will only support open standards if they had to. If Nvidia chooses not to support VRR in HDMI 2.1 then if further supports that opinion. I think Nvidia knows they made a mistake of thinking they could make VRR a high end only type feature. In reality VRR lowers the GPU performance needed to get a smooth experience as this Freesync 2 vs Gsync test somewhat proves. Probably scares Nvidia that in 2-3 yrs time an APU can deliver a smooth 4k experience when paired with a 4k VRR monitor so if I were Nvidia I would definitely lock up the high end by any means possible.
 
Off topic, but you really didn't refute what he said "Nvidia wont support any open standards unless they absolutely have no choice. Every tool, format or tech they release is for one purpose only, to keep you locked to them." You cited examples where Nvidia used open standards, but he did say Nvidia will only support open standards if they had to. If Nvidia chooses not to support VRR in HDMI 2.1 then if further supports that opinion. I think Nvidia knows they made a mistake of thinking they could make VRR a high end only type feature. In reality VRR lowers the GPU performance needed to get a smooth experience as this Freesync 2 vs Gsync test somewhat proves. Probably scares Nvidia that in 2-3 yrs time an APU can deliver a smooth 4k experience when paired with a 4k VRR monitor so if I were Nvidia I would definitely lock up the high end by any means possible.

I asked which open standard does Nvidia not support? Arguably Freesync but that's not something every other GPU maker supports. I'll check with Matrox.

So unless there's a list of open standards which Nvidia doesn't support, I don't see evidence that Nvidia won't support open standards unless they absolutely have no choice.
 
I asked which open standard does Nvidia not support? Arguably Freesync but that's not something every other GPU maker supports. I'll check with Matrox.

So unless there's a list of open standards which Nvidia doesn't support, I don't see evidence that Nvidia won't support open standards unless they absolutely have no choice.
Sadly, you are ignoring my comment and the post from the gentleman above.

Lets point some things that are obvious of what i said.

Opengl support: forced, since everyone, including Microsoft, used it and supported it.

Cuda: they never bothered in making it an open standard, since it locks you to them.

Opencl: forced. Regardless that apple created it, they donated it to kronos and properly supported by everyone, except them, their implementation is weaker, perhaps to make cuda look better.

Mantle which now is vulkan: forced, since is pushed to replace opengl. Even though, they bitched and moaned and dragged their feet on it. By the way, amd offerred free access to mantle and they ignored it.

Then we have gameworks, hairworks that interestingly, ran like shit on AMD cards.

Lastly, they bought physx and locked it behind their cards and would disable their own cards, if an amd card was detected on the same system, which is a big FU to a customer that already paid for a gpu. Granted, my understanding is, they stopped that, but I think it was because there are other options like havok that made physx irrelevant.

There are more, but hopefully you will understand what I originally said.
 
Sadly, you are ignoring my comment and the post from the gentleman above.

Lets point some things that are obvious of what i said.

Opengl support: forced, since everyone, including Microsoft, used it and supported it.

Cuda: they never bothered in making it an open standard, since it locks you to them.

Opencl: forced. Regardless that apple created it, they donated it to kronos and properly supported by everyone, except them, their implementation is weaker, perhaps to make cuda look better.

Mantle which now is vulkan: forced, since is pushed to replace opengl. Even though, they bitched and moaned and dragged their feet on it. By the way, amd offerred free access to mantle and they ignored it.

Then we have gameworks, hairworks that interestingly, ran like shit on AMD cards.

Lastly, they bought physx and locked it behind their cards and would disable their own cards, if an amd card was detected on the same system, which is a big FU to a customer that already paid for a gpu. Granted, my understanding is, they stopped that, but I think it was because there are other options like havok that made physx irrelevant.

There are more, but hopefully you will understand what I originally said.

I completely understand what you said. What I'm saying is I don't think what you are saying is very much substantiated.

OpenGL, Nvidia's drivers were the gold standard for the longest time. Try using ATI drivers with OpenGL if you want to know grief. And I'm saying this in the nicest possible way. Was this forced? How do you prove it was forced? Did Nvidia resist or reject use of OpenGL? Also, OpenGL is still AMD's weakness today. But I am not about to turn around and say AMD was forced into supporting OpenGL. No, AMD's poor OpenGL is due to the lack of resources.They have to divide their resources between DirectX and OpenGL and I think it's clear where their resources should go to, the majority marketshare that is DirectX.

CUDA is a fair point because they have not been interested in opening this up and I haven't seen any attempts to. However CUDA came into a market where nothing for the compute scene existed. Today, CUDA is a major API for the compute scene and that is due to Nvidia's support and execution. They put their money where their mouth was. However it is a conflict of interest in my opinion for OpenCL.

OpenCL, forced? They were part of the consortium in the first place. However as much as OpenGL is AMD's weakness, OpenCL is a glaring weakness for Nvidia. I don't see anything forced but rather a clearcut case of Nvidia prioritizing CUDA and neglect for OpenCL. Their OpenCL 2.x drivers are perpetually in beta.

Mantle, from where I work, I don't see Mantle the same as you do. Was Mantle popular? Did Nvidia kill it? My attitude to Mantle is the same as DX12, developers were not looking to code close to the metal. APIs exist to abstract the metal from the software. That was the whole original goal of DirectX in the first place, to abstract the hardware and allow you to focus on the software without thinking of the hardware. DX12 along with Mantle went the opposite direction. I can't say Mantle was any interesting, the studio I worked for had zero interest in it.

Gameworks is an interesting library. They are reviled because they are optimized for Nvidia hardware. However they did this by using DirectX and not through any special extensions to it. You bring up hairworks and I agree that hairworks shows AMD in a very bad light. Was that due to Hairworks running poorly with AMD or is it due to AMD's inferior tessellation engine? As many people throughout this forum has continually pointed out, AMD can often perform better than Nvidia with Gameworks. It all depends on which how the DX library was used in the Gameworks library and which favors AMD or Nvidia. Scattergun blaming does nothing good for AMD as all we do is excuse them for their failures. Now, like it or not, the Gameworks library is popular. It's a huge timesaver and time is money.

PhysX, I think this is the first genuinely fair anti-competitive point. My studio doesn't use PhysX so forgive me if I was ignorant of this one. I never paid much attention to PhysX because of the need for an Ageia card and when it was bought by Nvidia, it still was never interesting to me.

So to address your point on that Nvidia does not support open standards unless forced. OpenCL and OpenGL, I can't see any evidence of that. Nvidia is a very strong supporter of the Khronos Group and I believe it's an Nvidia representative leading the group at the moment. Claiming they won't support a standard unless forced is too much of a stretch.

As for tools locking you into an ecosystem, isn't that the whole point? Tools are all about lock-in. Does a philips screwdriver turn a slot screw? DirectX was a massive toolbox to keep everyone locked in to Windows. Nvidia definitely succeeded with a lot of their tools but that doesn't mean AMD never tried. Mantle was definitely an attempt albeit it failed so nobody can judge it any further. 3DNow! was another AMD instruction set, deprecated today. TruForm?

Now that said, I am not interested in further defending Nvidia or AMD. As I said before, I have antipathy for both companies due to the grief they give me in my daily job. You have the right to reply but I think we will have to agree to disagree beyond that.
 
In reality VRR lowers the GPU performance needed to get a smooth experience as this Freesync 2 vs Gsync test somewhat proves. Probably scares Nvidia that in 2-3 yrs time an APU can deliver a smooth 4k experience when paired with a 4k VRR monitor

Good luck with that. Regardless of VRR tech 40 fps is still 40 fps.
 
OpenGL, Nvidia's drivers were the gold standard for the longest time.

That one was forced, since the gold standard on those days was Glide and was proprietary from 3dfx. Nvida was small and didnt had an option but to support whatever was glide's enemy.

OpenCL, forced? They were part of the consortium in the first place.

Forced. Apple is a big customer and gave the standard, nvidia had to join, if it wanted to keep apple's bussines and to do not look like the monopolistic bastards they always dream to be.

Mantle, from where I work, I don't see Mantle the same as you do. Was Mantle popular? Did Nvidia kill it?

Mantle wasnt popular and given that only AMD supported it at the time, Nvidia didnt had a reason to support it, hence, not forced. Interesting enough, mantle was donated to kronos and became vulkan. Guess who is dragging its feet to provide proper support? yeap, nvidia.

As for tools locking you into an ecosystem, isn't that the whole point? Tools are all about lock-in.

Perhaps, but then, as a customer, is up to you if you want to become a victim of lock in and trust me, many companies do not like that. Ironically, apple is actively refusing to include any nvidia hardware, for a while now and it seems to be for that reason, avoiding lock-in.

Now that said, I am not interested in further defending Nvidia or AMD.

No problem, i just wanted to provide the info that you kind of requested on your answer.

Have a nice day.
 
Last edited:
mantle was ultimately replaced by dx12 which was the whole intent of AMD releasing mantle in the first place since microsoft wasn't going to make close to metal support available on PC even though it had been available on xbox consoles for years.. eventually microsoft caved and released dx12 all be it locked behind w10 so mantle became redundant and they moved on to vulkan.
 
mantle was ultimately replaced by dx12 which was the whole intent of AMD releasing mantle in the first place since microsoft wasn't going to make close to metal support available on PC even though it had been available on xbox consoles for years.. eventually microsoft caved and released dx12 all be it locked behind w10 so mantle became redundant and they moved on to vulkan.
Yes, AMD donated mantle or parts of it, not sure, to the kronos group and it became vulkan.
 
mantle was ultimately replaced by dx12 which was the whole intent of AMD releasing mantle in the first place since microsoft wasn't going to make close to metal support available on PC even though it had been available on xbox consoles for years.. eventually microsoft caved and released dx12 all be it locked behind w10 so mantle became redundant and they moved on to vulkan.

Microsoft had been working on DX12 for quite some time. They didn't just magically spin it out as a competitor to Mantle.
 
So the take away from this is what? AMD users made smarter choices because they have access to better monitors? Or nvidia users should regrett having faster hardware because they don't win the pepsi challenge?

I wish I hadn't settled on a 500$ 1080. If I was making a purchase today, I'd buy Vega 64 at 700-800$ to get access to one of these sweet hdr monitors.
 
Good video.

No matter what some people say. I love this kind of testing methodology alongside the classical "number crunching" / "oh this one is 5 FPS more so it is superior".

I know this video has basically just been done and I imagine the effort of organizing such blind test run is rather large. But if the author should think of doing something like it again in the far (or not so far) future
maybe I could suggest the following:

- take 1 NVIDIA + 1 AMD employee / or affiliate / or fan, or simply a trusted unbiased system builder (who is known & trusted to assemble efficient gaming rigs) and give each a virtual budget of $1000 (basically persons who will not waste $500 on RAM if it makes no difference at all for a blind gaming test to the be left with $500 for cpu, motherboard and gpu combined. who gets to do this could also be a community vote.)

- let these guys build a gaming rig with these $1000 that they think will offer the best gaming experience on modern games (-> no one can blame author for chosing TN panel over OLED then etc. etc.... the two parties are completely responsible, what makes the persons eligible for this job a very important choice tho'... )

- make sure the configuration of the systems is supervised by the author, but every possible tweak a party suggests is considered and integrated. / supervision should make sure that stuff like HPET / OS choice / driver choice etc. is for the best possible performance. it doesn't necessarily have to be streamlined , especially if one of the parties can prove to the author / community that a certain setup runs way faster if you tweak a certain elsewise unpopular setting.

- perform a blind test in a range of 10-20 modern triple A titles of all representative APIs , if a game offers multiple APIs the parties are allowed to decide which API will be enabled on their rig for that game.
for example if Nvidia runs Battlefield 1 better on DX11 in the system builder's opinion he may use that instead of DX12... or the AMD builder can use Vulkan on DOOM - it is open to them - since it is open to the user in the end as well.

- for the blind testers there should be a streamlining of categories they should talk about. so that there is something like consistent microstutter, normal stutter/intermittent freezes, tearing, graphical glitches, input lag.
of course they are open to address all pros and cons they experienced, but categories that can not be attributed 100% to hardware choice (like confirmed gameplay bug / possible "unlucky dead pixels" on monitors / etc.) should be filtered out of the results, since they tend to distort the picture.


This alongside with the classical reviews you already deliver will offer a great buying help for people who want the best for a certain amount. And not maybe pay $200 too much because they have heard that "company xyz" makes slow products... while they could have the same performance for $200 less...
 
Good video.

No matter what some people say. I love this kind of testing methodology alongside the classical "number crunching" / "oh this one is 5 FPS more so it is superior".

I know this video has basically just been done and I imagine the effort of organizing such blind test run is rather large. But if the author should think of doing something like it again in the far (or not so far) future
maybe I could suggest the following:

- take 1 NVIDIA + 1 AMD employee / or affiliate / or fan, or simply a trusted unbiased system builder (who is known & trusted to assemble efficient gaming rigs) and give each a virtual budget of $1000 (basically persons who will not waste $500 on RAM if it makes no difference at all for a blind gaming test to the be left with $500 for cpu, motherboard and gpu combined. who gets to do this could also be a community vote.)

- let these guys build a gaming rig with these $1000 that they think will offer the best gaming experience on modern games (-> no one can blame author for chosing TN panel over OLED then etc. etc.... the two parties are completely responsible, what makes the persons eligible for this job a very important choice tho'... )

- make sure the configuration of the systems is supervised by the author, but every possible tweak a party suggests is considered and integrated. / supervision should make sure that stuff like HPET / OS choice / driver choice etc. is for the best possible performance. it doesn't necessarily have to be streamlined , especially if one of the parties can prove to the author / community that a certain setup runs way faster if you tweak a certain elsewise unpopular setting.

- perform a blind test in a range of 10-20 modern triple A titles of all representative APIs , if a game offers multiple APIs the parties are allowed to decide which API will be enabled on their rig for that game.
for example if Nvidia runs Battlefield 1 better on DX11 in the system builder's opinion he may use that instead of DX12... or the AMD builder can use Vulkan on DOOM - it is open to them - since it is open to the user in the end as well.

- for the blind testers there should be a streamlining of categories they should talk about. so that there is something like consistent microstutter, normal stutter/intermittent freezes, tearing, graphical glitches, input lag.
of course they are open to address all pros and cons they experienced, but categories that can not be attributed 100% to hardware choice (like confirmed gameplay bug / possible "unlucky dead pixels" on monitors / etc.) should be filtered out of the results, since they tend to distort the picture.


This alongside with the classical reviews you already deliver will offer a great buying help for people who want the best for a certain amount. And not maybe pay $200 too much because they have heard that "company xyz" makes slow products... while they could have the same performance for $200 less...
Hehe, yeah, I guess if I had a couple months to work on one video, there is a whole lot of stuff there that would be awesome. :)
 
Back
Top