Big Navi is coming

Why do Nvidia always tend to make up the bulk of AMD threads, do people really want AMD to turn into Nvidia ? or just hands down compete on a playing field they almost exclusive laid down.
If thats the case AMD will never win, but maybe thats what some people want, even outside the Nvidia company that i am sure would like to be the google of computer grafix.

I dont need RT i just want a good deal on a card for my 1080p / 144Hz monitor, and the 5700 XT seem like a good deal, and as i said i could even fool myself to buy a even larger model, though i think time will prevent that this time around.

I feel a little like people are saying RT is the end all for everything in computer grafix, and to me thats like saying wind turbines are the only thing in renewable energy, which are of course silly to say.

O and then i want someone to make a bloody game i can play on my fine new computer / GFX card,,,,,,,,,, but i somehow doubt that so the rest of my gaming career with probably be one let down after another like it have been for the past 10 years.

No reason for you to buy anything but AMD since RT doesnt matter to you.

However in this world of ours many think and want differently. Luckily we have choices.
It's not about who "wins". It's about who can give people what they want.

Right now Nvidia can offer more. Simple as that.
 
Why do Nvidia always tend to make up the bulk of AMD threads,

<snip>

Because there are two behemoths in the GPU race and you can't talk about one without a comparison to another. It's obvious you don't want RTX. That's cool. I don't want to pay for it either but I'm also not looking to upgrade my current card anytime soon. If I were, it looks like the RTX 5700 XT is a mighty fine card and would find its way into my system.

What people in this thread are indicating is something that I've also stated: If AMD and nVidia cards perform the same, are in the same price bracket, use the same power but one has RTX and one doesn't, who is going to turn down a 'free' feature? AMD needs to consider DXR support for simple long-term longevity. If they don't then they need to be cheaper than the competition as AMD won't have support for a DX12 feature that their competitor has.

We don't want AMD to become nVidia. That would be the worst possible outcome short of AMD ceasing to exist.
 
"Nvidia is bringing support." (Do those cards magically now do hardware raytracing?)

Yes, they do, albeit not magically, but via shaders. Shaders are hardware. There is no emulation going on. From what you say I'm parsing that what you mean to say is that RTX is unnecessary to run DXR code, on that you would be correct, because AMD can also run DXR on shaders if they wanted to. That, however, is a very different statement than claiming that there's any kind of "emulation" going on.

Why do Nvidia always tend to make up the bulk of AMD threads, do people really want AMD to turn into Nvidia ? *snip* I dont need RT i just want a good deal on a card for my 1080p / 144Hz monitor, and the 5700 XT seem like a good deal, and as i said i could even fool myself to buy a even larger model, though i think time will prevent that this time around. I feel a little like people are saying RT is the end all for everything in computer grafix, and to me thats like saying wind turbines are the only thing in renewable energy, which are of course silly to say.

Few things to point out:

1) You're right and I apologize - I honestly thought I was in an Nvidia subforum thread when I responded to the RTX stuff, that's how much the point of the thread had changed. I got curious about the "emulation" comment (which is nonsense). Now, to return to Navi:

2) I certainly don't want AMD to transform into Nvidia: they already are in some ways, like market segmentation and pricing, as the RX 5700 were meant to be RX 600 series, but AMD saw they could get away with marketing them as $100 more expensive and successfully did so. That's a very Nvidia approach to the market. I'm not saying this is wrong, they're a company and are meant to make money. It does suck for consumers, however, when neither player is offering products in the whole stack, and have the benefit of seeing a void in an area and upping the price just because they can. That said, I trust exactly zero of the things that Nvidia or AMD tell me, so I do my research.

3) As I said earlier, I'm also mainly looking for a good 1080p card deal. I'm on a 1440p 32" monitor and I mostly game at 2560x1080 75hz because I like the ultrawide aspect ratio (but need the extra vertical space when I'm working), so essentially I'm a 1080p+ gamer. Sadly, there have been no new $200-250 cards in the past 2 years. I expected there to be a "small" Navi already, say an RX 5500, for me to upgrade, but nothing's out yet, only the higher tier RX 5700. Sure, nvidia released the 1660 but that's a garbage upgrade for a 1060 owner, it's designed for 960 owners. Let's keep in mind, the 1060 is now a 2 year old card - so I'm stuck with no upgrade either from Nvidia or AMD for now. That's why I'm much more interested in a $200 Navi than a $350 one - I don't need 1440p/4k-lite performance when I'm gaming at 1080p ultrawide.

4) Make no mistake: raytracing is a big deal. This is not Physx garbage. It's not Nvidia WhateverWorks bullcrap. It's a lighting implementation that actually does what real physical light does. The implications for graphic realism are tremendous, and it's feasible now (not very well) but give it 2-5 more years and you'll be blown away. The change is equivalent to the monumental improvements pixel shaders brought to graphics nearly 20 years ago - if you were around gaming at that time, you know exactly what that change felt like in terms of image quality; and if you were around 30 years ago, you'll remember the change from 8bit to 16bit games - earth-shattering. Raytracing is exactly that, but like all those generational improvements, it needs a few years to really hit its stride. Nvidia chose to strike first with a very gen1 release, premium for those who enjoy the bleeding edge, but not super practical yet. AMD chose to skip to gen2, when both them and Nvidia will get quite decent DXR acceleration. 2020 is going to be a good year to buy a GPU, with good development on new DXR tech and both companies competing on it.
 
Because there are two behemoths in the GPU race and you can't talk about one without a comparison to another. It's obvious you don't want RTX. That's cool. I don't want to pay for it either but I'm also not looking to upgrade my current card anytime soon. If I were, it looks like the RTX 5700 XT is a mighty fine card and would find its way into my system.

What people in this thread are indicating is something that I've also stated: If AMD and nVidia cards perform the same, are in the same price bracket, use the same power but one has RTX and one doesn't, who is going to turn down a 'free' feature? AMD needs to consider DXR support for simple long-term longevity. If they don't then they need to be cheaper than the competition as AMD won't have support for a DX12 feature that their competitor has.

We don't want AMD to become nVidia. That would be the worst possible outcome short of AMD ceasing to exist.

I agree to that, but if there is a price difference right now for the low/mid range, I'm taking cheaper. The low/mid cards right now *most likely* won't ever be able to do dxr reasonably well. If some nice AAA games come out and can run well on lower spec hardware, then I would consider the price difference vs. benefit. It's not that nobody wants dxr, it's just we don't want a half assed version of it released to on hardware that can't take advantage of it. I hope this changes, but for this generation, I doubt it. I don't care if AMD release dxr if it's crap, I would just as easily save some money and buy previous gen to save money over useless features. It's not that I care that it's Nvidia or AMD, I care about the results. You take people saying rtx isn't good as, it's an Nvidia feature so they must be AMD fan Bois. That may be true for some, but the reality is it's a cool concept that isn't fully fledged out yet. If AMD doesn't implement it well and Nvidia gets a better handle on it, then there IS a reason to charge/pay more for a better experience.
 
I agree to that, but if there is a price difference right now for the low/mid range, I'm taking cheaper. The low/mid cards right now *most likely* won't ever be able to do dxr reasonably well. If some nice AAA games come out and can run well on lower spec hardware, then I would consider the price difference vs. benefit. It's not that nobody wants dxr, it's just we don't want a half assed version of it released to on hardware that can't take advantage of it. I hope this changes, but for this generation, I doubt it. I don't care if AMD release dxr if it's crap, I would just as easily save some money and buy previous gen to save money over useless features. It's not that I care that it's Nvidia or AMD, I care about the results. You take people saying rtx isn't good as, it's an Nvidia feature so they must be AMD fan Bois. That may be true for some, but the reality is it's a cool concept that isn't fully fledged out yet. If AMD doesn't implement it well and Nvidia gets a better handle on it, then there IS a reason to charge/pay more for a better experience.

This is pretty much why I think AMD made the right choice with the 5700/5700XT. Although, if they come out with a flagship “big navi”, 2080ti rasterized perf, at $800-900 it’s a lot harder choice for me going with a card without RT since I’d like to keep for a few years and it’s at a performance level that can handle it.

I am also pretty excited for Cyberpunk 2077 which will have RT.
 
Not sure why anyone gives a shit about RT.

No card on the market today including the 2080ti can do RT justice. Your options even with the 2080ti is to turn it on at the expense of hitting a high en monitor refresh.

In battlefield even with all the updates the fact is 2080ti drops at 1440p from 125 fps to 60 fps. Tomb Raider a different DXR implementation and again 1440p goes from 131 down to 80fps. That isn't a feature anyone should care about right now much less base a purchasing decision on.

Cyberpunk ? ya it will probably have some really cool Ray tracing IQ... but the same problem will be there. Is anyone really ok with going from 100+FPS to 60. The current RTX generation is not really capable of DXR either, it can tech demo it.

As for the other things nvidia was going to do with their RTX stuff... DLSS. It's a bad joke that requires developer support and more of then not makes your game look like a bad oil painting with cell shading highlights when things move to fast. AMD trumped them with RIS... it does everything Nvidia promised with DLSS with zero developer input and 1% performance cost. Its a mid range feature on a mid range card... that just works.

It's nice to have competition. I hope big navi brings something interesting to the $600+ video card market. But in the mid range AMD 5700 stomps 2070/2060. Cause price does matter... and if NV fans are going to lean on DXR. The numbers using DXR on 2070 and 2060 are horrid. It is not a usable feature on those cards and never will be. DXR and DLSS are both complete crap today. Perhaps both will improve... and I expect DXR may look a lot better around the ampere launch, and NV will likely just stop talking about DLSS and let it die.
 
This is pretty much why I think AMD made the right choice with the 5700/5700XT. Although, if they come out with a flagship “big navi”, 2080ti rasterized perf, at $800-900 it’s a lot harder choice for me going with a card without RT since I’d like to keep for a few years and it’s at a performance level that can handle it.

I am also pretty excited for Cyberpunk 2077 which will have RT.

Yes, I just want to make sure people understand I'm not bashing on DXR in general, just that we don't have a viable ecosystem at the moment. I agree, at the higher end (2080ti type speeds) it is almost viable, and future games after they figure out how to better work with it, have a great chance of being playable on that hardware. I just have doubts about a 2060 (super) ever getting to the point of being viable, which is what the 5700 competes with. Good on nvidia for trying to push forward with technology and being the first to implement a feature. But just because you are the first to do something, doesn't make it the best implementation. Maybe future games will have better LOD options for raytracing and have better implementations allowing you to see some benefit without to drastic a hit to performance and allow lower level hardware to use it. I hope this is the case, but I'm a skeptical mind :).
 
Yes, I just want to make sure people understand I'm not bashing on DXR in general, just that we don't have a viable ecosystem at the moment. I agree, at the higher end (2080ti type speeds) it is almost viable, and future games after they figure out how to better work with it, have a great chance of being playable on that hardware. I just have doubts about a 2060 (super) ever getting to the point of being viable, which is what the 5700 competes with. Good on nvidia for trying to push forward with technology and being the first to implement a feature. But just because you are the first to do something, doesn't make it the best implementation. Maybe future games will have better LOD options for raytracing and have better implementations allowing you to see some benefit without to drastic a hit to performance and allow lower level hardware to use it. I hope this is the case, but I'm a skeptical mind :).

Raytracing is computational as BEEEEEEEEEEEEP.
I thought it would be ~10 years before we would start seeing it in games before Microsoft created the ECOSYSTEM for it (DXR).
The ecosystem IS her...NOW!


You must have forgotten how features like textures, H&L, AA and all other stuff started?!

And there is a "limit" to how low you can go with raytracing before the FAKING via rasterization has better I.Q. and the whole point of raytracing is lost.

People these days...*sigh*
 
Not sure why anyone gives a shit about RT.

No card on the market today including the 2080ti can do RT justice. Your options even with the 2080ti is to turn it on at the expense of hitting a high en monitor refresh.

In battlefield even with all the updates the fact is 2080ti drops at 1440p from 125 fps to 60 fps. Tomb Raider a different DXR implementation and again 1440p goes from 131 down to 80fps. That isn't a feature anyone should care about right now much less base a purchasing decision on.

Cyberpunk ? ya it will probably have some really cool Ray tracing IQ... but the same problem will be there. Is anyone really ok with going from 100+FPS to 60. The current RTX generation is not really capable of DXR either, it can tech demo it.

As for the other things nvidia was going to do with their RTX stuff... DLSS. It's a bad joke that requires developer support and more of then not makes your game look like a bad oil painting with cell shading highlights when things move to fast. AMD trumped them with RIS... it does everything Nvidia promised with DLSS with zero developer input and 1% performance cost. Its a mid range feature on a mid range card... that just works.

It's nice to have competition. I hope big navi brings something interesting to the $600+ video card market. But in the mid range AMD 5700 stomps 2070/2060. Cause price does matter... and if NV fans are going to lean on DXR. The numbers using DXR on 2070 and 2060 are horrid. It is not a usable feature on those cards and never will be. DXR and DLSS are both complete crap today. Perhaps both will improve... and I expect DXR may look a lot better around the ampere launch, and NV will likely just stop talking about DLSS and let it die.


DLSS is a major dissapointment for me. Given that you can get much better results with AMD RIS, plus you can use it on nvidia hardware too, even pascal. There has been numeous test comparing DLSS vs upscaling and more recently vs RIS and RIS comes on top both IQ and performance. DLSS looks almost as good (or bad?) than simple upscaling.

I kept wondering why not teach AI how to do AA instead of "adding pixel detail" or something like that to an upscaled render.
 
Heh viable these days means 4K @ 144hz. Just a few years ago 1080p 60fps was the bees knees now it’s unacceptable. That’s progress for you.

The first RT generation will be properly defined by next gen consoles but it’s clear that AMD can’t sit on the sidelines much longer. Would love to see their take on DXR and get some competition going.
 
Last edited:
Not sure why anyone gives a shit about RT.

No card on the market today including the 2080ti can do RT justice.

Ugh... well, this has been pointed out again, and again, and again, and... but here's once more. The reason why people give a shit, is the main flaw in your argument:

"No card on the market today including the 2080ti can do RT justice."

Give it 2-4 years and you'll see how relevant this becomes. To me, it's a gigantic step forward like 16bit was to 8bit, like 3D was to 2D, like shaders were to, well, not having shaders. DXR is bringing us from fake as shit lighting, to actually physical, accurate lighting. If you can't see the benefit in that, well, it'll probably become self-apparent to you in 4 years.
 
Raytracing is computational as BEEEEEEEEEEEEP.
I thought it would be ~10 years before we would start seeing it in games before Microsoft created the ECOSYSTEM for it (DXR).
The ecosystem IS her...NOW!


You must have forgotten how features like textures, H&L, AA and all other stuff started?!

And there is a "limit" to how low you can go with raytracing before the FAKING via rasterization has better I.Q. and the whole point of raytracing is lost.

People these days...*sigh*
What are you ranting about? Slow down, process, then respond. Ecosystem is getting started, it's not here. I can't go play more than a handful of games with so/so support. The building blocks are in place, and within the next two-three refreshes it will probably be viable to some extent. I don't call a single $1200 card that can kind of run stuff at low resolutions viable/ecosystem. I never said it won't improve nor did I mean to imply that there is nothing at all, just that it's not there yet (but much closer than it has ever been).
 
Heh viable these days means 4K @ 144hz. Just a few years ago 1080p 60fps was the bees knees now it’s unacceptable. That’s progress for you.

The first RT generation will be properly defined by next gen consoles but it’s clear that AMD can’t sit on the sidelines much longer. Would love to see their take on DXR and get some competition going.
No, viable to me is I can buy a mid range card and play a game @ > 40fps (depending on game type, some are ok down to 30, others crap until over 50)... Not a $1200 card to run 60fps @ 1080p with minimum gain in IQ (in most current games, I can't speak to the future, nor can you). Where did your standards go? Up to the stratosphere with this rediculous pricing structure? I am down here on the ground waiting for it to become a reality and it seems we're at/near that tipping point.
 
Heh viable these days means 4K @ 144hz. Just a few years ago 1080p 60fps was the bees knees now it’s unacceptable. That’s progress for you.

The first RT generation will be properly defined by next gen consoles but it’s clear that AMD can’t sit on the sidelines much longer. Would love to see their take on DXR and get some competition going.

Agreed we need some competition in that space for it to really move forward. Intel hopefully will have a solution of their own as well. I sort of expect AMD is going to go a 100% shader route... NV may well build another monolithic chip like turing just shrunk. Intel might be the dark horse.... they may well release higher end XE cards with a 3D stacked RT core/Tensor/Matrix math chiplet.

And to 1080p 60fps being enough. You make a good point... but ya high end monitors these days are 1440 and 4k. I don't expect a mid range card today to push 4k ultra settings 100 fps+... but if I spend $1200 bucks on a GPU, ya it better fucking be able to push the best monitors on the market to their specs.

With a 5700xt you can happily game at 1440 ultra/high settings on a high refresh high end monitor... and push mid range monitors to their refresh spec nicely. You can even game at 4k using RIS and 1800p downsample+sharpen.

So ya if I spend mid range money and I have a 4k monitor I can expect to use downsampling and have a great experience. If I spend doubt.

No one is spending $1200 on a 2080ti to play at 1080p. If they are I would strongly suggest going 5700 xt + a new monitor for the same spend.

I have said before kudos for NV for ninjng AMD and Sony/MS console RT stuff next year by doing the math on their Matrix Co Processors which had zero real game use. But the real Hybrid RT stuff is down the road still. NV is still king today... perhaps big navi can beat them at pure raster perhaps not. NV holds the crown today and they may for awhile. Current 5700 navi is a winner in the market its in... guess we'll see if AMD has something high end worthy of long back and forths about when they actually get something out the door.
 
What are you ranting about? Slow down, process, then respond. Ecosystem is getting started, it's not here. I can't go play more than a handful of games with so/so support. The building blocks are in place, and within the next two-three refreshes it will probably be viable to some extent. I don't call a single $1200 card that can kind of run stuff at low resolutions viable/ecosystem. I never said it won't improve nor did I mean to imply that there is nothing at all, just that it's not there yet (but much closer than it has ever been).

I run BF4 at 3440x1440 (2.5x the pixels of 1080p) at ultra/ dxr low and 80fps avg. That’s fine for me. And metro does even better. Many reviews used the Metro built in benchmark which is way more harsh than the actual game. Also DXR low is vastly better than off. People have to stop exaggerating the impact.

I don’t know where people get this idea that RT can’t have any impact either, I see it mentioned in this thread. If you think about what is happening it’s quite amazing.

That being said I completely agree we need a $500 card that can do 1440p above 60FPS and if you go into the 2080ti owners thread many people take higher FPS over RT right now. Given BF4 was a terrible title to show off RT with... a twitch shooter...

I still stand by my initial thoughts if big Navi can’t do DXR in any capacity I am going to skip it. My Radeon VII (HTPC) is fine for now. I’ll probably wait for Cyberpunk 2077 to actually launch and reassess then.
 
No, viable to me is I can buy a mid range card and play a game @ > 40fps (depending on game type, some are ok down to 30, others crap until over 50)... Not a $1200 card to run 60fps @ 1080p with minimum gain in IQ (in most current games, I can't speak to the future, nor can you). Where did your standards go? Up to the stratosphere with this rediculous pricing structure? I am down here on the ground waiting for it to become a reality and it seems we're at/near that tipping point.

That’s fair as you’re free to set a price of entry that works for you. Obviously enough people are happy to buy in at $1200 otherwise we wouldn’t be here.

This is no different to the days when people paid $1000+ for SLi and crossfire setups so they could run triple monitors or the latest games at max settings. Just because you couldn’t do those things on a $200 midrange card at the time doesn’t mean they weren’t real and viable options for consumers.

Options are good. Bottom line is AMD isn’t even giving people the option and trying to spin that as a positive thing is a losing argument every time. We need RT to take off as fast as possible as it’s the first real improvement in rendering tech in over a decade.
 
I run BF4 at 3440x1440 (2.5x the pixels of 1080p) at ultra/ dxr low and 80fps avg. That’s fine for me. And metro does even better. Many reviews used the Metro built in benchmark which is way more harsh than the actual game. Also DXR low is vastly better than off. People have to stop exaggerating the impact.

I don’t know where people get this idea that RT can’t have any impact either, I see it mentioned in this thread. If you think about what is happening it’s quite amazing.

That being said I completely agree we need a $500 card that can do 1440p above 60FPS and if you go into the 2080ti owners thread many people take higher FPS over RT right now. Given BF4 was a terrible title to show off RT with... a twitch shooter...

I still stand by my initial thoughts if big Navi can’t do DXR in any capacity I am going to skip it. My Radeon VII (HTPC) is fine for now. I’ll probably wait for Cyberpunk 2077 to actually launch and reassess then.
Fair enough, but BF4 isn't a great title (from what I've heard, I don't have it just read reviews) to compare as the experience with it on is sometimes ok, sometimes distracting. DXR low in one game makes it look better in some scenes... from the screenshots (again, I don't have, one, just read through a bunch of benchmarks with side/sides) I just don't see that much improvement in general, but I'm not a Hi-Fi type of person (probably because my target is low end, so my bar is so low anything looks good!) I'm not saying you should run out and buy Navi if it doesn't have DXR, nor am I going to run out and buy a 2080ti just because it has it. I was simply explaining my position, not everyone has to share it, but wanted to say that it was from logical reasoning and actual constraints, not just I love a company and everyone else is stoopid. Until it is more useable and more AAA games come out, I put very little thought into it. Once it's more mainstream, I'm sure I'll pick up a midrange card that supports it just to play.
I know exactly what's going on, I've written more than one raytracing engine in software ;). That was full tracing, not rendering geometry then tracing from those pixels (texels). That doesn't make it better or worse, just different targets (and that was like.. 10+ years ago when programmable GPU's where just coming out and I was getting pre-release hardware from both ATI and Nvidia at the time for doing development). I have a great appreciation for what it's doing, but appreciating something and being able to use it are not equal.
 
That’s fair as you’re free to set a price of entry that works for you. Obviously enough people are happy to buy in at $1200 otherwise we wouldn’t be here.

This is no different to the days when people paid $1000+ for SLi and crossfire setups so they could run triple monitors or the latest games at max settings. Just because you couldn’t do those things on a $200 midrange card at the time doesn’t mean they weren’t real and viable options for consumers.

Options are good. Bottom line is AMD isn’t even giving people the option and trying to spin that as a positive thing is a losing argument every time. We need RT to take off as fast as possible as it’s the first real improvement in rendering tech in over a decade.
Yes, enough people are willing to pay that much, however the mainstream market is still in the low/mid range. What's viable for me isn't necessarily the same for everyone, some can afford more, some can't afford as much. If it's a few % of the overall people that can afford it, and even then a very few % of games, and of those only a few done decently, it's just not there yet for me. If it had come out and improved ALL games currently in existence, awesome feature. When it supports a handful of games with expensive hardware, cool, some can afford it and even fewer utilize it (most use the 2080ti without just for the raw framerate), I don't consider it a viable solution. Again, I'm not trying to knock the technology, because it's pretty awesome to see how close it is to becoming mainstream. Maybe viable isn't the right term, maybe I should be using/saying mainstream instead. I really do hope some awesome games come out that can implement it nicely, but in the mid range I don't think the current gen is going to be usable, which (more than likely) makes AMD's choice to not use it in their mid range a good one. They know they will have to implement it at some point since it will become mainstream at some point, I just hope they don't wait to long, but I would give up DXR for faster performance in my price range easily at this point.
 
Ugh... well, this has been pointed out again, and again, and again, and... but here's once more. The reason why people give a shit, is the main flaw in your argument:

"No card on the market today including the 2080ti can do RT justice."

Give it 2-4 years and you'll see how relevant this becomes. To me, it's a gigantic step forward like 16bit was to 8bit, like 3D was to 2D, like shaders were to, well, not having shaders. DXR is bringing us from fake as shit lighting, to actually physical, accurate lighting. If you can't see the benefit in that, well, it'll probably become self-apparent to you in 4 years.

You missed the premise of his post.

All Gamers will be getting ray tracing in their games in the near future. This has been known for 5 -6 years that it's been coming to the market in 2020. This is nothing new, (it all been collaborated by the industry), but 2020 isn't here yet and Developer's and Game Engines just are not ready yet. (see year 2020)

Chad was saying that no hardware today (even the 2080ti) exists that can do raytracing without a performance hit.

Currently, Turing's architecture takes a massive hardware hit, so that in the games where DXRis "on", means there has been massive coding (by nvidia) in those games, to only use parts of Ray Tracing or only so many rays.. as not to choke the Gaming experience. Nvidia is trying (spending millions) to code their ray tracing in areas on the scene you'd most see visually, etc.

It is not true environmental ray tracing. <That is coming with the Xbox Scarlet and the PlayStation5 and on RDNA(2) and with Nvidia's Ampere.




So I agree with you as I think does everyone here.

Ray tracing gives a natural feel to certain games and really makes for some immersion. The general populace isn't going to pay a premium for a "taste" of it, they will want the full monte.
 
Last edited:
Yep turning on settings in a game causes a performance hit. Who would have thought that. I guess we should hope that game devs never add in new features. That way we never have to upgrade. What an awesome idea.
Troll much? No kidding there are performance hits with options, some bigger than others. The question always has and will be is the performance hit worth the IQ. That's honestly up to an individual, some people don't mind slide shows, others are bothered if they can't hold 144fps... With the majority somewhere in the middle (and it depends on the game as well to what's acceptable). The overall point is, I would not waste extra money in the 2060 range of cards for dxr since it would being my fps to low to be useful. To each his own though. We all know AMD is bringing dxr support, but not in the mid-range and I don't blame them one bit. Just as I don't think Nvidia testing waters and getting a head start on development is bad, just that it's premature for most people.
 
AMD did not "win" financially, but they were able to lead with a feature that is now becoming standard, so I still would take that as a win.

AMD mostly lost- they failed to enforce standards at the outset, and now Nvidia sets the standard for what a 'good Freesync monitor' is.

And for any enthusiast paying attention- that's a good thing, because Nvidia bestowing their 'G-Sync Compatible' certification means that the manufacturer didn't completely butcher the Freesync implementation.

Otherwise, how would you know? It's not like AMD bothered.
 
They just wanted to milk g sync as long as possible. Granted, I wish freesync defined minimum standards better, but g sync requiring licenses and custom Nvidia hardware was a crappy solution.

I wish Nvidia was more aggressive with G-Sync- they have the superior solution, especially when you get to the edge of performance (which is what Nvidia defines, because well, we're still waiting for 'Big Navi'), and the G-Sync module is very small in cost when you get to monitors that are themselves 'at the edge'.


To be clear: there are still newer, better monitors that are being released with G-Sync first. The feature is far from dead, and likely won't die until there is a consistent performance level from Freesync that can objectively match it. To be further clear: that doesn't exist yet.

They won by not having to compete against a proprietary feature. If they went with a proprietary freesync, they would have had to compete with Nvidia for monitor makers to adopt.

...and then lost because they refused to refine their standard.

Which I assume had quite a bit to do with the 'uptake' of Freesync, because building a Freesync monitor that can compete with the performance of a G-Sync monitor ain't cheap ;).
 
I wish Nvidia was more aggressive with G-Sync- they have the superior solution, especially when you get to the edge of performance (which is what Nvidia defines, because well, we're still waiting for 'Big Navi'), and the G-Sync module is very small in cost when you get to monitors that are themselves 'at the edge'.


To be clear: there are still newer, better monitors that are being released with G-Sync first. The feature is far from dead, and likely won't die until there is a consistent performance level from Freesync that can objectively match it. To be further clear: that doesn't exist yet.



...and then lost because they refused to refine their standard.

Which I assume had quite a bit to do with the 'uptake' of Freesync, because building a Freesync monitor that can compete with the performance of a G-Sync monitor ain't cheap ;).

https://www.hardocp.com/article/2018/03/30/amd_radeon_freesync_2_vs_nvidia_gsync

If we are talking about average Joe PC newbie gamers ok freesync has a lot of variance which is perhaps AMDs fault. Of course for those people you say buy a Freesync 2 monitor and that isn't an issue. Seeing as that brings the pricing more in line with Gsync its a wash.

For people on an extreme budget at least there are some decent low cost freesync options... for those that know how to buy budget gear.

As Kyle pointed out not that long ago FreeSync vs Gsync is about even... and Freesync 2 supports HDR.
 
As Kyle pointed out not that long ago FreeSync vs Gsync is about even

It was 'close enough' on the displays used, and that was with dissimilar panels. As pointed out at the time and apparently still required to be repeated: it wasn't a good Freesync vs. G-Sync test. Nice to have? Absolutely. Appreciate Kyle taking the time. But not definitive for the parts used, let alone for the technologies themselves.

and Freesync 2 supports HDR.

You got a hard 'lol' out of me for that one. Yes, G-Sync supports HDR- on Freesync 2 monitors. Please do not blindly repeat AMD mis-marketing.
 
You got a hard 'lol' out of me for that one. Yes, G-Sync supports HDR- on Freesync 2 monitors. Please do not blindly repeat AMD mis-marketing.

Fair... but its not really relevant. Bottom line is experienced gamers can't tell the difference. I hear people say GSync is superior bla bla not my experience. High end monitors with either FreeSync2 or Gsync are damn near identical. Or at the very least any difference has nothing to do with the refresh tech being used.

Yes we all know AMD lets people slap "freesync" on just about anything... and the range and quality of some of the cheapest $100 monitor with freesync options are crap. Still there are some very very good 1080p freesync monitors around for new PC gamers and or folks with tight budgets. As with most stuff you have to do a bit of research just seeing "freesync" isn't enough. IMO I prefer having a few stinkers on the market over having zero budget options on offer. And if that new or lazy gamer with money to burn just wants a solid going to be good experience going FreeSync2 is on par with going gsync in terms of quality.
 
Fair... but its not really relevant. Bottom line is experienced gamers can't tell the difference. I hear people say GSync is superior bla bla not my experience. High end monitors with either FreeSync2 or Gsync are damn near identical. Or at the very least any difference has nothing to do with the refresh tech being used.

Yes we all know AMD lets people slap "freesync" on just about anything... and the range and quality of some of the cheapest $100 monitor with freesync options are crap. Still there are some very very good 1080p freesync monitors around for new PC gamers and or folks with tight budgets. As with most stuff you have to do a bit of research just seeing "freesync" isn't enough. IMO I prefer having a few stinkers on the market over having zero budget options on offer. And if that new or lazy gamer with money to burn just wants a solid going to be good experience going FreeSync2 is on par with going gsync in terms of quality.

I'll meet you at 'close enough' ;). I'll also unequivocally agree that any VRR is better than no VRR. I'm not going back to no VRR either!

Personally, I'm good with Freesync 2 as a technology if that's what's available. I'll also pay up for G-Sync when available unless the Freesync monitor is clearly and provably an equal, because the cost in that range just isn't that much different and stuff just working right is worth it.

And the main reason I'm fine with G-Sync: AMD hasn't bothered competing with Nvidia's top-end consistently since they bought ATI. I used ATI mostly before that, exclusively with the release of the 9700 Pro, and well, that company / division has simply not returned to where it was pre-acquisition. Given the release of Navi, I'm not expecting them to return with Big Navi either.

As always, absolutely happy (ecstatic in the case) to be proven wrong- I'm not dogging on AMD, this is just what they've shown us.
 
G-Sync may be less of a gamble than FreeSync, but FreeSync can absolutely perform well on the right monitor.

I have owned two G-Sync monitors, two FreeSync monitors, and one FreeSync TV. My latest one is a Nixeus 240Hz FreeSync and it has better properties than any other I've tried, even 30fps on this thing is playable.

However, I also had a FreeSync 4K monitor and TV and, while they did work and were enjoyable, the range was too limited. Meaning it worked from 48Hz to 60Hz, but being so close it was easier to lower settings to get to 60 rather than walk that line and dip under.
 
I wish Nvidia was more aggressive with G-Sync- they have the superior solution, especially when you get to the edge of performance (which is what Nvidia defines, because well, we're still waiting for 'Big Navi'), and the G-Sync module is very small in cost when you get to monitors that are themselves 'at the edge'.


To be clear: there are still newer, better monitors that are being released with G-Sync first. The feature is far from dead, and likely won't die until there is a consistent performance level from Freesync that can objectively match it. To be further clear: that doesn't exist yet.



...and then lost because they refused to refine their standard.

Which I assume had quite a bit to do with the 'uptake' of Freesync, because building a Freesync monitor that can compete with the performance of a G-Sync monitor ain't cheap ;).
Like I said, I wish they better defined their standard, but I understand their reasoning. They did it b cause it allowed current monitors and chips to simply do a firmware update and have support, versus manufacturing new parts. This meant adoption rates would be much simpler and higher. When competing against someone with more resources, you need to get manufacturers on board. If they went for high standards and proprietary AMD hardware, I believe most would just make gsync monitors. AMD stuck with the public protocol defined by VESA (adaptive sync), rather than creating yet another fragmented segment. I think there needs to either be a better defined standard or a l simple way to define low/mid/high end, so standards categories.
It's not them defining the standard, it's the ones that define the HDMI specs...maybe that's the disconnect here. G-sync made a new standard vs supporting an existing one. AMD supported existing standards and have worked with them to help make it better (which is why freesync 2 had some better definitions, but could still use work). AMD works with other standards while Nvidia tries to work against open solutions. If Nvidia would have helped VESA better define the standard th y couldn't have charged extra for their product, make money of the manufacturers who have to use their proprietary chips, and locked people into their ecosystem.
The technology is great, and as I mentioned, the standards they hold help ensure everyone gets a good experience, but it slows down the industry as a hold and tries to strong arm competition (which is their right as a company, doesn't mean I as a customer have to buy into it).
 
https://www.hardocp.com/article/2018/03/30/amd_radeon_freesync_2_vs_nvidia_gsync

If we are talking about average Joe PC newbie gamers ok freesync has a lot of variance which is perhaps AMDs fault. Of course for those people you say buy a Freesync 2 monitor and that isn't an issue. Seeing as that brings the pricing more in line with Gsync its a wash.

For people on an extreme budget at least there are some decent low cost freesync options... for those that know how to buy budget gear.

As Kyle pointed out not that long ago FreeSync vs Gsync is about even... and Freesync 2 supports HDR.
I think this is where most are getting things wrong or maybe not clear. It's not AMDs standard, it's their implementation on the VESA apadtive sync standard first released in HDMI 1.2a. they have since helped better define things in freesync 2 (next version of adaptive sync). Again, AMD didn't make the standard, VESA made it, AMD implemented. They *could* have told monitor manufacturers that they couldn't put the freesync label on a monitor if it didn't meat a specific low end spec, but they wanted adaption (for better or for worse). I think them helping move the industry forward is good and not being locked into a specific brand is good too. I can buy a nice high end freesync monitor and switch between AMD and Nvidia at will. You can't do it the other way around.
 
It's not them defining the standard, it's the ones that define the HDMI specs...

HDMI VRR was an AMD invention that was ported from DisplayPort- DisplayPort Adaptive Sync was a protocol that existed before either companies' solutions. HDMI VRR works because HDMI has moved to the same signaling type that DP has used, as initially HDMI was just DVI with a smaller plug. Now it's just DisplayPort but with more DRM and licenses required.

In developing these into commercial solutions for desktop gaming, as the tech was designed first as a mobile power-saving feature, Nvidia created G-Sync to address the legion of shortcomings right out of the gate, and did so successfully. Like, every problem that could be imagined with a bare 'freesync' implementation, they fixed.

And when Nvidia released G-Sync, AMD showed up with a hacked-together demo of said bare solution, and then hundreds of demonstrably shitty Freesync implementations hit the market with zero specifications from manufacturers to identify all of the corners cut.

I don't blame Nvidia for taking the initiative to solve the problem right out the gate. I blame AMD for failing to compete. Even now, Freesync 2.0 falls short of G-Sync in terms of VRR in technical terms. Monitors have to exceed Freesync 2.0 specs to equal G-Sync spec for spec.

Instead, AMD lost the initiative (again), and now Nvidia has a certification program to certify those Freesync monitors that are 'close enough'.
 
It's not AMDs standard, it's their implementation on the VESA apadtive sync standard first released in HDMI 1.2a. they have since helped better define things in freesync 2 (next version of adaptive sync).

To be clear, this applies to DisplayPort, not HDMI. And even then FreeSync requires implementations of DP Alternate Mode that are not standard. It was just easy to implement in the barest fashion.

The HDMI version was absolutely defined by AMD as HDMI had no provision for variable syncing- it was never designed to be used as such. AMD (at least) got their HDMI VRR spec added, we should very much give them credit here!
 
HDMI VRR was an AMD invention that was ported from DisplayPort- DisplayPort Adaptive Sync was a protocol that existed before either companies' solutions. HDMI VRR works because HDMI has moved to the same signaling type that DP has used, as initially HDMI was just DVI with a smaller plug. Now it's just DisplayPort but with more DRM and licenses required.

In developing these into commercial solutions for desktop gaming, as the tech was designed first as a mobile power-saving feature, Nvidia created G-Sync to address the legion of shortcomings right out of the gate, and did so successfully. Like, every problem that could be imagined with a bare 'freesync' implementation, they fixed.

And when Nvidia released G-Sync, AMD showed up with a hacked-together demo of said bare solution, and then hundreds of demonstrably shitty Freesync implementations hit the market with zero specifications from manufacturers to identify all of the corners cut.

I don't blame Nvidia for taking the initiative to solve the problem right out the gate. I blame AMD for failing to compete. Even now, Freesync 2.0 falls short of G-Sync in terms of VRR in technical terms. Monitors have to exceed Freesync 2.0 specs to equal G-Sync spec for spec.

Instead, AMD lost the initiative (again), and now Nvidia has a certification program to certify those Freesync monitors that are 'close enough'.
Sorry, my memory wasn't quite right on this, you are correct it was DP. Since they use the same signaling it was a no brainier to ext me support. Yes, I admit that AMD did it to get exposure and market. I dont blame them for allowing lower spec as it allows lower cost of entry, I do however wish they had more defined criteria and possibly levels of support like low/mid/high so you could more easy determine what you were getting even if your not a techy person.
 
To be clear, this applies to DisplayPort, not HDMI. And even then FreeSync requires implementations of DP Alternate Mode that are not standard. It was just easy to implement in the barest fashion.

The HDMI version was absolutely defined by AMD as HDMI had no provision for variable syncing- it was never designed to be used as such. AMD (at least) got their HDMI VRR spec added, we should very much give them credit here!
Yes sir, sorry about that. They did it in a way that a lot of manufacturers that already supported it on DP could do a firmware update to support. And yes, as I mentioned, they worked with the industry instead of against it.
 
Last edited:
And yes, as I mentioned, they worked with the industry instead of against it.

In truth, neither were industry standards. Freesync is based upon an industry standard, but it is a unique implementation of such.

G-Sync took the idea and perfected it out the gate.

Both require special hardware.

Nvidia didn't open up G-Sync because they put significant effort into perfecting it before they announced- AMD 'opened up' Freesync because there was nothing else they could do. They had to cobble together a laptop to demonstrate their modification of DP when Nvidia announced, damn thing didn't even have discrete graphics.

Freesync was cheaper on the outset because it was so fucking halfassed. I cannot reiterate that point enough. You need a page of tests to show that a Freesync display is as close to a G-Sync display as Freesync can get. With G-Sync, you just buy G-Sync. You're done.

AMD needs to define a 'Freesync Premium' standard with a testing regime so that they can certify those Freesync displays that are functionally equivalent to G-Sync.


Absent that, well, even potential Big Navi buyers will want to look for 'G-Sync Compatible' on monitors for the best experience :D.
 
Back
Top