Rockenrooster
Gawd
- Joined
- Apr 11, 2017
- Messages
- 957
What exactly did AMD get by “winning”?
More like the AMD customers won
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
What exactly did AMD get by “winning”?
^^^^^^^^^^^^Good thread, lots of good talk both criticism and praises.
My problem is I would build a new system with 3900X and new Navi, but there are simply no good games right now. So I just don't see how building a new system to play some good games is a waste.
AMD did not "win" financially
AMD promoted an open standard. They don't get licensing fees. They don't get to sell more cards because of any exclusivity. It doesn't even significantly burden their competition or make AMD more competitive in the field.
Maybe it's a moral win, if you want to look at it as something positive in the news for AMD, but it doesn't move the needle on anything else for AMD.
Or DXR are like Phys X and go the way of the Dodo, and AMD are betting on something not proprietary.
I cant see how DXR or something else along those lines could make my gaming better / more enjoyable. i think just getting to DX 12 will be just fine for me, and then i will leave the "megapixel race" to the few elitist players.
To me it seem like AMD won the freesync / G sync battle, and i think they can still win the next level grafix too, and next level grafix dont absolutely have to be DXR.
It wasn't taken away, they offered it up for free to every, including Nvidia, like most of their tech... Nvidia was the one that those not to support it for so long, until they were running out of choices. It was just a driver update to enable and available in hardware for a long time. They just wanted to milk g sync as long as possible. Granted, I wish freesync defined minimum standards better, but g sync requiring licenses and custom Nvidia hardware was a crappy solution.I'd agree, Oldmodder . AMD did basically win the FreeSync/G-Sync battle. You can obviously spin it either way (since Nvidia did gain an advantage by taking away AMD's last big feature), I think in the long run you will see standards-based features winning out over proprietary (all else being equal).
With DXR it is a little different. While Nvidia's RTX hardware is proprietary, they were smart to work with Microsoft and incorporate it into DX12. This basically forces AMD's hand, they will be left behind even further if they can't come up with a viable RT solution.
And DXR does look nice. I actually chose to "upgrade" from a 1440p monitor to 1080p so I can experience DXR with higher framerates. And I'm happy I made that trade-off. It is obviously the future, and it will only get better as developers learn to use it.
They won by not having to compete against a proprietary feature. If they went with a proprietary freesync, they would have had to compete with Nvidia for monitor makers to adopt. Now that Nvidia had to fold and start supporting freesync, people are free to switch from Nvidia to AMD or AMD to Nvidia and not have to purchase another monitor.What exactly did AMD get by “winning”?
You keep saying Tensor cores (and RTX cores I assume) are not a long term solution. Based on what? I mean, its only the 1st generation RTX.
Based on transistor count. They hardly have enough room at 12-15 billion transistors to jam enough shader cores in as it is. If Ampere has a full tensor core (RT cores it doesn't matter what we call them) its going to push 20 billion transistors. At that point they are going to have again a chip that is much larger then the competition... larger chips are harder to fabricate. It's just physics. Every wafer is going to have defects if your die size is large the chances of fewer fully functioning chips goes up. This is why yields on NVs highest end turing chips suck.
At 7nm physics again becomes a problem with voltage leak. Super high clock speeds become a problem with that many transistors packed in that tight. Its why ryzen2 isn't pushing crazy 5ghz clocks... Intels won't either. Once we move past 7nm to 5nm that becomes even more a problem. Voltage leak is one reason why everyone is designing chiplets. Ampere may be a chiplet... I guess its possible NV is planning to fab 2 chips one shader and one tensor. At that point though they are going to be looking hard at how they can perform that math on shaders to avoid having to ship tensor hardware on consumer parts. If its not chiplet they will have to choose... ultimate shader performance, for lower clocks and fewer shaders so they can dedicate 3-5 billion transistors to tensor/RT cores.
They definitely are. It was confirmed PS5 has ray-tracing, so AMD must be in the later stages or almost finished with their implementation.
Well if they release 5800 and 5900 that pretty much make 5700 cards their lowest offering if we dont count old inventory, i dont think the low end Nvidia cards support RT either, so i would not say its sad.
I will agree if 5800 or at least 5900 dont have RT then it will be a bit sad, but then again did anyone say ( in regard to future consoles ASO ) that it would be hardware RT ?
Dident they demo some form or RT already on older vega cards, not hardware RT of course
Well if they release 5800 and 5900 that pretty much make 5700 cards their lowest offering if we dont count old inventory, i dont think the low end Nvidia cards support RT either, so i would not say its sad.
I will agree if 5800 or at least 5900 dont have RT then it will be a bit sad, but then again did anyone say ( in regard to future consoles ASO ) that it would be hardware RT ?
Dident they demo some form or RT already on older vega cards, not hardware RT of course
AMD has Radeon Rays, which is a limited ray-tracing SDK, so they have some experience in the field, though this is not a competitor to DXR.Dident they demo some form or RT already on older vega cards, not hardware RT of course
-IF- AMD can deliver a card by the holiday season that can compete with the 2080Ti in NON ray tracing performance for $100 less it will be a huge success.Option A: Big Navi lacks hardware ray-tracing --> it's dead on arrival
-IF- AMD can deliver a card by the holiday season that can compete with the 2080Ti in NON ray tracing performance for $100 less it will be a huge success.
Well they took the easy path with the 5700. Sad.
AMD needs to have courage and push the future with RT like nVidia is.
Yeah, RIS works well and gives some extra performance boost when you need it. I only have a 1080p monitor on that rig, but running at 900p it is very good and only on close inspection can you tell it's not native res. At 1800p w/ a 4K screen I'm sure it would look even better.They also gave 5700 users things like Radeon sharpening for nothing... and it doesn't need developer implementation. 5700 xt is actually very very good at 4k with down sampling and sharpening. I saw a A B test the other day at the local geek shop and I could tell which one was downsampled... only because it was running a lot smoother. The IQ looked identical to my eye. NV solution is what DLSS which is horrid.
AMD has Radeon Rays, which is a limited ray-tracing SDK, so they have some experience in the field, though this is not a competitor to DXR.
https://www.amd.com/en/technologies/radeon-prorender-developers
They were marketing it as a solution for professional CAD-like applications, and it has been around since before Nvidia announced their RTX plans.
Yeah, RIS works well and gives some extra performance boost when you need it. I only have a 1080p monitor on that rig, but running at 900p it is very good and only on close inspection can you tell it's not native res. At 1800p w/ a 4K screen I'm sure it would look even better.
I know you can do it on Reshade now, but only because AMD did it first and released it for free. Doesn't seem like it was heavily marketed, but it's a pretty huge feature. It may make 4K viable on this mid-range card.
More like the AMD customers won
I will agree if 5800 or at least 5900 don’t have RT then it will be a bit sad, but then again did anyone say ( in regard to future consoles ASO ) that it would be hardware RT ?
If you own an AMD card nothing really changes so it’s not clear what they won. Nvidia customers can now use freesync displays.
Nvidia and the monitor makers lose as it’ll be harder to extract that gsync premium now.
NVIDIA's RTX came after Microsoft's DX12 DirectXRaytracing (DXR) plans FYI.
Somehow people got this idea that DXR is all NVIDIA's doing, when infact it is Microsoft doing the enabling:
https://devblogs.microsoft.com/directx/announcing-microsoft-directx-raytracing/
This says it all:
"3D Graphics is a Lie"
This part is also worth noting:
"In addition, while today marks the first public announcement of DirectX Raytracing, we have been working closely with hardware vendors and industry developers for nearly a year to design and tune the API. "
AMD, Intel, NVIDIA...they all knew DXR were comming way in advance...this was no surprise to them.
Final quote:
"This is the power of DirectX Raytracing: it lets developers have their cake and eat it."
Yes they released a mainstream card that has been selling extremely well.. Indeed sad.
They also gave 5700 users things like Radeon sharpening for nothing... and it doesn't need developer implementation. 5700 xt is actually very very good at 4k with down sampling and sharpening. I saw a A B test the other day at the local geek shop and I could tell which one was downsampled... only because it was running a lot smoother. The IQ looked identical to my eye. NV solution is what DLSS which is horrid.
AMD is offering gamers features they can use today... not promising them features that won't work for real for another few generations of hardware.
Still at it?
I think you are still confused about Nvidia's RTX cards.
Turing doesn't do native DXR, there is a team who works with each game to get "RTX On" to work, because Nvidia needs to write a specific API for each game to get RTX to emulate DXR calls. "Interfaces with...." = emulation
Turing was built for the Enterprise world, not gaming... it doesn't matter if Nvidia knew DXR was coming, they have not released a chip that can natively do DirectX-raytracing. As such, Nvidia's AI/Server cards take a 30% performance hit... so essentially Turing can't do DXR either.
When you have a true DXR card, it will do DX12 ray tracing natively. Those cards are coming in the next 13 months, both from Nvidia and AMD.
Already dropped.... plz follow along.
It is as if.... Some of you people who don't know things... purposely do so.
Really, post the source that says RTX is not utilizing DXR.
Turing doesn't do native DXR, there is a team who works with each game to get "RTX On" to work, because Nvidia needs to write a specific API for each game to get RTX to emulate DXR calls. "Interfaces with...." = emulation.
Really, post the source that says RTX is not utilizing DXR.
"Nvidia is bringing support." (Do those cards magically now do hardware raytracing?)
Plz, understand you are being fooled by Nvidia's marketing and subtle play with words. You are missing the concepts of supporting in software, and hardware support. Navi can already do Ray tracing... but there is no reason for it now, because it would impact the Game's performance..
Jensen just hid really well behind all His convoluted "It just Works" bs.
Turing & Ray Tracing is not a selling point (moot). It is ONLY a marketing point. You are trying to market Nvidia, but Gamers are knowledgeable folk and are able to see through your gimmicks as well.