AMD on Ray Tracing, NVIDIA Embracing FreeSync: "No Benefit Today," "We Were Right"

Megalith

24-bit/48kHz
Staff member
Joined
Aug 20, 2006
Messages
13,000
AMD CEO Lisa Su participated in a roundtable Q&A after her CES keynote, touching upon ray tracing and NVIDIA’s support for Adaptive Sync. For ray tracing, Su said AMD is “deep into development” but doesn’t feel the need to hype it up yet: “The consumer doesn’t see a lot of benefit today because the other parts of the ecosystem are not ready.” She also said NVIDIA proved FreeSync was “the right answer for a couple of years” following their adoption of Adaptive Sync.

Look, we knew FreeSync was the right answer. We’ve known FreeSync was the right answer for a couple of years. The fact that others have decided that FreeSync is the right answer, I think, says that we made the right choice a few years ago. We believe in open standard. You know, we believe in open ecosystems. That’s been a mantra. So we have no issue with our competitors about FreeSync.
 
She's not wrong... someone should have told Jensen that a successful hardware launch with no software is a feat only Nintendo can pull off.

Hopefully AMD has a bunch of stuff to wow us with all year... open standards to push adoption are good, and launches supported by software are good. Hmmm makes sense guess someone has to be the adult CEO.
 
I know you're being funny with that but nVidia's CEO really came off like a whiny little bitch when he said what he said.

AMD is in a renessance for their company. They are currently about to become market leaders in both CPU in every consumer tier, and possible the mid and low range of the entire GPU market with 7nm later this year. Nvidia has lost about half its value in a short amount of time, and it's painfully obvious with Jensen going out of his way to badmouth the competition at every opportunity. Typically as a ceo one tries to stay classy and let the products speak for themselves. If you have to revert to slander and petty insults, it says a lot about your current mindset of defeat.
 
If you have to revert to slander and petty insults, it says a lot about your current mindset of defeat.
Basically this. nVidia's CEO said that raytracing was ten years in the making and here we are and it's pretty much... meh. Toss in the fact that many early release cards had the "space invaders" issue and you have a product launch that's been very much lackluster.
 
Ray tracing as an open standard will be pretty damn cool. I think it basically is... consdiering DirectX 12. I still wonder why direct x 11 is still a faster API than Direct X 12. I thought 12's memory allocation and use would give it an edge over 11.
 
I've listened to Su speak several times now. I think she has class and comes across as very professional. I applaud her for taking the high-road on her how she has responded to Nvidia recently.
AMD was in a very bad place a few years ago - if Zen was a flop, they were probably dead. Now that it is doing well and they have a killer pipeline (IMHO), I hope they start boosting their GPU R&D. They are obviously behind Nvidia.
I'm curious of their 5 year plans for GPU's. AMD has parts in Xbox, PlayStation, Mac, and maybe a few other areas Nvidia is not competing with right now. If that's your bread-and-butter, maybe you don't try to compete with Nvidia on the high end? Consoles will never ship with a $1000 GPU - too expensive and would not sell in sufficient quantities. The $499 number is probably the higher end of a magic number in the console world. Is that enough budget for a $199 GPU? What if you are AMD and can sell an APU to cover both the compute and graphics horsepower of a console? This same chip should do well in laptops as well.
 
Realtime Raytracing will always be a unicorn. It will, at best, become a small part of the raster pipeline: even in BFV they added screen-space reflections (a current raster effect) as an overlay to actually improve on the RTX.
 
She's not wrong... someone should have told Jensen that a successful hardware launch with no software is a feat only Nintendo can pull off.

Hopefully AMD has a bunch of stuff to wow us with all year... open standards to push adoption are good, and launches supported by software are good. Hmmm makes sense guess someone has to be the adult CEO.

WiiU will disagree...and AMD does the same exact thing all that support for trureaudio, and the great other features that never really were supported, just like when they claimed the Holy Grail was their version of async compute, that hardly any developer used.
 
Realtime Raytracing will always be a unicorn. It will, at best, become a small part of the raster pipeline: even in BFV they added screen-space reflections (a current raster effect) as an overlay to actually improve on the RTX.

Uhm, Ray tracing is the next actual evolution of rendering in a similar way, vector shifted to rasterization, it's not about being a unicorn it's about running it when we have the compute to actually run it in real time, that is all, NVidia got the ball rolling because they believe we are getting near a point where hybrid for the time being is acceptable and they are right, if you can run Lighting and reflections in real time that is more than acceptable for the time being it will be a slow change but the point is they found a place to start which is the most important step.

I think people are understating and overstating rtx by a large margin it is what it is, I doubt many of you remember the fps drop we had switching from 2d sprites to 3d rendering, alot of games were terrible for frame drops at random intervals because the power really wasn't there yet, Forsaken was a great example of this. The point is people act like you his is a new situation, every time we have an advancement of any major nature people start crying how much it sucks "I wanted 200+ frames at the top resolution" it doesn't work that way.

Personally I'd rather have graphics with context rather than having the same old same old lighting reflections and shadows are just god aweful in rasterization games with huge penalties for shadows due to the compute nature, that will now shift and act more naturally giving context to a scene rather than just blurry layers of black and grey, sure resolution is important but when we hit 8k we hit a wall where the panel exceeds what your eyes can percieve, at that point 4k with eyecandy and a step closer to uncanny valley would most likely be a better choice.
 
Even a blind dog finds a nut every once in a while.
Go AMD!
But if Dr Su wants to be considered a real CEO she is gonna hafta slander the green team a whole lot more, I mean really get dirty.

Honestly I rather like her as a CEO. I'd rather not have all the pointless trash talking and lip service. Sure there's going to be some and that's expected.

But I have noticed that AMD seems to have more of a solid plan since she's been at the helm. They only have so much resources for R&D and choices must be made.
 
You know, it's my belief that it doesn't matter if the game is beautiful if the game concept sucks. Case in point, Fallout 76; the game sucks.
 
You know, it's my belief that it doesn't matter if the game is beautiful if the game concept sucks. Case in point, Fallout 76; the game sucks.
While overall true, and even true in 76's case... As a player with a lvl 42 and lvl 26, I don't know if I can say that FO76 "sucks".

It definitely has its shortcommings, and obviously has its fair share of bugs and oversights (in terms of the latter, such as with a lot of weapon mods clearly not being labeled or specced correctly in comparison to others). A lot of elements in gameplay are indeed lacking as well, and there's a clear lack of engine optimization it seems like (in terms of VRAM usage). Some of the changes they make are weird as well, like increasing the nighttime lighting and some Ambient Occlusion issues in buildings that make you wonder how they were never noticed. I do wish I had console access JUST to do as I would in FO4: fix immersion breaking errors in textures or object placements.

On the fip side, there's still a lot to enjoy, and also a lot to look forward to. They do take player feedback into consideration and I expect that we're going to eventually see NPCs. Flesh and blood NPCs I mean, not robots. I mean it seems odd that there were clearly people that lived just fine after the nukes, but suddenly died. But yea, I don't know if he's been in the game the entire time, but just yesterady I came across a Traveling Vendor (as they were in the previous Fallout games), with a Pack Brahmin in tow.... except it was a Super Mutant NPC!! Living, breathing, non-robotic. Had well voiced "idle" dialogue too, comical and thought provoking heh Granted, there were no dialogue options with him, which overall is something clearly lacking in the game... but I do have hope that the game will improve.


There's a lot of things games could learn from 76, and that Bethesda could learn from other games, too... But again, you're right that FO76 does look beautiful (much better engine vs FO4), yet does in turn drop the ball in other regards.

I'd personally like to see Ray Tracing be backported into games. Granted, it'd be difficult given it's tied to the DX12 API, but surely there's a way to still access the RT cores, or whatever method AMD cooks up, to integrate it into older games. The benefit there is that due to their age, they can run better on the newer hardware, so you're actually showing off the tech in a better way... At least that's my opinion heh
 
the question is, how many real customers of nvidia actually care about this kind of stuff? I suspect a lot of folks that buy nvidia don't even know or care.
 
Ray tracing as an open standard will be pretty damn cool. I think it basically is... consdiering DirectX 12. I still wonder why direct x 11 is still a faster API than Direct X 12. I thought 12's memory allocation and use would give it an edge over 11.

DX jumps has always been a bit of a crapshoot. I remember Crysis supposedly being DX10 but you could just patch the game to enable all the dx10 effects on a dx9 card. DX11 was the advent of the infinite particle effects. And now we have a wonky dx12 that some of the largest studios on planet earth can't get to run par to dx11 without any perceivable visual benefit (im not talking about enabling ray tracing, that's a different metric for comparison). In the end it often comes down to the engines a game runs on and how optimized it is, not the DX level.
 
DX jumps has always been a bit of a crapshoot. I remember Crysis supposedly being DX10 but you could just patch the game to enable all the dx10 effects on a dx9 card. DX11 was the advent of the infinite particle effects. And now we have a wonky dx12 that some of the largest studios on planet earth can't get to run par to dx11 without any perceivable visual benefit (im not talking about enabling ray tracing, that's a different metric for comparison). In the end it often comes down to the engines a game runs on and how optimized it is, not the DX level.
IIRC, DX11 was the same... More a big extension to 10 where, again, most effects were capable in 10 with a bit of work.

I think 12 is finally a 'new animal' compared to the previous, which is why some games performance falters when running under 12. I think it's for two reasons: First, that it's this different beast, which takes time to hone your optimizations for. Second, because a lot of engines are ad-hocing DX12 support onto them, instead of being specifically written for it. Just a guess, though.
 
IIRC, DX11 was the same... More a big extension to 10 where, again, most effects were capable in 10 with a bit of work.

I think 12 is finally a 'new animal' compared to the previous, which is why some games performance falters when running under 12. I think it's for two reasons: First, that it's this different beast, which takes time to hone your optimizations for. Second, because a lot of engines are ad-hocing DX12 support onto them, instead of being specifically written for it. Just a guess, though.

Yup. DX11 is just DX10.2 (or whatever you like). It's not a bad addition, and it made some useful stuff mandatory, but it's really just a DX10 enhancement.

DX12 is a from the ground rewrite, but it's also a paradigm change on the PC side to better resemble the low-level capabilities of consoles. Unfortunately, it's also a bit of a catch-22; performance is increased without question, but implementation is on the developers, and well, they're not all great at that. SLI and Crossfire have the same dependency now and we've seen the GPU makers back off of them significantly for the moment. Ray tracing is similar too, but it's also a bit of a paradigm shift, and while supporting it is relatively easy on its own, making it work with current game engines is going to be hit or miss.

On the upside, developers like Epic already have this stuff more or less figured on the engine side. It's the engine licensees and those like EA that have their own engine (DICE's Frostbyte) that will have to work a bit harder.
 
Yup. DX11 is just DX10.2 (or whatever you like). It's not a bad addition, and it made some useful stuff mandatory, but it's really just a DX10 enhancement.

DX12 is a from the ground rewrite, but it's also a paradigm change on the PC side to better resemble the low-level capabilities of consoles. Unfortunately, it's also a bit of a catch-22; performance is increased without question, but implementation is on the developers, and well, they're not all great at that. SLI and Crossfire have the same dependency now and we've seen the GPU makers back off of them significantly for the moment. Ray tracing is similar too, but it's also a bit of a paradigm shift, and while supporting it is relatively easy on its own, making it work with current game engines is going to be hit or miss.

On the upside, developers like Epic already have this stuff more or less figured on the engine side. It's the engine licensees and those like EA that have their own engine (DICE's Frostbyte) that will have to work a bit harder.

I don't think we'll see normalized performance until the next gen consoles come out and it becomes the gold standard for all developement. 1 More year and it'll finally make sense to buy a new videocard for all those 970 gtx owners still crushing 60fps maxed out at 1080 in every game.
 
I don't think we'll see normalized performance until the next gen consoles come out and it becomes the gold standard for all developement. 1 More year and it'll finally make sense to buy a new videocard for all those 970 gtx owners still crushing 60fps maxed out at 1080 in every game.

With ray tracing, that's likely true- DX12 is going to take some doing too.
 
She's not wrong... someone should have told Jensen that a successful hardware launch with no software is a feat only Nintendo can pull off.

.

LOL!! Nintendoes what Nvidon't


Anyways, this is how a professional takes childish critiques. You go Dr. Su.
 
In a related story, AMD has been stating the obvious for what they say is "years." Jensen Huang commented, but only little was understood of what was rambled on about.

"She got me." Huang said as he wore his footy pajamas and curled up in the corner. "That mother fucker Su boomed me."

Huang added, "She's so good." Repeating it 4 times.

Huang then said that he wanted to add VSR as another feature on nvidia gpu's this summer.
 
The funny thing about all of these games with beautiful graphics is that in years past we didn't have high end graphics yet we were happy. We had side scrollers with 16-bit graphics and we were happy. WE WERE HAPPY I TELL YOU!

Fast forward to today and unless the game has gee-wiz graphics the game isn't considered to be a good game. How about making a game with great graphics and great game play. There's a reason why so many of the so-called Triple-A game title companies have seen a huge drop in stock prices, because they've forgotten how to make great games.

It's not the graphics that make the game, it's the game that makes the game.
 
The funny thing about all of these games with beautiful graphics is that in years past we didn't have high end graphics yet we were happy. We had side scrollers with 16-bit graphics and we were happy. WE WERE HAPPY I TELL YOU!

Fast forward to today and unless the game has gee-wiz graphics the game isn't considered to be a good game. How about making a game with great graphics and great game play. There's a reason why so many of the so-called Triple-A game title companies have seen a huge drop in stock prices, because they've forgotten how to make great games.

It's not the graphics that make the game, it's the game that makes the game.
Nah, I am happy with whatever graphics really.
 
the pic in the top news feed threw me for a moment, it almost looked liked Nvidia CEO with his leather jacket holding up a GPU
 
I don't think we'll see normalized performance until the next gen consoles come out and it becomes the gold standard for all developement. 1 More year and it'll finally make sense to buy a new videocard for all those 970 gtx owners still crushing 60fps maxed out at 1080 in every game.


This post made me lol..There are ZERO, I repeat ZERO, 970 owners that are crushing even 90% of the games on the market "maxed out @ 1080P and 60fps average." Nvidia's 1060, 1070, even 1080 cannot do that in EVERY game on the market. They can do it in a lot of games, but not everyone. There is so much fud spread here anyone it is crazy. I have one of the fastest VEGA 64s on the planet (1750/1100Mhz are my normal gaming clocks) and there are a few games out there that would stress it. Just maxing GTA V out with every possible option can cause crazy fps dips that drag things down. Crysis 3 can do the same thing. Those are 4-5 year old games. Forget about new games like KCD, Ark, etc.
 
................... They are currently about to become market leaders in both CPU in every consumer tier .............
Wait, what do you mean by "every consumer tier"?

I just want to confirm that you aren't talking about in the server rooms of data centers.
 
We had side scrollers with 16-bit graphics and we were happy. WE WERE HAPPY I TELL YOU!
I remember thinking in NES days I wouldn't be fully satisfied until we had voxel based graphics, where I could break into a house and cut people up any way I want and see all their bones and insides in photorealistic virtual reality... still a long ways to go for that kinda thing
 
Wait, what do you mean by "every consumer tier"?

I just want to confirm that you aren't talking about in the server rooms of data centers.

'Consumer' usually addresses the individuals that buy their cpus + gpus for home use in retail or oem channels for personal utility. I'm not talking about niche server spaces or cell phone cpus, only the news topic linked in this thread.
 
'Consumer' usually addresses the individuals that buy their cpus + gpus for home use in retail or oem channels for personal utility. I'm not talking about niche server spaces or cell phone cpus, only the news topic linked in this thread.


OK, Thanx for confirming your comment. Now let me pose one for your own consideration. the "niche" server market, data enters, etc. That's on one side of the fence, and on another side is the PC/consumer side you mentioned, as well as a third which you also mentioned, mobile. With the growth of virtualization, centralization of data centers, and expansion of mobile, the PC market is shrinking with the exceptions of enthusiast/gaming and the workplace workstation and even that workstation space is giving way to virtualized VDI.

So I'm asking, how much market is there left to lead in? It's rapidly looking like a pyric victory, or an abdication. People can beat on MS for not pushing real advances in the PC/workstation CPU market, but maybe they are just going to ride what they have as long as they can because they see it as a market without legs.
 
Back
Top