[PCper] Richard Huddy, AMD Gaming Scientist, Interview - Mantle, GameWorks, FreeSync

Final8ty

Gawd
Joined
Jun 13, 2006
Messages
1,001
AMD recently brought back Richard Huddy in the role of Gaming Scientist, acting as the information conduit between hardware development, the software and driver teams and the game developers that make our industry exciting.

Richard stopped by the offices of PC Perspective to talk about several subjects including his history in the industry (including NVIDIA and Intel), Mantle and other low-level APIs, the NVIDIA GameWorks debate, G-Sync versus FreeSync and a whole lot more.

https://www.youtube.com/watch?v=8uoD8YKwtww

http://www.pcper.com/news/General-T...ew-AMDs-Richard-Huddy-June-17th-4pm-ET-1pm-PT
 
Last edited:
That was really long and rather boring bitchfest about Nvidia.

Sadly not that much else there
 
Gotta say, good on Mr. Hubby for a live interview in less than amicable territory.
 
Cool, thanks for posting. Interesting claim he's making on how Mantle will adapt faster than DX12 to hardware innovations. Sounds plausible but AMD will have to stay on the ball with their driver development.
 
First rule of gameworks. You don't talk bout gameworks.

Heads gonna roll... (so much for amd not having close developer relations)
 
So it is his fault Ryan was asking those questions?

I don't know maybe there was a contract with a NDA clause ?

Richard just kept answering and they were not at all surprising to me. But it took way to much time and way to much emphasis on it.
 
Yeah, don't have time to watch a 1:30 interview. I know AMD's stance on gameworks, what point in the video do they talk about free-synch? That's the ONE indefensible sin I see Nvidia making. Gameworks? Yeah, it's a locked-tight product, but I don't think it is as bad as some people make it out to be. Mantle? Does not affect me YET. I have an AMD card, but BF4 was a disaster and Thief was a joke. Once mirror's edge comes out, then I'll comment. For now, I'm not worried. But G-sync? I remember Nvidia blare try saying (something to the effect) "yes, this can easily be ported over to the Displayport standard. No we aren't going to do that." Then AMD was like "lolz, okay we'll do it"
 
I anticipate much bitching and fighting in this thread by crazy people.
 
We havent had an appearance by PRIME1, Unknown-one, or Xoleras.

Its not a party till one of those jokers shows up.

lol
 
AMD's issue with gameworks is nVidia being able to running purposely unoptimized code for AMD hardware and there wouldn't be anything they could do. Their red flag is the devs not being allowed to optimize the code to run more efficiently on AMD hardware and not being able to show the code to AMD. If the source code is available and devs, or whoever, are allowed to do their own optimizations, then they can counter any shens like that. As Huddy said they actually made Tressfx source available (before release) and devs or nVidia can optimize the code as they see fit.They just have to make their optimized code available too. Serves 2 purposes, transparency and advancing the code.
 
Yeah, don't have time to watch a 1:30 interview. I know AMD's stance on gameworks, what point in the video do they talk about free-synch? That's the ONE indefensible sin I see Nvidia making. Gameworks? Yeah, it's a locked-tight product, but I don't think it is as bad as some people make it out to be. Mantle? Does not affect me YET. I have an AMD card, but BF4 was a disaster and Thief was a joke. Once mirror's edge comes out, then I'll comment. For now, I'm not worried. But G-sync? I remember Nvidia blare try saying (something to the effect) "yes, this can easily be ported over to the Displayport standard. No we aren't going to do that." Then AMD was like "lolz, okay we'll do it"

The last bit I thought was about free sync.
Monitors will be available in certain ratios for freesync (not the official designation when being sold) to work with. Monitors should be available near 2015 and some of the cards support it through bios upgrade and he said something about the APU supporting it as well.

Freesync has one major advantage over G-sync that is it doesn't use a framebuffer to store frames it allows communication between devices. That was one of the key differences between them.

Coming back to monitors he said something about that some of the earlier produced models in September (2014) would allow to work with an flash-able upgrade.
 
honestly, there are so many trolls in this section, im not sure.

but those three are the most active...
 
In the video he mentions "iterating more quickly" in the context of being something developers want... not sure I follow his line of reasoning.

Game developers tend to put consoles first, and consoles don't really iterate at all. It's a fixed hardware set for an entire generation (and a generation can be around for many years). A dev can standardize and learn to get the most out of that generation's hardware and software over many years.

In fact, consoles are often cited as the reason why PC games have been mostly stuck at DX9-level graphics for all these years (Both the 360 and PS3 use largely DX9-era hardware). Didn't matter how good the graphics APIs were on the PC, most games were still targeted at the current consoles limitations. Least-common-denominator wins out.

Edit 1: he also claims Battlefield 4 is a "terrific, terrific piece of code" and that in his personal opinion "it's the best piece of code out there in a game engine at the moment"... yeah, the endless bug reports surrounding that game don't really agree with him there.

Edit 2: He complains about Batman's cape being so heavily tessellated that it's inefficient on both AMD and Nvidia hardware, using the fact that the efficiency drop-off on AMD hardware is worse with gobs of tessellation applied to increase the performance-gap between the two vendor's cards... except AMD enforces tessellation settings on a per-game basis at the driver level. How is the dev's tesselation choice relevant when they can elect to ignore it?

We havent had an appearance by PRIME1, Unknown-one, or Xoleras.

Its not a party till one of those jokers shows up.
Can't speak for PRIME1, or Xoleras, but I'm not joking in any of these threads.
Be careful they will report you for just saying their names lol
I haven't reported anyone of these forums in years, actually.
"That's not what I said"
Hey, up the reading-comprehension and you won't see that quote as often.
Ha, I can't wait 'till they show up.
You guys would rather bitch about opinionated users than stay on-topic, and they're somehow the trolls?

Riiiiiight....
 
Last edited:
The tessellation option in catalyst came after the fact :eek:

There's absolutely no reason to tessellate an underwater layer that sits under the map in crysis 2.
When dev's can't change the .dll that nvidia provides , it's hardly the dev's choice to gimp performance.
At one point too much tessellation outweighs the visual benefit, in the cape example you're ten fold past that treshold.
 
In the video he mentions "iterating more quickly" in the context of being something developers want... not sure I follow his line of reasoning.

Game developers tend to put consoles first, and consoles don't really iterate at all. It's a fixed hardware set for an entire generation (and a generation can be around for many years). A dev can standardize and learn to get the most out of that generation's hardware and software over many years.

In fact, consoles are often cited as the reason why PC games have been mostly stuck at DX9-level graphics for all these years (Both the 360 and PS3 use largely DX9-era hardware). Didn't matter how good the graphics APIs were on the PC, most games were still targeted at the current consoles limitations. Least-common-denominator wins out.

Edit 1: he also claims Battlefield 4 is a "terrific, terrific piece of code" and that in his personal opinion "it's the best piece of code out there in a game engine at the moment"... yeah, the endless bug reports surrounding that game don't really agree with him there.

Edit 2: He complains about Batman's cape being so heavily tessellated that it's inefficient on both AMD and Nvidia hardware, using the fact that the efficiency drop-off on AMD hardware is worse with gobs of tessellation applied to increase the performance-gap between the two vendor's cards... except AMD enforces tessellation settings on a per-game basis at the driver level. How is the dev's tesselation choice relevant when they can elect to ignore it?

Can't speak for PRIME1, or Xoleras, but I'm not joking in any of these threads.

I haven't reported anyone of these forums in years, actually.

Hey, up the reading-comprehension and you won't see that quote as often.

You guys would rather bitch about opinionated users than stay on-topic, and they're somehow the trolls?

Riiiiiight....

He's trying to say that if you don't go full retard with tessellation, both parties can benefit more from it. Beyond the cost of actually crunching the triangles, very small triangles aren't good for rasterizer efficiency since modern GPU's like to shade in 2x2 pixel squares.

The tessellation option in catalyst came after the fact :eek:

There's absolutely no reason to tessellate an underwater layer that sits under the map in crysis 2.
When dev's can't change the .dll that nvidia provides , it's hardly the dev's choice to gimp performance.
At one point too much tessellation outweighs the visual benefit, in the cape example you're ten fold past that treshold.

They overkilled the absolute fuck out of tessellation in Crysis 2, but there is method to the apparent madness. The water not being culled away is an intentional design decision, so even though it remains visible, it should be quite cheap to draw. They rely on the pixel shader (probably the heaviest part) being skipped on occluded bits, and the geometry load of it being negligible.
 
::the red phone rings::

Uknown-one answers the phone.

"What is that, mr. President? Positive words said or written about AMD?? I'm on it!"
 
He's trying to say that if you don't go full retard with tessellation, both parties can benefit more from it.
Sure, but AMD has per-game profiles that control tessellation, so his complaint is baseless. A game can't go "full-retard" with it unless AMD allows it.

::the red phone rings::

Uknown-one answers the phone.

"What is that, mr. President? Positive words said or written about AMD?? I'm on it!"
What are you even talking about?

The portions of the video I was responding to had nothing to do with "positive words said about AMD." Dude in the video made nonsensical claims and cited problems that aren't actually problems on AMD hardware; I pointed out that the BS-o-Meter should be tripping for anyone watching said portions.

Do you actually have some kind of point? Do you agree, disagree, or are you just blindly trolling people now?
 
Last edited:
Do you actually have some kind of point? Do you agree, disagree, or are you just blindly trolling people now?

I agree with some of your claims, I disagree with others. But mostly I just love to watch you search-and-destroy anything positive about AMD.
 
In the video he mentions "iterating more quickly" in the context of being something developers want... not sure I follow his line of reasoning.

Game developers tend to put consoles first, and consoles don't really iterate at all. It's a fixed hardware set for an entire generation (and a generation can be around for many years). A dev can standardize and learn to get the most out of that generation's hardware and software over many years.

In fact, consoles are often cited as the reason why PC games have been mostly stuck at DX9-level graphics for all these years (Both the 360 and PS3 use largely DX9-era hardware). Didn't matter how good the graphics APIs were on the PC, most games were still targeted at the current consoles limitations. Least-common-denominator wins out.

Edit 1: he also claims Battlefield 4 is a "terrific, terrific piece of code" and that in his personal opinion "it's the best piece of code out there in a game engine at the moment"... yeah, the endless bug reports surrounding that game don't really agree with him there.

Edit 2: He complains about Batman's cape being so heavily tessellated that it's inefficient on both AMD and Nvidia hardware, using the fact that the efficiency drop-off on AMD hardware is worse with gobs of tessellation applied to increase the performance-gap between the two vendor's cards... except AMD enforces tessellation settings on a per-game basis at the driver level. How is the dev's tesselation choice relevant when they can elect to ignore it?

You are confused about what programmers want and what companies do to save money. They save money on PC development because management tells them where the priorities lie. This has nothing to do why games on pc are stuck in DX9. They are stuck because that is the largest common denominator. If that was DX11 then it would be DX11.

Makes sense if you want to sell games?

BF4 pushes boundaries , why don't you name 5 games that have more features then what is being used in the Frostbite 3 engine? More code = more changes of problems , normal way of life.

The Tessellation issue says something about people doing stupid shit just because they can. It does not make any sense and it does not make any sense what you typed about it.
 
I agree with some of your claims, I disagree with others. But mostly I just love to watch you search-and-destroy anything positive about AMD.
Except what I was responding to wasn't positive about AMD. Seriously, go watch the video, he flat-out says that high levels of tessellation are handled worse by AMD cards, and that fact could be used by a developer to sabotage performance.

Pointing out that that's BS (because AMD can tweak tessellation on a per-game basis right from their driver) is being positive towards AMD.

You really need to read more closely. You seem to think I'm being negative towards AMD when I'm simply calling-out clear nonsense from the video.

You are confused about what programmers want and what companies do to save money. They save money on PC development because management tells them where the priorities lie. This has nothing to do why games on pc are stuck in DX9. They are stuck because that is the largest common denominator.
Not confused, that backs up my point: "Least-common-denominator wins out."

Even if it's what devs want, it doesn't matter, because they're generally not in control. Doesn't matter how good the API's are on PC, because consoles are put first.

End result? Exactly what I said earlier: "Least-common-denominator wins out."

BF4 pushes boundaries , why don't you name 5 games that have more features then what is being used in the Frostbite 3 engine? More code = more changes of problems , normal way of life.
Irrelevant to what I said. Being a bug-filled feature-bucket doesn't make it a "fantastic piece of code."

Even you said more code causes more problems, and that's the "normal way of life." So it's normal... not fantastic.

The Tessellation issue says something about people doing stupid shit just because they can. It does not make any sense and it does not make any sense what you typed about it.
What I typed made perfect sense. The tessellation "issue" isn't an issue at all for AMD. They control the level of tessellation using per-game profiles baked into the video driver.

A dev can't sabotage AMD by using ridiculous quantities of tessellation, because AMD can simply lower the level again.

All a dev would gain by over-tessellating their models is lower performance on Nvidia hardware (as Nvidia drivers have no way to limit this feature, it has to be an in-game option).
 
Last edited:
Except what I was responding to wasn't positive about AMD. Seriously, go watch the video, he flat-out says that high levels of tessellation are handled worse by AMD cards, and that fact could be used by a developer to sabotage performance.

Pointing out that that's BS (because AMD can tweak tessellation on a per-game basis right from their driver) is being positive towards AMD.

You really need to read more closely. You seem to think I'm being negative towards AMD when I'm simply calling-out clear nonsense from the video.


Not confused, that backs up my point: "Least-common-denominator wins out."

Even if it's what devs want, it doesn't matter, because they're generally not in control. Doesn't matter how good the API's are on PC, because consoles are put first.

End result? Exactly what I said earlier: "Least-common-denominator wins out."


Irrelevant to what I said. Being a bug-filled feature-bucket doesn't make it a "fantastic piece of code."

Even you said more code causes more problems, and that's the "normal way of life." So it's normal... not fantastic.


What I typed made perfect sense. The tessellation "issue" isn't an issue at all for AMD. They control the level of tessellation using per-game profiles baked into the video driver.

A dev can't sabotage AMD by using ridiculous quantities of tessellation, because AMD can simply lower the level again.

All a dev would gain by over-tessellating their models is lower performance on Nvidia hardware (as Nvidia drivers have no way to limit this feature, it has to be an in-game option).

Speaking as a 3D artist, in unreal engine (the engine I use primarily) the artist has little control over tesselation and it's implemented in a very clumsy way (controlled entirely by camera distance), and on my 7970, if I start with a mostly high-res mesh and enable tesselation, the FPS drops like crazy because it over-tessellated the mesh. A wireframe model quickly becomes a white silhouette because there is more than 1 triangle per pixel. This is even with 'AMD Optimised' tesselation enabled in the driver.

Point is that the engine coders have much more control over the tesselation commands than the video driver.
 
Last edited:
Speaking as a 3D artist, in unreal engine (the engine I use primarily) the artist has little control over tesselation and it's implemented in a very clumsy way (controlled entirely by camera distance), and on my 7970, if I start with a mostly high-res mesh and enable tesselation, the FPS drops like crazy because it over-tessellated the mesh. A wireframe model quickly becomes a white silhouette because there is more than 1 triangle per pixel. This is even with 'AMD Optimised' tesselation enabled in the driver.

Point is that the engine coders har much more control over the tesselation commands than the video driver.

This behavior reminds of me a character from the first law series named Sand Dan Glokta. He was an embittered cripple filled to bursting with spite for all those around him. He was a disfigured mess of a human being, and every step he took was an exercise in pain. And he HATED all those around him who were healthier and better off than he.

One day, one of his underlings was severely injured and barely escaped with her life, she would recover eventually, but at that moment it was one of the few times in his life where someone would be in worse shape and more pain than he was. He tasked her to follow him somewhere. At one point he looked back and saw her struggling to keep up in her broken state, and so, he picked up his pace. It hurt him to do so, but as he said to himself, "but it hurts her more!"

Such a delightfully spiteful creature was a joy to read about, he was darkly hilarious. Of course, you don't want to see that type of attitude and behavior in a gpu company.
 
Speaking as a 3D artist, in unreal engine (the engine I use primarily) the artist has little control over tesselation and it's implemented in a very clumsy way (controlled entirely by camera distance), and on my 7970, if I start with a mostly high-res mesh and enable tesselation, the FPS drops like crazy because it over-tessellated the mesh. A wireframe model quickly becomes a white silhouette because there is more than 1 triangle per pixel. This is even with 'AMD Optimised' tesselation enabled in the driver.

Point is that the engine coders have much more control over the tesselation commands than the video driver.

I think Huddy's point too was that the tess in Hairworks is controlled by the dll and is not allowed to be changed by the dev. There was a lot of info given, but IIRC that was what he said. Do you know anything about this?
 
BF4 pushes boundaries , why don't you name 5 games that have more features then what is being used in the Frostbite 3 engine? More code = more changes of problems , normal way of life.
QuickBooks has a lot of features. Having many features means little if the features are inherently unreliable. (For reference, QuickBooks can stumble on very simple arithmetic operations, violating its invariants.)

Performance is a quality of a codebase, especially in a firm real-time system like a game, but so is reliability. And BF4 holds up very badly when it comes to reliability.
 
I'm pretty sure BF4s issues were to do with netcode, servers and gameplay bugs.

Did you notice textures not loading, animation issues, shadows disappearing? Those sorts of things are engine-specific, not game specific.
 
This is where i am going to call you out you make up answers and don't even reply to what is being written it is so sad that you are just typing nonsense non stop
What did I make up? What was nonsense? You didn't point anything out.

Where didn't I reply to what was being written? I quote-blocked and replied individually on a point-by-point basis.

on my 7970, if I start with a mostly high-res mesh and enable tesselation, the FPS drops like crazy because it over-tessellated the mesh. A wireframe model quickly becomes a white silhouette because there is more than 1 triangle per pixel. This is even with 'AMD Optimised' tesselation enabled in the driver..
Sounds like AMD doesn't have a limiter enabled for UDK. Kinda makes sense, since it's a dev tool.

You can force a lower level (that setting is a slider, after all) for non-profiled programs. You should be able to force much lower levels of tessellation than you're seeing there.
 
What did I make up? What was nonsense? You didn't point anything out.

Where didn't I reply to what was being written? I quote-blocked and replied individually on a point-by-point basis.


Sounds like AMD doesn't have a limiter enabled for UDK. Kinda makes sense, since it's a dev tool.

You can force a lower level (that setting is a slider, after all) for non-profiled programs. You should be able to force much lower levels of tessellation than you're seeing there.

Tried that as well, it's an issue with how tessellation is handled at the shader level. I'm pretty sure AMD's tesselation (or any driver-level tesselation) restriction limits a single input variable, usually for basic "tessellate surface x by Y amount"

If the shader is a bit more complex, such as tesselation based on angle, or texture, where is the limiter put?

And in the editor, the tessellation has an excuse to be as excessive as it wants, but running a game test launches a separate application.
 
QuickBooks has a lot of features. Having many features means little if the features are inherently unreliable. (For reference, QuickBooks can stumble on very simple arithmetic operations, violating its invariants.)

Performance is a quality of a codebase, especially in a firm real-time system like a game, but so is reliability. And BF4 holds up very badly when it comes to reliability.

And you know this how ? Are you developing software using Frostbite 3 engine ? If it didn't hold up why is the engine used in multiple games to date ? Why are there no cancellations of games using Frostbite 3 engine?
 
Where didn't I reply to what was being written? I quote-blocked and replied individually on a point-by-point basis.
.

All of it. There is simply no point in telling you what is wrong you keep persisting that when I or others say things you make up stuff and run with it.
 
Back
Top