Shouldn't the big game engine developers be creating gameworks like gpu libraries?

tybert7

2[H]4U
Joined
Aug 23, 2007
Messages
2,763
Unity?
Cryengine?
Unreal 4?
Frostbite 3?

etc?


I get that nvidia has a lot of money to burn and wants to capitalize on their market position and profit position to make their own products stand out, but why aren't the game engine developers creating their own intra engine tools to make creating effects easier and more streamlined for game developers?

Do they not have the resources to take on that task? Is the job of optimizing such effects across amd/nvidia/intel graphics so much more difficult it's not worth the trouble?

The answer may be obvious, but you would think the creators and architects of the big game engines should want to have a suite of tools like gameworks and beyond to make their engines more attractive for game developers to use. So why is it not happening? Or maybe it is and has and I am just not aware of it because all we ever hear about is gameworks.
 
The answer is obvious, time and money. All the companies care about now are making their shareholders happy.
 
The answer may be obvious, but you would think the creators and architects of the big game engines should want to have a suite of tools like gameworks and beyond to make their engines more attractive for game developers to use. So why is it not happening? Or maybe it is and has and I am just not aware of it because all we ever hear about is gameworks.

The reason you hear about Gameworks is because you likely follow news related to graphics hardware and related technology. As such these sources will cover this topic which is why you hear about it. Since Nvidia has just launched a new product this will be whats in the news.

But many games and game engines (including the ones you've listed) have used third party middleware not just for graphics but also for other aspects, like sound, for decades (well definitely as far as back as the 90s). If you pay attention to the title loading screen (or even some of splash screens/videos on startup) for games for example and look at all the legal related trademark/copyright/etc. information you'll see they are licensing a lot of third party technology.

Also Gameworks is more end user facing, both in terms of marketing and to some extent direct impact, so it's more noticeable as an end user in that sense and in terms of coverage.

Game engines and games (using in-house engines) themselves often do have alternatives to what is offered in Gameworks. However they are in the business of either licensing the engine as whole or selling a game directly, so you won't see parts repackaged and licensed out separately.

In terms of business economics and practicality it also isn't a question of just capability when deciding to do something but you need to measure the benefit versus opportunity cost.
 
vast majority of games use private engines, majority of commercial games use listed engines, Nvidia is a pure graphics company more or less.

So,
part 1) They don't have time, money or both to do this
part 2) they want to ensure they get lions share of profits
part 3) they want to keep everything theirs and charge big $ for it even when the way they do it often is not the best they castrate others to make it seem like it is.

Hard to optimize as generally speaking Nvidia and AMD do their GPU different so they need to be tuned and coded differently. CPU wise Intel and AMD also handle their crunching differently.

I think if they all had to have a minimum of x shaders as an example, or follow specific guidelines on certain required features then this could change things and make it easier to make standards as you have mentioned, but this is not happening anytime soon, the best we get so far seems to be the middle grounds like blender, DX, OpenGL and so forth.
 
Resources could be a legitimate reason and that's why I think UE4's open-source model is going to win out over time. Now they can take contributions from interested parties and game developers from all around the world. Long-term it's a good strategy. It would be controversial, but it might even be better for them to switch over to a GPLv2 type license though. I think game developers would howl though.
 
Resources could be a legitimate reason and that's why I think UE4's open-source model is going to win out over time. Now they can take contributions from interested parties and game developers from all around the world. Long-term it's a good strategy. It would be controversial, but it might even be better for them to switch over to a GPLv2 type license though. I think game developers would howl though.

As far as the resources question goes, it just seems strange to me. There are plenty of people creating games on mobile platforms on qualcomm/exynos/A7/etc chips that are not using nvidia specific hardware.

Same goes for tens or hundreds of thousands of game developers contributing to different engines.

It's not like a powerful and efficient toolset to create realistic physics would not be useful to a large chunk of these people, or more advanced lighting effects, or wave mechanics, or any number of graphical techniques. Streamlining would help everyone who needs to create a game where such effects would add to the visuals.


But maybe it has to do with nvidia just outright paying people to embed themselves with specific game developers to enhance certain effects. But the special effects set of course is restricted to newer nvidia hardware.

It just seems like it would be so much healthier for the entire industry to have game engine liaisons to embed themselves with certain game devs to help implement more advanced and exotic flourishes and effects that have been streamlined with a given game engine.

That should be one of the perks of using an engine for games of a certain size/audience. It's hard for me to believe that nvidia is the ONLY company on the planet solvent enough to engage in that kind of development assistance and outreach. But maybe they are.

And even if you are an nvidia fanboy, that is bad news for those very lovely and useful gameworks effects, because a lot of developers that might have wanted to use some of those more advanced effects, won't because it will only work with nvidia cards. 2/3s of the market it great, until to consider that it is only one among several gaming gpu markets.

It's not on consoles at all, and they are dwarfed in mobile, and don't play at all in the hybrid cpu/gpu market like intel and amd.

This is just not an ideal avenue for such a library if the goal is to increase the adoption of better effects. But then I suppose that was never the goal.
 
Historically the consoles weren't that adept at handling a lot of these special effects and all that, and game developers only cared about consoles. Porting the game to the PC was more of an afterthought than anything else... and to be honest, I don't really think that way of thinking has substantially changed.
 
This is what real game developers think of Nvidia these days: http://yosoygames.com.ar/wp/2014/09/is-everything-ok-nvidia/

There has been a distance between NVIDIA and their developers over the past 6 years. I used to login to their website, the “developer zone” as they called, at least once a month to see their new hotness.
They’ve created Cg, FX Composer, PerfHUD. And oh dear, those were useful back then.
Not only that. Their demos were very good, and their slides section from SIGGRAPH and GDC were always really good. Anyone remember the Human Skin demo? Mesmerizing.

We would often see those colour-coded slides with optimization tips, Green for NVIDIA-only cards, Red for AMD-only cards. But it’s been a long time since I came to their dev. website to get something useful, nor feel they’ve contributed anything meaningful (except for the Tegra Android Pack which is an awesome tool), whereas I keep looking in AMD’s website much more often: Useful slides, useful tips, technical documentation, and their demos (Oh! Leo Demo is still banging in the head of many devs), and their tools! (CodeXL, PerfStudio).

NVIDIA kept moving away from its devs; while AMD kept moving closer. I know so much about the GCN architecture that I can even predict the next AMD-specific GL extensions (like GL_AMD_vertex_shader_viewport_index) or lift of useless restrictions (like the 64kb UBO limit). Why? Because AMD keeps being open, with lots of documentation, GCN performance tips, and more. Not to mention their full spec docs are open for Open Source driver implementations.

What do I know about the Kepler architecture? Nothing. Zero. Zip. Nada. It’s like it’s all a secret. I better not find out how it works. I know when targetting GCN that I should optimize for register pressure. I know I can use random access to the UBOs without worrying about shader constant waterfalling. I know how much a divergence costs. I know that 32-bit integer multiplication is expensive, but I can use bitshifting tricks or 24-bit multiplication.
I know that packing attributes doesn’t do much difference. I know the export costs of each render target format. I know the sampling speed of each texture format for each filtering type.
And I can optimize accordingly. Does any of this apply to Kepler? I have no idea. Nvidia doesn’t tell (I’ve been informed you get some docs after signing some NDAs, written for CUDA development in mind though)

The rest is well worth the read.
 
Well there you have it. There's a wall between nvidia and real game developers, and it's AMD that's in bed with the devs. Well at least that clears up all the speculation.
 
Back
Top