Farewell to DirectX?

HardOCP News

[H] News
Joined
Dec 31, 1969
Messages
0
Bit-Tech has posted an editorial / interview with AMD's developer relations manager of the GPU division called "Farewell to DirectX?" Interesting stuff, definitely worth reading.

Despite what delusional forum chimps might tell you, we all know that the graphics hardware inside today's consoles looks like a meek albino gerbil compared with the healthy tiger you can get in a PC. Compare the GeForce GTX 580's count of 512 stream processors with the weedy 48 units found in the Xbox 360's Xenos GPU, not to mention the ageing GeForce 7-series architecture found inside the PS3.
 
DirectX is the glue that holds PC Graphics/Gaming grounded. The last damn thing we need is to go back to the days of each card manufacturer rolling their own (3DFX, etc...). What a nightmare that would be.

It would all but kill the Gaming market on the PC. There's no way in hell gaming companies would put precious resources into writing specific versions for specific chipsets.

Then again, this may be Microsofts plan to kill PC gaming once and for all to sell more XBOX's.

I'd rather game on my iPad than an XBOX.
 
i think what AMD is hoping here is that with the merger of CPU and GPU that your PC will basicly be a glorified console
 
So in essence, every software studio will have to develop their own driver for every piece of current and future hardware? Ouch.
 
The last damn thing we need is to go back to the days of each card manufacturer rolling their own (3DFX, etc...). What a nightmare that would be.

On long term it is needed, as you with all technologies, at some point, reach a dead end where it cant be developed any further without getting slower, if it at the same time also should be backwards compatible, so new technologies are needed, a generation shift now and then.
 
So in essence, every software studio will have to develop their own driver for every piece of current and future hardware? Ouch.

how it used to be done ... thats never going to happen again
 
IF MS was slick they would make a add in board when there next system comes out that would basicly be an Xbox on a board
then your PC could play Xbox1080 or what ever they call it games with out having to do any thing else
THIS would be the best way to do it
 
Does anybody remember the printer menu on WordStar?
 
Ummm no, sorry AMD. While I do think dedicated graphics chips will die that is still a long time away. However even when they do, there'll be something like DirectX. Why? Because writing graphics primitives sucks. Things like DirectX will still exist to provide a common framework to build your apps on. They won't be necessary for use, but then for that matter with today's programmable GPUs they aren't strictly necessary now. However they'll still get used.

I mean when you look at DirectX, it is way more than just 3D and most of it is just software. DirectSound, Xaudio, all that is done on the CPU in Vista and 7, but it is still there and current games use it. Why? Because it provides a good basic audio setup. No need to write your own from scratch.
 
On consoles, you can draw maybe 10,000 or 20,000 chunks of geometry in a frame, and you can do that at 30-60fps. On a PC, you can't typically draw more than 2-3,000 without getting into trouble with performance, and that's quite surprising - the PC can actually show you only a tenth of the performance if you need a separate batch for each draw call.

err, maybe in DX9, but draw calls in OGL were always fairly cheap and in DX10 they have much less overhead as well.

if NV / ATI ever got around to fixing their fucking drivers then there would be threading support too. threaded command building is still broken for DX11 drivers from both parties.
 
DirectX SAVED the PC game industry. However, he's right in a sense that here we are in a DirectX11 world & developers are STILL making DirectX9 games. Hell, there's not a lot of DirectX10 games and THAT technology has been out 2 years ago. THIS is not Microsoft's fault...but the fault of LAZY developers.
 
It's funny to see the Crytek quotes, they're undoubtedly capable of squeezing the most out of PC hardware but that's cancelled out by their new cross-platform vision, so they're actually going backwards rather than forwards.
It's also interesting that the APIs only exist because of the difficulties of developing engines at the "metal" level, which I'd imagine would also be a much more complex proposition today than it was in the glory days of Doom etc.
Ultimately this is only an issue because of chronically ageing console hardware and devs being desperate to wring more performance out of them so that their shiny new games don't look like they're 5 years old.
 
What if developers could agree to have a certain portion of their hardware be standardized?

There could be a section of every graphics card that would have a certain architecture that would be the same across all cards that the developers can use for the parts of their engines that are normally slower in the lighting and geometry departments.

There would be classes of this standardized GPU section. Class 1 would be for HTPC-level hardware, class 2 would be for midrange, class 3 for high range. As new GPUs are unveiled, new classes would be set. So ten years down the road, you don't have to pore over system requirements. On the box it would say "Class 10 GPU required, Class 14 recommended".

AMD and Nvidia could still add their proprietary architecture goodies, but there would be a certain part of each GPU dedicated to the functions that need the most speed.
 
What if developers could agree to have a certain portion of their hardware be standardized?

There could be a section of every graphics card that would have a certain architecture that would be the same across all cards that the developers can use for the parts of their engines that are normally slower in the lighting and geometry departments.

There would be classes of this standardized GPU section. Class 1 would be for HTPC-level hardware, class 2 would be for midrange, class 3 for high range. As new GPUs are unveiled, new classes would be set. So ten years down the road, you don't have to pore over system requirements. On the box it would say "Class 10 GPU required, Class 14 recommended".

AMD and Nvidia could still add their proprietary architecture goodies, but there would be a certain part of each GPU dedicated to the functions that need the most speed.

I believe MS tried something like that using the Windows score as a basis of determining if your system was able to play a certain game or not. Seems to have been lost in the shuffle and we still use the good ol' fashioned system spec format.
 
What I think this gets down to is how Microsoft has been neglecting DirectX performance just like it neglected Internet Explorer performance. There are certainly things that Microsoft can do hand in hand with Nvidia and AMD in order to make those API's execute much more quickly. They should also take a look at what developers really need and make new API's that are more suitable.
 
Nice of them to make a comparison of GPUs from today (e.g., GTX 580) vs older hardware in systems that MS and Sony "plan" to keep alive for ~10 years (e.g., the Xbox Xenos chip, which is designed by ATi as a moddified version of the GPU from the 3xxx(?) line, and the PS3 GPU which is by nVidia being based off of the old 7950(?) design). Yeah, we get it...console hardware is holding PC games back (if they're multiplatform), but if it wasn't for PC, where would console GPUs be? Dragging their dick in the sand probably.
 
So in essence, every software studio will have to develop their own driver for every piece of current and future hardware? Ouch.
Drivers? No. What he's suggesting is that GPUs become a bit more like CPUs, where you don't need to interface to it through a high-level intermediary (an API) and can instead write direct to metal (or at least very close to it). Direct3D handles setup, states, transforms, drawing, buffering, occlusion culling, rasterizing and so forth, and there's a fair degree of flexibility there in terms of how those things can be implemented, but the extent of flexibility isn't the same as it would be if developers could handle those things through their own low-level implementations.

For the people who want to handle what Direct3D handles for them because of the flexibility that provides, this could be beneficial. We could still have common APIs, but there could also be a more 'direct-to-metal' route. If this were a Microsoft-developed or open standard, there'd be basic interoperability between hardware. The problem is we'd eventually start to see more hardware divergence, and that could be a real problem.

What if developers could agree to have a certain portion of their hardware be standardized?
That would probably stifle innovation. The solution is to not standardize hardware but to standardize a much lower-level means of accessing the hardware, and that's almost purely a software solution.
 
IF MS was slick they would make a add in board when there next system comes out that would basicly be an Xbox on a board
then your PC could play Xbox1080 or what ever they call it games with out having to do any thing else
THIS would be the best way to do it

Creative Labs did this with the 3DO. Too bad that never really took off.
 
Solution:
1. A programming language that is standardized (not an API), something like ... OpenCL, but with video capabilities as well.
2. Each GPU manufacturer writes their own compiler and /or libs for the OpenCL like language.
3. Developers write one set of code, or even multiple different sets of code to take advantage of the different architectures, and just compile it with each compiler. They would also need a simple launch app to detect what brand of video card you have so it would select which executable to run.

4. Everybody except MS is happy.. well, maybe MS would be happy as well since they would no longer need to maintain DX.
 
Idiot pad vs. 360. Um ok.. Very trendy, kinda like the hoola-hoop...Angry birds over COD anyday for me!
 
OpenGl. DX3. Now those were the days. Would the game run? Would you get sound. What about those random artifacts? Yep I want AMD to take us back to paradise city.
Unreliable games on half baked code for their half baked products. The PC could be great, gaming wise. It should be. But the consoles have the code reliability to themselves. To bad really, because Battlefield 3 is gonna rock on the PC. Once they get the right drivers and everybody can afford that 450.00 video card, along with the CPU and ram upgrade. I wonder if the PS3 and XBOX 360 owners have to buy more ram and such. Or will the game run correctly. I know the consoles receive patches too, but much fewer then the PC. And when do you have to wait for new video card drivers just to fire the game up on a console? Oh never...So APIs have a wonderful unified structure that let people play games instead of tweak, hack, cuss and wait to play games. Long live DirectX or the next best thing!
 
Sure, drop OpenGL and DirectX, while you're at it, lets also drop every programming language and OS and do it all in machine code. :rolleyes::rolleyes:
 
What I got out of that is:

We at ATI can't write drivers worth a damn so we'd like to pass that responsibility on to the game developers
 
This is stupid. The whole reason we have the DirectX API in the first place is because having an abstraction layer is better, overall, than having a developer have to code, "direct-to-metal," for every piece of hardware that exists out there.

Who wants to go back to the days of wondering if your sound hardware, keyboards, mice, and video cards may or may not work with a game, depending upon whether the developer coded for that hardware target? Anyone feel like messing with IRQs and DMAs again? Or creating clean boot disks? Ah, the memories...

DirectX has its share of problems, sure, but they can certainly be addressed, and throwing the baby out with the bathwater by eliminating a useful API is incompetent and best and disingenuous at worst.

This sounds like an even worse AMD version of Nvidia's already intellectually dishonest TWIMTBP program. I'm sure AMD would like nothing better than for developers to start creating games that eschewed DirectX entirely, and coded "direct-to-metal" for a specific class of Radeons. Want to play Crysis 3? Hmm, it's been coded only for the AMD 7xxx GPU. No other video card hardware will run it, I'm afraid. Even your new AMD 6xxx GPU that you bought 3 months prior isn't new enough. Buy another card.
 
DirectX is the glue that holds PC Graphics/Gaming grounded. The last damn thing we need is to go back to the days of each card manufacturer rolling their own (3DFX, etc...). What a nightmare that would be.
That's not what developers want. Too many people here are making this assumption. They don't want ANY api's. No DirectX, no opengl, no Glide.

They want how they had it in the DOS days, when there wasn't any API. Programming "direct to metal" is what they call it.
It would all but kill the Gaming market on the PC. There's no way in hell gaming companies would put precious resources into writing specific versions for specific chipsets.
The idea is that all CPUs will have built in graphics capabilities, like larrabee. Everything will be general purpose, and capable of doing graphics and normal X86 code.

I'd rather game on my iPad than an XBOX.
That's pretty sad. For you I mean.
 
I forgot to mention (damn, no edit function):

It'd be one thing if this interview had been conducted with a developer who had no ties to either major GPU maker. Hell, John Carmack of id just came out the other day and said that DirectX was pretty good (and hopefully he didn't receive any kickbacks from MS in order to make him say it). But the fact that it's an AMD quasi-PR person making this statements just hurts the credibility of his argument even more. What's more responsible for the poor state of graphics in PC games today? The existence of the DirectX API, or the fact that most developers code to the lowest common denominator (console XXX), and have little interest in optimizing code for PCs?
 
What I got out of that is:

We at ATI can't write drivers worth a damn so we'd like to pass that responsibility on to the game developers

Yet, AMD got requests from developers to help eliminate the API's.
 
I think you guys missed one of his main points. I think he is saying that GPU architecture will become standardized sometime soon (GPGPUs), like with x86 or x86_64 on CPUs. Writing machine code/assembly is the same for an Intel chip or an AMD chip, I think he is saying this will soon be the case with GPUs.
 
I think you guys missed one of his main points. I think he is saying that GPU architecture will become standardized sometime soon (GPGPUs), like with x86 or x86_64 on CPUs. Writing machine code/assembly is the same for an Intel chip or an AMD chip, I think he is saying this will soon be the case with GPUs.

LOLerz!

I strongly suspect Nvidia will take issue with AMD's proposed standard:rolleyes:
 
I can understand the desire to revamp DX.
Just like anything else Microsoft, I'm sure it's become far more bloated than it needs to be.

If they truly are talking about completely eliminating API's, however, I find this issue ironic.

The whole reason for the development of Windows was to remove the need for direct-to-metal programming.

As sly already alluded with regard to Wordstar, the attraction of Windows was pseudo-multitasking and a standardized abstraction layer to eliminate the need for program-specific drivers.

I still remember the nightmare that was WordPerfects early Windows versions. They insisted on continuing the use of their own printer drivers.

Then you have HAL and it's successors; to remove the possibility of a program exerting too much control over the hardware.

You take all that away, and you are left with a game developer creating what amounts to a DOS console, with a custom driver for every series of graphics card, sound card, network card, CPU and chipset.

Instead of buying Windows, you'd have to buy what amounts to an entire OS, simply to run a game.

You watch, someone will come up with the idea of developing a gaming-specific OS...
 
Unreliable games on half baked code for their half baked products. The PC could be great, gaming wise. It should be. But the consoles have the code reliability to themselves. To bad really, because Battlefield 3 is gonna rock on the PC. Once they get the right drivers and everybody can afford that 450.00 video card, along with the CPU and ram upgrade. I wonder if the PS3 and XBOX 360 owners have to buy more ram and such. Or will the game run correctly.

That's inheritantly the problem. We are fairly consistently waiting on hardware upgrades for consoles in order to see the full potential fo PC gaming.

The xbox 360/ps3 are still left with only having 256mb of memory for the cpu and 256mb of memory for the video cards. Its slightly shareable in that the xbox 360 uses a Unified Memory Architectures(UMA) and the ps3 has a work-around for using cpu memory as gpu memory albiet costly performance wise. If I told you I bought a new computer and it had 0.25GB of memory, you'd laugh at me. If I said my video card had 256mb of video card ram. You'd comment that even cheaper lower-end video cards are coming with 512mb to 1GB of video memory.

It becomes harder and harder to develop a game that can use a PC's 4GB to 16GB of system ram and still scale down to a mere 256MB.
 
Ahh yes let's return to the good old days of direct hw access. I can't wait to replace hardware because it's not well supported again... Developers coding 16bit apps immediately dropped it and started coding 32bit. Just like 64bit.. It only took 3 days from the release of the first 64 bit processor for all developers to completely switch.. Everyone pulled 64bit drivers out of their asses and there was no pain or incompatibility. Hell nobody's codes anything in 32bit now... This sounds like the best possible option.
 
It makes sense that this should happen, I mean with cpu's we ended up on the x86 standard so that code could just access the 'metal' without any kind of interpretor, and with the performance of GPU's now so high and so useful beyond just moving pixels being able to get straight to them would be a sane step.

But it will need a standard comparable to x86 which would only come about from one of the GPU makers finally proving dominant, which in the current 3 horse race is unlikely as they all have their own solid market share and only a small demographic is actually up for grabs through competition or MS, Intel, Nvidia and AMD all sitting down and agreeing a standard architecture for a GPU instruction set, which won't happen because MS doesn't want to be made redundant by a virtualised system where by games could be played through any hypervisor, they want you needing DirectX for games.
 
Did [H]ardocp just like to an article daydreaming about microsoft suddenly giving up a huge part of its home computer gaming monopoly for no advantage to itself? And the readers did more than point and laugh?

Note: OpenCL is a programming language, not an API. The ability to "write to the metal" depends only on the OpenCL compiler and the hardware driver (which will likely fail MS's certification if it allows non-directx calls).

The other point is that somehow directx is holding software companies back. Even if OpenCL included enough OpenGL to handle graphics, the limiting factor has to be cost. Check the names on the credits sometime and figure out the cost to pay them all. I'm one of microsoft's biggest critics on the [H]ardforums, but I don't see how you are going to save money by ditching directx. I suspect that it is possible that a 15+ year old API is crufty enough to be more expensive than supporting two separate graphics cards (and don't forget the difference between VLIW4 and VLIW5 in AMD) and a wannabe (it's only been 10 years, by the time directx dies, maybe Intel will have decent 3d performance), but it would be a tough sell.
 
Back
Top