Here comes DirectX 11

I doubt they have inside info, after the shitstorm of the Xbox1 GPU I don't think MS really likes them all that much.

I cherrish my original Xbox and its soft moddedness, and my PS2. Those were the days that i had a shity ass computer lol.
 
Ah, so you have DirectX10 with the .1 extentions and DirectX10 with .nvidia extentions? Thats pretty worthless. Nvidia supports some extentions, but they would probably need to be supported outside of DX10 since we don't have DX10.nvidia. Nvidia would have to support this themselves through CUDA or something.

You REALLY need DirectX 10.1 support if to have these features through DirectX with the speed required. The single cube map rendering per pass in DX10 versus the multiple, scalable and indexable cube map rendering in DX10.1 makes a lot of difference.

nVidia supports the multisample buffer readback in hardware. No performance penalty.
http://arstechnica.com/reviews/hardware/nvidia-geforce-gx2-review.ars/2
nVidia said:
NVIDIA's official position reads as follows: "DirectX 10.1 includes incremental feature additions beyond DirectX 10, some of which GeForce 8/9/200 GPUs already support (multisample readback for example). We considered DirectX 10.1 support during the initial GPU design phase and consulted with key software development partners. Feedback indicated DirectX 10.1 was not important, so we chose to focus on delivering better performance and architectural efficiency."

So in a way, there is DX10.nVidia.
That is, if you run DX10 on an nVidia card, there are some extra features enabled.
ATi did the same back when they didn't have SM3.0, but they did have geometry instancing. Developers also made use of that, so I don't see why they wouldn't make use of this extension.

That image has everything to do with global illumination, which it is an example of. Global illumination have been possible for many years already, but not fast enough for game usage. Same goes with ray tracing. DirectX10.1 and most likely also DirectX 11 will give the speed needed.

I'm just saying that this is a completely different kind of rendering than how you'd apply global illumination in DirectX 11.
Trust me, I know.
I've implemented Henrik Wann Jensen's photonmapping during my masters:
http://scali.scali.eu.org/caustic/demo.html
 
nVidia supports the multisample buffer readback in hardware. No performance penalty.
http://arstechnica.com/reviews/hardware/nvidia-geforce-gx2-review.ars/2


So in a way, there is DX10.nVidia.
That is, if you run DX10 on an nVidia card, there are some extra features enabled.
ATi did the same back when they didn't have SM3.0, but they did have geometry instancing. Developers also made use of that, so I don't see why they wouldn't make use of this extension.

If Nvidia's multisample buffer readback is to be utilized, it requires the developers to do implement a query in the Nvidia drivers in order for the games to gain access to it. We saw what happened in Assassins creed and the lack of multisample buffer readback when AA was applied in DX10.1. Nvidia cards had a big performance penalty.

I haven't read anything about nvidia cards supporting multiple indexable cube maps arrays, which should give a big performance increase when global illumination is applied.


I'm just saying that this is a completely different kind of rendering than how you'd apply global illumination in DirectX 11.
Trust me, I know.
I've implemented Henrik Wann Jensen's photonmapping during my masters:
http://scali.scali.eu.org/caustic/demo.html

It was not the rendering techniques that was the point of this illustration, but rather direct illumination vs. global illumination. If you have done your masters with using this, then you can agree that he illustrates those differences well with that video?

Global illumination is something I really want in games, since its currently a faster method to get ray tracing. I know that ATI cards have a 100% path to ray tracing, but if you would know me better, you'd know that I don't care about ATI vs. Nvidia. I care about gaming and eye-candy. Which card I buy depends on what I get and couldn't care less about red team vs. green team. They both wants to make money and are not in it for anything else. I want to get the most out of my money towards my need and the most out of the games I buy (which is why I want DX10.1 and then later DX12).

Cool pictures btw! :D
 
If Nvidia's multisample buffer readback is to be utilized, it requires the developers to do implement a query in the Nvidia drivers in order for the games to gain access to it.

Yes, I don't see a problem with that, do you?
There are plenty of examples of this in the past.

We saw what happened in Assassins creed and the lack of multisample buffer readback when AA was applied in DX10.1. Nvidia cards had a big performance penalty.

Yes, but now that they support it, this no longer has to be the case.

It was not the rendering techniques that was the point of this illustration, but rather direct illumination vs. global illumination. If you have done your masters with using this, then you can agree that he illustrates those differences well with that video?

Yes and no.
Yes, in theory it demonstrates the difference between direct illumination and global illumination.
No, in practice games already use various techniques to simulate global illumination, so the difference will not be as extreme as demonstrated here.
Secondly, the technique used to generate these images is not the same as what is used by hardware, so the results will not be identical.
So it is by no means a demonstration of how games would look with DirectX 10.1 or 11.

Global illumination is something I really want in games, since its currently a faster method to get ray tracing. I know that ATI cards have a 100% path to ray tracing

I think you don't know what you're talking about, to be honest.
Firstly, global illumination is the name of a lighting phenomenon, not of a rendering technique. Raytracing is one possible technique that can perform global illumination, but there are various others.
Secondly, ATi cards don't have a "100% path to raytracing". They have a programming model which allows raytracing to be implemented.
But nVidia's hardware can do this aswell. In fact, nVidia has supported raytracing for years, in their Gelato rendering software.

Cool pictures btw! :D

They're not 'pictures', they're a Java applet doing the photonmapping and raytracing in realtime.
 
He seems to be focused on consoles in his interview and that's why he's so focused on eliminating APIs. For consoles, you have one general hardware configuration that tons of people have due owning a specific console. With PCs, you need an API that can talk to almost any hardware that's utilized w/o worrying about being specific to any brand or model. I feel that he's become way too focused on MS's & Sony's possible next-gen consoles that he no longer has a care for how PCs are configured. To me, that show's he's become a lazy developer. It must be an Epic thing... one dev wants to do away with graphics APIs and Cliff B wants to go with super simplistic game controllers. Over opinionated shenanigans or truth molded from experience? I really don't know... but I disagree with both devs on a lot of what they say.

QFT,
I was looking so forward to UT3, and when i got it i was so disappointed at how many rough edges and quality issues there were. It's really unfortunate because the core gameplay is there.

I compared it to Gears on PC, and how refined that experience was, and i became convinced that Epic no longer wants anything to do with PC gaming.

/sorry for thread hijack.
 
nVidia supports the multisample buffer readback in hardware. No performance penalty.
http://arstechnica.com/reviews/hardware/nvidia-geforce-gx2-review.ars/2


So in a way, there is DX10.nVidia.
That is, if you run DX10 on an nVidia card, there are some extra features enabled.
ATi did the same back when they didn't have SM3.0, but they did have geometry instancing. Developers also made use of that, so I don't see why they wouldn't make use of this extension.



I'm just saying that this is a completely different kind of rendering than how you'd apply global illumination in DirectX 11.
Trust me, I know.
I've implemented Henrik Wann Jensen's photonmapping during my masters:
http://scali.scali.eu.org/caustic/demo.html

They did had Geometry Instancing, it was a back door that ATi did to implemented it in the SM2.0b profile, but it never really took off as most Hardware Developer Extensions, developers uses the DirectX API and that's it, it doesn't expose capbits and the developer requires knowledge of this type of Hardware Vendor Extensions and driver query to support it and this didn't stop or delayed the SM3.0 adoption, why do we have to stop the DX10.1 adoption? Because nVidia think it's unnecessary, then Microsoft wouldn't had bothered to created if it wasn't needed or useful in the first place. Even if nVidia suports currently the Multi Sample Read Back Buffer, still doesn't support the DX10.1 method of Global Illumination and the higher count of registers etc.
 
I don't understand all the hype about direct X. I remember DX 9 taking a fraction of a sec to install on my system, and DX 10 taking not much longer. I don't know why direct x has so much influence over the Gpu market when direct x isn't a very big app (size wise).
 
I don't understand all the hype about direct X. I remember DX 9 taking a fraction of a sec to install on my system, and DX 10 taking not much longer. I don't know why direct x has so much influence over the Gpu market when direct x isn't a very big app (size wise).

Clearly, you have no idea what DirectX is or does -- Wikipedia/Google is your friend. Come back when you realize how uninformed your post sounds.
 
hahaha @ above

I just did wish that dx will be replaced by something more universal rather than being linked to MS OS'es. So that i wont have to purchase windows stuff anymore and just stick with linux. But that likely wont happen would it?
 
hahaha @ above

I just did wish that dx will be replaced by something more universal rather than being linked to MS OS'es. So that i wont have to purchase windows stuff anymore and just stick with linux. But that likely wont happen would it?

*cough, OpenGL, cough* ;)
 
What I like of OpenGL is that is very easy to implement Hardware Vendor Extensions and works with a lot of different platforms like Mac, Windows, Linux, even on Cellphones!! OpenGL needs a boost, like the boost that OpenSUSE and Ubuntu did for Linux.
 
AndréRocha;1032744554 said:
i was thinking on getting a new card. what do you guys think? now or wait, i can wait.
wait for a DX11 card? I hope you are joking..
 
They did had Geometry Instancing, it was a back door that ATi did to implemented it in the SM2.0b profile, but it never really took off as most Hardware Developer Extensions, developers uses the DirectX API and that's it, it doesn't expose capbits and the developer requires knowledge of this type of Hardware Vendor Extensions and driver query to support it and this didn't stop or delayed the SM3.0 adoption, why do we have to stop the DX10.1 adoption? Because nVidia think it's unnecessary, then Microsoft wouldn't had bothered to created if it wasn't needed or useful in the first place. Even if nVidia suports currently the Multi Sample Read Back Buffer, still doesn't support the DX10.1 method of Global Illumination and the higher count of registers etc.

I know that at least Far Cry used the Geometry Instancing on ATi cards:
http://www.xbitlabs.com/articles/video/display/farcry20b_3.html

And there is no DX10.1 method of Global Illumination. ATi just implemented a demo with Global Illumination for marketing reasons. It's not like it's a specific feature or anything.
 
What I like of OpenGL is that is very easy to implement Hardware Vendor Extensions and works with a lot of different platforms like Mac, Windows, Linux, even on Cellphones!!

This is exactly why developers prefer DirectX over OpenGL.
They don't have to worry about different platforms, extensions or anything. DirectX just works.
 
Indeed,
"directx just works (slower ?)"

first opengl was faster and then with the newer versions (5, 7,..) directx was faster
but now with the new shaders , I am not so sure anymore if directx is still faster
An example is maybe Crysis, wonder what the frame rate would have been when using an opengl implementation

also with opengl you can do more optimization (even more probably with the upcoming version 3 of opengl)

I am glad to read (see?) that the ID Tech 5 engine still uses opengl
 
These days the speed of the API isn't very relevant, as the hardware is very powerful and 'autonomous'.
You just upload your code and let the GPU execute it. Therefore the speed depends nearly completely on the GPU used (and the quality of the code/driver), not on the API that was used to put the code there.

And I don't see how OpenGL will allow you to do more optimization. Since DirectX 7, the hardware is designed to Direct3D specifications, not the other way around.
In the last few years, 'new' OpenGL functionality was mostly cloned from Direct3D. There is no real active development going on with OpenGL anymore, it's just following whatever Direct3D does.
 
What's with the people complaining that they're releasing a new dx already? Do you get pissed off when intel releases a faster processor too? Too bad we can't all be using windows 98 still. Why can't computers be more stagnant? Fuck upgrades.

Im not really complaining just about Direct X its technology in general... I just spent $5000 in 2 months on technology. For example right now I have some M-Audio recordiong studio will not fuction on my Windows Vista 64 machine right now. Gotta wait for the god dam drivers and its taking too long. Now that Windows 7 is coming out... FUCK. Also, I just purchased a 4870 and now that Direct X 11 is releasing and now I gotta buy another card. Noone puts a fucking gun to my head and tell me to buy shit... Im a technology happy mother fucker and I buy shit with no problem... it aint the money. Its the little shit here and there that gets in the way thats annoying. Im not materialistic, I actually use all the shit I buy, i just dont let it sit there. Even PC game developers complain about all this hardware/software technology moving too fast. I also hear console gamers complain about having to upgrade video cards.
 
You could have checked it out first... E-mu supports Vista.

Screw that.. I dont want to have to check for shit. I spend my money on shit I expect it to be updated and working within a reasonable time. I call M-Audio up and they cry about having to making drivers for Vista 32 bit, Vista SP1, and Vista 64 bit. They have full support for Vista 32bit on most of their products now. Half of the shit dont work on SP1 and they barely have 64 bit in Beta. The hardware I purchase from them are a year old.

I have no problem with Direct X 11.... but god dam give me 50 games and a little more than a year to enjoy my video card.
 
Screw that.. I dont want to have to check for shit. I spend my money on shit I expect it to be updated and working within a reasonable time. I call M-Audio up and they cry about having to making drivers for Vista 32 bit, Vista SP1, and Vista 64 bit. They have full support for Vista 32bit on most of their products now. Half of the shit dont work on SP1 and they barely have 64 bit in Beta. The hardware I purchase from them are a year old.

Yea I know, I had the same problem with XP x64 back in the day... Some manufacturers just take too long developing decent drivers for new OSes, or just ignore them altogether.

Technically there should be no difference for pre-SP1 and post-SP1 Vista. If it no longer works in SP1, then chances are there were already issues in the pre-SP1 driver... they were just getting away with it.
 
I know that at least Far Cry used the Geometry Instancing on ATi cards:
http://www.xbitlabs.com/articles/video/display/farcry20b_3.html

And there is no DX10.1 method of Global Illumination. ATi just implemented a demo with Global Illumination for marketing reasons. It's not like it's a specific feature or anything.

Your nVidia biasing is blinding you. Why you don't do a little research on the net about DX10.1 and see what Global Illumination means?

Global illumination is a rendering technique that combines the benefits of light/shadow mapping with indirect lighting and support for practically unlimited dynamic light sources, as well as realistic reflections and soft shadows. With DirectX 10.1, developers can use cube map arrays and geometry shaders to implement global illumination efficiently in real time, even with thousands of physically modeled objects in a complex, interactive scene.

Why now is possible to implement it with DX10.1?

Cube Map Arrays - Allow reading and writing of multiple cube maps in a single rendering pass.

Benefits: Efficient Global Illumination in real time for complex, dynamic, interactive scenes Enable many ray trace quality effects including indirect lighting, color bleeding, soft shadows, refraction, and highquality glossy reflections.

That's something that couldn't be done before in real time. And developers that did wanted to implement a sort of Global Illumination, had to fake it or simulate it, and is not the same.
 
Your nVidia biasing is blinding you. Why you don't do a little research on the net about DX10.1 and see what Global Illumination means?

I've already shown that I've implemented global illumination years before anything such as DX10 existed, yet you want to say I don't know what it is?

Global illumination is a rendering technique that combines the benefits of light/shadow mapping with indirect lighting and support for practically unlimited dynamic light sources, as well as realistic reflections and soft shadows. With DirectX 10.1, developers can use cube map arrays and geometry shaders to implement global illumination efficiently in real time, even with thousands of physically modeled objects in a complex, interactive scene.

Why now is possible to implement it with DX10.1?

Cube Map Arrays - Allow reading and writing of multiple cube maps in a single rendering pass.

Benefits: Efficient Global Illumination in real time for complex, dynamic, interactive scenes Enable many ray trace quality effects including indirect lighting, color bleeding, soft shadows, refraction, and highquality glossy reflections.

That's something that couldn't be done before in real time. And developers that did wanted to implement a sort of Global Illumination, had to fake it or simulate it, and is not the same.

Excuse me, but you are just reiterating the ATi marketing material. Don't accuse people of being biased or not knowing what they're talking about when this is the best you can do (marketing material is always biased).
You're talking to an actual developer here, with hands-on experience. So show me your implementation of global illumination if you want to convince me. I've already shown mine.
Then we can talk about implementation details and then we'll see what is and isn't possible with or without DX10.1.
Until then I strongly suggest you refrain from calling someone biased or claiming he doesn't know what he's talking about.
 
I've already shown that I've implemented global illumination years before anything such as DX10 existed, yet you want to say I don't know what it is?



Excuse me, but you are just reiterating the ATi marketing material. Don't accuse people of being biased or not knowing what they're talking about when this is the best you can do.
You're talking to an actual developer here, with hands-on experience. So show me your implementation of global illumination if you want to convince me. I've already shown mine.
Then we can talk about implementation details and then we'll see what is and isn't possible with or without DX10.1.
Until then I strongly suggest you refrain from calling someone biased or claiming he doesn't know what he's talking about.

I never said that is not possible to do Real Global Illumination under DX10 or lower, is just that isn't possible to do it on real time on a game. Unless Deferred Rendering is used (Which would run slow anyways). A developer? loll, I'm not a developer, but I find hard to believe someone who call himself a developer with such childish bragging, that page of yours just show simple light rendering using some kind of program like 3dsmax or maya, which uses OpenGL and have some global illumination implementations which had nothing to do against the Global Illumination implementation in DX10.1 Just go and eat some Chef Boyardee spaguetti.
 
I never said that is not possible to do Real Global Illumination under DX10 or lower, is just that isn't possible to do it on real time on a game. Unless Deferred Rendering is used (Which would run slow anyways). A developer? loll, I'm not a developer, but I find hard to believe someone who call himself a developer with such childish bragging, that page of yours just show simple light rendering using some kind of program like 3dsmax or maya, which uses OpenGL and have some global illumination implementations which had nothing to do against the Global Illumination implementation in DX10.1 Just go and eat some Chef Boyardee spaguetti.

That page of mine is a Java applet. If you bothered to check the HTML code you could see that:
<applet code="RayTracerApplet.class" width=800 height=600>
<param name="antialias" value="1">
</applet>

It has nothing to do with 3dsmax, Maya, OpenGL or anything. It's just some procedurally generated scene rendered in realtime with 2 different camera positions, and with/without textures to emphasize the lighting.
 
That page of mine is a Java applet. If you bothered to check the HTML code you could see that:
<applet code="RayTracerApplet.class" width=800 height=600>
<param name="antialias" value="1">
</applet>

It has nothing to do with 3dsmax, Maya, OpenGL or anything. It's just some procedurally generated scene rendered in realtime with 2 different camera positions, and with/without textures to emphasize the lighting.

I knew it was Java, but I didn't knew it was Ray Tracing, Rasterization and Ray Tracing are completely different, of course, Rasterization has a hard time to create convincing light effects etc, Ray Tracing don't, for me, Ray Tracing is the future, so it would be great if there's a way to combine Ray Tracing and Rasterizations, like some sort of hybrid rendering.
 
I knew it was Java, but I didn't knew it was Ray Tracing,

I already said it implemented Henrik Wann Jensen's photonmapping (incidentally, at the time I wrote this applet, no major raytracer supported this yet).

Rasterization and Ray Tracing are completely different, of course, Rasterization has a hard time to create convincing light effects etc, Ray Tracing don't, for me, Ray Tracing is the future, so it would be great if there's a way to combine Ray Tracing and Rasterizations, like some sort of hybrid rendering.

Nonsense. For the most part, rasterizers and raytracers share the exact same lighting formulas.
In fact, prior to Cars, Pixar movies used virtually no raytracing whatsoever, and were nearly entirely rendered with a rasterizer (a REYES renderer to be exact).
This also goes for all other movies done with Pixar's RenderMan software, such as Terminator 2/3, The Matrix and tons of other games.

Also, hybrid rendering has existed for years, and is used by many popular 3d modeling/rendering applications, such as 3dsmax, Maya, Lightwave etc.

Raytracing is more hype than future at this moment.
 
they tried that aproach with "games for windows"

and that was EPIC FAIL

That's just it - they didn't try. We got a couple of crappy 360 ports. Where is the equivalent of Halo, developed exclusively for the PC? I don't mean a port of Halo 3, I mean a real new game, a real new IP that would only work on a PC because it is beyond what consoles can do. Something like Crysis, except, you know ... an actual playable fun game, and optimized so that it blows the barn doors off a pc running a top of the line GPU like a 4870 / GTX 2xx.
 
I already said it implemented Henrik Wann Jensen's photonmapping (incidentally, at the time I wrote this applet, no major raytracer supported this yet).



Nonsense. For the most part, rasterizers and raytracers share the exact same lighting formulas.
In fact, prior to Cars, Pixar movies used virtually no raytracing whatsoever, and were nearly entirely rendered with a rasterizer (a REYES renderer to be exact).
This also goes for all other movies done with Pixar's RenderMan software, such as Terminator 2/3, The Matrix and tons of other games.

Also, hybrid rendering has existed for years, and is used by many popular 3d modeling/rendering applications, such as 3dsmax, Maya, Lightwave etc.

Raytracing is more hype than future at this moment.

MMmm. but I was talking about real time rendering, not offline rendering, like in games and stuff that usually can't have a luxury to run at 0.00001fps. AFAIK in DirectX games, to store light they have to use lightmaps or cubemaps and that's it, light could not be created or destroyed like is possible with RayTracing (Or deffered Rendering). Since I don't know much about this matter, any senseless word please enlight me :)
 
MMmm. but I was talking about real time rendering, not offline rendering, like in games and stuff that usually can't have a luxury to run at 0.00001fps. AFAIK in DirectX games, to store light they have to use lightmaps or cubemaps and that's it, light could not be created or destroyed like is possible with RayTracing (Or deffered Rendering). Since I don't know much about this matter, any senseless word please enlight me :)

Well, even I (being no developer) know well enough that what you're saying isn't true -- it is perfectly possible to create and destroy lights in DirectX applications without altering lightmaps.
 
MMmm. but I was talking about real time rendering, not offline rendering, like in games and stuff that usually can't have a luxury to run at 0.00001fps. AFAIK in DirectX games, to store light they have to use lightmaps or cubemaps and that's it, light could not be created or destroyed like is possible with RayTracing (Or deffered Rendering). Since I don't know much about this matter, any senseless word please enlight me :)

What do you think a dynamic light is? Of course you can have lights that are created/destroyed/moved (all the same category). What do you think turning on/off a flashlight in a game does? It creates/destroys a light in a game.
 
MMmm. but I was talking about real time rendering, not offline rendering, like in games and stuff that usually can't have a luxury to run at 0.00001fps. AFAIK in DirectX games, to store light they have to use lightmaps or cubemaps and that's it, light could not be created or destroyed like is possible with RayTracing (Or deffered Rendering). Since I don't know much about this matter, any senseless word please enlight me :)

A photonmapper also stores light in lightmaps, as does the RenderMan offline renderer that Pixar uses.
And as already mentioned, there's no problem in supporting dynamic lights. You can just enable/disable/recalc lightmaps.
 
Some things I always read about DirectX no matter what the version is:

Version X is in the making? But we havn't used version X-1 yet!
We dont even use Version X to its full potential, why do we need version X+1?
Version X is all we need, Version X+1 has no new featuers

DirectX is like...a fine wine. It takes years to ferment properly, you can't have a business model where you ferment a season of wine, wait for it for 2-3 years sell it all then re-start the process because then you have a 2-3 gap between your next batch. You start work on your next batch before the first one is out.

DirectX is the same, it needs to be out for a year or so for developers to get a hold of it, learn it, experiment and impliment it, by that time they've done that, the hardware to run it at good speeds becomes available and you're ready to go.

Its good that DX11 is in the works now, because by the time it's been fully developed and tested, then released to developers, and they've got their heads around it and learnt it and then it's been introduced into game design and then actually released, DX10 will be in common use and we'll be about ready for a new version.
 
Indeed, the way things work these days is that DirectX first defines the featureset of the next-generation hardware, then that hardware is built. The new DirectX software and hardware then arrive on the market more or less at the same time, and with guaranteed compatibility.
I'm sure that some people still recall the early days of 3d acceleration in PC gaming, where there were various APIs, and even if your card supported an API, it wasn't guaranteed to actually run all the games, because some features were missing or didn't work according to spec.

We've come a long way since then, and now we can at least assume that most games will just work out-of-the-box on any AMD or nVidia card.
 
I'm not sure why people still think in terms of 'effects'.
DirectX is not an 'effect-library'. It's an API/programming model for GPUs.
Things like "DX10 effects" or "DX11 effects" simply don't exist. Effects are whatever you program.
A more elaborate programming model, together with a more powerful GPU simply allows you to implement certain effects more efficiently.

It's more like going from a Pentium 4 to a Core2. Core2 is more powerful but it doesn't have "Core2 effects" or anything. From a programming point-of-view, the Pentium 4 can do everything the Core2 can. The Core2 can just do some things more efficiently, and is faster overall, which enables new possibilities.
 
Back
Top