I doubt they have inside info, after the shitstorm of the Xbox1 GPU I don't think MS really likes them all that much.
I cherrish my original Xbox and its soft moddedness, and my PS2. Those were the days that i had a shity ass computer lol.
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
I doubt they have inside info, after the shitstorm of the Xbox1 GPU I don't think MS really likes them all that much.
Ah, so you have DirectX10 with the .1 extentions and DirectX10 with .nvidia extentions? Thats pretty worthless. Nvidia supports some extentions, but they would probably need to be supported outside of DX10 since we don't have DX10.nvidia. Nvidia would have to support this themselves through CUDA or something.
You REALLY need DirectX 10.1 support if to have these features through DirectX with the speed required. The single cube map rendering per pass in DX10 versus the multiple, scalable and indexable cube map rendering in DX10.1 makes a lot of difference.
nVidia said:NVIDIA's official position reads as follows: "DirectX 10.1 includes incremental feature additions beyond DirectX 10, some of which GeForce 8/9/200 GPUs already support (multisample readback for example). We considered DirectX 10.1 support during the initial GPU design phase and consulted with key software development partners. Feedback indicated DirectX 10.1 was not important, so we chose to focus on delivering better performance and architectural efficiency."
That image has everything to do with global illumination, which it is an example of. Global illumination have been possible for many years already, but not fast enough for game usage. Same goes with ray tracing. DirectX10.1 and most likely also DirectX 11 will give the speed needed.
nVidia supports the multisample buffer readback in hardware. No performance penalty.
http://arstechnica.com/reviews/hardware/nvidia-geforce-gx2-review.ars/2
So in a way, there is DX10.nVidia.
That is, if you run DX10 on an nVidia card, there are some extra features enabled.
ATi did the same back when they didn't have SM3.0, but they did have geometry instancing. Developers also made use of that, so I don't see why they wouldn't make use of this extension.
I'm just saying that this is a completely different kind of rendering than how you'd apply global illumination in DirectX 11.
Trust me, I know.
I've implemented Henrik Wann Jensen's photonmapping during my masters:
http://scali.scali.eu.org/caustic/demo.html
If Nvidia's multisample buffer readback is to be utilized, it requires the developers to do implement a query in the Nvidia drivers in order for the games to gain access to it.
We saw what happened in Assassins creed and the lack of multisample buffer readback when AA was applied in DX10.1. Nvidia cards had a big performance penalty.
It was not the rendering techniques that was the point of this illustration, but rather direct illumination vs. global illumination. If you have done your masters with using this, then you can agree that he illustrates those differences well with that video?
Global illumination is something I really want in games, since its currently a faster method to get ray tracing. I know that ATI cards have a 100% path to ray tracing
Cool pictures btw!![]()
He seems to be focused on consoles in his interview and that's why he's so focused on eliminating APIs. For consoles, you have one general hardware configuration that tons of people have due owning a specific console. With PCs, you need an API that can talk to almost any hardware that's utilized w/o worrying about being specific to any brand or model. I feel that he's become way too focused on MS's & Sony's possible next-gen consoles that he no longer has a care for how PCs are configured. To me, that show's he's become a lazy developer. It must be an Epic thing... one dev wants to do away with graphics APIs and Cliff B wants to go with super simplistic game controllers. Over opinionated shenanigans or truth molded from experience? I really don't know... but I disagree with both devs on a lot of what they say.
nVidia supports the multisample buffer readback in hardware. No performance penalty.
http://arstechnica.com/reviews/hardware/nvidia-geforce-gx2-review.ars/2
So in a way, there is DX10.nVidia.
That is, if you run DX10 on an nVidia card, there are some extra features enabled.
ATi did the same back when they didn't have SM3.0, but they did have geometry instancing. Developers also made use of that, so I don't see why they wouldn't make use of this extension.
I'm just saying that this is a completely different kind of rendering than how you'd apply global illumination in DirectX 11.
Trust me, I know.
I've implemented Henrik Wann Jensen's photonmapping during my masters:
http://scali.scali.eu.org/caustic/demo.html
I don't understand all the hype about direct X. I remember DX 9 taking a fraction of a sec to install on my system, and DX 10 taking not much longer. I don't know why direct x has so much influence over the Gpu market when direct x isn't a very big app (size wise).
hahaha @ above
I just did wish that dx will be replaced by something more universal rather than being linked to MS OS'es. So that i wont have to purchase windows stuff anymore and just stick with linux. But that likely wont happen would it?
wait for a DX11 card? I hope you are joking..AndréRocha;1032744554 said:i was thinking on getting a new card. what do you guys think? now or wait, i can wait.
They did had Geometry Instancing, it was a back door that ATi did to implemented it in the SM2.0b profile, but it never really took off as most Hardware Developer Extensions, developers uses the DirectX API and that's it, it doesn't expose capbits and the developer requires knowledge of this type of Hardware Vendor Extensions and driver query to support it and this didn't stop or delayed the SM3.0 adoption, why do we have to stop the DX10.1 adoption? Because nVidia think it's unnecessary, then Microsoft wouldn't had bothered to created if it wasn't needed or useful in the first place. Even if nVidia suports currently the Multi Sample Read Back Buffer, still doesn't support the DX10.1 method of Global Illumination and the higher count of registers etc.
What I like of OpenGL is that is very easy to implement Hardware Vendor Extensions and works with a lot of different platforms like Mac, Windows, Linux, even on Cellphones!!
What's with the people complaining that they're releasing a new dx already? Do you get pissed off when intel releases a faster processor too? Too bad we can't all be using windows 98 still. Why can't computers be more stagnant? Fuck upgrades.
You could have checked it out first... E-mu supports Vista.
Screw that.. I dont want to have to check for shit. I spend my money on shit I expect it to be updated and working within a reasonable time. I call M-Audio up and they cry about having to making drivers for Vista 32 bit, Vista SP1, and Vista 64 bit. They have full support for Vista 32bit on most of their products now. Half of the shit dont work on SP1 and they barely have 64 bit in Beta. The hardware I purchase from them are a year old.
I know that at least Far Cry used the Geometry Instancing on ATi cards:
http://www.xbitlabs.com/articles/video/display/farcry20b_3.html
And there is no DX10.1 method of Global Illumination. ATi just implemented a demo with Global Illumination for marketing reasons. It's not like it's a specific feature or anything.
Your nVidia biasing is blinding you. Why you don't do a little research on the net about DX10.1 and see what Global Illumination means?
Global illumination is a rendering technique that combines the benefits of light/shadow mapping with indirect lighting and support for practically unlimited dynamic light sources, as well as realistic reflections and soft shadows. With DirectX 10.1, developers can use cube map arrays and geometry shaders to implement global illumination efficiently in real time, even with thousands of physically modeled objects in a complex, interactive scene.
Why now is possible to implement it with DX10.1?
Cube Map Arrays - Allow reading and writing of multiple cube maps in a single rendering pass.
Benefits: Efficient Global Illumination in real time for complex, dynamic, interactive scenes Enable many ray trace quality effects including indirect lighting, color bleeding, soft shadows, refraction, and highquality glossy reflections.
That's something that couldn't be done before in real time. And developers that did wanted to implement a sort of Global Illumination, had to fake it or simulate it, and is not the same.
I've already shown that I've implemented global illumination years before anything such as DX10 existed, yet you want to say I don't know what it is?
Excuse me, but you are just reiterating the ATi marketing material. Don't accuse people of being biased or not knowing what they're talking about when this is the best you can do.
You're talking to an actual developer here, with hands-on experience. So show me your implementation of global illumination if you want to convince me. I've already shown mine.
Then we can talk about implementation details and then we'll see what is and isn't possible with or without DX10.1.
Until then I strongly suggest you refrain from calling someone biased or claiming he doesn't know what he's talking about.
I never said that is not possible to do Real Global Illumination under DX10 or lower, is just that isn't possible to do it on real time on a game. Unless Deferred Rendering is used (Which would run slow anyways). A developer? loll, I'm not a developer, but I find hard to believe someone who call himself a developer with such childish bragging, that page of yours just show simple light rendering using some kind of program like 3dsmax or maya, which uses OpenGL and have some global illumination implementations which had nothing to do against the Global Illumination implementation in DX10.1 Just go and eat some Chef Boyardee spaguetti.
That page of mine is a Java applet. If you bothered to check the HTML code you could see that:
<applet code="RayTracerApplet.class" width=800 height=600>
<param name="antialias" value="1">
</applet>
It has nothing to do with 3dsmax, Maya, OpenGL or anything. It's just some procedurally generated scene rendered in realtime with 2 different camera positions, and with/without textures to emphasize the lighting.
I knew it was Java, but I didn't knew it was Ray Tracing,
Rasterization and Ray Tracing are completely different, of course, Rasterization has a hard time to create convincing light effects etc, Ray Tracing don't, for me, Ray Tracing is the future, so it would be great if there's a way to combine Ray Tracing and Rasterizations, like some sort of hybrid rendering.
they tried that aproach with "games for windows"
and that was EPIC FAIL
I already said it implemented Henrik Wann Jensen's photonmapping (incidentally, at the time I wrote this applet, no major raytracer supported this yet).
Nonsense. For the most part, rasterizers and raytracers share the exact same lighting formulas.
In fact, prior to Cars, Pixar movies used virtually no raytracing whatsoever, and were nearly entirely rendered with a rasterizer (a REYES renderer to be exact).
This also goes for all other movies done with Pixar's RenderMan software, such as Terminator 2/3, The Matrix and tons of other games.
Also, hybrid rendering has existed for years, and is used by many popular 3d modeling/rendering applications, such as 3dsmax, Maya, Lightwave etc.
Raytracing is more hype than future at this moment.
MMmm. but I was talking about real time rendering, not offline rendering, like in games and stuff that usually can't have a luxury to run at 0.00001fps. AFAIK in DirectX games, to store light they have to use lightmaps or cubemaps and that's it, light could not be created or destroyed like is possible with RayTracing (Or deffered Rendering). Since I don't know much about this matter, any senseless word please enlight me![]()
MMmm. but I was talking about real time rendering, not offline rendering, like in games and stuff that usually can't have a luxury to run at 0.00001fps. AFAIK in DirectX games, to store light they have to use lightmaps or cubemaps and that's it, light could not be created or destroyed like is possible with RayTracing (Or deffered Rendering). Since I don't know much about this matter, any senseless word please enlight me![]()
MMmm, but if that's true, why do we need deferred rendering?
MMmm. but I was talking about real time rendering, not offline rendering, like in games and stuff that usually can't have a luxury to run at 0.00001fps. AFAIK in DirectX games, to store light they have to use lightmaps or cubemaps and that's it, light could not be created or destroyed like is possible with RayTracing (Or deffered Rendering). Since I don't know much about this matter, any senseless word please enlight me![]()