Larrabee Doing Real-Time Ray Tracing


Agreed on the water bit. They should have spent at least 20 lines of code on it. For crying out loud.

Since they were apparently afraid to give out any details on the underlying hardware, and the demo was this crappy, I don't think that Intel's stockholders will be very impressed either, let alone any potential Larrabee customers.

Larrabee will be either vapourware or a big flop. As for i740, it wasn't even an Intel-designed chip, but a design they bought from elsewhere. X4500 is the best Intel has come up with in a commercial product so far.
 
To all of you saying this is unimpressive, do you have any idea WHAT RTRT (real-time ray tracing) is, or how intense it is?

Oh yes. It is pretty crazy stuff to run real time. I would often bake it plus fake it for software rendered scenes, because its a killer on render times. The fact that it is able to run at 5 fps is impressive for sure, and potentially holds promise for future products.

What matters the most though, is how the result look, hence how the immersion and overall visual gaming experience will turn out. I feel thoroughly underwhelmed by this demo. Its nice to see true reflections sure, but Id rather have them improve other aspects of the visual field, like better real time shaders (which still exists in a very narrow domain compared to software rendered shaders) and the lovely new DX11 tesselation technology. That in my mind will bring the looks of real time closer to photo realism (or what ever style you would want to adopt) and make us buy the illusion of being there.

Go Intel, bring us your best and tickle the competition! But even though this stuff have been worked on for years (8 years?), its way too insignificant in terms of its visual impact for me to get excited about right now.
 
Not impressed. Needs more tech specs and information on the demo.

Also, Ray-Tracing is a buzzword (tm)(c). No sensible developer would solely use RT to render a scene with as it has a lot of limitations, including an inability to do lighting/shadows properly. What one needs is Photon Mapping, which is what happens in Real Life (tm) too. What we call ray tracing is just the inverse of photon mapping, an ugly shortcut with a lot of limitations.

Most modern engines including my company's use a simplified version of photon mapping to get realistic lighting (global illumination) and shadows (dynamic, soft shadows). Ray-tracing can't do any of this. Please stop the hype :)

um, I'm not sure what you're talking about. Ray tracing, when implemented properly, can most deffinitly do realistic shadows and lighting, that's the whole premise behind the technology and what makes it so desirable. Photon mapping is nothing more than RT, but casting rays from the light source and drawing the image, opposed to RT where it's casting the rays from the camera/eyes (and saving valuable resources in the process). If they cannot get real time ray tracing to a usable performance level, there is no way that realtime photon mapping will become anymore than a pipe dream (if you're looking for realistic IQ w/caustics and such) since PM is just an extension/different approach to RT

I do agree though, we'll never see it used 100% by itself in real time as there are limitations to the technology, or at least, not for a very long time (or some effects will have to be faked). I don't design game engines, but I do use a raytracer almost every day of my life :)
 
I doubt ray tracing will be of importance for another 5 years or so....at least.
 
Whoa, a lot of people are not impressed, even though this is fairly impressive.

"With Intel’s latest quad-socket systems—equipped with a 2.66 GHz Dunnington processor in each socket—we can achieve approximately 20 to 35 fps at a resolution of 1280x720. Nonetheless, this represents a significant improvement over the experiments in 2004 that required 20 machines to render a simpler game more slowly and at a lower resolution. The greatest performance gains result from research efforts around the world that improve efficiency and the new, many-core hardware platforms that use parallelism to accelerate graphics operations."

http://software.intel.com/en-us/articles/quake-wars-gets-ray-traced/?cid=sw:graphics081
 
Real-Time Ray Tracing is nice and all but I think there are much more 'important' things to do in the realm of making games seam more real.

Destructible environments for one.
 
um, I'm not sure what you're talking about. Ray tracing, when implemented properly, can most deffinitly do realistic shadows and lighting, that's the whole premise behind the technology and what makes it so desirable. Photon mapping is nothing more than RT, but casting rays from the light source and drawing the image, opposed to RT where it's casting the rays from the camera/eyes (and saving valuable resources in the process). If they cannot get real time ray tracing to a usable performance level, there is no way that realtime photon mapping will become anymore than a pipe dream (if you're looking for realistic IQ w/caustics and such) since PM is just an extension/different approach to RT

I do agree though, we'll never see it used 100% by itself in real time as there are limitations to the technology, or at least, not for a very long time (or some effects will have to be faked). I don't design game engines, but I do use a raytracer almost every day of my life :)

RT is generally understood to be tracing rays from the camera along the FOV using N rays which are each bounced or refracted at most X times depending on the materials encountered (opaque, reflective, transparant). PM traces each photon emitted by each light source (emitter) until it is absorbed or otherwise can no longer be tracked. A light map is thus created with the ambient light level and every light gradient determined. Some of the photons may end up at the camera.

PM and RT are thus quite different in their approach and will give wildly differing results. Sometimes a hybrid RT/rasterization approach is used to fill in information either method misses (transparancy, reflections, shadows, &c).
 
Who is impressed? The water looks more like a lava lamp than real water. Those motion picture frames that hang on the wall look better than this. They should focus on making it look better like maybe some perspective movement or something other than distant helicopter-like pixels on a fixed path. If they only used 10 lines of code, somebody was really lazy. The tech is cool, but I was expecting a bit more than this.
 
They're moving forward, at least. I guess I can see why they wanted to fully raytrace the scene, but I wouldn't be surprised to see a hybrid(RT/Rasterization) approach in any "real" games that take advantage of the capability.
 
The problem: Intel has zero track record with GPUs. Zero. Year after year they produce shit IGP solutions. But wait! Next year! It will be AWESOME! Really!

I do believe Intel is the #1 provider of GPU's in the world at the moment. No matter how you spin the dice.
 
I do believe Intel is the #1 provider of GPU's in the world at the moment. No matter how you spin the dice.

Yeah sure. If you call an IGP a GPU. They are last in the world in AIB which is all that matters in the graphics world.
 
If anything, the amount of processing being done is impressive, but that's it. What chance does "pure" RTRT have so long as software rendering is still trying to fake as much as possible?
 
They're moving forward, at least. I guess I can see why they wanted to fully raytrace the scene, but I wouldn't be surprised to see a hybrid(RT/Rasterization) approach in any "real" games that take advantage of the capability.

The hybrid approach is actually already being applied. Modern GPUs are more than up to the task of using RTRT to spice up some transparancy effects and such. RT, rasterization and PM all come together in a sense to compensate for weaknesses in each technology :)
 
All the naysayers are forgetting one thing: Competition is good.

Does the demo impress me? Not really, but overall I do hope Intel brings good competition to the market. It only makes things better for us the consumer. If Larrabee is mediocre or semi-crappy when first released who cares? Don't buy it. Let Intel improve on it. Let them push Nvidia and AMD. Let's get more price wars going. :D
 
All the naysayers are forgetting one thing: Competition is good.

Does the demo impress me? Not really, but overall I do hope Intel brings good competition to the market. It only makes things better for us the consumer. If Larrabee is mediocre or semi-crappy when first released who cares? Don't buy it. Let Intel improve on it. Let them push Nvidia and AMD. Let's get more price wars going. :D

So far Intel's own contributions to the GPU wars has been the GMA IGPs and one GPU (i740) they bought from another company. I740 fell flat due to some issues with how it fetched textures. We all know how appreciated GMA is. It's only natural to assume that Intel won't have some kind of super GPU in the pipeline :)

Of course we'd love more competition in the GPU market, but the harsh reality is that even X4500 doesn't come even close to threatening AMD/nVidia GPUs, let alone IGPs.
 
I think everyone is getting off topic. The point of this demo was not to wow you with visuals, or to show you the merit's of ray tracing over other lighting methods. It was illustrating that there has been progress developing a chip that can more efficiently produce raytracing. Most notably, a single chip solution, ie. something we might have in our houses for a gaming rig. Now this is not saying the machine used in the demo signifies RT is here and we should all get it, no, it's simply showing the progress.

So let me reiterate, Intel is showing us they have made progress. They are showing us that RT might be here in the next 5 years. They are showing us that they are on the forefront of r&d of silicon based graphics rendering.

Whether the current debate is about how good/crappy the demo looked, or which method of rendering is best IS NOT THE ISSUE.

Sorry to rant, this topic just went off in more tangents, then Rays in the demo.
 
I agree, but you have to remember this is the FIRST demo'd RTRT sillicon (that I know of). 5fps looking like ass may not be much, but it's SOMETHING, and that's impressive.
Oh yes. It is pretty crazy stuff to run real time. I would often bake it plus fake it for software rendered scenes, because its a killer on render times. The fact that it is able to run at 5 fps is impressive for sure, and potentially holds promise for future products.

What matters the most though, is how the result look, hence how the immersion and overall visual gaming experience will turn out. I feel thoroughly underwhelmed by this demo. Its nice to see true reflections sure, but Id rather have them improve other aspects of the visual field, like better real time shaders (which still exists in a very narrow domain compared to software rendered shaders) and the lovely new DX11 tesselation technology. That in my mind will bring the looks of real time closer to photo realism (or what ever style you would want to adopt) and make us buy the illusion of being there.

Go Intel, bring us your best and tickle the competition! But even though this stuff have been worked on for years (8 years?), its way too insignificant in terms of its visual impact for me to get excited about right now.
 
I think everyone is getting off topic. The point of this demo was not to wow you with visuals, or to show you the merit's of ray tracing over other lighting methods. It was illustrating that there has been progress developing a chip that can more efficiently produce raytracing. Most notably, a single chip solution, ie. something we might have in our houses for a gaming rig. Now this is not saying the machine used in the demo signifies RT is here and we should all get it, no, it's simply showing the progress.

So let me reiterate, Intel is showing us they have made progress. They are showing us that RT might be here in the next 5 years. They are showing us that they are on the forefront of r&d of silicon based graphics rendering.

Whether the current debate is about how good/crappy the demo looked, or which method of rendering is best IS NOT THE ISSUE.

Sorry to rant, this topic just went off in more tangents, then Rays in the demo.

This completely. It's not the visuals that are important in this demo. It's the fact that here we have sillicon that can do raytracing in real time, a first.
 
The Realtime Raytracing thing has been a holy grail of sorts for quite a long time. If this new chip can run Realstorm 1600x1200 at a solid 30 fps, then it can truly be called impressive. :D
 
I agree, but you have to remember this is the FIRST demo'd RTRT sillicon (that I know of). 5fps looking like ass may not be much, but it's SOMETHING, and that's impressive.

There's actually a company which sells expansion cards with vector processors precisely aimed at solving the RTRT issue and doing it on a far more massive scale than Larrabee just showed. In fact, any modern GPU can do the same thing Larrabee did in this video, as the latter is after all aimed at being released as a budget/mainstream card.
 
We've all seen how successful such products have been though Elledan, for example, the phys-x card, the killer network card, etc.

I doubt raytracing will justify a new card, and if so, ATI and/or Nvidia will most likely simply swoop in, and include such hardware on their boards. Of course, we are talking commercial desktops. As was the demo. By no way am I referencing industry cards which will in no way be sold, or adopted, mainstream. As such, we shouldn't really be talking about such products in this thread.
 
RT is generally understood to be tracing rays from the camera along the FOV using N rays which are each bounced or refracted at most X times depending on the materials encountered (opaque, reflective, transparant). PM traces each photon emitted by each light source (emitter) until it is absorbed or otherwise can no longer be tracked. A light map is thus created with the ambient light level and every light gradient determined. Some of the photons may end up at the camera.

PM and RT are thus quite different in their approach and will give wildly differing results. Sometimes a hybrid RT/rasterization approach is used to fill in information either method misses (transparancy, reflections, shadows, &c).

Uh, no. Photon mapping simply adds a first pass before doing ray tracing. It builds a photon map of the scene (for surface radiance from caustics and soft indirect lighting), and then uses *ray tracing* to render it.

So PM and RT are literally the exact same thing when it comes to rendering, with PM just adding a first pass to do Global Illumination. PM is a superset of RT. Direct illumination and specular reflections of an object aren't affected by PM and use standard RT algorithms.

Also, transparancy, reflections, and shadows are all things RT excels at and does them extremely well.
 
dont know what ray tracing is but how many fps would crysis run with larrabee?

The problem is Intel hasn't really show off Larrabee running any standard DirectX or OpenGL games. I suspect that out of the gates Larrabee will be a terrible card for gamers that will be very slow and buggy in games. I really have nothing to base that on, its just what I expect.
 
Those IGP solutions don't target the high end market. They're purpose built to provide the best performance in the most efficient way possible at the lowest price point possible.

Their IGP bits don't even target what they say they target. Intel claims awesome media support in a low power package but in reality you're lucky to get it, and even so their DXVA support is not fully implemented. And then, if you do get it working, better hold on to that driver package, because the next one will likely not work. Those video options in the control panel? Not hooked up. Keep clicking the inverse telecine checkbox, it'll be unchecked the next time you look at it.

I deal with users weekly who try to build media machines using Intel solutions. They look awesome on paper. True Video! Wooooo! Then they come back. "I can't get HDCP to work" "My black levels are expanded and clipped" "My CPU loading is high, I thought this had accelerated decode support" etc. They always end up going to an AMD or NVIDA low end card.

So, basline IMHO: Intel never delivers on their claims /wrt graphics. The demo is always better than the real thing. I hope this next iteration is awesome, it sure looks good on paper.
 
The problem is Intel hasn't really show off Larrabee running any standard DirectX or OpenGL games. I suspect that out of the gates Larrabee will be a terrible card for gamers that will be very slow and buggy in games. I really have nothing to base that on, its just what I expect.

I concur. Echoing my previous post about video: applications that use Intel private APIs do video fairly well. Intels support of public APIs like DXVA, is pretty weak and in some cases incomplete.

So this part can be the most bad-ass ray tracer out there but if it cannot rock the DX and OGL support then it's DOA for gaming. But based on their preview info it probably fold like a mofo.
 
Uh, no. Photon mapping simply adds a first pass before doing ray tracing. It builds a photon map of the scene (for surface radiance from caustics and soft indirect lighting), and then uses *ray tracing* to render it.

So PM and RT are literally the exact same thing when it comes to rendering, with PM just adding a first pass to do Global Illumination. PM is a superset of RT. Direct illumination and specular reflections of an object aren't affected by PM and use standard RT algorithms.

Also, transparancy, reflections, and shadows are all things RT excels at and does them extremely well.

Isn't that basically what I said? :confused:
 
if they can get a chip that can play batman at max for every new laptop 1 year down the line, then welcome to the new golden age of pc gaming. massive increase in baseline specs + millions more of untapped hardcore-virgins = a lot more reason to start developing something substantial for the pc again.

until then we will wither and slowly die.
 
Raytracking is cool, but very very "old hat". Crysis looks much better, and runs much better, without Ray Tracing.

Remember when the first car came out...couldn't go in reverse, was loud..etc blah blah blah. Your argument is the same.
 
There's actually a company which sells expansion cards with vector processors precisely aimed at solving the RTRT issue and doing it on a far more massive scale than Larrabee just showed. In fact, any modern GPU can do the same thing Larrabee did in this video, as the latter is after all aimed at being released as a budget/mainstream card.

I always thought that company was just talk and had nothing to actually show? At least I never saw anything.
 
Isn't that basically what I said? :confused:

No? The photons emitted from light sources never make it to a camera as the camera doesn't exist at that point. PM builds a lightmap in advance of rendering, it doesn't render anything itself. PM and RT aren't wildly different in their approach like you claimed, they have the same approach, just with PM adding a pre-render first pass.

Remember when the first car came out...couldn't go in reverse, was loud..etc blah blah blah. Your argument is the same.

Not really, no. Ray tracing and Rasterisation are equally both old techniques. It isn't like ray tracing is just taking off. Remember ray casting (think Wolfenstein 3d)? That was a simpler version of ray tracing (initially they both start with the same algorithm, just the rays stop as soon as they hit something, whereas rt continues tracing to determine shadows, etc..). People have been working on improving and making the two algorithms faster for an equally long time. RT isn't just getting its start or any crap like that.

This isn't anything like car vs. horse. This is Intel reminding everyone why real time rendering doesn't use ray tracing - its slow. RT is simply too expensive. Algorithmic simplicity doesn't make up for raw speed. And RT is going to stay too slow for easily 5+ years.
 
Back
Top