Raytracing on Video Games

Enduring_Warrior said:
Maybe the Geforce X900 or even the GTX Geforce 9800/9900 would run it...
Keep in mind that video cards are specifically designed for traditional rasterisation. I don't know how similar the calculations are to those needed for ray tracing, but I'm guessing that if a video card could do it efficiently then that guy would probably be using one instead of a CPU (especially considering how much better suited GPUs are to parallel computation).

So I think we'll have to wait for either dedicated hardware or much faster CPUs before ray tracing takes off.
 
It looks apealing...I wonder if DX10 and unified shaders will allow enough programming to do the processing in GPU's one day?
 
I'm wondering is the Stream Processors on the G80 could be used for this somehow.... very interesting nonethrless!
 
I love hearing silly things like this, it makes me laugh. Film CG still takes ages to render and although game shader tech is getting more advanced we won't get Film like visuals in games for years. This reminds me of the things Sony marketing comes out with when they launch a new console "The magical rainbow engine can render the whole world with real physics and lighting in full HD etc. etc.".. Ahahahaha
 
artmonkey said:
I love hearing silly things like this, it makes me laugh. Film CG still takes ages to render and although game shader tech is getting more advanced we won't get Film like visuals in games for years. This reminds me of the things Sony marketing comes out with when they launch a new console "The magical rainbow engine can render the whole world with real physics and lighting in full HD etc. etc.".. Ahahahaha


Well my friend then you need to check this site

www.tacc.utexas.edu

where Greg S. Jhonson's Research about Real Time Global illumination Which is nothin more than raytracing taken a whole step forward has brought him the NVIDIA research of the year three times in a row.

Oh and next time real the whole article before posting, there is actual specialized hardware for raytracing. I'd give 5 to 6 years to happen, course it wouldn't look like LOTR but sure it'll be better than Z-buffering.
 
And when you refer to ages to render sure you need a whole render farm to portray a film like shrek. But never say never, heck cell phones started on the Startrek Series until a guy figured out a way to bring them to actual life. Another thinhg is what AMD is doin, using a specialized card made up of ujum "stream procesorss" Xenon anyone? for HPC. SO wadda u think? That it can't be done?
Is suspicious when a guy who posts on a tech related site says never to something like this.

I actually think its well underway.
 
Back
Top