NVIDIA to Announce RTX Technology Next Week

Megalith

24-bit/48kHz
Staff member
Joined
Aug 20, 2006
Messages
13,000
NVIDIA is expected to detail its RTX technology on Monday, a culmination of a decade’s worth of work for achieving real-time cinematic rendering. The company will also be demonstrating Ray Tracing for Gameworks, which “enables real-time Area Shadows, Glossy Reflections, and Ambient Occlusion.” Major game engines such as Unreal, Unity, and Frostbite have already stated their support for the new API, which was created in partnership with Microsoft.
 
Interesting, getting closer the a new graphics card that will include this RTX technology I assume. I just hope Nvidia doesn't pin us to using Gameworks for the feature. That would suck.
 
Maybe they can make a new GPU that is good at games and not at mining. (I know this is a contradiction)
 
For the low price of $3,000?
Be nice to see this, if reasonable, while I still have enough reflexes left to game.
 
Ray Tracing X? Are they bringing a new age GPU like they did 19 or so years ago?
 
If it was a halo gaming card that was 30% faster than 1080ti 100 % scaling sli for $3000, I'd probably buy it. :(
 
  • Like
Reactions: Sufu
like this
Raytracing has always been the holy grail. Cause it solves all the issues of shadows, lighting, etc. Just will make everything look so much better instead of kinda faking it now.
 
Raytracing has always been the holy grail. Cause it solves all the issues of shadows, lighting, etc. Just will make everything look so much better instead of kinda faking it now.

Very much this. Also, they should be able to do 'audio shaders' the same way.

And I'll bet that whatever solution Nvidia has come up with for this on the PC, said solution will be fairly portable due to Microsoft's involvement- has to run on AMD hardware on the next Xbox too, right?
 

Intel hasn't been a viable competitor in GPUs since...well, ever.

Funny story: I worked for Intel in the late nineties, and really tried to push the idea of getting serious about discrete GPUs. They weren't having it...management thought GPUs were a "niche product" and that they would never offer substantial revenue. Whoops.
 
Well maybe the next Xbox will have the dream GPU. AI done on the GPU making today's AI look like as smart as a brick wall, Tensor Cores. Physics all done on the GPU - hey Nvidia has been doing that for awhile, all lighting but way more physical based which has been anyways for awhile but add in 3d sound with the ray tracing of the environment and that too can be done on the GPU. You won't need much of anything for a CPU, it will be a slave to the GPU doing minor I/O stuff etc. an Arm processor would probably be perfect also very low power.

All above would make VR way more closer to reality. Taking a hammer to any object in the scene and it reacts realistically to what you do to it for an example.

I do believe Nvidia can be a super mega corporation if they don't mess up and keeps ahead of everyone else.
 
I don't see this happening anytime soon. Real raytracing , without all the bullshit approximation tricks, has to be the goal for all vr development moving forward, and to date, this has barely started. Imagine a game model that does not use baked maps for its details, just raw polys - in the millions. Now, if you use raytracing on it with any amount of GI added for realism, it takes a while with some pretty beefy setups. For a gfx to do this real time at 24 fps/2k/4k would be an insane leap.
 
I thought at first they bought Matrox or their tech, lol

I still have these, a Matrox RT.X100 and RT.X2 Real-Time video effects editing card.

rtx2.jpg


rt.x100.jpg
 
Countdown until AMD complains about RTX being unfair or creating an “open source” knock-off that nobody will use.
 
Been saying for a while we needed Ray-tracing. I think it'll be like this, but accelerated.

 
John Carmack was talking about this ages ago, before he left ID and before the DooM 4 project turned into DOOM; the DooM 4 engine was intended to pull this off using current GPU tech.
 

I had a Video Toaster 4000 card back in the early 90's when I had my Amiga 3000. Awesome card, but kinda pricey at $2500.
I bought it only for Lightwave 3D and traded the card for a PC when the pirate program called LightRave came out and allowed Lightwave to run without a Toaster installed in the Amiga.
I eventually got Lightwave for the PC since it rendered much faster on a Pentium 100 than on the 68030 processor I had.
 
Raytracing has always been the holy grail. Cause it solves all the issues of shadows, lighting, etc. Just will make everything look so much better instead of kinda faking it now.

Pretty much all 3d rendering for games is a bucket workarounds for trying to achieve something visually similar to raytracing in the absence of the computational power to do the real thing in real time.

I'm super glad to see some green shoots on getting there, but unless there's been some big breakthrough that everyone has been keeping secret, I feel like we are still a long way from consumer cards that can really pull it off, but hopefully we start to see some minor implementations.
 
Is newtek from video toaster the same as the one who made Lightwave?
 
I had a Video Toaster 4000 card back in the early 90's when I had my Amiga 3000. Awesome card, but kinda pricey at $2500.
I bought it only for Lightwave 3D and traded the card for a PC when the pirate program called LightRave came out and allowed Lightwave to run without a Toaster installed in the Amiga.
I eventually got Lightwave for the PC since it rendered much faster on a Pentium 100 than on the 68030 processor I had.
Pricey? It rivaled professional equipment costing $15,000+
 
Is newtek from video toaster the same as the one who made Lightwave?
Yes. Lightwave 3D was part of the Video Toaster Bundle from Newtek and then was available as a stand alone product starting with Version 4 I believe.
I still have my Lightwave 3D parallel port security dongle.
 
Pricey? It rivaled professional equipment costing $15,000+
Pricey for the home hobbyist at the time that wasn't using it for making money.
I did end up getting a job at a TV station since I knew how to use the Toaster and the Toaster operator at the station was moving to another state to teach.
 
Countdown until AMD complains about RTX being unfair or creating an “open source” knock-off that nobody will use.

+ low fps that nobody can use + expensive cards and so on. :D

This is actually good news moving forward. Now let's see how well it takes off.
 
How well will an RTX card mine?
Maybe it won't be capable of mining? The thing about something being accelerated is that it doesn't do general computing. Somehow our specialized graphic cards went from being specialized to general computing. If ray-traying is to be pulled off, it may need to be specialized.
 
Back
Top