cageymaru

Fully [H]
Joined
Apr 10, 2003
Messages
22,061
Crytek has released a new video demonstrating the results of a CRYENGINE research and development project called Neon Noir. Neon Noir is based on an advanced version of CRYENGINE's Total Illumination showcasing real-time ray tracing. The demo shown below is running on an AMD Vega 56 and demonstrates how real-time mesh ray-traced reflections and refractions can deliver highly realistic visuals for games. This feature is expected to be available to developers using CRYENGINE in 2019. The demo was created on a bespoke version of CRYENGINE 5.5 and the experimental ray tracing feature is both API and hardware agnostic. This means that it will run on AMD and NVIDIA GPUs. Crytek will optimize the technology to benefit both the current generation of graphics cards and supported APIs like Vulkan and DX12. Thanks Gideon !

Neon Noir follows the journey of a police drone investigating a crime scene. As the drone descends into the streets of a futuristic city, illuminated by neon lights, we see its reflection accurately displayed in the windows it passes by, or scattered across the shards of a broken mirror while it emits a red and blue lighting routine that will bounce off the different surfaces utilizing CRYENGINE's advanced Total Illumination feature. Demonstrating further how ray tracing can deliver a lifelike environment, neon lights are reflected in the puddles below them, street lights flicker on wet surfaces, and windows reflect the scene opposite them accurately.
 
That's some impressive shit. Honestly, don't buy into the Nvidia horseshit. It can be done, smoothly, on any modern video card. Nifty. Only reason why I do have a 2080Ti is for the horsepower. Never used the Ray tracing. .
 
I just wished they had a demo version I could download and run on my machine to see how well it does. Granted I am running a 1080 not a Vega 56 but it would be interesting. Also shows what I knew all along, tenser cores are unneeded.
 
It took a while and I bet it will improve, there is no need for raw ray tracing imo.
This wouldnt matter if Moors law kept on track but with costs starting to spiral per extra MHz, we need short cuts.
Sweet.
 
I just wished they had a demo version I could download and run on my machine to see how well it does. Granted I am running a 1080 not a Vega 56 but it would be interesting. Also shows what I knew all along, tenser cores are unneeded.

Tensor cores have nothing to do with raytracing; they are for things like AI etc. The raytracing has yet another architecture (the RT cores).
 
this makes Nvidia look like they are scamming everyone, I for one will not touch a RTX card. This whole ray tracing chip thing is now exposed as a non required fraud.....
I've been saying for a while that you don't need an asic for ray tracing. I've shown demos from 2011 with real time ray tracing. But nvidia has reasons like not depending on the CPU for consistant performance. More likely to lock ray tracing to their hardware and therefore brand.

BTW battlefield V did ray tracing on a gtx 1080 ti first.
 
So Ray Rracing can be achieved without "RT cores"? Did Nvidia lie to us?

They tech never said it couldn't be achieved with a non rtx card it also shows your ignorance on path tracing, let me follow the same guidelines you gave RTX and NVidia "any games, spent 800$ for a card and no Ray traced games" so garbage......that's pretty much how you guys sound
 
Tensor cores have nothing to do with raytracing; they are for things like AI etc. The raytracing has yet another architecture (the RT cores).
Frankly speaking some of these idiots are spouting non-sense they clearly show they are ignorant to how the technology works and why NVidia used Rt cores and tensor cores, as cool as it is there just isn't enough compute right now to do it with proper path tracing, which this is using "mesh" I am more interested in what they did and how they did it, from the looks of it they are cheating it in a way that makes it work with maybe hardly any impact? I don't know, it will be interesting to dissect this.
 
so this video is saying that Crytek will be implementing ray-tracing into their games without the need for a hardware specific RTX card?

With the limitations and performance relative to using hardware- and whether the engine can leverage RT hardware at all- as to anyone's guess.
 
I've been saying for a while that you don't need an asic for ray tracing. I've shown demos from 2011 with real time ray tracing. But nvidia has reasons like not depending on the CPU for consistant performance. More likely to lock ray tracing to their hardware and therefore brand.

BTW battlefield V did ray tracing on a gtx 1080 ti first.

I've known from the beginning that ray tracing doesn't need any special cores to be achieved and that the RT cores in RTX cards are there to accelerate it. I feel people are getting overly excited by this canned demo. It's impressive and all but we need to see how it looks and performs in a game before declaring RTX cards a complete scam when it comes to ray tracing.
 
I've known from the beginning that ray tracing doesn't need any special cores to be achieved and that the RT cores in RTX cards are there to accelerate it.

Well, you and anyone else with half a brain that actually read the details on the technology. 20 years ago we had raytracing, when neither RT nor GPUs existed. Of course the RT cores are only meant to accelerate raytracing calculations (currently done at 1 ray per pixel or less, then just denoising the mess they output). You could do the same on a CPU, just slower.

Seems like Crytek have designed a way to use shaders to raytrace more efficiently. Again, this is nothing new, as Microsoft has always said that DXR would use shaders in the absence of raytracing hardware (which is all that RTX does, accelerate calculations to then translate them to DXR so they work via DX12):

upload_2019-3-15_19-5-16.png

The interesting thing will be if Crytek's approach enables at least 1080p30 rendering with a similar degree of quality as the RTX 20 series.
 
Well, you and anyone else with half a brain that actually read the details on the technology. 20 years ago we had raytracing, when neither RT nor GPUs existed. Of course the RT cores are only meant to accelerate raytracing calculations (currently done at 1 ray per pixel or less, then just denoising the mess they output). You could do the same on a CPU, just slower.

Seems like Crytek have designed a way to use shaders to raytrace more efficiently. Again, this is nothing new, as Microsoft has always said that DXR would use shaders in the absence of raytracing hardware (which is all that RTX does, accelerate calculations to then translate them to DXR so they work via DX12):

View attachment 148412

The interesting thing will be if Crytek's approach enables at least 1080p30 rendering with a similar degree of quality as the RTX 20 series.

Sad thing is that it seems a lot of people think that RT cores are required for ray tracing or that Nvidia said they were so they are getting way too excited about this demo and declaring RT cores a scam. If the technique Crytek is using can get good results and run decently on all hardware then that would be fantastic but it's also reasonable to think that it won't look as good as what RTX cards can produce and still cause a performance hit.
 
looks good. if i bought a rtx card i might start to fume. still glad i didnt one way are the other.
 
They tech never said it couldn't be achieved with a non rtx card it also shows your ignorance on path tracing, let me follow the same guidelines you gave RTX and NVidia "any games, spent 800$ for a card and no Ray traced games" so garbage......that's pretty much how you guys sound

I'm very aware of what path tracing is. My post was full of questions, not statements. Get the sand out of your vagina and your BS out of here! *Statement.
 
If a games uses this and AMD does as well as nVidia, then RTX is a scam.

If a game uses this and AMD is at a disadvantage, then nVidia paid off Crytek.

That’s the narrative, I’m calling it now. :D

It’s been known forever and nVidia even said RT can be done on other vendor’s cards since it’s based on DirectX or Vulkan.
 
If a games uses this and AMD does as well as nVidia, then RTX is a scam.

If a game uses this and AMD is at a disadvantage, then nVidia paid off Crytek.

That’s the narrative, I’m calling it now. :D

It’s been known forever and nVidia even said RT can be done on other vendor’s cards since it’s based on DirectX or Vulkan.

I don't believe its a scam since the technology actually works & they are the first card on the market to do it, BUT they sure as heck touted like they created the technology, which is kind of deceiving to the public.
 
RT cores is rather a misnomer, as I don't believe there is any such thing. There are FP32, FP16, and tensor cores. The FP units may be used for anything, the tensor cores are used for DLSS and denoise.
 
I don't believe its a scam since the technology actually works & they are the first card on the market to do it, BUT they sure as heck touted like they created the technology, which is kind of deceiving to the public.
They never said anything about it being invented by them.
 
If the technique Crytek is using can get good results and run decently on all hardware then that would be fantastic but it's also reasonable to think that it won't look as good as what RTX cards can produce and still cause a performance hit.

Technically speaking, physics are physics, light is light, and math is math. whatever Crytek is doing to calculate their raytracing should look exactly the same as what DXR or RTX calculate. The only difference should be in the performance impact, where Nvidia probably will have the performance advantage for now because their RT cpres are designed to accelerate these calculations. That does not mean, however, that other vendors won't get there with software in time, as I would fully expect them to do.
 
Back
Top