Digital Viper-X-
[H]F Junkie
- Joined
- Dec 9, 2000
- Messages
- 15,116
I've been excited for ray tracing for years now.... Intel was pushing it pretty hard with their GPU venture as well were they not? Is this NV trying to get ahead of intel :O
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
I've been excited for ray tracing for years now.... Intel was pushing it pretty hard with their GPU venture as well were they not? Is this NV trying to get ahead of intel :O
The hardest thing for me is grasping the trade offs made to call this ray-tracing. The whole bottom/top acceleration structure makes me feel this is still in the environmental light arena compared to a recursive sampling from the camera's point of view.
Tracing a ray from one source to another is technically ray tracing, this just may be a bit different than camera centric algorithm.
The first iteration of T&L on Geforce 256 was why the RTX 2800 Ti is called a GPU today. This will evolve, perhaps like you've said 3-5 years from now as the true game changer. The evolution of this card, will undoubtedly change the game imo whether it's 3, 5, or 8 years from now. When ray tracing is fully capable at mid grade on up GPU. I personally (and don't flame me for this lol) think that the consoles like Xbox held us back visually on PC. I think we should be like 3-5 years at least into Ray Tracing already...which was where we were heading. At least Nvidia got the ball rolling though.
yea and we will need a 60 month loan to buy one
Maybe true. I do think this first gen is very expensive....but I think it goes in line with the economics. The economy and consumer confidence level is way way up at the moment, lots of jobs. So companies are charging more, simple as that as more potential cash flow around.
since all we get now are consoles ports until AMD gives the consoles RT it will not go main streamYou do make some good points. It just depends on how many enthusiests want to adopt early tech. I know alot of computer guys are clammoring for something like this because it gives them an edge on consoles at the moment and other GPU's.
i'll stick with my 1080 ti strix and wait for AMD and 7nm
I'm not PRE paying for their R&D .
RT is a toy at this point and won't be used in anymore games than they have said by the time this will be replaced in a year with 7nm
Not really looking at the tech, but the implementations we've been shown, I say overhyped. Every demo I've seen looks like they went over every surface with a high-gloss wax finish. Breaks whatever "realism" RT is supposed to bring to the scene, and IMHO, looks incredibly tacky.
Not really looking at the tech, but the implementations we've been shown, I say overhyped. Every demo I've seen looks like they went over every surface with a high-gloss wax finish. Breaks whatever "realism" RT is supposed to bring to the scene, and IMHO, looks incredibly tacky.
Could you post pictures/video's as you are seeing something else than I am?
Same, I though it looked extremely good.
IIRC the Titan V doesn't have the guts to natively ray trace. I could be wrong...
As far as CPU vs. GPU, you're not getting real-time ray tracing from generic hardware; the hardware doing this is specialized.
I need to see some side beside gameplay of with and without ray tracing. To see if it really is worth while.
Except you didn't have to go back to 320x200 to enjoy HW T&L on it. T&L was a game changer then and there, not 3-5 years later. At this point this seems like a distraction. From what you may ask? We'll find out soon when independent benchmarks on public drivers start to come.The first iteration of T&L on Geforce 256 was why the RTX 2800 Ti is called a GPU today. This will evolve, perhaps like you've said 3-5 years from now as the true game changer.
Except you didn't have to go back to 320x200 to enjoy HW T&L on it. T&L was a game changer then and there, not 3-5 years later. At this point this seems like a distraction. From what you may ask? We'll find out soon when independent benchmarks on public drivers start to come.
What? You seem to have missed the point entirely.Where does Raytracing seems like a "distraction"....video/images please?
Except you didn't have to go back to 320x200 to enjoy HW T&L on it. T&L was a game changer then and there, not 3-5 years later. At this point this seems like a distraction. From what you may ask? We'll find out soon when independent benchmarks on public drivers start to come.
Let's assume that's true and Nvidia invested years of research and millions of dollars and spent a bunch of transistors on adding RT hardware just to distract from poor performance in today's games . Why didn't they save themselves a ton of trouble and just throw some more shader cores in there instead of wasting time on RT? Granted we don't really know how much transistor budget RT actually uses. Could potentially be very cheap since it's custom logic.
Either way I really hope Turing delivers the rasterization goods and AMD quickly follows with RT hardware of their own. Then we can move on from all the mental gymnastics required to see the development of proper real time GI as anything but awesome and start arguing over which company has "real" RT
RT seems to like cores / threads for seamless gameplay.
they should be much more profound when in VR space.
The mention was targeting your basic pre-Coffee i7 at 4C/8T; while ray tracing is inherently parallel, it's hard to imagine the CPU needs growing much. All heavy work should be accelerated on the GPU.
What? You seem to have missed the point entirely.
A distraction from the actual performance of the new cards without it. Why do you think I mentioned independent benchmarks?
Is it so hard to grasp that raytracing is THE holy grail in graphics and NVIDIA is proud that they are bringing it to the market so they focus on that...really?
. 7nm is in production and around the corner, where the real benefits lay.