I have this theory that ray tracing may eventually bring back PhysX type processing. Baked lighting, shadows, etc. take a lot of pre-calculation and storage and there's only so much of that you can stuff into a game. It's one of the reasons we don't have fully destructible game environments. Devs have to predict and bake all the possible lighting & shadow states or it won't look right. That's pretty much undoable if players can destroy anything in the game. If you're using real time ray tracing you don't have to pre-bake lighting and shadow maps for every possible spot a jet might fly over, drop a 2000lb bomb on, and leave an appropriately sized crater. Blow a big hole in something and if the baked light & shadow maps aren't ready for it things won't look right. Now if we're dropping 2000lb bombs wherever we want and blowing holes in buildings with tank guns or whacking trees in a fantasy RPG with an enchanted axe of splintering we might need some PhysX type stuff to deal with all the bits that go flying, and if the game is using RT it might not look goofy.