cageymaru

Fully [H]
Joined
Apr 10, 2003
Messages
22,061
Metro Exodus developer 4A Games has set a performance goal of 1080p 60fps when enabling Nvidia's new RTX features according to a company representative. Ben Archard, a rendering programmer at the studio had this to say when interviewed by Rock Paper Shotgun.

"We're always going to be pushing 60[fps]," rendering programmer Ben Archard told me when I asked if 4A have any kind of performance targets when it came to implementing Nvidia's uber lighting ray tracing tech. "But we'll see what we get. Obviously, there are three cards there and we'll see what profiles we can get for each."
He also confirmed that 4A's 60fps target was with a 1920x1080 resolution as well. "It's 1080p, yes," he said. "That's the goal, but we'll see how it goes."

The article repeatedly articulates that the game code is not finished as it has a February 2019 release date and optimizations are ongoing. An extended interview with the developer is coming towards the end of the week according to Hardware Editor Katharine Castle.
 
So, sounds like 60Hz ray tracing at 4K is still at least a couple of product cycles away: The math: need 4x the raw number of pixels/rays, 70% improvement per generation (since it's new it might improve fast), maybe use the HW DNN, software optimization, and rely on GSync to get some more.

So, we might get 4K 60Hz RTX in Q4'2022 using two RTX 2280Ti. Lots of unknowns though, especially fab process tech.
But that might be worth upgrading to get.
 
Feels like this is one of those features that everyone gets worked up about but won't actually be usable until 3 generations from now. Just hope the 2080ti is able to provide a significant performance uplift in non-raytracing situations.
Holy hell you're a lurker, hahaha.

Totally agree though. I just want a framerate boost in my current games really. Maybe enough to max them out again and get 144+
 
I was expecting..... Yes EXPECTING a new card to push 60fps at 1080p regardless of the new feature.

It feels like they just keep hitting reset on the fps counter.

I have a 144hz 1080p monitor and expect a config to hit 144..... IF ray tracing means I cant hit that number then its dead to me.

I have strong feels.
 
As others have said this is a bit of a concern if they're saying "1080P, 60FPS" as a target for a $1200 card in 2018. How much better will it REALLY look using this tech to justify what seems to be a possibly incredible performance hit. Furthermore, if you're launching brand new cards that are supposedly especially built with cores for this sort of raytracing and AI, upcharging an arm and a leg compared to previous generation MSRP etc.. shouldn't the quality and viability of the features on question be exceptional as opposed to struggling to reach what used to be an old baseline with it enabled?

Lastly, I'm even more concerned about how different the "RTX" raytracing and AI tech will be. Is this another GSync/PhysX/GameWorks style thing where they are pushing yet another proprietary way to do things? If so, it means that the game experience for those with AMD cards or whatnot may be very different - assuming of course these features really look/feel different - in titles that make use of them, forcing developers to make a choice. If this does come to pass, I'm more interested in seeing what sort of ideally open alternative AMD and others will embrace instead and hope that the industry prefers that interpretation to whatever proprietary crap that Nvidia may decide to push yet again.
 
Lot of people seem to think RTX is just an eye-candy upgrade. It isn't.

As the Battlefield demo shows, with RTX, you'll see things in reflective surfaces that you won't see without it.
That's not just eye candy: that means you'll see someone hiding behind a barrier, see them in as a reflection in a shiny object a little bit farther away.
That's a pretty nice advantage to have if you have the situational awareness to notice it.

So I won't be surprised to see top-tier competitive FPS players running at 1080p on dual RTX 2080TI with ray-tracing, just for the reflections.

And then there's professional game streamers. If you're one of those and your stream is mainly being watched at 1080p 60Hz anyway, why not run at that res and give the audience all the photo-real eye candy you can? So again, I won't be surprised to see dual RTX 2080TI with ray-tracing in that market, either.

At the top end, the people playing in these markets don't care about the mere $2500 such a configuration is going to cost.
 
They're probably just tuning the visuals with 60hz@1080p as a target, weighing performance vs appearance. If they get good results after optimizations, they may target a higher resolution for release or in an update.

I'm not expecting that this generation, but I just want to keep this in perspective.
 
Lot of people seem to think RTX is just an eye-candy upgrade. It isn't.

As the Battlefield demo shows, with RTX, you'll see things in reflective surfaces that you won't see without it.
That's not just eye candy: that means you'll see someone hiding behind a barrier, see them in as a reflection in a shiny object a little bit farther away.
That's a pretty nice advantage to have if you have the situational awareness to notice it.

So I won't be surprised to see top-tier competitive FPS players running at 1080p on dual RTX 2080TI with ray-tracing, just for the reflections.

And then there's professional game streamers. If you're one of those and your stream is mainly being watched at 1080p 60Hz anyway, why not run at that res and give the audience all the photo-real eye candy you can? So again, I won't be surprised to see dual RTX 2080TI with ray-tracing in that market, either.

At the top end, the people playing in these markets don't care about the mere $2500 such a configuration is going to cost.

I don't care how good ray tracing is if it sets resolution/FPS standards back 15 years. Holy hell.
 
Furthermore, if you're launching brand new cards that are supposedly especially built with cores for this sort of raytracing and AI, upcharging an arm and a leg compared to previous generation MSRP etc.. shouldn't the quality and viability of the features on question be exceptional as opposed to struggling to reach what used to be an old baseline with it enabled?

"As the company name is stylized, "nVIDIA" is pronounced invidia, a Latin word for envy. This sense is a "looking upon" associated with the EVIL EYE, also derived from the Latin invidere (to look in a HOSTILE manner)." - Quora website

Seems many folks just don't get it ... evil has difficulty recognizing evil because to evil, evil looks normal. You know, "just smart business", "clever marketing", and evil finds telling lies to be the norm (new series release still A LONG WAY OFF, GTX 2080 Ti will be $999, etc)

nVIDIA is just being who they are
 
Last edited:
what is the comparison of 1080p with rtx versus 4k without it in terms of visuals? Since developers are seeming to embrace it, will the lighting using rtx at 1080p make the overall scene visually more appealing than a hi 4k res visual would?

I have seen demos of it but of course they are just videos. From a distance will you be in awe of the lighting but then as you come closer to an object soon realize your looking at a low rez texture? or will better lighting/reflections actually make that low rez texture look better?
 
"As the company name is stylized, "nVIDIA" is pronounced invidia, a Latin word for envy. This sense is a "looking upon" associated with the evil eye, also derived from the Latin invidere (to look in a hostile manner)." - Quora website

Seems many folks just don't get it ... evil has difficulty recognizing evil because to evil, evil looks normal. You know, "just smart business", "clever marketing", etc

Does that mean it blows a lot of smoke?
 
Last edited:
As others have said this is a bit of a concern if they're saying "1080P, 60FPS" as a target for a $1200 card in 2018. How much better will it REALLY look using this tech to justify what seems to be a possibly incredible performance hit. Furthermore, if you're launching brand new cards that are supposedly especially built with cores for this sort of raytracing and AI, upcharging an arm and a leg compared to previous generation MSRP etc.. shouldn't the quality and viability of the features on question be exceptional as opposed to struggling to reach what used to be an old baseline with it enabled?

Lastly, I'm even more concerned about how different the "RTX" raytracing and AI tech will be. Is this another GSync/PhysX/GameWorks style thing where they are pushing yet another proprietary way to do things? If so, it means that the game experience for those with AMD cards or whatnot may be very different - assuming of course these features really look/feel different - in titles that make use of them, forcing developers to make a choice. If this does come to pass, I'm more interested in seeing what sort of ideally open alternative AMD and others will embrace instead and hope that the industry prefers that interpretation to whatever proprietary crap that Nvidia may decide to push yet again.

I don't think it's a concern at all. It is exactly what I expected watching the presentation. This is brand new technology. They're enabling features that require render farms when used by movie studios. Jensen himself said that you'd need to talk in terms of petaflops in order for true ray tracing to be viable in games. Of course its going to be incredibly demanding, this is first gen tech. Remember early GPU PhysX? You needed an entire separate card dedicated to PhysX processing to even hope to have a decent framerate. Now days GPUs are powerful enough to easily handle PhysX and game rendering on a single card. It will be YEARS before this tech is mature. No one should even think about buying 1st gen RTX cards for all those RTX features.
 
And they're likely using the 2080 TI for the Target so you're going to be royally fucked with a plain 2080 or the 2070. I mean really why even waste Die space on something that's not even going to give you what most people consider a playable performance at a measly 1080p? And pretty much anyone buying the high-end 2080 TI has at least a high refresh rate 1440p monitor if not a 4k monitor so running well below native resolution will look like pure shit negating the visual improvements that Ray tracing brings anyway.

That said, once they go to 7 nanometers they can go Buckwild adding much more RT cores and that can change everything.
 
I don't think it's a concern at all. It is exactly what I expected watching the presentation. This is brand new technology. They're enabling features that require render farms when used by movie studios. Jensen himself said that you'd need to talk in terms of petaflops in order for true ray tracing to be viable in games. Of course its going to be incredibly demanding, this is first gen tech. Remember early GPU PhysX? You needed an entire separate card dedicated to PhysX processing to even hope to have a decent framerate. Now days GPUs are powerful enough to easily handle PhysX and game rendering on a single card. It will be YEARS before this tech is mature. No one should even think about buying 1st gen RTX cards for all those RTX features.
And that is not true just due to the simple fact that Hardware physx is unoptimized shit and always has been. Even at 1080p, a 1080 TI cannot hold 60fps in Borderlands the pre sequel in areas that use heavy physx. And as usual the GPU usage actually drops instead of going up which is always been a problem with the shity coded physx.
 
I don't care how good ray tracing is if it sets resolution/FPS standards back 15 years. Holy hell.
So you're not their market for RTX.
I doubt they care, you'll probably buy the card anyway for the faster non-RTX performance.
Or for epeen, 'cause it's faster than the card you currently brag about in your sig.
 
I don't care how good ray tracing is if it sets resolution/FPS standards back 15 years. Holy hell.

I don't really agree here. Too often, especially in computing, things get pushed off to the side to accommodate speed. And in the end, it leads to a mess. Current 3D graphics for gaming are trickery compared to ray tracing. I pretty much expected another feature that wouldn't be used for a few generations. And I can understand why they pushed it. Yeah, it's going to be completely useless right now, but it might lead to a few interesting indie games, if done correctly.
 
And that is not true just due to the simple fact that Hardware physx is unoptimized shit and always has been. Even at 1080p, a 1080 TI cannot hold 60fps in Borderlands the pre sequel in areas that use heavy physx. And as usual the GPU usage actually drops instead of going up which is always been a problem with the shity coded physx.

Fair point. Let me rephrase: In most cases, a single card can handle both PhysX and game rendering.
 
So you're not their market for RTX.
I doubt they care, you'll probably buy the card anyway for the faster non-RTX performance.
Or for epeen, 'cause it's faster than the card you currently brag about in your sig.
It's faster than his Titan? You have the full review for the 2080ti already? Please share! How much faster is it?
 
I don't really agree here. Too often, especially in computing, things get pushed off to the side to accommodate speed.

please allow me to put things into proper perspective using a car as an example. I'll sell you my brand new just released to market speedster car for an ultra premium price because it has an option that is going to revolutionize the car driving experience but ... that revolutionary option wont do much for your driving experience for the next 2-3 years but not to worry ... at least you have it there NOW !
 
please allow me to put things into proper perspective using a car as an example. I'll sell you my brand new just released to market speedster car for an ultra premium price because it has an option that is going to revolutionize the car driving experience but ... that revolutionary option wont do much for your driving experience for the next 2-3 years but not to worry ... at least you have it there NOW !

Tessellation, Bump Mapping, etc. Companies always hype the new feature. I had a Vérité V1000 back in 1996 (Sierra Screamin' 3D), which hyped anti-aliasing. It came with Indy Car Racing 2 with that feature on, and the game ran at like 20 fps. It's always been this way for new technologies.
 
Feels like this is one of those features that everyone gets worked up about but won't actually be usable until 3 generations from now. Just hope the 2080ti is able to provide a significant performance uplift in non-raytracing situations.

I completely agree. The Mrs. and I both game on 1080P screens still, but with this laptop and it's 1080GTX I was hoping to jump to a ultrawide monitor. So even though 60FPS at 1080P sounds super terrible for those with high resolution displays, it's still remarkable that this technology is even ready for prime time at that resolution. With that said, I probably won't have the means to jump to a new laptop with the 2080 reaches the notebook market so I'll just take this as writing on the wall to venture into new areas (ultrawide screens) instead.
 
please allow me to put things into proper perspective using a car as an example. I'll sell you my brand new just released to market speedster car for an ultra premium price because it has an option that is going to revolutionize the car driving experience but ... that revolutionary option wont do much for your driving experience for the next 2-3 years but not to worry ... at least you have it there NOW !

Hurrah for shitty car analogies. Computers are not cars, car analogies do not work. Would you rather new tech and features never get introduced? First gen tech ALWAYS has trouble with new features. You act like RTX is the first to ever demand sacrifices to enable fancy new stuff.
 
So people paying up to $2,000 for a new RTX card may not be able to use the feature above 1080p@60FPS? Wow, just.. wow.

It's not really a feature yet. It's going to be slow, anyone expecting 4k with it on needs to google what it's doing.

Not gonna happen for years. Of course by 2022 we'll have real time caustics and global illumination to keep everyone at 1080p.
 
As others have said this is a bit of a concern if they're saying "1080P, 60FPS" as a target for a $1200 card in 2018. How much better will it REALLY look using this tech to justify what seems to be a possibly incredible performance hit. Furthermore, if you're launching brand new cards that are supposedly especially built with cores for this sort of raytracing and AI, upcharging an arm and a leg compared to previous generation MSRP etc.. shouldn't the quality and viability of the features on question be exceptional as opposed to struggling to reach what used to be an old baseline with it enabled?

Lastly, I'm even more concerned about how different the "RTX" raytracing and AI tech will be. Is this another GSync/PhysX/GameWorks style thing where they are pushing yet another proprietary way to do things? If so, it means that the game experience for those with AMD cards or whatnot may be very different - assuming of course these features really look/feel different - in titles that make use of them, forcing developers to make a choice. If this does come to pass, I'm more interested in seeing what sort of ideally open alternative AMD and others will embrace instead and hope that the industry prefers that interpretation to whatever proprietary crap that Nvidia may decide to push yet again.
The RTRT is part of DX12 specifications, so any hardware can run it, the question will be how fast, and when ATI and Intel decides to bring their answer, probably later in 2019 for ATI.

It has happened plenty of times for new technologies to take a while to catch on, there's lots of good discussions going on here on [H] as to what is likely to happen, 7nm process is not too far off possibly boosting the nvidia performance, and ATI must be working on something but we can only guess on lots of stuff right now.

The Radeon 3870 was a great card for DirectX 9, it supported 10.1 but was so slow you ran a game and thought "Wow, this will be cool when a card can actually run this, set back to DX9".
 
Lot of people seem to think RTX is just an eye-candy upgrade. It isn't.

As the Battlefield demo shows, with RTX, you'll see things in reflective surfaces that you won't see without it.
That's not just eye candy: that means you'll
see someone hiding behind a barrier, see them in as a reflection in a shiny object a little bit farther away.
That's a pretty nice advantage to have if you have the situational awareness to notice it.

upload_2018-8-23_18-55-35.png


Like this? That is a pretty good advantage, yeah.
 
Back
Top