New Avatar PC Game Has a Hidden 'Unobtainium' Graphics Setting Targeting Future GPUs

Marees

2[H]4U
Joined
Sep 28, 2018
Messages
2,095
Anyone without an RTX 4090 or Radeon 7900 XTX should not even attempt it.

According to a Redditor with an RTX 4090, this mode consumes 18.5GB of VRAM at 4K and runs at 30fps using DLAA and no frame generation. In the same thread, another RTX 4090 owner said they're getting about 50fps at 4K with DLSS set to "quality," about half the performance before enabling this hidden mode.


Btw, there is a way to unlock "overdrive" aka "Unobtanium" settings: https://www.ubisoft.com/en-us/game/...ar-frontiers-of-pandora-pc-features-deep-dive

This hidden mode might soon become the new "Can it run Crysis?" for PC gamers. Since the vast majority of games these days are designed for consoles, very few push modern PC gaming hardware. It appears the new Avatar game is the exception, and we're here for it, especially as it includes ray tracing. Avatar: Frontiers of Pandora also includes more benchmarking tools than we've seen in a game in a long time, including the ability to automate the entire process, indicating the developers want it to become a new benchmark for PC games. With this hidden mode being difficult to run even for the RTX 4090, we can see it becoming the new gold standard for testing GPUs.

https://www.extremetech.com/gaming/...hidden-unobtainium-graphics-setting-targeting
 
Well, any game can arbitrarily set LoD values to be more demanding. Especially where ray-tracing is concerned. Crisis wasn't considered a graphics king and new standard because it was demanding. It was a new standard because it utilized a lot of new tech at the time.

Very clickbaity. If the game has amazing graphics, why not just say so? Why couch it in a pretense about hidden settings?
 
Specular Reflections at Unobtanium max settings is the main FPS killer by far. I'm losing 28FPS (80 to 52) going from very high to max. It seems to increase the resolution of the reflection. You can see this when you stare on the water and look for plants reflections for example. I don't see a difference anywhere else, although it kills performance everywhere lol. As far I can tell, second biggest impact has Diffuse Reflections with 10FPS.

I only use Unoptanium settings for Shadow Quality, Spot Shadows, Diffuse Reflections and Environment Reflection Quality.
 
Last edited:
All games that can, should have that. I've always found it inexplicable when a game's PC version's max graphics setting falls short of unlocking everything - like how Elder Scrolls: Oblivion's max grass-distance rendering is only a few metres away from the player, and the player can see everything fading into view beyond that. That's always been incredibly stupid design that defies any sense. If the developer is concerned with people not realising what's causing their PC to become a slideshow (which itself is a bit of an irrational fear), they can just add a warning on certain levels of settings saying that it's intended for future PC hardware.
 
I'm all in favor of this.

It seems silly that they are hiding the options though, but I understand why.

If a game launches these days, and the kids can't run everything at Ultra they start crying and review-bombing the title as "unoptimized garbage", even though what constitutes medium, high or ultra is just arbitrary.

The true measure of how well coded a game or engine is should be a comparison of how good it looks, and how hard it hits the hardware, but the problem here is that this is hopelessly subjective, so kids cling to "my medium to high end system can't run ultra, burn the witch"

I remember a time 20 years ago when on a pretty high end system, at launch most people couldn't run things at much above medium settings. The higher settings were there for future hardware. While I applaud the developers for bringing this thought process back, if I am going to be honest I was never going to play a blue pocahontas in space game anyway.

Come to think of it, I have never once enjoyed a game that was based on a film/tv series franchise that I had watched. Games are always better when they are original (or at least based on some obscure book I haven't read) so I have nothing to compare them to.
 
I'm all in favor of this.

It seems silly that they are hiding the options though, but I understand why.

If a game launches these days, and the kids can't run everything at Ultra they start crying and review-bombing the title as "unoptimized garbage", even though what constitutes medium, high or ultra is just arbitrary.
I don't think what constitutes medium, high, or ultra is arbitrary.

It's the developer's job to properly align the settings with what people can capably run at the time. Deciding which settings impact performance, the scale of adjustment, and where diminished returns sets in - these things are all important.

If the developer includes settings that are intended for future hardware but fails to make that clear...well, that's on the developer.
 
I don't think what constitutes medium, high, or ultra is arbitrary.

It's the developer's job to properly align the settings with what people can capably run at the time. Deciding which settings impact performance, the scale of adjustment, and where diminished returns sets in - these things are all important.

If the developer includes settings that are intended for future hardware but fails to make that clear...well, that's on the developer.

IMHO when a game launches, the most capable hardware on the market should only barely be able to handle "high". "Very high" and "ultra" should be for the future only.
 
Is it just me or when I saw short clips of this game, I immediately thought I was looking at Crysis?
 
This thread is worthless without... video of Unobtainium mode:

View: https://www.youtube.com/watch?v=jUDUgkhhjLg

Specular Reflections at Unobtanium max settings is the main FPS killer by far. I'm losing 28FPS (80 to 52) going from very high to max. It seems to increase the resolution of the reflection. You can see this when you stare on the water and look for plants reflections for example. I don't see a difference anywhere else, although it kills performance everywhere lol. As far I can tell, second biggest impact has Diffuse Reflections with 10FPS.
Yar, the only thing that really looks outstanding is the water and shadows. Watched video before reading your posts. Checks out.
 
Last edited:
Videos really don't do the game justice IMO. Yes, you can see how technically good it can look, but what is more impressive is just how smooth/responsive the game is while looking that good. Also, most of these videos aren't HDR - And the HDR in this game is incredible on OLED.
 
I'm all in favor of this.

It seems silly that they are hiding the options though, but I understand why.

If a game launches these days, and the kids can't run everything at Ultra they start crying and review-bombing the title as "unoptimized garbage", even though what constitutes medium, high or ultra is just arbitrary
I'm sure that's why they did it. It looks like Unbotanium really is that, even for a 4090. It is how they would like it to look, but we just don't have the GPU power today.

But we saw what happened with Alan Wake 2: Because you can't max its settings, without using DLSS, even on a 4090, people threw a massive fit. Never mind that the game looks great at "medium" and that's what the consoles run, nope if PC gamers can't crank everything all the way up they scream. So here's a way to stick it in higher settings, and let people play around with them if they want, but keep it away so hopefully there's less screaming.

Also that aside, it does help with confusion for less technical users. If settings really kill FPS to an unacceptable amount even on high end hardware, maybe best to keep them hidden from normal users so they aren't confused.

I really like when games do things like this. Like, you can't design for the future because technology will change, but you can have the ability to crank things way up because in the future those limits may not be an issue. One example I remember was Doom 3 having the option to do uncompressed textures. No real point, it is a very minor quality increase and needed more VRAM than cards of the day had... but there was also not a reason to NOT have it and it didn't take long before you could have a card that could use that.
 
I'm all in favor of this.

It seems silly that they are hiding the options though, but I understand why.

If a game launches these days, and the kids can't run everything at Ultra they start crying and review-bombing the title as "unoptimized garbage", even though what constitutes medium, high or ultra is just arbitrary.

The true measure of how well coded a game or engine is should be a comparison of how good it looks, and how hard it hits the hardware, but the problem here is that this is hopelessly subjective, so kids cling to "my medium to high end system can't run ultra, burn the witch"

I remember a time 20 years ago when on a pretty high end system, at launch most people couldn't run things at much above medium settings. The higher settings were there for future hardware. While I applaud the developers for bringing this thought process back, if I am going to be honest I was never going to play a blue pocahontas in space game anyway.

Come to think of it, I have never once enjoyed a game that was based on a film/tv series franchise that I had watched. Games are always better when they are original (or at least based on some obscure book I haven't read) so I have nothing to compare them to.
QFT
 
IMHO when a game launches, the most capable hardware on the market should only barely be able to handle "high". "Very high" and "ultra" should be for the future only.
Its not like the game is easy to run. Techpowerup has the 7800 XT only doing 60fps, at 1080p/Ultra.
 
I'm all in favor of this.

It seems silly that they are hiding the options though, but I understand why.

If a game launches these days, and the kids can't run everything at Ultra they start crying and review-bombing the title as "unoptimized garbage", even though what constitutes medium, high or ultra is just arbitrary.

The true measure of how well coded a game or engine is should be a comparison of how good it looks, and how hard it hits the hardware, but the problem here is that this is hopelessly subjective, so kids cling to "my medium to high end system can't run ultra, burn the witch"

I remember a time 20 years ago when on a pretty high end system, at launch most people couldn't run things at much above medium settings. The higher settings were there for future hardware. While I applaud the developers for bringing this thought process back, if I am going to be honest I was never going to play a blue pocahontas in space game anyway.
I agree with everything...

Come to think of it, I have never once enjoyed a game that was based on a film/tv series franchise that I had watched. Games are always better when they are original (or at least based on some obscure book I haven't read) so I have nothing to compare them to.
...except for this.
Go play RoboCop: Rouge City, it breaks that mold entirely and is both faithful to the original 1987 film while being highly original at the same time.
 
Come to think of it, I have never once enjoyed a game that was based on a film/tv series franchise that I had watched. Games are always better when they are original (or at least based on some obscure book I haven't read) so I have nothing to compare them to.
Turtles in time, Goldeneye, Alien Isolation, Ghost Busters (PS3/XB360), Knights of The Old Republic, Star Trek: Elite Force, numerous Spiderman games, Hulk: Ultimate Destruction, the recent Hogwartz game, Star Wars Republic Commando, Tron 2.0,.....many games for decades.
 
Turtles in time, Goldeneye, Alien Isolation, Ghost Busters (PS3/XB360), Knights of The Old Republic, Star Trek: Elite Force, numerous Spiderman games, the recent Hogwartz game, Star Wars Republic Commando, Tron 2.0,.....many games for decades.
For sure. My rule is "Any game based on a TV or movie sucks until proven otherwise," and also vice versa for TV/movies based on games. There are good ones, I just don't assume they'll be good until I see reviews to that effect. Also, the quality of the game is often decoupled from the quality of the movie.
 
Avatar: Frontiers of Pandora also includes more benchmarking tools than we've seen in a game in a long time, including the ability to automate the entire process, indicating the developers want it to become a new benchmark for PC games. With this hidden mode being difficult to run even for the RTX 4090, we can see it becoming the new gold standard for testing GPUs.
Does the built-in benchmark support Unobtanium mode 🤔

Here is a positive comment, on the benchmark, by a brazilian google engineer on twitter/X

The built-in bench takes 90s, split in 3 sections that represent the game's rendering EXTREMELY well.The first section shows a large RDA installation, soldiers in AMP suits, gunfight and explosions.

Section two is a flyby from high altitude; very limited animation but very deep drawing distance.You can notice significant pop-in of approaching detail, the engine struggles to handle LOD. Rare place where UE5 would beat this fantastic Snowdrop.

Third section walks through the jungle. Incredibly dense vegetation with lots of animation, also heavy rain and atmospheric effects, water, reflections.If you have played this for a few hours and you don't consider these 3 scenes a perfect distillation of the graphics... well.

People should understand that a good benchmark doesn't have to show "real gameplay". It needs the right mix of workload that's a good sampling of the game's: models, animations, level structure, particle effects, etc. A bias towards heavy scenes is good. This bench does that 💯%

https://twitter.com/opinali/status/1733742735960502715?s=20
 
Turtles in time, Goldeneye, Alien Isolation, Ghost Busters (PS3/XB360), Knights of The Old Republic, Star Trek: Elite Force, numerous Spiderman games, Hulk: Ultimate Destruction, the recent Hogwartz game, Star Wars Republic Commando, Tron 2.0,.....many games for decades.
You forgot Street Fighter: The Movie: The Game.
 
When devs release games that won't run @ "HIGH" on my 2 year old hardware & I only upgrade every 4~5 years, I won't be buying that game for 2-3 years, but lucky me, I have a HUGE backlog & the game will be 1/2 price in 2-3 years.
 
I think when the next Epic sale come December that you might be able to get a big discount on this game and trasfer it to your UBI launcher.
 
I'm sure that's why they did it. It looks like Unbotanium really is that, even for a 4090. It is how they would like it to look, but we just don't have the GPU power today.

But we saw what happened with Alan Wake 2: Because you can't max its settings, without using DLSS, even on a 4090, people threw a massive fit. Never mind that the game looks great at "medium" and that's what the consoles run, nope if PC gamers can't crank everything all the way up they scream. So here's a way to stick it in higher settings, and let people play around with them if they want, but keep it away so hopefully there's less screaming.

Also that aside, it does help with confusion for less technical users. If settings really kill FPS to an unacceptable amount even on high end hardware, maybe best to keep them hidden from normal users so they aren't confused.

I really like when games do things like this. Like, you can't design for the future because technology will change, but you can have the ability to crank things way up because in the future those limits may not be an issue. One example I remember was Doom 3 having the option to do uncompressed textures. No real point, it is a very minor quality increase and needed more VRAM than cards of the day had... but there was also not a reason to NOT have it and it didn't take long before you could have a card that could use that.

I think it’s the scaling people really complain about. Consoles being able to code to the metal (I’m not even sure that’s a thing with the apis out now). There are probably hundreds of cross platform games that only marginally run better on 4090 compared to a PS5 or Xbox Series X.

The Last of US Part 1, Ratchet and Clank, Spider Man, Alan Wake 2, CyberPunk, heck even Madden 24, etc.

All things considered you’d like to see a 4090 be 8x faster but that’s not how it works. Games are way more complicated. The only games I know that I would say are expertly coded, stable and “optimized” are anything made/coded by John Carmack and Id Software.

Doom and Doom Eternal look insane and run over 200 fps on weaker hardware sans RT.
 
All things considered you’d like to see a 4090 be 8x faster but that’s not how it works
Depend what we mean by faster.

Alan wake 2 on ps5 run around 1270p at around 30 fps, 847p for around 60 fps.

A 4090 at 4k medium quality, with DLAA for an expensive AA solution, can do around 76 fps.

That 2.88 the pixels count going at 2.55x the speed or about 7.2 "faster", not sure how similar medium is to ps5 quality mode and so on and obviously the test was made on a way faster cpu, but the gap between a 4090 and a PS5/Xbox-X can be quite something.

I think we can forget about it because of how normal 1080p/30 fps is on console and that in general resolution matter less because of sitting distance, despite 4k monitor being almost everyone playing on them.
 
Last edited:
I think it’s the scaling people really complain about. Consoles being able to code to the metal (I’m not even sure that’s a thing with the apis out now). There are probably hundreds of cross platform games that only marginally run better on 4090 compared to a PS5 or Xbox Series X.

The Last of US Part 1, Ratchet and Clank, Spider Man, Alan Wake 2, CyberPunk, heck even Madden 24, etc.

All things considered you’d like to see a 4090 be 8x faster but that’s not how it works. Games are way more complicated. The only games I know that I would say are expertly coded, stable and “optimized” are anything made/coded by John Carmack and Id Software.

Doom and Doom Eternal look insane and run over 200 fps on weaker hardware sans RT.
You are basically making things up, here. A PS5 isn't only marginally worse than a 4090, in actual game performance. And a 4090 isn't 8x faster even on paper specs, let alone actual game performance.
 
Depend what we mean by faster.

Alan wake 2 on ps5 run around 1270p at around 30 fps, 847p for around 60 fps.

A 4090 at 4k medium quality, with DLAA for an expensive AA solution, can do around 76 fps.
That always seems to be one of the first things people ignore when getting angry about PC performance being "unoptimized" against consoles is FPS and resolution. Consoles have long been willing to trade those for shiny graphics, and if you wanna compare on equal ground on a PC you need to as well. I remember that shit with Oblivion on the Xbox 360. People screamed about the shitty PC port because you "needed" an 8800 GTS to get good performance... but what they meant was that you needed that to get 1080p60 (or around 60 it wasn't enough to hold it solid). The Xbox 360 was running it at 720p30 and it couldn't hold that, dipped plenty.

So what people were complaining as optimization, was really just a very different level in demand. You can't demand higher than console settings, rez, and fps and then be surprised when the hardware needed for it is massive.
 
I agree with everything...


...except for this.
Go play RoboCop: Rouge City, it breaks that mold entirely and is both faithful to the original 1987 film while being highly original at the same time.

Thank you for that recommendation. I'll have to try it!
 
You are basically making things up, here. A PS5 isn't only marginally worse than a 4090, in actual game performance. And a 4090 isn't 8x faster even on paper specs, let alone actual game performance.

Yep I sure did, but my point wasn’t to be factually accurate when describing or conveying what others tend to describe.

EDIT: Hold the phone, PS5 is something like 10.3 TFLOPS and the 4090 is what 82.5 TFLOPS?

82.5 / 10.3 = 8.00970874

And I was pulling the 8x figure out of my rear lol.
 
Last edited:
from here:

Avatar took 50-100 computer hours per frame to Render. That's 1,200-2,400 hours a second at 24 frames per second.

no they try something like this in real time on pc.
anyway, it's nice to compare the original avatar's cinema graphics with pc graphics today and see how far things went.
 
Yep I sure did, but my point wasn’t to be factually accurate when describing or conveying what others tend to describe.

EDIT: Hold the phone, PS5 is something like 10.3 TFLOPS and the 4090 is what 82.5 TFLOPS?

82.5 / 10.3 = 8.00970874

And I was pulling the 8x figure out of my rear lol.
AMD and Nvidia TFLOPS paper spec, are not directly comparable, for gauging actual gaming performance.

RX 6750 XT (RDNA 2. PS5 is based on RDNA2, but not exactly the same) and RTX 4060 ti (Ada Lovelace, like 4090) are roughly the same gaming performance.

RX 6750 XT is FP32 (float)16.20 TFLOPS.
RTX 4060 ti is FP32 (float) 22.06 TFLOPS

There isn't a lot of coverage on the RX 6700, which is the closest desktop GPU, to the PS5's GPU. And Techpowerup's relative performance chart on the RX 6700 page is flat out wrong (they put it behind the 6650 XT, which is incorrect). The point being, there isn't an ADA GPU which really fits exactly, in comparison. IMO the 4060 is probably less performant. and the 4060 ti is more performant. If we sort of strike in the middle, with a little benefit to ADA for being a newer Arch: that would make the 4090 roughly 5x more powerful, for actual gaming.
 
Last edited:
from here:

Avatar took 50-100 computer hours per frame to Render. That's 1,200-2,400 hours a second at 24 frames per second.

no they try something like this in real time on pc.
anyway, it's nice to compare the original avatar's cinema graphics with pc graphics today and see how far things went.
Avatar: The Way of Water is 48 FPS, not 24.
 
I think all games should have settings for future hardware, and it shouldn't even be hidden.

Same, but just to stop morons from complaining about optimization, they should have obvious names, with a description that clearly states "these settings are not intended to be playable today at launch, and are included for the benefit of future players.

Call them Future+, Future++ and Future+++ or something.

Of course, one might argue, they can just launch with out them present, and patch the game when the time comes, but we all know major patches are unlikely more than a year or two after launch.

Also, presets are just presets. One could just leave the custom settings for those who really want to crank things up down the line, and not mess with presets at all.
 
  • Like
Reactions: M76
like this
Same, but just to stop morons from complaining about optimization, they should have obvious names, with a description that clearly states "these settings are not intended to be playable today at launch, and are included for the benefit of future players.

Call them Future+, Future++ and Future+++ or something.

Of course, one might argue, they can just launch with out them present, and patch the game when the time comes, but we all know major patches are unlikely more than a year or two after launch.

Also, presets are just presets. One could just leave the custom settings for those who really want to crank things up down the line, and not mess with presets at all.
That's probably the best thing to do. But people still complain. Doom 3 had all sorts of warnings and people still complained. And that was back in the day when it wasn't uncommon to have settings that current hardware couldn't come close to 60 fps maxed out.
IMO just do it anyways, stupid people are going to be stupid and entitled people are going to be entitled.
 
That's probably the best thing to do. But people still complain. Doom 3 had all sorts of warnings and people still complained. And that was back in the day when it wasn't uncommon to have settings that current hardware couldn't come close to 60 fps maxed out.
IMO just do it anyways, stupid people are going to be stupid and entitled people are going to be entitled.

I think this is the crux of it.

"THIS BARELY LOOKS BETTER AND WILL RUN LIKE SHIT" warnings in the menu aren't enough. It needs to take conscious effort to enable, or someone will do it anyway and subsequently bitch about it.

This is violently exasperated by anyone playing on PC seeing not-max settings as a personal attack. And no game is going to call your computer a pussy little bitch. So, at that point the only real choice is to tank your performance and complain about it on reddit
 
I think this is the crux of it.

"THIS BARELY LOOKS BETTER AND WILL RUN LIKE SHIT" warnings in the menu aren't enough. It needs to take conscious effort to enable, or someone will do it anyway and subsequently bitch about it.

This is violently exasperated by anyone playing on PC seeing not-max settings as a personal attack. And no game is going to call your computer a pussy little bitch. So, at that point the only real choice is to tank your performance and complain about it on reddit
Ya I think hiding it behind a command line is actually the right way to go. Still won't eliminate all screaming, as there are some dumb people out there, but it helps. You have to go out of your way to enable it. It can always be patched in mainline later, if they want, but that way it is there even if there aren't patches or for people who want to fiddle around with it.
 
Crysis was fucked because it was designed for 5-6ghz single threaded cpu's that never eventuated and was hella CPU bound for a long time.

If I want to break my system I use Fortnite, it gets a new unobtanium setting every three months with Epic showing off the latest engine improvements.
 
I remember it doing quite well done in, a 'Experimental Graphics - For Future Hardware' section, with a written explaination-warning, do not remember complaining, maybe it was not mainstream enough.

Did people complained a lot with cyberpunk pathtraced or extreme RT ?

I feel that gamer are quite in the know now and if it is stuff that his not included when you set the game graphic preset to ultra by default with a little warning it can go well enough, specially if we speak level of stuff that people with a 4090 can easily run like this seem to be, so not really just future gpus exclusive.
 
I remember it doing quite well done in, a 'Experimental Graphics - For Future Hardware' section, with a written explaination-warning, do not remember complaining, maybe it was not mainstream enough.

Did people complained a lot with cyberpunk pathtraced or extreme RT ?
Yes. Some gamers get like personally offended when their system can't max a game.
 
I think this is the crux of it.

"THIS BARELY LOOKS BETTER AND WILL RUN LIKE SHIT" warnings in the menu aren't enough. It needs to take conscious effort to enable, or someone will do it anyway and subsequently bitch about it.

This is violently exasperated by anyone playing on PC seeing not-max settings as a personal attack. And no game is going to call your computer a pussy little bitch. So, at that point the only real choice is to tank your performance and complain about it on reddit
The right kind of game could totally rip on a user's system and it would be funny and maybe get the point across. An edgy fps type of game like Duke Nukem or Borderlands. "Your garbage GPU won't be able to get 30 FPS with ultra ray tracing you noob".

IMO the best game settings show you screenshots or even have the game show live changes as you make changes and describe what it does and the performance impacts if it hits your GPU or CPU harder, or uses more memory. I think there was a Call of Duty or Battlefield game that did something like that a few years ago. But if you can completely max everything out and still hit your target FPS all that is kind of pointless.

There are a lot of easy settings that can be added to make a game look better beyond what current hardware can handle.
Lighting has always been something with a lot of settings and impact on performance and look. Ray traced lighting is seems like it should be just as easily scalable and go far beyond what current hardware can do. Just keep adding more bounces and allow users to set how many they want. UE5's Nanite also seems like it could be very easily scalable with a simple slider or number entry, very similar to setting draw distance and supersampling. Easy stuff to add if you aren't afraid of the wrath of the plebians.
 
Back
Top