Alan Wake 2

Also more fine detail in areas, even in no-ray reconstruction cases. I've seen a number of games where DLSS quality reveals fine details in distant geometry (like wires and such) that you don't see at clearly at native rez. Now of course it introduces some minor artifacts too, but I find in quality it ends up being a tradeoff that while there may be some minor downgrades over native, there are often some minor upgrades too.
DLSS in AW2 trashers Rain effects. And has noticeable ghosting, especially when you activate Ray Reconstruction.

There are certainly some positives, such as good Anti-Aliasing and temporal stability for the edges of objects or thin objects, such as fencing, tree branches, etc. But for me, the above mentioned issues ruin it.
 
DLSS in AW2 trashers Rain effects. And has noticeable ghosting, especially when you activate Ray Reconstruction.

There are certainly some positives, such as good Anti-Aliasing and temporal stability for the edges of objects or thin objects, such as fencing, tree branches, etc. But for me, the above mentioned issues ruin it.
So what are you using instead? FSR?
 
FSR seems relatively good or usable for me. At 4k I get more frame stability with FSR quality over native? HUH. DLSS has its pluses and minuses, but overall the superior upscaling tech.

As for frame generation, unless Remedy uses FSR3 frame gen I won’t see it in action. Can consoles support frame gen? As for AMD Fluid Motion, that looked terrible and I see no use at this stage.

Game is spectacular looking, atmosphere and so on and glad they include the RT stuff, even though I can’t really use them now. More options are better even if they don’t work well for you.

Enjoy the game.
 
Game is spectacular looking, atmosphere and so on and glad they include the RT stuff, even though I can’t really use them now. More options are better even if they don’t work well for you.

Enjoy the game.
Agreed on the game being spectacular looking. With regards to RT, even if you have RT disabled in the option, the renderer does its own software RT for GI. This makes the game look amazing even in RT disabled mode.
 
  • Like
Reactions: noko
like this
FSR seems relatively good or usable for me. At 4k I get more frame stability with FSR quality over native? HUH. DLSS has its pluses and minuses, but overall the superior upscaling tech.
DF says that native rez FSR is busted at the moment. I would expect they'll patch it soon, but the issues are a bug with the implementation, not something inherent to native rez FSR2.
 
I'll never understand the folks who are absolutely obsessed with "native" resolution. Dead Space remake is the only game I can think of that I have played in years where native actually looked better than DLSS Quality (or FSR Quality for crappy games without DLSS support).

People with a 4090 need to focus on DLDSR. I played Dead Space with DLDSR + DLSS quality.

DLDSR + DLSS looks always better than native resolution + TAA.
 
Reminds me of Crysis, at release the best hardware available couldn't maintain a playable framerate in that either. The big difference is that Crysis looked massively better than most other games when it released. Alan Wake 2 is not such a big step forward.

The fact that a $1600 GPU can only manage about 30FPS at native 4k tells me that we're still a long way from mainstream 4k gaming though. While TVs have decidedly moved on to 4k, it's too much for all but the most expensive PCs. If your're rendering at 1440p anyway, it'll probably look the same or better on a native 1440p panel than upscaled for a 4k one.

^"just fine" rolf

people don't spend that kind of money to turn settings down...
and the point about crysis is not even comparable, flagship cards back then were a fraction of the price of what they are these days, like around $500 or so, which was nothing for anyone to afford back then.
I'm sorry if I sound harsh. But these reasonings are retarded.
The game's *lowest* settings look better than the *highest* settings in some of finest looking games available.
Let's say a developer "A" made a game "A1" that looks amazing and runs great on current hardware. A1's "highest" setting runs great (average 120fps) on RTX 4090 at 3840x2160.
Another developer "B" made a game "B1" that looks just as good as A1. But the developer "B" made B1's renderer to scale further into the future. Hence, its "lowest" settings while looking as good as A1's *highest* settings, runs just as well on RTX 4090 at 4K. However, B1 has more demanding settings which graphical fidelity scales further than what is offered in A1: While B1's "lowest" settings runs and looks as good as A1's highest settings, B1 has Lowest - Low - Midium - High - Highest settings to scale further and the with the highest settings, while the game provides improved graphical fidelity compared to the *lowest* (and that of the highest settings in A1), there's enough penality in performance that current RTX 4090 does not provide playable/smooth framerate at 4K.
Now, do you guys realize how stupid your posts sounded?
 
because some of us can see and feel the difference, and its crap.
Well, is someone pointing a gun in your head forcing you to use DLSS? I don't get it. DLSS is provided as-an-option. Period. You can choose to use it or ignore it. If you don't like 1080p image with DLSS, run it at 540p native. It's your choice. No technology is perfect.
 
Reminds me of Crysis, at release the best hardware available couldn't maintain a playable framerate in that either. The big difference is that Crysis looked massively better than most other games when it released. Alan Wake 2 is not such a big step forward.

The fact that a $1600 GPU can only manage about 30FPS at native 4k tells me that we're still a long way from mainstream 4k gaming though. While TVs have decidedly moved on to 4k, it's too much for all but the most expensive PCs. If your're rendering at 1440p anyway, it'll probably look the same or better on a native 1440p panel than upscaled for a 4k one.
Crysis looked great on high settings and above, but medium and low looked worse than FarCry on max and was significantly more demanding. Granted Crysis on max settings with DX10 was lightyears ahead of anything else and that lasted for a few years. In Crysis you were mostly stuck on a mix of medium and low to get 40ish fps unless you had a 8800 series card and the 8800 series cards were about the same price as 4080 and 4090 when you inflation adjust. Alan wake 2 at least looks good on older hardware, but don't think it scales that well frame rate wise.

Big issue now days is that games start relying on upscaling and upscaling on a 1080p monitor probably isn't very good, at least using anything below DLSS quality is a letdown on a 1440p monitor and even DLSS quality is noticeably worse on my 1440p monitor compared to DLAA (more blurry with less detail).
 
Well, is someone pointing a gun in your head forcing you to use DLSS? I don't get it. DLSS is provided as-an-option. Period. You can choose to use it or ignore it. If you don't like 1080p image with DLSS, run it at 540p native. It's your choice. No technology is perfect.
Your alternative is DLAA or FSR as there is no menu option for turning them off. Afaik FSR native is broken so you basically need to start config editing if you want to run native res without an Nvidia card.
because some of us can see and feel the difference, and its crap.
DLSS Quality is passable on a 1440p monitor, but native is certainly better. Wonder if a lot of the people that can't see the difference spent all their money on GPU to run the game at max on a small monitor with so so picture quality rather than get a good medium or large size monitor with good picture quality and colors.
 
DLSS Quality is passable on a 1440p monitor, but native is certainly better. Wonder if a lot of the people that can't see the difference spent all their money on GPU to run the game at max on a small monitor with so so picture quality rather than get a good medium or large size monitor with good picture quality and colors.
A sweeping generalization. I've seen DLSS providing better image quality than native (due to poor TAA implementation) and I've also seen DLSS providing worse image quality compared to native. There are various factors where AA and/or scaling contribute to image quality.
 
So what are you using instead? FSR?
I have a 27-inch 1080p 240hz monitor from Monoprice Dark Matter ( an IPS-like AHVA panel, made by KTC. Gets really bright in SDR and HDR. I believe Tom's tested it at over 500nits. Contrast is notably higher than most IPS, as well) and a 1440p 75hz Asus ProArt monitor. And my hardware is good enough, I don't generally need to use upscaling. If something runs poorly at 1440p, I will simply play it on the 1080p monitor. And actually, I like look of that 1080p monitor so much, I use it often, anyway. Great contrast and minimal glow.
What analysis? I am all for critique and constructive opinion. All I saw was whine and stupid reasoning.
I don't consider my comments to be whining and non-constructive

DLSS in AW2 trashers Rain effects. And has noticeable ghosting, especially when you activate Ray Reconstruction.

There are certainly some positives, such as good Anti-Aliasing and temporal stability for the edges of objects or thin objects, such as fencing, tree branches, etc. But for me, the above mentioned issues ruin it.
Ruining rain effects is an old problem for DLSS, which I first observed in Death Stranding. And that game I prefer to play simply with the FXAA setting. No DLSS and no Fidelity FX CAS.

And now I have responded to myself.
 
Last edited:
A sweeping generalization. I've seen DLSS providing better image quality than native (due to poor TAA implementation) and I've also seen DLSS providing worse image quality compared to native. There are various factors where AA and/or scaling contribute to image quality.
I have yet to see it look better than native and I have played quite a few games with DLSS support, but only used it in Cyberpunk and Control as it was needed to get a smooth experience with RT on. DLAA on the other hand is a really nice feature.
 
^"just fine" rolf

people don't spend that kind of money to turn settings down...
and the point about crysis is not even comparable, flagship cards back then were a fraction of the price of what they are these days, like around $500 or so, which was nothing for anyone to afford back then.
I had a pair of 8800 GTX I bought to play Crysis, the flagship card at the time, which cost $1,600 in 2023 money (only $100 less than my RTX 4090). I had no issue lowering graphic settings to get to 75 Hz on my 1600x1200 monitor.
 
I had a pair of 8800 GTX I bought to play Crysis, the flagship card at the time, which cost $1,600 in 2023 money (only $100 less than my RTX 4090). I had no issue lowering graphic settings to get to 75 Hz on my 1600x1200 monitor.
A pair of 8800 GTX would be about $1900-$2000 in todays money while the a single ultra would be $1250-$1300 range.
 
I was there thinking my GTX 8800 GTS 320Mb could handle Crysis, it wasn't even a few months old when Nvidia dropped the 8800GT that blew my card out of the water in Crysis, I ended up buying an Evga 8800GT Super Clock. those years of gaming for me was rough, had to leave my Sli 7900GTO rig behind.

Alan Wake II would have been a better place to fit FSR 3 into the mix, as I finally got to see the Arc 770 using frame gen in Forspoken and it reacts very well as to run it with no issues and what may be next from Intel knowing their video cards can use that technology just as good as AMD or Nvidia can.
 
Well, is someone pointing a gun in your head forcing you to use DLSS? I don't get it. DLSS is provided as-an-option. Period. You can choose to use it or ignore it. If you don't like 1080p image with DLSS, run it at 540p native. It's your choice. No technology is perfect.
thanks, tips.
 
I had a pair of 8800 GTX I bought to play Crysis, the flagship card at the time, which cost $1,600 in 2023 money (only $100 less than my RTX 4090). I had no issue lowering graphic settings to get to 75 Hz on my 1600x1200 monitor.
I kinda just pulled that number from an orifice tbh lol . I didn't recall cards being that pricey back then. I think I was running an HD3870 or maybe it was a 4870 at the time (I did have both), which were nowhere near that price iirc. I remember it running pretty well for the most part at 1280x1024, but I also remember the perf going to shit once you got to the aircraft carrier.
 
The northlight game engine has always produced a very slightly blurry image. DLSS also (usually) results in a very slightly blurry image. This game has both combined, so I'm not surprised that some people will be put off by the blurriness of the game.

The sharpening value in the ini file defaults to zero, you can raise it to something like 0.8 which will help a fair bit.
 
The northlight game engine has always produced a very slightly blurry image. DLSS also (usually) results in a very slightly blurry image. This game has both combined, so I'm not surprised that some people will be put off by the blurriness of the game.

The sharpening value in the ini file defaults to zero, you can raise it to something like 0.8 which will help a fair bit.
Turning off the lense distortion effect via the .ini config file, is the single biggest improvement for visual clarity.
 
I remember purchased the 8800GT just to play Crysis and it ran like shit. It was literally no decent card that could max out Crysis at that time..

I guess you need the 4090 if you wanna max out on 4K. There's no way around. Anything lower than 4K its decent on the 4080/4070. That's about it.
 
I remember purchased the 8800GT just to play Crysis and it ran like shit. It was literally no decent card that could max out Crysis at that time..

I guess you need the 4090 if you wanna max out on 4K. There's no way around. Anything lower than 4K its decent on the 4080/4070. That's about it.
yes
 
GeForce Hotfix Driver Version 546.08

This hotfix addresses the following issues:

-[Alan Wake 2] Addressing gradual stability and performance degradation over extended periods of gameplay [4334633]
-Windows 10 transparency effects are not displaying correctly after driver update [4335862]
-Random Bugcheck may be observed on certain systems [4343844]

https://nvidia.custhelp.com/app/answers/detail/a_id/5492?=&linkId=100000224743375
 
GeForce Hotfix Driver Version 546.08

This hotfix addresses the following issues:

-[Alan Wake 2] Addressing gradual stability and performance degradation over extended periods of gameplay [4334633]
-Windows 10 transparency effects are not displaying correctly after driver update [4335862]
-Random Bugcheck may be observed on certain systems [4343844]

https://nvidia.custhelp.com/app/answers/detail/a_id/5492?=&linkId=100000224743375
what is bugcheck? bsod? Also when official driver
 
what is bugcheck? bsod? Also when official driver
Bug Check is the official name for a BSOD, yes. The hotfix driver is an official driver. If you mean a "Game Ready" driver, it will release whenever the next one is scheduled to be released.

This is put on every page when a hotfix driver is released:

A GeForce driver is an incredibly complex piece of software, We have an army of software engineers constantly adding features and fixing bugs. These changes are checked into the main driver branches, which are eventually run through a massive QA process and released.
Since we have so many changes being checked in, we usually try to align driver releases with significant game or product releases. This process has served us pretty well over the years but it has one significant weakness. Sometimes a change that is important to many users might end up sitting and waiting until we are able to release the driver.
The GeForce Hotfix driver is our way to trying to get some of these fixes out to you more quickly. These drivers are basically the same as the previous released version, with a small number of additional targeted fixes. The fixes that make it in are based in part on your feedback in the Driver Feedback threads and partly on how realistic it is for us to quickly address them. These fixes (and many more) will be incorporated into the next official driver release, at which time the Hotfix driver will be taken down.
To be sure, these Hotfix drivers are beta, optional and provided as-is. They are run through a much abbreviated QA process. The sole reason they exist is to get fixes out to you more quickly. The safest option is to wait for the next WHQL certified driver. But we know that many of you are willing to try these out. As a result, we only provide NVIDIA Hotfix drivers through our NVIDIA Customer Care support site.
 
  • Like
Reactions: hu76
like this
Bug Check is the official name for a BSOD, yes. The hotfix driver is an official driver. If you mean a "Game Ready" driver, it will release whenever the next one is scheduled to be released.

This is put on every page when a hotfix driver is released:

A GeForce driver is an incredibly complex piece of software, We have an army of software engineers constantly adding features and fixing bugs. These changes are checked into the main driver branches, which are eventually run through a massive QA process and released.
Since we have so many changes being checked in, we usually try to align driver releases with significant game or product releases. This process has served us pretty well over the years but it has one significant weakness. Sometimes a change that is important to many users might end up sitting and waiting until we are able to release the driver.
The GeForce Hotfix driver is our way to trying to get some of these fixes out to you more quickly. These drivers are basically the same as the previous released version, with a small number of additional targeted fixes. The fixes that make it in are based in part on your feedback in the Driver Feedback threads and partly on how realistic it is for us to quickly address them. These fixes (and many more) will be incorporated into the next official driver release, at which time the Hotfix driver will be taken down.
To be sure, these Hotfix drivers are beta, optional and provided as-is. They are run through a much abbreviated QA process. The sole reason they exist is to get fixes out to you more quickly. The safest option is to wait for the next WHQL certified driver. But we know that many of you are willing to try these out. As a result, we only provide NVIDIA Hotfix drivers through our NVIDIA Customer Care support site.
A ok thx.
 

lol, rant, glad he does not make games where everything is predictable in a character or do what is logical. That makes the game for me very fun when the character do weird, unexpected things that I find illogical at times, more exactly put you in a situation way out of your comfort zone in the story. Still having fun in the game.
 
lol, rant, glad he does not make games where everything is predictable in a character or do what is logical. That makes the game for me very fun when the character do weird, unexpected things that I find illogical at times, more exactly put you in a situation way out of your comfort zone in the story. Still having fun in the game.
whether you agree with his take or not at least it's mildly amusing. :)
 
Back
Top