Well, based on Tech PowerUp's reviews, Anno 1800 at 4K using its highest settings (Ultra?) averages in the low 60fps range for the 3080 and in the high 60fps range for the 3090. Now to compare to the older 2080 Super and 2080 Ti, those two older cards were hovering around the low to mid 40fps range when played at the same resolution and graphic settings.
Guru3D did a video memory consumption for the Anno 1800 game, and it never went much above 5GB of memory at 4K.
Looking around on YouTube, ARK set to 4K with EPIC settings was producing average framerates in the low to mid 40s for the 3080 and for the 3090 it was averaging in the high 40s to low 50s.
So unless you are dropping some image quality, don't expect either to push the games to 120fps.
For the games you mentioned and for ones that are not memory limited, I would guess the 3090 will give you about 10% higher framerates than the 3080 when both are running at stock clocks.
3080 owner here. My main display is an LG CX running at 4K120.
If you absolutely have to have 4K120, the RTX 3090 is the closest thing you're going to get to that in modern titles. The truth is that the RTX 3090, as powerful as it is, is still not enough to do native 4K120 in the latest games.
If you're not running anything crazy (as you said), the RTX 3080 is more than enough. I have yet to run into a situation where the 10GB VRAM on the RTX 3080 is a limiting factor at 4K. I'm still waiting for someone to give me a title that is VRAM limited.
Well considering 3080 and 3090 are both basically impossible to get, I would say get either of the first you can grab your hands on honestly. The difference between the two aren't big. Pushing 120fps will require setting reductions or DLSS, both of which will make the game less memory intensive. Hitting 120fps at 4k for the foreseeable future will not be a memory limited issue--unless you are playing old RPGs and modding them to the moon with texture upgrades, in which case more VRAM will be better.