GeForce RTX 2080 Ti in NVLink: 4K 120FPS Gaming Is Now Here

cybereality

[H]F Junkie
Joined
Mar 22, 2008
Messages
8,789
https://www.tweaktown.com/articles/...i-nvlink-4k-120fps-gaming-now-here/index.html

8741_69_nvidia-geforce-rtx-2080-ti-nvlink-enables-4k-120hz-gaming.png
 
That's some very impressive scaling, but I'm going to stick to a single card to see just how well NVLink is supported by the rest of the industry.
 
Does NVLink still introduce a full frame of input lag like SLI did? Going from 2x970 to 1x980 Ti made a clear difference in responsiveness.
 
Yes, in well supported titles a 1080 Ti SLI rig could outpace a single 2080 Ti, but not in many games.

Usually the games I play use sli. Will BFV use sli? Cod it's been a no the last time threw.

The last tomb raider game i played through i used 1080ti (borrowing my sons card). Thinking of doing the same if needed.
 
Ah, 4K/120 is finally possible for well supported mGPU games and it only costs a cool $2500 for the GPUs and another $1500-$2000 for a capable monitor. So it's only about $4-5k in just upgrades to an existing decent gaming PC to turn it into a 4K/120 machine...

I'll wait a few more years when the pricing is more realistic. :)
 
Yeah, new name for SLI. The main difference is that it has 50x the transfer bandwidth of SLI.
 
Enjoy 4:2:2. “But it doesn’t make a difference in games!!”
Yes it actually does.
DisplayPort 1.4 has enough bandwidth for full RGB at 4K 120 Hz. It's only limited with HDR.
AA doesn't really matter. I find myself just using FXAA these days since SMAA is inconsistent depending on the game for some reason, and don't get me started on TAA. But AO off? Then what is the fucking point?
 
DisplayPort 1.4 has enough bandwidth for full RGB at 4K 120 Hz. It's only limited with HDR.

AA doesn't really matter. I find myself just using FXAA these days since SMAA is inconsistent depending on the game for some reason, and don't get me started on TAA. But AO off? Then what is the fucking point?

Eh glad they list them. I'll gladly sacrifice AO or just about anything to get the fps to a more consistent ~144 fps. I personally don't like TAA much either and might not turn it on regardless of performance. Though in SP games that's not a big concern of mine (v. high fps) and I lean more on gsync than in MP games.
 
Eh glad they list them. I'll gladly sacrifice AO or just about anything to get the fps to a more consistent ~144 fps. I personally don't like TAA much either and might not turn it on regardless of performance. Though in SP games that's not a big concern of mine (v. high fps) and I lean more on gsync than in MP games.

TAA or FXAA wouldn't impact the FPS much, but disabling the AO is just stupid.

"hey, look at this amazing 4k monitor! It has 8 million pixels to clearly see all the areas where the missing AO would make the game look better!"
 
TAA or FXAA wouldn't impact the FPS much, but disabling the AO is just stupid.

"hey, look at this amazing 4k monitor! It has 8 million pixels to clearly see all the areas where the missing AO would make the game look better!"

Again that's the problem, you're always going to have to make cuts somewhere in most games. Now if I get a 2080 ti I might have to make very few if any cuts @ 1440p. I guess the conclusion we might draw from tomb raider is that you won't be playing that at 4k 120 fps unless you want to spend a ton of cash.
 
Again that's the problem, you're always going to have to make cuts somewhere in most games. Now if I get a 2080 ti I might have to make very few if any cuts @ 1440p. I guess the conclusion we might draw from tomb raider is that you won't be playing that at 4k 120 fps unless you want to spend a ton of cash.

As a 4K gamer, I don't see the point in completely removing features that add detail to get high FPS, some features, like shadows and AA can be SLIGHTLY reduced without much noticeable Effect, but I just don't get the idea of having such a high resolution to render N64 levels of image quality...
 
As a 4K gamer, I don't see the point in completely removing features that add detail to get high FPS, some features, like shadows and AA can be SLIGHTLY reduced without much noticeable Effect, but I just don't get the idea of having such a high resolution to render N64 levels of image quality...

Yeah I see your point. If I was a 4K/60hz gamer I wouldn't care about getting higher than 60 fps either and of course could tolerate some dips as long as it doesn't go too low.
 
I think a lot of this is all bullshit at 4k and is made up unless you magnify your screenshots beyond what you would actually perceive in high speed gaming, Especially all the 27 to 40 inch 4k monitor screens where 4K makes minimal difference. I had a 43" LG 4K60hz with HDR (waiting on replacement projector) and could not see a damn difference with any form of AA on or off. Sure, screenshot, put your head against the monitor and you barely see a difference. AA on 4K is a joke at best IN MY OPINION and some factual. You went from 2 million or 3 million pixels to 8.8 million on that same small screen? Correct?

I sit here with this opinion because I play at 4K on a 100" screen. I turn that shit off. I have tried an HT2550 DLP 4K, Optoma UHD 60, Vivtek 2288, and this Epson faux 4K and not one damn PJ has shown a major difference with AA on or off on 100" screen. Only a useless performance hit. And before anyone says anything about the Shadow of Tomb Raider test I did, that was a standard test everyone else was running. I turn that crap off in everyday gaming.

Now JVC has an expensive ass 8K eshift projector coming out and Samsung a Qled 85" 8K tv coming out. LG also. Sony by CES 2019. Once 8K hits relevance by 2020, are you seriously going to sit here and tell everyone the need for AA????

Please don't quote me. Nothing is going to change my opinions and facts that I have seen and tested. And if you tell me you can see the difference, without screenshoting and zooming in GREAT!! I am calling the government because one of their super cyborgs greater 20/05 vision has escaped!!
 
^^ LOL

Ya must be blind man. Don’t know what else could be wrong. Anybody that can’t tell AA on/off on a 43 has eye problems.
That’s what I game on and it’s night/day. I’ve had both a 40 and a 43 and also a 75.

PJs are blurry messes to begin with - one of many reasons not to own a PJ setup. You’ve got blurry AA built in, lol. Just like a rear-projection setup.
Free blur for your enjoyment. No thanks.
 
SnowBeast You're probably right about that on a projector. I have a 125" 1080p projector, and the image is soft, even without AA, so no problems there.

But on my 55" 4K TV, I can clearly see aliasing if there is not AA enabled. I never really liked MSAA as I felt the performance cost was too much, but post-process AA is acceptable to me and does improve the look of the image. Granted, I sit about 2 feet away from my 55" TV (used as a monitor), so the pixel density is probably similar to a 24" 1080p monitor at that distance. So AA is still very much of use.
 
Yep aliasing has been perfectly visible for me on 4k panels from 27 to 32 inches at about 2 feet away. A projector is not the same thing sitting 12 feet away.
 
Remember, still-frame image quality is a TOTALLY different beast to motion quality. Jagged, aliased images look MUCH worse in motion. So even though you may not be able to see much difference still-to-still, in motion, on an uncompressed video or source output, it is obvious. This is why TXAA is so popular even though "it's just a blur filter lololol" because it looks SO much better in motion.
 
Again that's the problem, you're always going to have to make cuts somewhere in most games. Now if I get a 2080 ti I might have to make very few if any cuts @ 1440p. I guess the conclusion we might draw from tomb raider is that you won't be playing that at 4k 120 fps unless you want to spend a ton of cash.
I would never turn AO off. Doing so would be an immersion killer. I'd turn the quality down to basic SSAO to save frames, but I'd never turn it all the way off. SSAA is a nice benefit to have if there are frames to spare, but it's not going to make or break the experience.
At 4k do you even need AA?
No matter the resolution, you at least need a post process AA method, otherwise you're going to get shimmering on all of those wonderful specular effects made in the pixel pipeline. SMAA is usually a happy medium between barebones FXAA and aggressive TAA, but like I said it seems the algorithm for it isn't consistent between developers.
 
I would never turn AO off. Doing so would be an immersion killer. I'd turn the quality down to basic SSAO to save frames, but I'd never turn it all the way off. SSAA is a nice benefit to have if there are frames to spare, but it's not going to make or break the experience.

No matter the resolution, you at least need a post process AA method, otherwise you're going to get shimmering on all of those wonderful specular effects made in the pixel pipeline. SMAA is usually a happy medium between barebones FXAA and aggressive TAA, but like I said it seems the algorithm for it isn't consistent between developers.

Different strokes I guess. No setting is really an immersion killer for me.
 
Back
Top