Will we ever need more than 1 top end card from now on? Will SLI ever be seamless hardware wise?

Right. VHS tapes were around 352x240 resolution and it looked real.

There are probably a lot of things that can be done to increase graphics that are not only bumps to resolution.

I mean, I've gone from 1080p -> 1440p Surround -> 4K -> back to 1080p ultrawide, and I actually like my current setup best.

Resolution is not everything, and you can compare a 1080p game from 10 years ago to 1080p today and it is a world of a difference.
 
Went from 4K to 1440P HDR 144hz -> HDR definitely better and 144hz keeps everything pretty much in FreeSync range and when above it just doesn't matter that much. I would think a step up would be 4K+ 144hz+ HDR 1000nit. Problem with HDR is not enough games support it well but when they do, it is amazing.
 
Yeah, but it runs ridiculously well on just about anything already. I'm not sure what we'd get out of it if it did support it. In a recent video id was talking about how they've seen 400fps on Doom Eternal on some of their internal hardware with the upper limit (obviously in the future) of 1000. Let's assume for a moment that on top end consumer hardware (single GPU) we can get 200fps. What more could you ask for in a game that looks that good?
4K with ray tracing in the future?
 
I just went 4K, HDR1000 @ 120Hz here... the color quality and smoothness of 4K is amazing. I can't go back, no way in hell. The 2080Ti does a decent job of driving most games I have and maintaining 75~90 FPS fully maxed on games. The only exception is RDR2, where I had to compromise on a few settings, but the HDR still makes it look way better than it did on my 2K IPS monitor.

Went 1080P 120Hz --> 2K 144Hz IPS Gsync --> 4K 120Hz IPS Gsync HDR1000
 
I just went 4K, HDR1000 @ 120Hz here... the color quality and smoothness of 4K is amazing. I can't go back, no way in hell. The 2080Ti does a decent job of driving most games I have and maintaining 75~90 FPS fully maxed on games. The only exception is RDR2, where I had to compromise on a few settings, but the HDR still makes it look way better than it did on my 2K IPS monitor.

Went 1080P 120Hz --> 2K 144Hz IPS Gsync --> 4K 120Hz IPS Gsync HDR1000
You must not own very many demanding games if Red Dead Redemption 2 is the only one you have that can't "maintain 75 to 90 FPS fully maxed". There are plenty of games that can't even average anywhere near that much less maintain it.

Edit: In fact why don't I name plenty of games that most certainly don't maintain 75 to 90 fps fully maxed at 4k. And even though you actually said fully maxed, I'll be sensible and even leave off ray tracing and any demanding forms of anti-aliasing. Other than that I do mean fully maxed and not just the highest preset because many times you will have to go manually adjust a few options. I doubt any of these games can even maintain 60fps as most can't even average it. Heck a few can't even average 50 fps and will have minimums even in the 40s and 30s.

Hellblade Senua's Sacrifice
Borderlands 3
Quantum Break
Rage 2
Gears 5
The Outer Worlds
Assassin's Creed Odyssey
Deus Ex Mankind Divided
Mafia 3
Surge 2
Final Fantasy XV
Ghost Recon Wildlands
Ghost Recon Breakpoint
Monster Hunter World
Control
Watch Dogs 2
Kingdom Come Deliverance
 
Last edited:
You must not own very many demanding games if Red Dead Redemption 2 is the only one you have that can't "maintain 75 to 90 FPS fully maxed". There are plenty of games that can't even average anywhere near that much less maintain it.

Edit: In fact why don't I name plenty of games that most certainly don't maintain 75 to 90 fps fully maxed at 4k. And even though you actually said fully maxed, I'll be sensible and even leave off ray tracing and any demanding forms of anti-aliasing. Other than that I do mean fully maxed and not just the highest preset because many times you will have to go manually adjust a few options. I doubt any of these games can even maintain 60fps as most can't even average it. Heck a few can't even average 50 fps and will have minimums even in the 40s and 30s.

Hellblade Senua's Sacrifice
Borderlands 3
Quantum Break
Rage 2
Gears 5
The Outer Worlds
Assassin's Creed Odyssey
Deus Ex Mankind Divided
Mafia 3
Surge 2
Final Fantasy XV
Ghost Recon Wildlands
Ghost Recon Breakpoint
Monster Hunter World
Control
Watch Dogs 2
Kingdom Come Deliverance

Yeah, TBH, I don't own any of those games on that list except Control. I generally play 1 SP game at a time and only keep 1 or 2 MP games to fill in the gaps.

Active games I have installed:
- RDR2
- BFV
- SOTTR
- Control (forgot this one, very demanding if using indirect diffuse lighting)
- Wolfenstein youngblood
- Metro Exodus

But, from my perspective, everything runs great and smooth for me.

Don't get me wrong, I wish SLI was widely supported, I'd buy another 2080Ti for sure, but sadly, support dropped off which is why I went from 2 1080's to a single card solution. That and I absolutely wanted to experience RT and DLSS.
 
You must not own very many demanding games if Red Dead Redemption 2 is the only one you have that can't "maintain 75 to 90 FPS fully maxed". There are plenty of games that can't even average anywhere near that much less maintain it.

Edit: In fact why don't I name plenty of games that most certainly don't maintain 75 to 90 fps fully maxed at 4k. And even though you actually said fully maxed, I'll be sensible and even leave off ray tracing and any demanding forms of anti-aliasing. Other than that I do mean fully maxed and not just the highest preset because many times you will have to go manually adjust a few options. I doubt any of these games can even maintain 60fps as most can't even average it. Heck a few can't even average 50 fps and will have minimums even in the 40s and 30s.

Hellblade Senua's Sacrifice
Borderlands 3
Quantum Break
Rage 2
Gears 5
The Outer Worlds
Assassin's Creed Odyssey
Deus Ex Mankind Divided
Mafia 3
Surge 2
Final Fantasy XV
Ghost Recon Wildlands
Ghost Recon Breakpoint
Monster Hunter World
Control
Watch Dogs 2
Kingdom Come Deliverance
I didn't know "fully maxing" image quality was a prerequisite to running a game in 4K resolution. But out of the games I've played in that list:
  • Hellblade - all in-game settings, never experienced any performance issues
  • Rage 2 - all in-game settings, runs mostly around 60-70 FPS peaking in the 90s and never dropping below 50
  • Gears 5 - all in-game settings, has occasional drops into the 50s but stays above 70 FPS 90% of the time
  • Kingdom Come: Deliverance - rendering distances turned down by around half, more of a CPU bottleneck than a GPU
I have FFXV and Deus Ex, but have not played it since getting my 4K monitor and 2080 Ti. Previously played them on a Titan X with PG278Q.
 
I didn't know "fully maxing" image quality was a prerequisite to running a game in 4K resolution. But out of the games I've played in that list:
  • Hellblade - all in-game settings, never experienced any performance issues
  • Rage 2 - all in-game settings, runs mostly around 60-70 FPS peaking in the 90s and never dropping below 50
  • Gears 5 - all in-game settings, has occasional drops into the 50s but stays above 70 FPS 90% of the time
  • Kingdom Come: Deliverance - rendering distances turned down by around half, more of a CPU bottleneck than a GPU
I have FFXV and Deus Ex, but have not played it since getting my 4K monitor and 2080 Ti. Previously played them on a Titan X with PG278Q.
I was quoting him as he was the one that mentioned fully maxed. And not experiencing any issues has nothing to do with what I said about actual frame rate. He had named a specific frame rate and I addressed that in the games that I mentioned as they were not capable of that.
 
Just FYI, LG's current OLED TVs (B9 and up) actually support 4k 120 Hz (with VRR/G-Sync/Freesync as well). Same deal with Samsung's Q90R, just with no G-Sync, only Freesync. There's just no graphics cards out now with HDMI 2.1 on them to test them out with. Dunno why they don't just put DisplayPort on high-end TVs now at this point too.
I just got one! The C9 since the B9 was out of stock. I upgraded from a circa 2008 Vizio we got as a hand me down from my wife’s grandparents. It looks bonkers good by comparison and it’s my first high FPS screen so I am lovin it. But I am waiting with bated breath for the display port to hdmi 2.1 cables announced recently so I can hit that sweet 4K 120hz at least for older games and some fps games like destiny 2
 
I just got one! The C9 since the B9 was out of stock. I upgraded from a circa 2008 Vizio we got as a hand me down from my wife’s grandparents. It looks bonkers good by comparison and it’s my first high FPS screen so I am lovin it. But I am waiting with bated breath for the display port to hdmi 2.1 cables announced recently so I can hit that sweet 4K 120hz at least for older games and some fps games like destiny 2

I got a 65" B9 last Black Friday on sale for $1600, so I couldn't pass that up since I just moved and needed a new TV for the last half of last year, hah. It's definitely been an awesome TV so far and I can't wait for the next gen consoles to come out to take advantage of the VRR capabilities. I've been temted to drag my RTX 2080 rig in there to play around with G-Sync at 1440p 120 Hz and HDR (even though I already have all that on my UW PC monitor) though. Those blacks and contrast are constantly amaze me, so much that any other LCD/LED TV I've seen pales so much in comparison and from what I've seen the B9 gets plenty bright enough for me as well, so I'd like to see the C9 in person to see if you can tell that much of a difference in bright HDR scenes, since that's pretty much the biggest difference between the B9 and C9.
 
Bi
I got a 65" B9 last Black Friday on sale for $1600, so I couldn't pass that up since I just moved and needed a new TV for the last half of last year, hah. It's definitely been an awesome TV so far and I can't wait for the next gen consoles to come out to take advantage of the VRR capabilities. I've been temted to drag my RTX 2080 rig in there to play around with G-Sync at 1440p 120 Hz and HDR (even though I already have all that on my UW PC monitor) though. Those blacks and contrast are constantly amaze me, so much that any other LCD/LED TV I've seen pales so much in comparison and from what I've seen the B9 gets plenty bright enough for me as well, so I'd like to see the C9 in person to see if you can tell that much of a difference in bright HDR scenes, since that's pretty much the biggest difference between the B9 and C9.
When I first turned it on it was too bright for the room and image I had on it which made me relieved since I heard that’s the only drawback of OLED. I got the 55” with the 5year BestBuy warranty since it covers burn in. I really just didn’t want to wait for an order to come in since I just completed my 3900x 2080ti build and the price bump wasn’t enough to make me wait but even the bestbuy rep was saying there was almost no discernible difference. But yeah I bought c9 for the gsync almost exclusively and it is such a difference from 1080p 60hz no gsync. I am not even really playing games much on it yet just gawking and seeing the eye candy on my games is distracting enough. I am having a little trouble with all the options and like figuring out what has HDR and what doesn’t and when it doesn’t matter. Not to mention the ARC pass through is bring finicky on my older receiver but playing games at 1440p 120hz gsync is just awesome
 
I’m also fine with it going away. If id can make the Doom games fly like they do on modest(ish) hardware and scale up from there, then other devs should work a bit harder to get their games looking good on a similar range of hardware. I guess they all can’t be id though can they :D

Checkerboard... hmm... sounds like PowerVR’s tile rendering. :D That was always a good idea, just never implemented well until fancier mobile chipsets really. Never really hit its stride on PC platforms. Of course I need to read the article. Could be nothing like it. Just thought of PowerVR when I read that.
I hope checkerboard rendering goes somewhere. I sprung for an X570 Taichi so I could have NVLINK support in the future. I’m hoping to grab another 2080ti instead of a newer model
 
I hope checkerboard rendering goes somewhere. I sprung for an X570 Taichi so I could have NVLINK support in the future. I’m hoping to grab another 2080ti instead of a newer model

I'd love to see them come up with something that works reliably. I know they're capable of doing it. I'm guessing it (mGPU) just hasn't been their main focus for a long time. Just sort of keeping it alive and supported as a by-product maybe (on the gaming side anyway).

Personally, I'll probably just keep following the same rule I have been for the last few years. Buy the new x70 card and be happy. :D I haven't dumped money into top-end let alone multiple top-end GPUs for a long time. My GTX-295 and pair of GTX-285s was probably the last time really. Since then, I typically use a single x70. It's kind of the sweet spot for how I play games.

That said, I love seeing people build insane mult-gpu rigs. I just have too many expensive hobbies to devote that kind of time and money to games these days.
 
I'd love to see them come up with something that works reliably. I know they're capable of doing it. I'm guessing it (mGPU) just hasn't been their main focus for a long time. Just sort of keeping it alive and supported as a by-product maybe (on the gaming side anyway).

Personally, I'll probably just keep following the same rule I have been for the last few years. Buy the new x70 card and be happy. :D I haven't dumped money into top-end let alone multiple top-end GPUs for a long time. My GTX-295 and pair of GTX-285s was probably the last time really. Since then, I typically use a single x70. It's kind of the sweet spot for how I play games.

That said, I love seeing people build insane mult-gpu rigs. I just have too many expensive hobbies to devote that kind of time and money to games these days.


I am in the same boat. My rule is to buy the best used card around $200 when my current one is chugging along and I am not getting enough performance. Currently using an R9 Fury Nano I picked up for $100 last year, and its serving me well since I went to SFF.

I am hoping that this next gen of AMD cards, along with the next console gen will drive the used market way down.
 
I am in the same boat. My rule is to buy the best used card around $200 when my current one is chugging along and I am not getting enough performance. Currently using an R9 Fury Nano I picked up for $100 last year, and its serving me well since I went to SFF.

I am hoping that this next gen of AMD cards, along with the next console gen will drive the used market way down.

I was actually very happy with my GTX-1070 still. It played everything I threw at it perfectly at the settings I like to play at. I could have probably run it for another year or two actually. :D However, I got an itch to dabble with RTX, and got a decent deal on a 2070 Super, so I got one. If it wasn't for me being RTX-curious I would have just stayed with the 1070 for a lot longer. I am glad that I got the 2070S though, as Doom will be nice on it, and I'll eventually start grabbing more RTX titles. I'm actually looking forward to more source ports and updates of old games using RTX than new ones if I'm honest. I'd love to see it implemented in Quake, Q3A, Doom 3, Quake 4, maybe System Shock 2, etc. (wishful thinking I know, but you never know... I really enjoyed the Q2 RTX update.)
 
I was actually very happy with my GTX-1070 still. It played everything I threw at it perfectly at the settings I like to play at. I could have probably run it for another year or two actually. :D However, I got an itch to dabble with RTX, and got a decent deal on a 2070 Super, so I got one. If it wasn't for me being RTX-curious I would have just stayed with the 1070 for a lot longer. I am glad that I got the 2070S though, as Doom will be nice on it, and I'll eventually start grabbing more RTX titles. I'm actually looking forward to more source ports and updates of old games using RTX than new ones if I'm honest. I'd love to see it implemented in Quake, Q3A, Doom 3, Quake 4, maybe System Shock 2, etc. (wishful thinking I know, but you never know... I really enjoyed the Q2 RTX update.)
Are you gaming at 720p?

just kidding :D :D

I had a GTX1070, nice card.
 
Are you gaming at 720p?

just kidding :D :D

I had a GTX1070, nice card.

Close. I play all my games at 1080. I have an 86" 4K TV in the living room, but still play at 1080 on that, and at my desk. I just like to crank up the eye candy features in games, I like synced frames at 60 or 120, and that's pretty much it. I'll make a resolution jump when I can comfortably do it on an xx70 class card. I'm much more sensitive to time-related artifacts than I am pixels. I still play a ton of pixel-art games, classic games, retro games, etc. That's not to say that I don't like a nice crisp image, but I'm more into what's being done on the screen than the resolution it's being done at. (within reason of course) I'm also into effects, so things like Doom and Atomic Heart are very interesting to me, where the latest-best-realism-whiz-bang-military-simulation-game-plusplus are not quite as interesting.
 
Last edited by a moderator:
I'm still mostly on my beater; a GTX970 backed by an 8700K.

I remain impressed, despite having to turn nearly every setting down to medium or low (or off!).
 
I have never really felt the need for an SLI tbh. One card has always been enough to enjoy my games
 
Back
Top