Let's put this shit into perspective. I was gaming at 2560x1600 in 2009.
And how do those 2009 pixels compare to 2020 pixels? Increased resolution is the worst way to use graphics power.
We don’t need more pixels. We need better ones.
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
Let's put this shit into perspective. I was gaming at 2560x1600 in 2009.
And how do those 2009 pixels compare to 2020 pixels? Increased resolution is the worst way to use graphics power.
We don’t need more pixels. We need better ones.
And how do those 2009 pixels compare to 2020 pixels? Increased resolution is the worst way to use graphics power.
We don’t need more pixels. We need better ones.
This. More pixels and better ones like oled.No actually we need more pixels.
No actually we need more pixels.
Real ones, obviously .That’s easy. Turn on DSR. Voila 4x more pixels that looks 10% better and craters performance.
Real ones, obviously .
4K with ray tracing in the future?Yeah, but it runs ridiculously well on just about anything already. I'm not sure what we'd get out of it if it did support it. In a recent video id was talking about how they've seen 400fps on Doom Eternal on some of their internal hardware with the upper limit (obviously in the future) of 1000. Let's assume for a moment that on top end consumer hardware (single GPU) we can get 200fps. What more could you ask for in a game that looks that good?
DLSS can take the load off that.4K with ray tracing in the future?
You must not own very many demanding games if Red Dead Redemption 2 is the only one you have that can't "maintain 75 to 90 FPS fully maxed". There are plenty of games that can't even average anywhere near that much less maintain it.I just went 4K, HDR1000 @ 120Hz here... the color quality and smoothness of 4K is amazing. I can't go back, no way in hell. The 2080Ti does a decent job of driving most games I have and maintaining 75~90 FPS fully maxed on games. The only exception is RDR2, where I had to compromise on a few settings, but the HDR still makes it look way better than it did on my 2K IPS monitor.
Went 1080P 120Hz --> 2K 144Hz IPS Gsync --> 4K 120Hz IPS Gsync HDR1000
You must not own very many demanding games if Red Dead Redemption 2 is the only one you have that can't "maintain 75 to 90 FPS fully maxed". There are plenty of games that can't even average anywhere near that much less maintain it.
Edit: In fact why don't I name plenty of games that most certainly don't maintain 75 to 90 fps fully maxed at 4k. And even though you actually said fully maxed, I'll be sensible and even leave off ray tracing and any demanding forms of anti-aliasing. Other than that I do mean fully maxed and not just the highest preset because many times you will have to go manually adjust a few options. I doubt any of these games can even maintain 60fps as most can't even average it. Heck a few can't even average 50 fps and will have minimums even in the 40s and 30s.
Hellblade Senua's Sacrifice
Borderlands 3
Quantum Break
Rage 2
Gears 5
The Outer Worlds
Assassin's Creed Odyssey
Deus Ex Mankind Divided
Mafia 3
Surge 2
Final Fantasy XV
Ghost Recon Wildlands
Ghost Recon Breakpoint
Monster Hunter World
Control
Watch Dogs 2
Kingdom Come Deliverance
I didn't know "fully maxing" image quality was a prerequisite to running a game in 4K resolution. But out of the games I've played in that list:You must not own very many demanding games if Red Dead Redemption 2 is the only one you have that can't "maintain 75 to 90 FPS fully maxed". There are plenty of games that can't even average anywhere near that much less maintain it.
Edit: In fact why don't I name plenty of games that most certainly don't maintain 75 to 90 fps fully maxed at 4k. And even though you actually said fully maxed, I'll be sensible and even leave off ray tracing and any demanding forms of anti-aliasing. Other than that I do mean fully maxed and not just the highest preset because many times you will have to go manually adjust a few options. I doubt any of these games can even maintain 60fps as most can't even average it. Heck a few can't even average 50 fps and will have minimums even in the 40s and 30s.
Hellblade Senua's Sacrifice
Borderlands 3
Quantum Break
Rage 2
Gears 5
The Outer Worlds
Assassin's Creed Odyssey
Deus Ex Mankind Divided
Mafia 3
Surge 2
Final Fantasy XV
Ghost Recon Wildlands
Ghost Recon Breakpoint
Monster Hunter World
Control
Watch Dogs 2
Kingdom Come Deliverance
I was quoting him as he was the one that mentioned fully maxed. And not experiencing any issues has nothing to do with what I said about actual frame rate. He had named a specific frame rate and I addressed that in the games that I mentioned as they were not capable of that.I didn't know "fully maxing" image quality was a prerequisite to running a game in 4K resolution. But out of the games I've played in that list:
I have FFXV and Deus Ex, but have not played it since getting my 4K monitor and 2080 Ti. Previously played them on a Titan X with PG278Q.
- Hellblade - all in-game settings, never experienced any performance issues
- Rage 2 - all in-game settings, runs mostly around 60-70 FPS peaking in the 90s and never dropping below 50
- Gears 5 - all in-game settings, has occasional drops into the 50s but stays above 70 FPS 90% of the time
- Kingdom Come: Deliverance - rendering distances turned down by around half, more of a CPU bottleneck than a GPU
I just got one! The C9 since the B9 was out of stock. I upgraded from a circa 2008 Vizio we got as a hand me down from my wife’s grandparents. It looks bonkers good by comparison and it’s my first high FPS screen so I am lovin it. But I am waiting with bated breath for the display port to hdmi 2.1 cables announced recently so I can hit that sweet 4K 120hz at least for older games and some fps games like destiny 2Just FYI, LG's current OLED TVs (B9 and up) actually support 4k 120 Hz (with VRR/G-Sync/Freesync as well). Same deal with Samsung's Q90R, just with no G-Sync, only Freesync. There's just no graphics cards out now with HDMI 2.1 on them to test them out with. Dunno why they don't just put DisplayPort on high-end TVs now at this point too.
I just got one! The C9 since the B9 was out of stock. I upgraded from a circa 2008 Vizio we got as a hand me down from my wife’s grandparents. It looks bonkers good by comparison and it’s my first high FPS screen so I am lovin it. But I am waiting with bated breath for the display port to hdmi 2.1 cables announced recently so I can hit that sweet 4K 120hz at least for older games and some fps games like destiny 2
When I first turned it on it was too bright for the room and image I had on it which made me relieved since I heard that’s the only drawback of OLED. I got the 55” with the 5year BestBuy warranty since it covers burn in. I really just didn’t want to wait for an order to come in since I just completed my 3900x 2080ti build and the price bump wasn’t enough to make me wait but even the bestbuy rep was saying there was almost no discernible difference. But yeah I bought c9 for the gsync almost exclusively and it is such a difference from 1080p 60hz no gsync. I am not even really playing games much on it yet just gawking and seeing the eye candy on my games is distracting enough. I am having a little trouble with all the options and like figuring out what has HDR and what doesn’t and when it doesn’t matter. Not to mention the ARC pass through is bring finicky on my older receiver but playing games at 1440p 120hz gsync is just awesomeI got a 65" B9 last Black Friday on sale for $1600, so I couldn't pass that up since I just moved and needed a new TV for the last half of last year, hah. It's definitely been an awesome TV so far and I can't wait for the next gen consoles to come out to take advantage of the VRR capabilities. I've been temted to drag my RTX 2080 rig in there to play around with G-Sync at 1440p 120 Hz and HDR (even though I already have all that on my UW PC monitor) though. Those blacks and contrast are constantly amaze me, so much that any other LCD/LED TV I've seen pales so much in comparison and from what I've seen the B9 gets plenty bright enough for me as well, so I'd like to see the C9 in person to see if you can tell that much of a difference in bright HDR scenes, since that's pretty much the biggest difference between the B9 and C9.
I hope checkerboard rendering goes somewhere. I sprung for an X570 Taichi so I could have NVLINK support in the future. I’m hoping to grab another 2080ti instead of a newer modelI’m also fine with it going away. If id can make the Doom games fly like they do on modest(ish) hardware and scale up from there, then other devs should work a bit harder to get their games looking good on a similar range of hardware. I guess they all can’t be id though can they
Checkerboard... hmm... sounds like PowerVR’s tile rendering. That was always a good idea, just never implemented well until fancier mobile chipsets really. Never really hit its stride on PC platforms. Of course I need to read the article. Could be nothing like it. Just thought of PowerVR when I read that.
I hope checkerboard rendering goes somewhere. I sprung for an X570 Taichi so I could have NVLINK support in the future. I’m hoping to grab another 2080ti instead of a newer model
I'd love to see them come up with something that works reliably. I know they're capable of doing it. I'm guessing it (mGPU) just hasn't been their main focus for a long time. Just sort of keeping it alive and supported as a by-product maybe (on the gaming side anyway).
Personally, I'll probably just keep following the same rule I have been for the last few years. Buy the new x70 card and be happy. I haven't dumped money into top-end let alone multiple top-end GPUs for a long time. My GTX-295 and pair of GTX-285s was probably the last time really. Since then, I typically use a single x70. It's kind of the sweet spot for how I play games.
That said, I love seeing people build insane mult-gpu rigs. I just have too many expensive hobbies to devote that kind of time and money to games these days.
I am in the same boat. My rule is to buy the best used card around $200 when my current one is chugging along and I am not getting enough performance. Currently using an R9 Fury Nano I picked up for $100 last year, and its serving me well since I went to SFF.
I am hoping that this next gen of AMD cards, along with the next console gen will drive the used market way down.
Are you gaming at 720p?I was actually very happy with my GTX-1070 still. It played everything I threw at it perfectly at the settings I like to play at. I could have probably run it for another year or two actually. However, I got an itch to dabble with RTX, and got a decent deal on a 2070 Super, so I got one. If it wasn't for me being RTX-curious I would have just stayed with the 1070 for a lot longer. I am glad that I got the 2070S though, as Doom will be nice on it, and I'll eventually start grabbing more RTX titles. I'm actually looking forward to more source ports and updates of old games using RTX than new ones if I'm honest. I'd love to see it implemented in Quake, Q3A, Doom 3, Quake 4, maybe System Shock 2, etc. (wishful thinking I know, but you never know... I really enjoyed the Q2 RTX update.)
Are you gaming at 720p?
just kidding
I had a GTX1070, nice card.