Separate names with a comma.
Discussion in 'General Gaming' started by Comixbooks, Nov 19, 2018.
Nice vid. Pretty much what my own conclusions are.
At sub 27" inches 4K makes no sense for me but I went from a 42" 1080p to a 50" 4K and the difference is night and day. Not just for games but also for work.
I'll take 4K over high refresh anyday of the week.
Linus is dumb.
I agree that the current cost-to-benefit ratio of 4K gaming is dumb for the average gamer.
A resolution bump is always beneficial to image quality but at current costs for decent monitors and even a graphics card (or cards) that can game enjoyably at 4K puts 4K gaming squarely in the 'dumb' category for now.
I went Gsync for a while on an Ultrawide, which I consider 21:9 the most "dumb" in terms of overall adoption rates, and then playing older games that don't won't ever support them (without having to go get fixes/patches on your own). I honestly thought I was going to hate going to 4k and losing Gsync, never noticed it. Samsung 43 curved 4k with HDR, I'll never look back. No garbage Acer/Asus monitor at this point looks as sharp as a nice 4k TV. I also have to say I do not game competitively any longer, so I always choose what's "prettiest" at this point.
I tend to agree. Hell I'm still rocking a 1080 27" and honestly have no complaints. 144Hz, near zero lag and the 1080 resolution means I can stay awfully close to that 144 even with all the eye candy cranked up.
I do admit that 1440 is perfect for 27" and probably even 32".
Games aren't getting much more demanding and monitors image quality isn't getting that much better so manufacturers of both monitors AND GPU's need a reason to make you upgrade. Thus 4K.
Anytging over 60 fpsa is also dumb
I have an LG 4K OLED for my TV/Blu-ray watching but I'm satisfied with my 1440p G-Sync monitor for gaming
OK, I just watched the entire video...he's right about a lot...here at [H] there's always the subset of members who jump at every new video card release and every new technology because they assume that newer equals greater without really knowing why...1440p 144hz is indeed still the sweet spot
And why is that?
Even with how old i'm getting the difference between 60FPS and 120FPS on a 120hz capable panel is fairly noticeable.
Only because it can be driven. I have two 27'' panels. One is the Dell 1440 144hz panel and the other is the Acer X27. 4k @ 120hz full 8-bit RGB looks incredible. The issue is that any AAA demanding title you generally can't run at 120fps, let alone even 98fps which is full 10-bit RGB maxing out DP 1.4.
That being said, i'd rather the X27 been a 1440 panel obviously. I feel that 4k is just overkill for 27''. Unfortunately if you want high refresh rate / g-sync / HDR1000 they just don't give you a choice. You're stuck with 4k.
The X27 pretty much forced me to also get a 2080ti to drive it properly, and even then it's just not enough for the most demanding titles. Although most games are generally running anywhere between 60-100 FPS and with g-sync it's not terrible.
I'm of the opinion that it depends on what type of games you're playing. It's also worth noting that this is talking about desk setups. Once you start PC gaming on a massive TV with surround sound, it's tough to go back to a desk.
That video was hard to watch. I barely made it a minute into it. Because of the over acting type personality of that guy and because of the people looking at the different screens, yep this ones nicer, no this one is but this one isn't bad. I mean were the panels the same, same color gamut, exact same brightness, saturation etc.
I don't agree with this. I have a 240hz monitor and there is no way I'd go back to 144hz let alone the 60hz that most TVs are. I also have a 65in 4k TV and while it does look glorious I can't stand the lower refresh rates and latency in it.
Anyways, I really think are two types of "game" experiences when you look at it. There is online playing with/against people where there is an element of competition, and then there is the offline grand cinematic type experience.
If you're playing online games against other people 4k is stupid and does not make sense. And it won't make sense until 144hz-240hz 1ms monitors at 4k resolution become reasonably priced and can be driven solidly with one card. On the other hand, if your idea of gaming is jumping into a single player RPG and losing yourself for a couple hours than yeah a large 4k screen with surround speakers might be a bit more up your alley.
Pretty sure his post was sarcasm.
I think we can all agree that a 27" 1440p is the current "sweet spot".
How many times have we heard this? We can put it up the shelf right next to "your eyes can't see more than 24 fps"
No we can't. There is no such thing as a sweet spot. The more resolution the better. It's an entirely different question whether our GPUs can produce enough FPS at a certain resolution. But that doesn't mean 4k is dumb, it just means we don't have gpus that are powerful enough for 4k yet.
.... I never implied 4k is dumb.
I'm merely stating for the price/performance and the current state of graphics cards, that if you want the best image quality FOR THE MONEY: a 27" 1440p display is the best size/pixel count..
Now, if you have the disposable income, get dual 2080 ti's and a 50" 4K curved display or whatever.
But what some of you guys need to realize is: most [H] sig rigs are in the top 0.01%.
Since SLI seems to be dead, you can't even have it for any money currently.
Users on this forum, or anyone for that matter, that can drive 4K games at enjoyable frame rates will see the headline "4k gaming is dumb" and not even think twice about why most people would consider it dumb.
This is exactly how I feel but I'm fine with 100hz-144hz. The 1ms latency of monitors is important too as far as I can tell, I know not all TV's are the same but the latency on my OLED sucks compared to my Acer monitor. End result is I only enjoy single player cinematic games on the TV, anything fast or competitive I would just rather play that on the PC with a fast monitor.
"For the money" is exactly the point. a 27" 1440p Freesync setup with a Vega 64 is maybe $700 out the door (thanks to $400 vega 64's becoming available recently). A 27" 4k G-sync experience is in the $2000+ range and you'll get better framerates with the other setup. Is 4k really a $1300 better experience? I can think of a lot of things I could do with $1300.
I LOL'd at that because it's true.
27" G-Sync 1440p 144HZ mated to a 7700k/1080ti. i consider this an ideal configuration.
27" 4k is dumb. 40"+ 4k is awesome.
4k gaming is so dumb, I wouldn't give up my 43" 4k monitor for anything less.
How is this moron popular, I refuse to click any links because I don't want to contribute to his views.
Eh, I game on a 1440p 32 inch 75HZ display. Fine with me.
Running a 28" 4K Monoprice monitor and OC'd Vega 64. I no longer care about competitive FPS so I don't need 1000FPS in CS:GO or 144FPS in Battlefield. What I do enjoy is the graphical fidelity and cinematic experience 4K gaming can bring. I'm ecstatic if I get 4K 60FPS. If my Vega 64 isnt close then I drop to 1440P and it still looks great.
Between work and family life my time for gaming Is occasional. It's the experience that matters most to me.
43” 4K @ 2.5ft + 2080 TI is where it’s at!
I have to agree with him. While I'd take 4k and high framerates, I can't have that. However 2.5k at high framerates is quite doable. I'd much rather go above 60 fps than go above 2.5k. Now of course there is going to be a limit to that. At 120 fps, then I'd probably rather go to 4k rather than higher framerates. I'd have to see and judge for myself, but I think at that point I'd rather have a crisper image than a more fluid one. Likewise if I was at 4k 120 fps, I'd probably rather go for higher fps instead of higher rez. Basically I find a balance gives the best experience. I don't want to cut down heavily on rez to get high fps, or cut down heavily on fps to get high rez. I find 2.5k to be quite good, but 60 fps to be ok. So instead of doubling the pixels, let's double the fps.
I think that's what he means: It is the sweet spot of rez vs cost and rez vs performance. 2.5k monitors are cheap and widely available. If you only want 60Hz you can have them for as little as $200 for a cheap TN panel and $300 for IPS. For $650ish you can have have IPS, 165Hz and G-sync. That's just way more economical than you find 4k monitors. Likewise the GPU power needed to drive it is half what it takes at 4k, it is literally half the pixels. So a GPU that can only pull 30fps at 4k will do 60fps at 2k5 no problem. Hence it is a sweetspot. The cost of 4k vs 2.5k is fairly large, the cost of 2.5k vs 1080p is much smaller.
Should be close enough for the fire extinguisher. Safety first!
i think you mean 30 fps.
LInus' rant on laptop monitor resolutions is right on. Anything more than 1080p on a 17'' or less screen is silly and just wasting $2-4k on a machine that will be obsolete faster.
I think even 1080p is too much for a 15" screen. on 17" it's OK. But not from a gaming perspective, but desktop use perspective. If we had proper scaling it wouldn't matter.
4K gaming on a 48" Samsung JS9000 is glorious.
Yeah I understand what he meant now. Meanwhile I think my 27" 1440p -> 38 4K wide switch was the best upgrade I did in a decade, even if I'm stuck with 40fps in games without sli support.
You know you can get a 4K monitor for $200 these days. You can get a very large 4K TV for $300.