Dell Alienware AW3423DW 34″ QD-OLED 175Hz (3440 x 1440)

Input lag is in line with other displays of this class. A lot of the 34" ultrawides people are coming from have same or worse input lag than this. It has lower lag than the LG 34GN850 I see a lot of people use. It's not the best for sure, but not bad either.

If you do play competitively won't you be worse off anyway vs other people using even higher Hz displays. If it's a major concern seems you would want to pick a monitor that is easier to drive + higher Hz. Of course lower lag is always nice but yeah I don't see it being a major concern for most with this monitor.
 
As an aside, it is strange to me how much time and stress people spend on latency for displays for competitive shooters but nobody pays any attention to audio latency. We react to audio stimuli faster than visual stimuli on average so if absolute fastest reactions were of a concern, I'd think low audio latency would be of interest. It is something that differs too, different sound cards have different amount of buffers on the USB interface, processing, and DA itself. There is quite a bit of information on it out there as it matters a lot in the world of pro audio and stage reproduction (for sound 1ms = about 0.9 feet in the air so you deal with it when synchronizing sound sources).

Ideally if a game were all about fastest reactions you'd want it to support ASIO output so you can bypass windows sound processing, which has its own buffering, and communicate directly with a soundcard. With ASIO and a good pro card you can get latencies under 1ms from the time the card receives the signal to the time the DA is generating voltage output.
Some of us aim train and play competitive FPS like it's our career... even at the age of 40 I can't let it go.

I 100% agree with your comments about audio latency, my reaction time to audio queues is very slow compared to my visual reaction time.

The problem with Asio is that most games don't support ASIO output so you're still stuck with windows sound processing.
 
Some of us aim train and play competitive FPS like it's our career... even at the age of 40 I can't let it go.

100% fine and if that's your thing I'll never tell you that you are wrong. When it comes to games, you do what is fun. Also in the event you are good enough that it might matter, I would advocate getting extremely fast equipment as it could help. Even if you aren't good enough for it to matter I can certainly see preferring faster stuff and getting it when available.

I just don't like how some (not you) act like latency is the be-all end-all or that 10ms of latency (which it isn't even clear this has) is unplayable bad for everyone.

The problem with Asio is that most games don't support ASIO output so you're still stuck with windows sound processing.

It is something I feel like competitive gamers should harp on game devs for. If latency matters, why not audio as well? That aside, even with the 20ms of Windows buffer, different soundcards have their own internal buffer sizes and can be faster or slower.
 
And in a game like Apex, you might get no audio as it's still buggy and the server only operate at around 20hz... so 1ms vs 4ms isn't going to matter :p
 
  • Like
Reactions: elvn
like this
The reason that they may stay around isn't Gsync, but brightness. OLEDs use a lot of power to get as bright as they do and there is interest in making them brighter still. For that to work well, it will need cooling to prevent issues and image retention/burn in. Can be done with a large enough heatsink, of course, but a fan may be the solution of choice. Sometimes the heat output and size of something is going to make engineering go for a fan. My Denon X4400 is like that, first receiver I've ever had from them with a fan. They always tried to passively cool. However they would either need less amp channels, less DSP, or a larger case to go passive and none of those were things they were willing to do so it has fans.



I wouldn't hold your breath, MS seems to really not be interested in working on their font rendering these days. They haven't done any updates to cleartype in years near as I know.
I agree with all of that. Adding a fan is often cheaper than adding a bigger heatsink. One of those "beancounters said we can't do it in a way that would be the best solution" things.

I think the only thing that would make MS improve Cleartype is if OLEDs with unusual pixel structures become more common in the following years. Not holding my breath either for any updates to happen.
 
Yeah I'm not even gonna bother trying to explain why a few ms of input lag isn't going to matter. You can read elvn's essay if you want.

The last time RTings tested the 48CX it's input lag numbers were pretty low:

1080p @ 120Hz

6.9 ms

1440p @ 120Hz

6.9 ms

4k @ 120Hz

6.7 ms

1080p with Variable Refresh Rate

5.9 ms

1440p with VRR

6.2 ms

4k with VRR

5.9 ms

. . . . . . . . . . . . . . . . . . . . . .

~ 6ms at 120fpsHz 4k VRR.

With online gaming you are essentially acting on seeing interpolated frames rolled back from: typically 25ms to 40ms of transmission time (ping) + two frames, your current one and a buffered one (interp_2). So for 120fps *solid* on a 120hz screen you'd have 25ms to 40ms + 8.3ms frame + 8.3ms interp_2. Thats ~ 42ms to 57ms of time depending if 25ms or 40ms ping. 42 = 5 frames at 120fpsHz solid, and nearly 7 frames at 57ms.

Your human reaction time to act after being delivered the interpolated frame is also around 150ms at best, up to 180, 200, or 250ms, which also spans a ton of local 8.x ms frames at 120fpsHz (18 frames at 150ms reaction time, usually worse), 4.x ms frames at 240fpsHz solid -> 36 frames at 150ms reaction, etc. Again if you are capable of supplying those frame rates to the Hz reliably in the first place. Your reaction time also spans a lot of ticks of the server on that end.

So as stated, the frame state you send and the biased, manufactured/imagined, and rolled back frame you receive back span a lot of local frames on a high hz monitor to roll back from. This is true even on 60fpsHz locally on a 128tick server since it is still imagining and manufacturing a frame state to send back after processing your many frames worth of ping time (e.g. 25ms to 40ms) + your 17ms actual frame + a 17ms buffered frame (interp_2) .

It is done in a biased way realtive to all player's time+state packets based on the netcode of whichever game you might be playing (e.g. some games are incapable of simultaneous hits while others offer differing gap times to allow for it, suffer a "peeker's advantage", higher fire rate weapons coded to do more damage in fewer hits since the tick rate, ping interpolation and overall net code dynamics are incompatible with the fire rate -> aka 'super bullets', etc).

So there is a big rubberband with an interpolated~imagined action state frame sent back to you based on 128tick servers, or magnitudes worse tick rate on many (most) other games. Extremely higher hz screen's hz (outside of blur reduction benefit if/when matching frame rate to feed that Hz) and small differences in already low input lag won't really matter competitively averaged out (and across 150ms to 180ms gamer reaction times), especially online with heavily interpolated results and net code compromises rather than playing a competitive LAN match which is pretty rare now for non professionals. Online games are rife with thousands and thousands of cheaters too, even those who set up to "low key" cheats to make it believable in order to carry themselves and their teammates up ranking ladders while appearing to do so legitimately. Some even utilize a 2nd streaming rig cheat system so there is no detectable code on the gaming machine. However that is another discussion.
 
Last edited:
As an aside, it is strange to me how much time and stress people spend on latency for displays for competitive shooters but nobody pays any attention to audio latency. We react to audio stimuli faster than visual stimuli on average so if absolute fastest reactions were of a concern, I'd think low audio latency would be of interest. It is something that differs too, different sound cards have different amount of buffers on the USB interface, processing, and DA itself. There is quite a bit of information on it out there as it matters a lot in the world of pro audio and stage reproduction (for sound 1ms = about 0.9 feet in the air so you deal with it when synchronizing sound sources).

Ideally if a game were all about fastest reactions you'd want it to support ASIO output so you can bypass windows sound processing, which has its own buffering, and communicate directly with a soundcard. With ASIO and a good pro card you can get latencies under 1ms from the time the card receives the signal to the time the DA is generating voltage output.
Audio latency honestly won't become an actual issue until you are above the latencies that the windows sound system has. To me it has only become noticeable with wireless headphones where e.g. videos don't have audio in sync. Of course I would prefer if Windows provided an actual low latency system where you don't need to deal with separate ASIO drivers. MacOS is way more straightforward in this sense and surprisingly better for audio setup whereas MacOS handling for external displays is shit.
 
Audio latency honestly won't become an actual issue until you are above the latencies that the windows sound system has. To me it has only become noticeable with wireless headphones where e.g. videos don't have audio in sync. Of course I would prefer if Windows provided an actual low latency system where you don't need to deal with separate ASIO drivers. MacOS is way more straightforward in this sense and surprisingly better for audio setup whereas MacOS handling for external displays is shit.
Windows does, it is called WASAPI exclusive mode. I've seen latencies in the 1-2ms with it. The main reason I'd push ASIO is that it provides more easy, clear, user control of buffers and absolute lowest latency for professional interfaces. Games could easily offer a choice between WASAPI Shared, which has latency but lets you do things like use Discord as well, WASAPI Exclusive, for lowest latency on consumer cards, and ASIO for lowest latency on pro cards.
 
Windows does, it is called WASAPI exclusive mode. I've seen latencies in the 1-2ms with it. The main reason I'd push ASIO is that it provides more easy, clear, user control of buffers and absolute lowest latency for professional interfaces. Games could easily offer a choice between WASAPI Shared, which has latency but lets you do things like use Discord as well, WASAPI Exclusive, for lowest latency on consumer cards, and ASIO for lowest latency on pro cards.
Waste of time? The latency is not high enough to notice or cause a problem. Plenty of games rely on audio cues and all of it is fast enough to be a non issue. Including games that rely on precise timing of those elements. If it wasn't good enough games like Beat Saber, Beat mania etc would be un-playable on PC.
 
Last edited:
Waste of time? The latency is not high enough to notice or cause a problem. Plenty of games rely on audio cues and all of it is fast enough to be a non issue. Including games that rely on precise timing of those elements. If it wasn't good enough games like Beat Saber, Beat mania etc would be un-playable on PC.
It is fine for normal gaming... I'm just saying it seems silly to me that people get worked up about a couple of ms on the monitor, but not on audio, when we actually react quicker to audio cues. Personally I think so long as latency is reasonable, it is fine. I don't fuss over a monitor with a 5ms lag vs 10ms. All low enough to keep me happy. However some people get really worked up over that on a display, arguing it matters a lot of competitive gaming. I then argue if they are concerned about that, they should also be concerned about audio lag and trying to shave a few ms off of that.
 
It is fine for normal gaming... I'm just saying it seems silly to me that people get worked up about a couple of ms on the monitor, but not on audio, when we actually react quicker to audio cues. Personally I think so long as latency is reasonable, it is fine. I don't fuss over a monitor with a 5ms lag vs 10ms. All low enough to keep me happy. However some people get really worked up over that on a display, arguing it matters a lot of competitive gaming. I then argue if they are concerned about that, they should also be concerned about audio lag and trying to shave a few ms off of that.
Input lag on gaming monitors is already so low to be inconsequential. I am sure some will chime how they will notice every little millisecond but for me personally you need to have input lag over 20ms before it becomes an actual problem. I used to have the Samsung KS-8000 TV which had about 20ms input lag in SDR mode, good for TVs at the time. I had no complaints playing games at that input lag and could not really tell a difference to my ASUS PG278Q with a combined input+response time lag in the 4ms region. But the KS-8000 HDR mode was over 30ms and that became noticeably worse.

With the OLED input lag being somewhere in the 5-8ms region, I really doubt anyone can notice that. To me a bigger issue with gaming displays has been pixel response time which means added motion blur on moving content, with some displays not even being able to keep up with their max refresh rate. OLED has largely solved this too.

Competitive players are always looking for that extra edge that they believe will help and it's like a cold war arms race. I would really like to see some top tier player who plays on some shitty LCD at 60 Hz because that's all they could afford but they just got really good at the game. Let's not forget that people used to play a lot of competitive shooters on early LCDs (ignoring CRTs for sake of discussion) back in the day that were many times worse than any gaming monitor you can buy today.

I still say that audio lag is not enough of a concern to worry about. Audio lag matters a lot for things like playing instruments in real time but pretty much all games out there, even ones that rely on audio (e.g dance games etc) generally do fine. I can't think of a single instance in a game where I felt that the audio was not in sync with on screen content.
 
Yeah, i've never had an issue with audio latency with traditional setups on a computer. Only time i've ever noticed audio latency is when my USB DAC broke and I was using BT headphones, the latency of BT was noticeable.
 
Competitive players are always looking for that extra edge that they believe will help and it's like a cold war arms race. I would really like to see some top tier player who plays on some shitty LCD at 60 Hz because that's all they could afford but they just got really good at the game. Let's not forget that people used to play a lot of competitive shooters on early LCDs (ignoring CRTs for sake of discussion) back in the day that were many times worse than any gaming monitor you can buy today.
Basically Shroud when he first started, snipped below lol
 
I mean, that video shows me what I already knew by myself. Generally, 120hz is going to be enough for most people. For me personally, once it's above 100hz I can't really utilize whatever advantage anything higher would give.
 
I mean, that video shows me what I already knew by myself. Generally, 120hz is going to be enough for most people. For me personally, once it's above 100hz I can't really utilize whatever advantage anything higher would give.
I'm not even going to argue that it doesn't make a difference as it's easy to see that 240 Hz/fps is smoother than 120. It feels more responsive and is nicer, provided you can keep up that framerate consistently. But it's not going to be a make or break thing for performing well in competitive games.
 
IMO it's in LAN scenarios that such a thing would make the biggest difference. Online gaming, ping is still king and then there's just other variables. That being said, I understand those who want to have the best of the best but usually those people still complain about anything regardless lmao

That being said, love the AW for all my gaming uses so far. Be it adventure games or games such as COD. I'm basically more or less where I am FPS performance =P
 
I don't recall any details but back when I originally watched that LTT video my takeaway was that great players are great no matter what, and mediocre players are helped out by better equipment.

And that's why I have this monitor and an expensive mouse
 
I don't recall any details but back when I originally watched that LTT video my takeaway was that great players are great no matter what, and mediocre players are helped out by better equipment.

And that's why I have this monitor and an expensive mouse
The video pretty clearly shows, even with average people, that the difference between 60z and 120hz is pretty big when it comes to hitting certain types of shots.

However, from 120hz to 240hz the difference is very small, even with a pro player.
 
I mean, that video shows me what I already knew by myself. Generally, 120hz is going to be enough for most people. For me personally, once it's above 100hz I can't really utilize whatever advantage anything higher would give.
It's weird to me how people look at what seems like 100% clear data to me, and then just come out with completely random opposite conclusions.

IMO this video shows that increasing past 144hz provides a statistical advantage to all players of all skills levels, but actually makes the biggest difference the less skilled you are. That actually makes far more sense, it's not about "utilizing"(it's a display, not an input device). Highly skilled players are much better at predicting positions, so they don't need to SEE those positions as much. But if you're not skilled at predicting, seeing is everything.

Linus showed big improvements on tests that the pro gamers showed little or even negative improvement.

You can't tell me that doubling your hits on the double door test is nothing, that basically means any situation where you only see the enemy for a fraction of a second, 240hz helps you a huge amount unless you're already skilled at prediction.

1650040864629.png
.

1650041057543.png


Now personally, I'm not all about competitive FPS play, it's not that high a priority to me. But if you play a lot of them, then higher refresh rates offer a clear and proven advantage. FPSes aren't incremental games. Either you hit or you miss. That means even small improvements can lead to significant changes in hit rates.
 
Those tests are usually done on LAN games an/or vs bots locally so aren't really a fair comparison. Online gaming is a whole different animal and that is what these monitors are marketed for and where their usage is overwhelmingly (outside of single player games).

The tick rates of online games combined with interpolation guesswork and rollback over ping times and interp_2 buffered frame to a biased result (which varies based on the net code) likely averages out or makes insignificant small differences in input lag in already low input lag screens, plus factoring in 150ms -180ms (even up to 200 - 250ms) deviance in human reaction time of your game input after you got that biased result. The result is based on or results are "balanced" using those same factors across many other players too so is nothing like playing on LAN or vs bots. E.g. things like peeker's advantage happens so you could shoot at the head popping out of a doorway in an online game and see it hit on your screen but have it not register on the server. Simultaneous shots fired are also impossible on some games for example. So you could have very high fpsHz locally but what you see is not what you get from the server.

If you set up a local/lan quick draw shooting gallery you are going to get much different results than actual online gameplay in a gameworld/arena on a server.

High Hz ranges,when combined with equivalent FPS are definitely much better in general though. It's making more dots per dotted line or dotted curved line movement-wise, aka increasing articulation (or "smoothness"). The tick rates and biased interpolation of the rollback times and results muddies online gaming though. You have two sides going on,local and what the server manufactures as the result. This unequivalence is happening all the time but it's only the worst most obnoxious occurrences or when the man behind the curtain is shown more obviously that people call it out as "rubberbanding" , "peeker's advantage" , and things like high fire rate weapons ending up across fewer ticks resulting in "super bullets" among other issues. The thing is, in online gaming it's always rubberbanding and throwing manufactured results back at you, it's just a matter of how long or short that rubberband is and what kind of guesswork biases are coded in. You might only be receiving a new tick every 17ms or even every 45ms on really low tick game while your gaming monitor is running 4.x ms or 8.x ms frames. However, high hz with high fps will always cut the sample and hold blur down more and more the higher that ceiling is, provided you supply equivalent FPS. That alone can be an advantage vs moving targets and identifying targets vs the whole viewport smearing during mouse looking/movement keying and more molasses movement of 60fpsHz and lower in graphs.

For those like me who are interested in overall game aesthetics balance combined with performance, we are often at 4k resolutions or near ultrawide ones with a lot of graphics eye candy settings enabled - dialing in those settings and relying on VRR/g-sync/free-sync to find a good frame rate graph range that still lives in the higher fpsHz ranges (of 100 - 120fpsHz at least). Extremely high hz is great but for overall game aesthetics rather than stripped down/minimalist rock-em sock-em robot arena shooters we probably won't get 500fpsHz or 1000fpHz (1000 is essentially "zero" sample and hold blur) without some kind of frame duplication tech in the future.
 
Last edited:
IMO it's in LAN scenarios that such a thing would make the biggest difference. Online gaming, ping is still king and then there's just other variables. That being said, I understand those who want to have the best of the best but usually those people still complain about anything regardless lmao

That being said, love the AW for all my gaming uses so far. Be it adventure games or games such as COD. I'm basically more or less where I am FPS performance =P
If you shoot someone faster, you shoot someone faster. Ping has nothing to do with anything in this conversation. Same thing to the guy that brought up tickrate earlier.
 
If you shoot someone faster, you shoot someone faster. Ping has nothing to do with anything in this conversation. Same thing to the guy that brought up tickrate earlier.

https://www.reddit.com/r/Overwatch/comments/3u5kfg/everything_you_need_to_know_about_tick_rate/

=============================
Lag Compensation

Lag compensation is a function on the server which attempts to reduce the perception of client delay.

Without lag compensation (or with poor lag compensation), you would have to lead your target in order to hit them, since your client computer is seeing a delayed version of the game world. Essentially what lag compensation is doing, is interpreting the actions it receives from the client, such as firing a shot, as if the action had occurred in the past.

The difference between the server game state and the client game state or "Client Delay" as we will call it can be summarized as: ClientDelay = (1/2*Latency)+InterpolationDelay

An example of lag compensation in action:

  • Player A sees player B approaching a corner.
  • Player A fires a shot, the client sends the action to the server.
  • Server receives the action Xms layer, where X is half of Player A's latency.
  • The server then looks into the past (into a memory buffer), of where player B was at the time player A took the shot. In a basic example, the server would go back (Xms+Player A's interpolation delay) to match what Player A was seeing at the time, but other values are possible depending on how the programmer wants the lag compensation to behave.
  • The server decides whether the shot was a hit. For a shot to be considered a hit, it must align with a hitbox on the player model. In this example, the server considers it a hit. Even though on Player B's screen, it might look like hes already behind the wall, but the time difference between what player B see's and the time at which the server considers the shot to have taken place is equal to: (1/2PlayerALatency + 1/2PlayerBLatency + TimeSinceLastTick)
  • In the next "Tick" the server updates both clients as to the outcome. Player A sees the hit indicator (X) on their crosshair, Player B sees their life decrease, or they die.
Note: In an example where two players shoot eachother, and both shots are hits, the game may behave differently. In some games. e.g. CSGO, if the first shot arriving at the server kills the target, any subsequent shots by that player that arrive to the server later will be ignored. In this case, there cannot be any "mutual kills", where both players shoot within 1 tick and both die. In Overwatch, mutual kills are possible. There is a tradeoff here.

  • If you use the CSGO model, people with better latency have a significant advantage, and it may seem like "Oh I shot that guy before I died, but he didn't die!" in some cases. You may even hear your gun go "bang" before you die, and still not do any damage.
  • If you use the current Overwatch model, tiny differences in reaction time matter less. I.e. if the server tick rate is 64 for example, if Player A shoots 15ms faster than player B, but they both do so within the same 15.6ms tick, they will both die.
  • If lag compensation is overtuned, it will result in "I shot behind the target and still hit him"
  • If it is undertuned, it results in "I need to lead the target to hit them".
What this all means for Overwatch

Generally, a higher tick-rate server will yield a smoother, more accurate interaction between players, but it is important to consider other factors here. If we compare a tick rate of 64 (CSGO matchmaking), with a tick rate of 20 (alleged tick rate of Overwatch Beta servers), the largest delay due to the difference in tick rate that you could possibly perceive is 35ms. The average would be 17.5ms. For most people this isn't perceivable, but experienced gamers who have played on servers of different tick rates, can usually tell the difference between a 10 or 20 tick server and a 64 tick one.

Keep in mind that a higher tickrate server will not change how lag compensation behaves, so you will still experience times where you ran around the corner and died. 64 Tick servers will not fix that.

===================================
 
Last edited:
If you shoot someone faster, you shoot someone faster. Ping has nothing to do with anything in this conversation. Same thing to the guy that brought up tickrate earlier.
Weakest link in the chain matters. All I'm saying though is we shouldn't be worrying about single digit differences in input lag when dealing with such low numbers.
 
It's weird to me how people look at what seems like 100% clear data to me, and then just come out with completely random opposite conclusions.

IMO this video shows that increasing past 144hz provides a statistical advantage to all players of all skills levels, but actually makes the biggest difference the less skilled you are. That actually makes far more sense, it's not about "utilizing"(it's a display, not an input device). Highly skilled players are much better at predicting positions, so they don't need to SEE those positions as much. But if you're not skilled at predicting, seeing is everything.

Here's the thing: That is about FRAMERATE, not INPUT LAG which is what we were discussing. What started this tangent was some people and reviewers getting worked up because it seems this monitor has maybe 10ms of input lag (though that isn't clear, because it isn't clear how good the testing methodology is and others have different results) vs some similar LCDs that supposedly had about 5ms. Some were making that out to be a huge, game breaking, kind of deal. The point is that others, like me, don't really think it should be unless you are already a top competitor, in which case then you probably don't want this at all because you want something with higher frame rate.

Now if you have data that shows that a small decrease in input lag equals big performance improvements in gameplay outcomes for low-moderate skill players, then Id' be very interested in seeing it. But this is not it, this is about higher framerate.

Finally, as elvn notes, what you see in a contrived test is different than what you'll see in actual gameplay. I'd love to see a long term test done to see if actual gameplay outcomes change with 144Hz vs 240Hz. That would be very hard and expensive to do though, so it isn't likely to happen. Just keep in mind that what you see mattering in a small test case might not generalize to better results. As a kinda related example:

People work on tuning lossless audio and video codecs all the time to try and maximize the data rate reduction while keep the perceptual quality to humans the same. This isn't just done with no codecs, but optimizing existing encoders. Lots of work has been done on the LAME MP3 encoder since it is good, open source, and mp3 is popular. Well a community online became fixated on this particular test case, silk I think it was called, that was a heavy bass with short high frequency ticks. It was something that gave encoders fits. So lots of messing around was done, optimizing, an alternate preset that because the default for a bit, all to try and make it better with that test, which it did... But then when a more wide listening test was done on actual musical material, it had degraded the perceptual quality on that. The test case wasn't a good representation of what was actually going on.
 
Here's the thing: That is about FRAMERATE, not INPUT LAG which is what we were discussing.
Dude it's a forum. I replied to a user talking about a video about framerate, with the words "once it's above 100hz I can't really utilize whatever advantage anything higher would give" in his post.

That is what I was replying to. What I was quoting. Not your imaginary argument that I didn't reply to. I didn't say anything about input lag, and I don't think the input lag on this monitor is an issue.

Not sure where you got the idea that every reply is going to be in whatever bucket you want it to be.
 
That previous example I replied with also works in the opposite scenario. If their ping times were reversed, player A on his client would see his shot hit player B but the server would interpolate it as a miss and player B would be safely behind the wall even though player A "shot first" on his machine.

You can shoot someone faster on your client in an online game and miss due to interpolation as well as other biased net code decisions in online games. In other game's net code, shots fired within the same tick will both hit even if one or both kill the other shooter (even if one of the players "shot first" within that tick). Higher tick rate makes the results a bit less loose the faster it is so is appreciable on higher tick servers but it's still not as fast as it could be currently. Especially on low tick servers, things like high fire rate weapons can fire what are essentially double damage (or more) "super bullets" per tick since the fire rate far exceeds the tick rate. But these things don't just happen to shooting, it's all movement and activity - like turning a corner/doorway/window peek, starting to run or stopping, free falling off of a building or drop off, driving vehicles, picking up ammo or resources, reloading initiation and rate, healing/reviving initiation and rate, everyone on the server's location and state data, etc. This is even further complicated by the fact that some games have lower tick rates for certain abilities and spells than their player location/state data's tick rate. Considering these factors among other online game/code dynamics and decisions, shooting (and everything else) in an online game isn't processed accurately compared to a local game.

---------------------------------------------------------------------------

Here's the thing: That is about FRAMERATE, not INPUT LAG which is what we were discussing.

My point is you can "bench test" system specs locally like tiny differences in already low input lag, much higher fpsHz than 120fpsHz, etc. but playing online due to it's limitations more than muddies the results making what can be very appreciable local increases less meaningful in online gaming as compared to LAN or single player games or vs. local bots. It's just the nature of online gaming and server code. These specs are marketed in advertising as if it's a 1:1 relationship when playing competitive games online. That is false.

You have two sides going on every player's machine, local and what the server manufactures as the result at it's own rate. This unequivalence is happening all the time but it's only the worst most obnoxious occurrences or when the man behind the curtain is shown more obviously so to speak that people call it out as "rubberbanding" , "peeker's advantage" , and things like high fire rate weapons ending up across fewer ticks resulting in "super bullets" (like time slips in movies) among other issues. The thing is, in online gaming it's always rubberbanding and throwing manufactured results back at you, it's just a matter of how long or short that rubberband is and what kind of guesswork biases are coded in one way or another.
 
Last edited:
You can grab custom windows Cleartype files now to help elevate the issue a bit. Can't remember where I saw that posted though.
Any idea where you saw that man? Can't find anything myself and would like to give it a try
 
I'm just looking at the difference with regards to concerns from some reviews worried that it is noticeably slow with regards to response. From the look of those comparisons, it isn't and should feel fast and responsive which is all that really matters.
..

Who can actually tell a difference of 4.7ms input lag? Unless you have the reflexes of a fly I don't think a 4.7ms is a meaningful amount of input delay.
Just because you're unable to consciously perceive the input delay doesn't mean it doesn't matter. If you can get an advantage over someone in a competitive game it can be a pretty big deal if all other things are equal.
Say you're playing counter strike against someone, you both come around the corner at the same time and see each other, you both react at exactly the same speed but the other person had a 10 ms input lag advantage. You just died and they lived.

I doubt we can. However it seems like lag measurements from reviews are all over the place, I think in part because it is very hard to do well (accurate detection of things over that short a timescale is hard) and there is no standard. Some people seem to think this monitor is on the slower side, zio's results seem to show that is not the case.

Ya but let's not pretend like that is a big issue for a few reasons:

1) Not everyone plays competitive twitch games. I see far too much of the assumption that the be-all, end-all for gaming is fast response as though all anyone does is play online twitch shooters. Yet when we look at what is being sold and played we see that there are all kinds of games that doesn't apply to. So while it might be valid IF you are the kind of person who does play that, it isn't something everyone needs to get worked up about.

2) Even if you do play those games, you have to ask yourself if you skill is at such a level that it matters. Skill and tactics are far and away more important than response time in how good you'll do. So even if you are a CSGO player if you are Silver 3, the thing keeping you from Global Elite is you, not an extra 6ms of response in your monitor. Only if you are already performing at extremely high levels is it a huge deal.

3) You are still going to be the biggest factor in this. Human response time is garbage. Usually in the realm of 200ms for a visual stimulus. There's also a lot of variability in it for an individual. So even if your reaction time averages, say, 150ms you will discover it is not precisely 150ms but rather a range of values from something like 120-170ms.


I'm not saying it isn't something to consider, particularly if competitive shooters are what you do and you are good at them, but if that isn't what you do it isn't such a huge concern and if you are pretty bad, don't assume better gear will make a huge difference, it is you that you need to work on.


The video pretty clearly shows, even with average people, that the difference between 60z and 120hz is pretty big when it comes to hitting certain types of shots.

However, from 120hz to 240hz the difference is very small, even with a pro player.
It's weird to me how people look at what seems like 100% clear data to me, and then just come out with completely random opposite conclusions.

IMO this video shows that increasing past 144hz provides a statistical advantage to all players of all skills levels, but actually makes the biggest difference the less skilled you are. That actually makes far more sense, it's not about "utilizing"(it's a display, not an input device). Highly skilled players are much better at predicting positions, so they don't need to SEE those positions as much. But if you're not skilled at predicting, seeing is everything.

Linus showed big improvements on tests that the pro gamers showed little or even negative improvement.

You can't tell me that doubling your hits on the double door test is nothing, that basically means any situation where you only see the enemy for a fraction of a second, 240hz helps you a huge amount unless you're already skilled at prediction.

View attachment 464012.

View attachment 464015

Now personally, I'm not all about competitive FPS play, it's not that high a priority to me. But if you play a lot of them, then higher refresh rates offer a clear and proven advantage. FPSes aren't incremental games. Either you hit or you miss. That means even small improvements can lead to significant changes in hit rates.

If you shoot someone faster, you shoot someone faster. Ping has nothing to do with anything in this conversation. Same thing to the guy that brought up tickrate earlier.

======================================

I never imagined in a million years the first QD-OLED monitor thread will devolve into what's important in competitive play.

I'm impressed.


...........................

When a new device hits the scene people pick apart it's specs. Gaming monitor's specs are typically heavily marketed to online gaming performance and scoring advantage so people will make comparisons to other screens and split hairs.

However after replies like those including people posting LOCAL reaction tests at different fpsHz thresholds, claiming whoever shoots first on the local client always shoots first on the game server and that wysiwyg online, that very high fpsHz ms and small differences in already low input lag gives you an appreciable reaction advantage in *online* games, etc.. I thought I'd speak up with some general info about how online game servers are very different than local spec testing, "reaction benching", etc. Similarly to some of the replies in this thread, "extreme" gaming monitor specs are typically marketed as being 1:1 advantage in online gameplay by companies. That is false. If I ever see things like that thrown around I'll always speak up in whatever thread it happens to be in, even it's the first consumer holographic MR rig :ROFLMAO:
 
Last edited:
I never imagined in a million years the first QD-OLED monitor thread will devolve into what's important in competitive play.

I'm impressed.
I love the guy who thinks tick rate and ping doesn't matter in online play, just who clicks faster. :ROFLMAO:. elvn provides good insight as always. I'm often too lazy these days to explain tech stuff :p.
 
I love the guy who thinks tick rate and ping doesn't matter in online play, just who clicks faster. :ROFLMAO:. elvn provides good insight as always. I'm often too lazy these days to explain tech stuff :p.
No one said tick rate and ping don't matter, they're irrelevant in a conversation about hardware input lag. Learn to read.
 
I'm glad the only thing to argue about is a pitifully small input lag. It tells me I'm going to love this monitor if it ever ships.
 
No one said tick rate and ping don't matter, they're irrelevant in a conversation about hardware input lag. Learn to read.
You were quoted as saying so. No, they're not irrelevant in a discussion about online game play and input lag's effects on it. Learn to stop being a compulsive liar.
 
I'm glad the only thing to argue about is a pitifully small input lag. It tells me I'm going to love this monitor if it ever ships.

This isn't even worth arguing about. Maybe the triangle subpixel structure is, but certainly not the input lag.



Lol so if my monitor back then, Asus VG248QE, did not have 5.1ms of input lag then maybe that wouldn't have happened to me right? :ROFLMAO:

1650241377944.png
 
You were quoted as saying so. No, they're not irrelevant in a discussion about online game play and input lag's effects on it. Learn to stop being a compulsive liar.
I said it has nothing to do with this conversation. Your inability to differentiate between relevancy and importance is not my problem. https://www.dictionary.com/

Hardware input lag is about how fast you can interface with your own physical components and is something you have complete control over. Tickrate and ping are about how accurately and how fast those actions are received and interpreted by a server which you have no control over*. None of these things have any bearing on the other. Yes, they all affect gameplay. Lots of things affect gameplay, that doesn't mean all of those things affect each other or are all relevant or meaningfully related.

This is why when reviewers test for input lag they don't have charts with ping and tickrate-induced delay added to them, because that would be incredibly stupid and makes no sense.

*Yes, you can control tickrate in some games that support player-hosted dedicated servers, is that relevant for most gamers today playing things like CS:GO and Apex Legends on first party servers? No, so don't try to "gotcha" me there. Yes, you can physically move across the country to significantly reduce your ping. Is that a practical solution for most gamers? No, so don't try to "gotcha" me there either.

It's a shame I have to waste my time and effort on shit like this but that's what happens when you're talking to someone that isn't interested in having an honest discussion. Back to the block list you go, don't know why I ever took you off of it.
 
Back to the block list you go, don't know why I ever took you off of it.
Oh no, not the block list of some random insulting guy who can't understand basic tech and rarely posts! However shall I survive? P.s. You literally just explained how the factors are intertwined. You're obviously a very confused person who's hypocritical to the point of absurdity to try to "win" an imaginary argument.
 
Oh no, not the block list of some random insulting guy who can't understand basic tech and rarely posts! However shall I survive? P.s. You literally just explained how the factors are intertwined. You're obviously a very confused person who's hypocritical to the point of absurdity to try to "win" an imaginary argument.
Odellus is on point though while you're derailing the thread. Nothing constructive to add? Please G T F O.
 
Back
Top