Dell Alienware AW3423DW 34″ QD-OLED 175Hz (3440 x 1440)

This is the workbench light on
A10D0A8F-416E-4D0F-B9B8-1F8910E5D4FD.jpeg


And this the workbench and overhead light on
CF4F6DD0-8AFD-4ACF-98C7-279387E27D9F.jpeg


I took the shots at an angle since it’s like a mirror with all black and the lights on.
But it still looks black to me (the photos make it look a bit grey but not in real life), not like a much lighter grey I’ve seen posted.
 
Seems like majority of owners here are just enjoying their monitor/panels. Looking at all the reviews, primarily YT, it's interesting to see how others perceive the technical data and subjective portions.
 
But Hardware Unboxed showed us that if you put a 1000W spotlight 12 inches in front of your monitor and point it at it it will look worse than an IPS panel, clearly your images have been edited to deceive us.
To be fair, that is a benefit of IPS, and actually lets FALD IPS perform better. If it's in a bright room it masks the FALD blooming.
 
To be fair, that is a benefit of IPS, and actually lets FALD IPS perform better. If it's in a bright room it masks the FALD blooming.
I'm not sure if it is IPS as much as just LCD have a polarization layer. LCDs inherently need polarization of light to function so they have a polarization layer (two of them actually). That is also going to help with glare on the screen, same reason we have polarized sunglasses... However it is going to decrease viewing angles some just because of how light works. So while Dell/Samsung could put a polarization layer on this OLED it would hinder the viewing angles a touch. Since OLED doesn't need a polarizer (you can have one, my phone does) and since the QD-OLED has an advantage of very wide dispersion of light and thus viewing angles, they seem to have decided to leave it off and maximize that.
 
I just ordered mine a few minutes ago. Sometime around June 18th for delivery. I currently have the ASUS ROG Swift 32" which is great and will still be used as my secondary monitor. I love the clarity and black levels of OLED and cannot wait to experience this on my PC, in a size that fits on my desk. 42" and 48" is just too large for me for sitting on my desk and gaming. I ended up moving the 43" Aurus to my work setup as it is much easier to keep track of code, PR's, and GitHub issues on a monitor that large :)

Also, thanks for all the fairly in depth info on the monitor in this thread! Sadly HardOCP went away (mostly), but this forum is still a god send when it comes to nitty gritty information. Reading through the sheer volume of info in this thread, the pros of this monitor outweighed the cons for me.
 
I placed an order the other day. I expect to receive mine sometime in June.
I'm currently using AW3821DW with LG OLED E9. Once AW3423DW arrives, AW3821DW will be moved to second PC.
 
Both the Sony A95K and Samsung S95B QD OLED TVs have the exact same polarisation traits as this monitor, I wonder if they'll catch the same flack.
 
Likely not as bad, since it's not as large as a TV, and a TV is typically sitting in a room with a lot more light.
 
⁵I also placed my order the other day. Thanks for all the info in this thread. Mine says late June also. If anyone gets theirs sooner, please let us know. I can't wait to check this thing out.
 
Here is Techspot's review of it. https://www.techspot.com/review/2445-alienware-aw3423dw-oled/
Quick Summary:
We think a lot of people were hoping this would be the ultimate display to use for everything: HDR gaming, watching movies, web browsing, desktop apps, spreadsheets, you name it. But in our opinion, we're not there yet.

First-gen QD-OLED panels don't offer that level of versatility, so we feel there's a little bit of early adopting pain going on with the AW3423DW. The active fan with audible fan noise, the odd screen coating, and layer setup leading to grey blacks at times, the lack of HDMI 2.0, weird gamma calibration, the teething issues with a non-standard subpixel layout… we reckon many of these problems will be resolved in future iterations.
 
Here is Techspot's review of it. https://www.techspot.com/review/2445-alienware-aw3423dw-oled/
Quick Summary:
We think a lot of people were hoping this would be the ultimate display to use for everything: HDR gaming, watching movies, web browsing, desktop apps, spreadsheets, you name it. But in our opinion, we're not there yet.

First-gen QD-OLED panels don't offer that level of versatility, so we feel there's a little bit of early adopting pain going on with the AW3423DW. The active fan with audible fan noise, the odd screen coating, and layer setup leading to grey blacks at times, the lack of HDMI 2.0, weird gamma calibration, the teething issues with a non-standard subpixel layout… we reckon many of these problems will be resolved in future iterations.
Something to keep in mind. Both Tim and Steve from Hardware Unboxed are Feature Editors & Reviewers for Techspot.com. So this is basically Tim's written review of his HardwareUnboxed youtube review.
 
First-gen QD-OLED panels don't offer that level of versatility, so we feel there's a little bit of early adopting pain going on with the AW3423DW. The active fan with audible fan noise, the odd screen coating, and layer setup leading to grey blacks at times, the lack of HDMI 2.0, weird gamma calibration, the teething issues with a non-standard subpixel layout… we reckon many of these problems will be resolved in future iterations.
The subpixel layout may be something here to stay, much like the RGBW layout of LG panels. I can't imagine Samsung did it glibly and with no reason, there is probably a very good technical reason for why they chose it and that may not change. Fans are something I wonder about, they may end up becoming a feature that hangs around for desktop OLEDs to allow greater brightness, even for monitors without a Gsync module. We'll have to see. HDMI 2.1 is for sure something we'll get with future displays, though I'm not quite sure why it is such a big deal to these guys.
 
Does it really take that long to get the screen from ordering to receiving? I don't really want to wait until June... I heard BestBuy will be getting them soon.
 
Does it really take that long to get the screen from ordering to receiving? I don't really want to wait until June... I heard BestBuy will be getting them soon.
Seems that they had a ton of orders from the get go and just depleted inventory before the monitor was actually released.
 
Glad I decided to wait. Sounds like there are some teething issues with QD-OLED. I’ll jump in with the next models.
 
Both the Sony A95K and Samsung S95B QD OLED TVs have the exact same polarisation traits as this monitor, I wonder if they'll catch the same flack.
It could, but if you're getting an OLED bright lighting scenarios is somewhat already a known thing to keep an eye on.
Does it really take that long to get the screen from ordering to receiving? I don't really want to wait until June... I heard BestBuy will be getting them soon.
Is this from an official source or just hearsay? If Dell is having trouble getting these out, surprised BB would have any. Which makes me wonder when and if Samsung will release their version.
Glad I decided to wait. Sounds like there are some teething issues with QD-OLED. I’ll jump in with the next models.
Hopefully some some of the things are fixed by then but I am sooooo glad to have this now (Replaced x34 from 7 years ago). The cons so far are negligible for my case usage and this is peaks better than the IPS/VA alternatives out right now in the same resolution.
 
Last edited:
I heard from a BestBuy employee when I was there a week ago. Could be hearsay but he seemed to know exactly what I was talking about and said they should have them "in a couple of weeks". I, too, am skeptical but I'm willing to wait it out and see if it's true. <shrug>
 
Dell has sent at least allocation to other retailers because I know a few people who ordered from Amazon when it popped up in stock on March 23rd. Since that initial batch was available to order Amazon has no confirmed ETA probably because Dell themselves are clueless as to when they'll have more.
 
The subpixel layout may be something here to stay, much like the RGBW layout of LG panels. I can't imagine Samsung did it glibly and with no reason, there is probably a very good technical reason for why they chose it and that may not change. Fans are something I wonder about, they may end up becoming a feature that hangs around for desktop OLEDs to allow greater brightness, even for monitors without a Gsync module. We'll have to see. HDMI 2.1 is for sure something we'll get with future displays, though I'm not quite sure why it is such a big deal to these guys.
For Samsung maybe, for LG maybe not according to this article: https://t.co/y3ldCXsA0u

Nvidia has adamantly refused to upgrade the G-Sync module. It has miserably low number of input options and while that's fine for gamers, it's not for those of us who want to connect multiple computers. Lack of HDMI 2.1 comes from there. Similarly the only reason these things have a fan is because of the G-Sync module. The Freesync displays meanwhile need no such things while performing similarly (no, not identically before someone starts frothing at the mouth) which leads me to believe the G-Sync module is a design that just happens to run hot. I really dislike the idea of fans in a desktop display especially when replacing them is not going to be easy. Mechanical moving parts are often the first to go bad.

Microsoft could actually solve the subpixel issues, either by providing Cleartype modes better suited for the odd QD-OLED pixel layout. Alternatively if they simply improved the grayscale font smoothing option to be closer to what MacOS does then there would be no issue as I feel my LG OLED works without problems on MacOS but not on Windows where I instead use Cleartype adjustments on RGB font smoothing combined with MacType and DPI scaling to mitigate the problem. If Samsung starts making 4K 27, 32 or 42" panels in the next few years, those should have no real issues with text rendering as DPI scaling can mitigate it pretty decently.

I agree with that Techspot review that the Dell is the early adopter option and it's going to have some growing pains. I can understand people jumping on it left and right when the monitor market is garbage.
 
You can grab custom windows Cleartype files now to help elevate the issue a bit. Can't remember where I saw that posted though.
 
Someone was asking about ambient lighting making the screen grey on another forum so I took these two photos earlier.
I have my really bright LED shop light on behind me and screen still looks fine,
View attachment 462789

View attachment 462790
Nice setup and I'm really tempted to place an order for that monitor. Sidenote, seeing your keyboard sure makes me miss my old Logitech with the lcd screen.
 
I really dislike the idea of fans in a desktop display especially when replacing them is not going to be easy. Mechanical moving parts are often the first to go bad.

The reason that they may stay around isn't Gsync, but brightness. OLEDs use a lot of power to get as bright as they do and there is interest in making them brighter still. For that to work well, it will need cooling to prevent issues and image retention/burn in. Can be done with a large enough heatsink, of course, but a fan may be the solution of choice. Sometimes the heat output and size of something is going to make engineering go for a fan. My Denon X4400 is like that, first receiver I've ever had from them with a fan. They always tried to passively cool. However they would either need less amp channels, less DSP, or a larger case to go passive and none of those were things they were willing to do so it has fans.

Microsoft could actually solve the subpixel issues, either by providing Cleartype modes better suited for the odd QD-OLED pixel layout. Alternatively if they simply improved the grayscale font smoothing option to be closer to what MacOS does then there would be no issue as I feel my LG OLED works without problems on MacOS but not on Windows where I instead use Cleartype adjustments on RGB font smoothing combined with MacType and DPI scaling to mitigate the problem. If Samsung starts making 4K 27, 32 or 42" panels in the next few years, those should have no real issues with text rendering as DPI scaling can mitigate it pretty decently.

I wouldn't hold your breath, MS seems to really not be interested in working on their font rendering these days. They haven't done any updates to cleartype in years near as I know.
 
I really dislike the idea of fans in a desktop display especially when replacing them is not going to be easy. Mechanical moving parts are often the first to go bad.

For what it's worth, my Acer X27's fan is still working even after 4 years. Although I don't know how much longer it will continue to work, and surely it has already failed on other people's units.
 
Imagine if more monitor reviewers included this information (Thank goodness for Rtings):
True, but I think the publications need to be contacted otherwise they wouldn't know. Because how about things such as VRR/Gsync/Free-sync on/off during these tests among other things. Just recently Rtings included these but moving forward, think it needs to be a staple for other as well. That being said, going to contact them to include HDR on/off input lag results as well =P

For those wondering:
1649880868951.png
 
Last edited:
Hopefully some some of the things are fixed by then but I am sooooo glad to have this now (Replaced x34 from 7 years ago). The cons so far are negligible for my case usage and this is peaks better than the IPS/VA alternatives out right now in the same resolution.
I too am still using the x34 from 7 years go, Im typing this message on it now.
So you feel the move from x34 to the 3223dw was a worthwhile upgrade, even though it's the same res and size?
 
True, but I think the publications need to be contacted otherwise they wouldn't know. Because how about things such as VRR/Gsync/Free-sync on/off during these tests among other things. Just recently Rtings included these but moving forward, think it needs to be a staple for other as well. That being said, going to contact them to include HDR on/off input lag results as well =P

For those wondering:
View attachment 463472
I've been begging (contacted directly) Tim from Techspot/HW unboxed and literally everyone else that has reviewed this monitor to include more detailed input lag testing. No Bueno.
 
I too am still using the x34 from 7 years go, Im typing this message on it now.
So you feel the move from x34 to the 3223dw was a worthwhile upgrade, even though it's the same res and size?
For my case usage, yes. I am still happy with 3440x1440p along with wanting high refresh. My original set-up of 4790 & 980ti with the x34 was just fine but obviously even back then would need to turn down settings. In 2020 updated to 5900x/3080 combo which was finally able to utilize x34 to it's potential.

I have an OLED TV but have always wanted that HDR capability along with wonderful blacks/contrast ratio in monitor format and it's finally here.

That being said, the cons are real. The question is, how much is it a compromise to you? Below I think are the important things that will vary per individual.
  • Fan noise - 2 different types when it turns on may be audible in silent scenarios but I wear headphones majority of the time. Even when on, does not bother me.
  • Text Fringing - I notice only when up to extremely close distance so non-issue for me
  • Black - May appear gray depending on lighting. I have controlled lighting scenario so not a big issue for me.
  • Brightness - I use mine at 55-70 brightness for SDR (100-150? nits) so is not an issue along with having a controllable lighting scenario.
X34 vs AW image comparison. Have the AW right above X34 and made it as close as possible to real viewing condition
Album Comparison

x34 (2015) vs AW at native 3440x1440p resolution
G7 1440p vs AW at 2560x1080p resolution to keep both even(GPU Scaling). If anyone can actually figure out how to have the AW natively(Active Signal Resolution) output 2560x1440p, let me know so I can record again.
 
Last edited:
Nice setup and I'm really tempted to place an order for that monitor. Sidenote, seeing your keyboard sure makes me miss my old Logitech with the lcd screen.
I too liked the LCD screens on the old Logitechs. This is actually a Corsair iCue Nexus touch screen just sitting on my Logitech G815.
The screen is kinda pricey though at $100.
IMG_0851.JPEG
 
Brightness - Have a controlled situation so totally not a problem.

This I think is less likely to be a problem than most people think. I guess it depends on how bright you like your monitor vs your background but at work, with overhead fluorescents, I measured my LCD as being set at under 200nits when I had set it to a comfortable level. I'm not saying you can't find really bright scenarios when you want more, like my laptop goes up to 400+ max and I've used that in airports when there is sunlight streaming in the windows, but I think even when you are in a lit room you probably aren't going to exceed the cap on this thing.

X34 vs AW image comparison. Have the AW right above X34 and made it as close as possible to real viewing condition
Album Comparison

x34 (2015) vs AW at native 3440x1440p resolution
G7 1440p vs AW at 2560x1080p resolution to keep both even(GPU Scaling). If anyone can actually figure out how to have the AW natively(Active Signal Resolution) output 2560x1440p, let me know so I can record again.

So correct me if I'm wrong but it looks like the AW is noticeably faster than the x34, a frame or maybe more and I'm having trouble seeing any difference between it and the G7. That being the case I'd say the lag is not a big deal on it.
 
This I think is less likely to be a problem than most people think. I guess it depends on how bright you like your monitor vs your background but at work, with overhead fluorescents, I measured my LCD as being set at under 200nits when I had set it to a comfortable level. I'm not saying you can't find really bright scenarios when you want more, like my laptop goes up to 400+ max and I've used that in airports when there is sunlight streaming in the windows, but I think even when you are in a lit room you probably aren't going to exceed the cap on this thing.

So correct me if I'm wrong but it looks like the AW is noticeably faster than the x34, a frame or maybe more and I'm having trouble seeing any difference between it and the G7. That being the case I'd say the lag is not a big deal on it.
For brightness, I'll update my post but it's mostly for people who like to blast theirs 250+nits as preference and/or due to room lighting scenario.

That video is taken with a N20 at 960FPS so it makes it more visible. In real time use, would say the bigger difference is in feeling and how much clearer the AW is vs the x34.

Between G7 & AW, I'd say the G7 is faster in response (Negligible IMO) while the AW looks much clearer to me. It's when you start going into lower refresh that the clarity of the AW goes above the G7. If you can maintain 200+ on the G7 clarity is great of course. As a pure gaming monitor, since having the AW I have appreciated the G7 even more considering it is a VA tech.
 
For brightness, I'll update my post but it's mostly for people who like to blast theirs 250+nits as preference and/or due to room lighting scenario.
For sure, just noting that 250 is more than most people think. I think there is a worry from some that they need more brightness than they actually do, since they don't know what the actual brightness output of their monitor is at a given setting since most people don't have (or need) a meter. It is for sure something worth noting since compared to modern LCDs, it is low, they usually are 400ish for cheaper ones, and higher for ones that do HDR.

It is just an issue I feel that some reviews make more of a deal of than they should, and that individuals are more worried than they should.

That video is taken with a N20 at 960FPS so it makes it more visible. In real time use, would say the bigger difference is in feeling and how much clearer the AW is vs the x34.

Between G7 & AW, I'd say the G7 is faster in response (Negligible IMO) while the AW looks much clearer to me. It's when you start going into lower refresh that the clarity of the AW goes above the G7. If you can maintain 200+ on the G7 clarity is great of course. As a pure gaming monitor, since having the AW I have appreciated the G7 even more considering it is a VA tech.

I'm just looking at the difference with regards to concerns from some reviews worried that it is noticeably slow with regards to response. From the look of those comparisons, it isn't and should feel fast and responsive which is all that really matters.
 
I'm just looking at the difference with regards to concerns from some reviews worried that it is noticeably slow with regards to response. From the look of those comparisons, it isn't and should feel fast and responsive which is all that really matters.

Who can actually tell a difference of 4.7ms input lag? Unless you have the reflexes of a fly I don't think a 4.7ms is a meaningful amount of input delay.
 
Who can actually tell a difference of 4.7ms input lag? Unless you have the reflexes of a fly I don't think a 4.7ms is a meaningful amount of input delay.

Just because you're unable to consciously perceive the input delay doesn't mean it doesn't matter. If you can get an advantage over someone in a competitive game it can be a pretty big deal if all other things are equal.
Say you're playing counter strike against someone, you both come around the corner at the same time and see each other, you both react at exactly the same speed but the other person had a 10 ms input lag advantage. You just died and they lived.
 
Who can actually tell a difference of 4.7ms input lag? Unless you have the reflexes of a fly I don't think a 4.7ms is a meaningful amount of input delay.
I doubt we can. However it seems like lag measurements from reviews are all over the place, I think in part because it is very hard to do well (accurate detection of things over that short a timescale is hard) and there is no standard. Some people seem to think this monitor is on the slower side, zio's results seem to show that is not the case.
Just because you're unable to consciously perceive the input delay doesn't mean it doesn't matter. If you can get an advantage over someone in a competitive game it can be a pretty big deal if all other things are equal.
Say you're playing counter strike against someone, you both come around the corner at the same time and see each other, you both react at exactly the same speed but the other person had a 10 ms input lag advantage. You just died and they lived.
Ya but let's not pretend like that is a big issue for a few reasons:

1) Not everyone plays competitive twitch games. I see far too much of the assumption that the be-all, end-all for gaming is fast response as though all anyone does is play online twitch shooters. Yet when we look at what is being sold and played we see that there are all kinds of games that doesn't apply to. So while it might be valid IF you are the kind of person who does play that, it isn't something everyone needs to get worked up about.

2) Even if you do play those games, you have to ask yourself if you skill is at such a level that it matters. Skill and tactics are far and away more important than response time in how good you'll do. So even if you are a CSGO player if you are Silver 3, the thing keeping you from Global Elite is you, not an extra 6ms of response in your monitor. Only if you are already performing at extremely high levels is it a huge deal.

3) You are still going to be the biggest factor in this. Human response time is garbage. Usually in the realm of 200ms for a visual stimulus. There's also a lot of variability in it for an individual. So even if your reaction time averages, say, 150ms you will discover it is not precisely 150ms but rather a range of values from something like 120-170ms.


I'm not saying it isn't something to consider, particularly if competitive shooters are what you do and you are good at them, but if that isn't what you do it isn't such a huge concern and if you are pretty bad, don't assume better gear will make a huge difference, it is you that you need to work on.


As an aside, it is strange to me how much time and stress people spend on latency for displays for competitive shooters but nobody pays any attention to audio latency. We react to audio stimuli faster than visual stimuli on average so if absolute fastest reactions were of a concern, I'd think low audio latency would be of interest. It is something that differs too, different sound cards have different amount of buffers on the USB interface, processing, and DA itself. There is quite a bit of information on it out there as it matters a lot in the world of pro audio and stage reproduction (for sound 1ms = about 0.9 feet in the air so you deal with it when synchronizing sound sources).

Ideally if a game were all about fastest reactions you'd want it to support ASIO output so you can bypass windows sound processing, which has its own buffering, and communicate directly with a soundcard. With ASIO and a good pro card you can get latencies under 1ms from the time the card receives the signal to the time the DA is generating voltage output.
 
Just because you're unable to consciously perceive the input delay doesn't mean it doesn't matter. If you can get an advantage over someone in a competitive game it can be a pretty big deal if all other things are equal.
Say you're playing counter strike against someone, you both come around the corner at the same time and see each other, you both react at exactly the same speed but the other person had a 10 ms input lag advantage. You just died and they lived.

Yeah I'm not even gonna bother trying to explain why a few ms of input lag isn't going to matter. You can read elvn's essay if you want.

Rollback aka rubberbanding is attempting to keep you up to the tick rate to compensate for ping times it's not compensating to keep everyone else up to your local fps-Hz rate. At best you would theoretically get the tick rate of the server which with the interp2 on a 128tick server is 15.6ms but that is not how the formula works with your ping time contributing to it. More likely in the end you are getting serveral frames of your local game between each server state update. For example the game server could be sending you game action slices at a 55Hz (55ms) rate considering interpolation time + ping even if you are running 120fpsHz or 360fpsHz. On worse tick servers than 128 tick, which is most games, the interp time itself is much worse to start with before your ping time is factored in.

So you are running 15ms interp + your ping time on the best 128tick servers, with some averaging done by the client between 3 frames (current plus 7ms x 2 at interp2) that has been "buffered". By comparison, a LAN game typically has a latency of 3ms to 7ms.






.... So you are loading multiple ticks sent (with your latency) which are on the best tick server a new packet + 7.8ms x 2 = 15ms tick frames, then figuring out what happened between them. This is like buffering a few frames and making a frame out of the difference between them.


Command Execution Time = Current Server Time – (Packet Latency + Client View Interpolation)

So you pull the trigger or peek around a corner = current server time (say 100min:00sec:00ms) minus packet latency (25ms to 40ms typical) + client view interp (15ms) =

128tick at interp_2 = 15ms + (25 - 40ms) ping = 40ms to 55ms ~~~~~~~~~> 5 to 7 frames before new action/world state data is show/new action is registered (at 117fps solid)
64 tick at interp_2 = 31.2ms + (25 - 40ms) ping = 56ms to 71ms ~~~~~~~~> 7 to 8 frames
22 tick at interp_2 = 90ms + (20 - 40ms) ping = 110ms to 130ms ~~~~~~~~> 13 to 15 frames
12 tick at interp_2 = 166ms + (20 - 40ms)ping = 186ms to 206ms ~~~~~~~> 22 to 24 frames





... So the rollback you are talking about
-starts from the current server time (the server time logged after your latency sent packet was received)
-minus
- your ping time (25ms - 40ms) plus your interpolation time (128tick = 15ms, 64 tick = 31ms, 22 tick = 90ms, 12 tick = 166ms)

There's no factoring in of 120fpsHz 8.3ms frames or 360fpsHz 's 2.7ms frames. You are bound by the tick rate. The best tick is closer to 60fpsHz so higher fpsHz isn't doing anything to better that. If anything your latency is the factor (at least until there are someday higher tick servers).





--------------------------

https://www.reddit.com/r/Overwatch/comments/3u5kfg/everything_you_need_to_know_about_tick_rate/



So you are never being fed new information or having your new actions register.. compensated across [[ two received 7.8ms tick action slices (15ms) + your local current frame state ]] on the client ... fast enough to make 13.x ms of input lag a competitive factor in online gaming imo even if you had a ping of zero but especially with online latency factored in.

Anyone testing would have to make sure that they are running 120fps (115 or 117fps capped) as a minimum , no averages, in order to get 8.3ms per frame at 120fps (or 8.6ms at 115fps) for the upper limit of what the monitor can do. However a more realistic test would be 90 to 100fps average where people are using higher graphics settings on demanding games, relying on VRR to ride a roller coaster of frame rates.

-------------------------------------------
If someone is doing graphics settings overboard or has a modest gpu and cranks up the graphics at 4k resolution on a game so that they are getting say 75fps average, they would then be frame durations something in the ranges of:

......25ms / 16.6ms <<< 13.3ms >>>> 11.1ms / 9.52ms

at...40fps / 60fps <<< 75fps >>> 90fps / 105fps

------------------------------------------
https://win.gg/news/4379/explaining-tick-rates-in-fps-games-difference-between-64-and-128-tick
  • CSGO official matchmaking: 64-tick
  • CSGO on FACEIT: 128-tick
  • CSGO on ESEA: 128-tick

Valorant tick rates:
  • Valorant official matchmaking: 128-tick

Call of Duty: Modern Warfare tick rates:
  • COD multiplayer lobbies: 22-tick
  • COD custom games: 12-tick
While that sounds fast, many CSGO players have monitors capable of running at 144Hz. In simple terms, the monitor can show a player 144 updates per second, but Valve's servers only give the computer 64 frames total in that time. This mismatch in the server's information getting to the computer and leaving the server can result in more than a few issues. These can include screen tearing, a feeling like the player is being shot when protected behind cover, and general lag effects.
---------------------------------------------------

You'd think that a tick of 128 would be 7.8ms and a tick of 64 would be 15.6ms , but it's not that simple... (see the quotes below)

----------------------------------------------------


http://team-dignitas.net/articles/b...-not-the-reason-why-you-just-missed-that-shot









Combining all of these factors - these tiny ms differences on the LG are really moot and especially if arguing latency regarding online (rather than LAN) gameplay and without using solid frame rates that don't dip below the max Hz of the monitor.
 
Last edited:
Back
Top