Latency: Why Typing on Old Computers Just Feels Better

Megalith

24-bit/48kHz
Staff member
Joined
Aug 20, 2006
Messages
13,000
It seems that modern computers are so complex that there are just some things they can't do quickly. Dan Luu (a hardware/software engineer at Microsoft) had a nagging feeling that the computers of today felt slower than those he used as a kid and decided to document the latency of an assortment of devices from 1977 to present.

Almost every computer and mobile device that people buy today is slower than common models of computers from the 70s and 80s. Low-latency gaming desktops and the iPad pro can get into the same range as quick machines from thirty to forty years ago, but most off-the-shelf devices aren’t even close.
 
I have a stack of C64C and a 128D in my basement work room. All in working order. However, with the state of C64 emulation (VICE is amazing) I haven't really bothered to hook them up lately. However, I just saw an announcement that Limbo is being made for the C64, so I may hook one up when that gets released some day just to see it.

Everything on those old machines was programmed in assembly or straight machine language (with the exception of some basic and such). The hardware was all tightly synchronized, and games were all written in sync with vertical refresh of CRTs of the time. Joysticks were mostly un-multiplexed, and I'm sure keyboards were multiplexed through logic ICs or something like that too. (would have to look up my old schematics though) That's all pretty immediate.
 
Low-latency gaming desktops and the iPad pro can get into the same range as quick machines from thirty to forty years ago, but most off-the-shelf devices aren’t even close.

Screw FPS-analyses. I want to see a typing latency breakdown for an i9/Titan V system....


Seriously though, I type for a living and this problem is exactly why I prefer to do most of my work from my gaming desktop rather than my laptop.
 
I still use a crt as primary competetive gaming monitor so I can understand the fondness of tech that just gave better performance despite being replaced.

For those that didn't read the actual guts of the article, he's simply measuring the time it takes to press a keystroke, and how long it takes to fully render it on the display for that device. I would be interested on an even more in depth follow up on this article that showcased more of a gaming focus that explored the real latency of current setups. I bet a lot of people would be pretty blown away at the lackluster performance of their 'high end' gaming monitors.
 
I can understand the fondness of tech that just gave better performance despite being replaced.

Unfortunately mostly out of necessity in a lot of cases. Joysticks used to have one or two buttons and either two analog axes or four additional on-off directions. Not much to monitor there, and most of the time not multiplexed. One connection per control-item. Now controllers have so many controls they have to be multiplexed, or controlled via micro-controller in order to sort all the IO. (as one example) Others include half-duplex serial buses (for convenience... ala USB) Monitor tech, well, aside from the obvious push for thin, flat displays, there were dangerous (and now much more regulated) chemicals and voltages involved in CRTs. Unfortunately all of this progress, convenience, additional control, also has the side-effects like latency, additional layers of processing, etc.
 
He didn't explicitly state what keyboard he was using with the new computer, but does mention the ErgoDox. I looked it up, and it appears to be USB only. I wonder if PS/2 would make any difference?
 
Interesting reading, but honestly, that is all it is to me.

I used Apple 2e machines back in the day and use older IBM Dos machines when I was growing up. To me, those older machines are still wickedly slow when typing. The only thing that I notice lag or delays on when typing on modern laptop/desktops is when you have autocorrect turned on and it fixes a word for you. But other then that, the key strokes are near instant on just about every machine I have used.

If anything it is the monitor technology that he is actually coming into contention with as far as typing speeds. Comparing a CRT to a LCD, there is almost always a higher input lag when comparing the two. Many monitor sites specifically talk about those types of issues.
 
Aren't CRTs still superior when it comes to color depth as well? I remember viewing high bit-depth images, along with some movies, and seeing obvious color-banding on LCDs (this problem is easy to see in, say, sky shots). I still see a lot of color banding in today's LCDs, whereas a decent CRT was able to reproduce the colors perfectly without any banding. Now LCDs are finally getting, what, 10bit color, whereas a 15 year old CRT monitor is able to view 24-32bit color? All of the other benefits of LCD have obviously made CRTs a relic, but damn, the picture quality did look better.
 
Aren't CRTs still superior when it comes to color depth as well? I remember viewing high bit-depth images, along with some movies, and seeing obvious color-banding on LCDs (this problem is easy to see in, say, sky shots). I still see a lot of color banding in today's LCDs, whereas a decent CRT was able to reproduce the colors perfectly without any banding. Now LCDs are finally getting, what, 10bit color, whereas a 15 year old CRT monitor is able to view 24-32bit color? All of the other benefits of LCD have obviously made CRTs a relic, but damn, the picture quality did look better.

Depend on your interpretation of superiority. crt monitors are constancy fading the brightens which the eys are more sensitive to than colors.
This also means the brightnes from top to bottom are never the same s a pictures is never "Correct" in britightness a yellow at the top of a crt scree is diffrnetly than te same yellow as the bottom of the screen.
 
Interesting reading, but honestly, that is all it is to me.

I used Apple 2e machines back in the day and use older IBM Dos machines when I was growing up. To me, those older machines are still wickedly slow when typing. The only thing that I notice lag or delays on when typing on modern laptop/desktops is when you have autocorrect turned on and it fixes a word for you. But other then that, the key strokes are near instant on just about every machine I have used.

If anything it is the monitor technology that he is actually coming into contention with as far as typing speeds. Comparing a CRT to a LCD, there is almost always a higher input lag when comparing the two. Many monitor sites specifically talk about those types of issues.

I still use a Commodore 64 and there's no latency whatsoever when typing unless I'm on a BBS at 300 baud.

Not too sure what's wrong with your Apple II's.
 
No one is going to point out every iOS device tested beat every Android device tested with scrolling latency? iOS “feels” better to a lot of people for a reason.
 
Seems to me it might have something to do with complexity of the operating system. Yes, computers are much faster, but they also have more processes to share cpu time between, GUI to run, etc. Old non multitasking operating systems are sitting there just waiting for input, responding to interrupts, etc.
 
Seems to me it might have something to do with complexity of the operating system. Yes, computers are much faster, but they also have more processes to share cpu time between, GUI to run, etc. Old non multitasking operating systems are sitting there just waiting for input, responding to interrupts, etc.

Valid point.

My retro Pentium 3 Tualatin gamer running Windows 2000 feels no slower in operation to my Windows 10 gamer running an i7 and GTX 980Ti.
 
smh - he gonna measure the latency of old typewriters, too? I mean, the time between your finger hits the key and the time the hammer hits the ribbon/page is different for old manuals vs IBM Selectrics.

For 99.9% of people, nobody gives a sh*t. They can't think/type fast enough to even notice.

Truly #FWP grognard BS. :p

BB
 
I agree typing in a mech keyboarf feels about as close to old PC but the latency is still too long.
 
I thought that Apple Pencil was the stupidest thing they'd ever made... then I played with one. The latency is amazing on it. Spent 99 bones as quick as I could.



.... I've used it maybe 3 times.
 
smh - he gonna measure the latency of old typewriters, too? I mean, the time between your finger hits the key and the time the hammer hits the ribbon/page is different for old manuals vs IBM Selectrics.

Now I'm curious as to how quickly those old mechanical typewriters physically hits those pages...
 
A bit of a clickbait article. I know that personally I don't find latency in any way noticeable on any of the machines I own, new or old.

An interesting article, but somewhat 'clickbaity'.
 
Now I'm curious as to how quickly those old mechanical typewriters physically hits those pages...

depends on how you measure it. if you measure from the bottom of the keystroke it will be exactly 0 delay and may have even hit the page before your done pressing the key
 
depends on how you measure it. if you measure from the bottom of the keystroke it will be exactly 0 delay and may have even hit the page before your done pressing the key

He should measure the old serial teletypes when connected to an IBM System 360!
 
Old systems the keyboard is wired to the cpu with it's very own irq, not on a bus going through a bridge dealing with competing data all handled by a controller sharing an irq with who knows what.
 
I have a 32" tv\crt I turn on 1 per year give or take. ... make a good //gs monitor?

Can you even still buy new 3.5/5.25 floppies?

Wonder if I could get it on ethernet?
 
Now I'm curious as to how quickly those old mechanical typewriters physically hits those pages...


I've used one, it's not fast.

I was so used to typing on geocities chat in "typing" class with electronic ones i could fill the buffer and it would beep. I had to wait 30 seconds for it to catch up.
 
I still use a Commodore 64 and there's no latency whatsoever when typing unless I'm on a BBS at 300 baud.

Not too sure what's wrong with your Apple II's.

Think you mean no perceivable latency. If you were to measure it would come up somewhere in the 30-60ms range from keystroke to fully drawn on screen.
 
Can you even still buy new 3.5/5.25 floppies?

You can get New Old Stock on Ebay, that's about it. Most replace Floppy's with Gotek's these days, but I still like using my 1541 from time to time.

Think you mean no perceivable latency. If you were to measure it would come up somewhere in the 30-60ms range from keystroke to fully drawn on screen.

Yes, sorry, that's what I meant.
 
It's not just the latency aspect, at least not for me (never even considered that aspect to be honest), it's about the comfort of the keyboard more than anything else. Now, before people misunderstand what I'm saying here know this: I do not own a custom keyboard, I don't give a rat's ass about all the modern craze for custom keyboards, keys, switches, overlays, designs, whatever, that's not me and I simply could care less about that huge market that appeared almost out of thin air for such things.

As luck would have it right now I'm considering getting an old ThinkPad laptop with the proper 7-row keyboard on it, just haven't decided on which one just yet. I'd love - I honestly would - to have two of the ThinkPad 25th Anniversary models (the T25) because it's enough meaning it has enough hardware power, battery life (with the 6-cell battery), the screen is fine (I know it's not an IPS panel, I don't give a fuck, nor do I care about the touch functionality on the matte display either), and has enough ports for me and the size and form factor are spot on.

But it's got that proper ThinkPad 7-row keyboard like those from years past and that makes it attractive to me - of course, the $1900 price on it has me and many many other ThinkPad fans kinda pissed off - I mean really, it's a $1200-1300 laptop and they are obviously gouging the price much higher than it should be considering the T470 foundation it's based upon which is available for a whole lot less money but without that 7-row keyboard as an option sadly.

Anyway, I'll probably end up with either a T410 for a proper 16:10 display panel or more likely the T420 (16:9, ugh) because I truly miss those keyboards. I've used Dell Latitudes for years now and I have no big issues with their keyboards overall but it's just not the same experience at all, as expected. Those ThinkPad keyboards, my god, it literally is so comfy to use them, there's just nothing else out there that compares, and even in spite of Lenovo's attempts to translate that feel to the chiclet bullshit they produce nowadays, those original 7-row keyboards are something I've missed for too long.

But yeah, sure, latency, whatever. :D
 
I personally don't see the fascination with mechanical keyboards and tend just to use whatever feels good.

I had an IBM mechanical keyboard years ago, ended up lobbing it in the bin when I upgraded from a 486 PS/2....
 
It's not just computers. A few months ago, I went ahead and tried playing Super Mario Bros. again. I didn't remember the controls ever being so unresponsive, and what I considered a fairly easy game when I was a kid now seemed like a clunky and difficult game. So, I also happened to have a working NES and CRT television in storage, and pulled them out, played the real deal, and beat it in only 3 attempts. (I died multiple times in level 8, but I'd still say it's not that bad for barely playing it). The real NES on a CRT just felt so much more responsive.
 
It's not just computers. A few months ago, I went ahead and tried playing Super Mario Bros. again. I didn't remember the controls ever being so unresponsive, and what I considered a fairly easy game when I was a kid now seemed like a clunky and difficult game. So, I also happened to have a working NES and CRT television in storage, and pulled them out, played the real deal, and beat it in only 3 attempts. (I died multiple times in level 8, but I'd still say it's not that bad for barely playing it). The real NES on a CRT just felt so much more responsive.

I've found the exact same thing playing my C64 on an LCD vs a 1084S CRT.
 
“Latency: why typing on old computers just feels better”

Message from your carpal tunnels: “We beg to differ.”
 
Old saying, "The more you complicate the plumbing, the easier it is to stop up the drain."

Basically many of the gains achieved with each top tier tech are compromised by the plethora of complicated software or OS related items piled on them. At this point, especially with Windows, it is an art form to keep a streamlined and properly updated high performance rig. My rigs, ugly as anything to look at, just do what they're supposed to do(except for the occasional OS update that sends me scrambling to a manufacturer for their corresponding driver). Typing or any HID's rarely have any noticeable lag. When they do, it's a sign to me that something's changed and I need to fix it.
 
Aren't CRTs still superior when it comes to color depth as well? I remember viewing high bit-depth images, along with some movies, and seeing obvious color-banding on LCDs (this problem is easy to see in, say, sky shots). I still see a lot of color banding in today's LCDs, whereas a decent CRT was able to reproduce the colors perfectly without any banding. Now LCDs are finally getting, what, 10bit color, whereas a 15 year old CRT monitor is able to view 24-32bit color? All of the other benefits of LCD have obviously made CRTs a relic, but damn, the picture quality did look better.
Yes and no.

Today's monitors that are labeled as 10-bit are not just 10-bit. It's 10-bit per color channel (Red, Green, Blue), so essentially 30-bit. So an 8-bit LCD display is really 24-bit. They are labeled differently than what CRTs usually were advertised as.

What you have to watch out for is many of the cheaper TN panels that are listed as 8-bit (24-bit) are really 6-bit with dithering. It's those displays that you probably noticed banding on since they can only do a fraction of the color output of a true 8-bit panel (~262,000 colors vs ~16.7 million). 8-bit and higher displays you "shouldn't" notice banding. 10-bit panels can reproduce billions of colors. Some 10-bit panels are 8-bit plus dithering or are true 10-bit panels but only able to accept an 8-bit per color channel signal and interpolate it up to 10-bits. IPS and VN panels are typically 8-bits or higher.

As for CRT monitors, I don't really know of any (at least consumer level) that were higher than 24-bit color output. The 32-bit you are thinking of is color output from the graphics card, which at the time is really 24-bit color plus 8-bit for alpha channel. There is a 36-bit and 48-bit color output (12-bit and 16-bit per color channel), but that was primarily limited to high end professional graphics editing machines. When you include the alpha channel, you are essentially at 40/48/64 bits per pixel.

Another thing to consider is color gamut. CRTs typically were superior to LCDs when it came to color reproduction when comparing consumer level monitors straight out of the box. This has improved greatly over the years, especially with the more expensive, higher end LCDs. This also may have played a factor in what you were describing as "banding" (although bit depth is most likely the primary cause), as a poorly calibrated monitor may not display the colors properly.
 
Seems to me it might have something to do with complexity of the operating system. Yes, computers are much faster, but they also have more processes to share cpu time between, GUI to run, etc. Old non multitasking operating systems are sitting there just waiting for input, responding to interrupts, etc.

I assume you are referring to DOS. I was running a UNIX server in the 80's with 30+ people connected to it (using dumb terminals via RS232), running all thier applications (spreadsheets, databases, word processing, email...), and the server would never bog down. It only had 1MB of RAM (used for the applications/processes) on one CPU and two banks of bank-switched 32K RAM on the other CPU (handled all I/O). At any givien time, there were approximately 100 to 120 processes running for all those users.

We had one guy who could type 175 WPM. He never had an issue with lag. He wore out and broke many keyboards, but he never had lag issues.

We used a proprietary context switching shell, I wrote, which only required a single keystroke to run any given application, or open a file with the correct application. The shell also had a built-in message system, email notification, and a scheduling calendar running which would post notifications overlaying any given application you were running.

Computers, today, are being used as advertising and marketing nodes gathering data. We never suffered any of that.
 
In Windows you can turn off the stupid animations and keys render virtually instantaneous again. Just saiyan.
 
In Windows you can turn off the stupid animations and keys render virtually instantaneous again. Just saiyan.
Just sayin', just saying, or Super Saiyan?

96a6c358b0d8528459befe9c962ab0c9.png
 
I still use a Commodore 64 and there's no latency whatsoever when typing unless I'm on a BBS at 300 baud.

Not too sure what's wrong with your Apple II's.

The problem is that they weren't made by Commodore. :D
 
Yes and no.

Today's monitors that are labeled as 10-bit are not just 10-bit. It's 10-bit per color channel (Red, Green, Blue), so essentially 30-bit. So an 8-bit LCD display is really 24-bit. They are labeled differently than what CRTs usually were advertised as.

What you have to watch out for is many of the cheaper TN panels that are listed as 8-bit (24-bit) are really 6-bit with dithering. It's those displays that you probably noticed banding on since they can only do a fraction of the color output of a true 8-bit panel (~262,000 colors vs ~16.7 million). 8-bit and higher displays you "shouldn't" notice banding. 10-bit panels can reproduce billions of colors. Some 10-bit panels are 8-bit plus dithering or are true 10-bit panels but only able to accept an 8-bit per color channel signal and interpolate it up to 10-bits. IPS and VN panels are typically 8-bits or higher.

As for CRT monitors, I don't really know of any (at least consumer level) that were higher than 24-bit color output. The 32-bit you are thinking of is color output from the graphics card, which at the time is really 24-bit color plus 8-bit for alpha channel. There is a 36-bit and 48-bit color output (12-bit and 16-bit per color channel), but that was primarily limited to high end professional graphics editing machines. When you include the alpha channel, you are essentially at 40/48/64 bits per pixel.

Another thing to consider is color gamut. CRTs typically were superior to LCDs when it came to color reproduction when comparing consumer level monitors straight out of the box. This has improved greatly over the years, especially with the more expensive, higher end LCDs. This also may have played a factor in what you were describing as "banding" (although bit depth is most likely the primary cause), as a poorly calibrated monitor may not display the colors properly.

With an analog (read VGA or other analog RGB connection) the monitor (CRT) actually isn't limiting anything in bits. It's the DAC on the card, and whatever the driver/software/OS is able to output. The CRT monitor (quality differences notwithstanding) isn't going to limit. In fact, it will blur together multiple shades that are output from the display adapter and actually create smoother grades as a byproduct. (phosphors, beams, circuitry, etc.)

Edit: Actually re-reading the last part of your post, I think you were saying the same thing.
 
Last edited by a moderator:
With an analog (read VGA or other analog RGB connection) the monitor (CRT) actually isn't limiting anything in bits. It's the DAC on the card, and whatever the driver/software/OS is able to output. The CRT monitor (quality differences notwithstanding) isn't going to limit. In fact, it will blur together multiple shades that are output from the display adapter and actually create smoother grades as a byproduct. (phosphors, beams, circuitry, etc.)
When going even more granular, that's true. The CRT itself (additional components and connections, notwithstanding), from technological standpoint, isn't the main limiting factor when it comes to color reproduction capabilities. But much of those other limitations came with the territory (and the time) of available technology, especially at a consumer level. So, poor wording on my part.

Edit:
I see that you edited your post while I was posting. Lol
But, yes, we are pretty much on the same page.
 
Back
Top