Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Terminal Latency (beuke.org)
64 points by xrayarx on March 17, 2024 | hide | past | favorite | 48 comments


I find the eventual conclusion baffling:

  > Applying this custom tuning results in an average latency of 5.2 ms, which is 0.1 ms lower than xterm, and that’s with having a much more sane terminal without legacy cruft.
This "legacy cruft" is as-close-as-it-gets compatibility to EVERYTHING the Terminal-based world has ever seen. It's an asset, not a liability, for those who need it, in case they need it - and that could very well be you one day.

I find it quite amazing that xterm manages to more or less outperform all the other implementations while (probably - I have not actually verified) being he most complete and correct one. Thomas Dickey is the man!


I think this is important subject, but I'm always bit wary on pure software measurements; I feel like the Typometer data needs to be cross-validated against physical click-to-photon measurements. In similar vein focusing on specific millisecond values is of limited use when display updates are quantized to specific frame rates (i.e. 60Hz), so the main question is what percent of updates have 1,2,..n frames of latency. Of course I recognize that doing such measurements is immensely more effort, requires hardware etc, so I'm not berating author for not doing that.


I also recently tried different terminals, because I was annoyed by the startup time of konsole. I eventually settled for "foot" [0], which I think is a bit underappreciated. It would be an interesting addition for the benchmark, although it's Wayland only.

[0] https://codeberg.org/dnkl/foot


Looks cool, but unfortunately not packaged in fedora where I’d need it.


It looks like it is packaged in the official Fedora repos: https://packages.fedoraproject.org/pkgs/foot/foot/


A couple weeks ago I started noticing my terminal was weirdly slow to startup. I never understood why people cared about terminal latency until I had to wait literally several seconds after opening my terminal to start typing. I thought it was because I had installed oh my zsh and it was doing something funky at startup.

Looking into it more I realized that I had unthinkingly pasted an echo '...' >> ~/.zshrc command into my .zshrc instead of running it on the command line. So every time I ran my terminal, it was not only running the same line hundreds of times, it was appending one more line.

I now appreciate a fast terminal startup much more after the experience.


The latency discussed in this article is input latency. As in, his terminal runs at 190fps but some other really bad runs are as slow as 25fps (that is to say they add 1/25th of a second input latency to typing).


The author links to a “blog post”, but hasn’t looked for much preceding work. For example, this article from 2018 in LWN was the first that opened my eyes to the wide differences in latency among terminal emulators:

https://lwn.net/Articles/751763/


But your blog link has the same issues the author mentioned - not including some modern apps like wezterm and not measuring the full / end-to-end latency


Ghostty

I’m really interested in Mitchell Hashimoto (founder of Hashi Corp), terminal app he’s developing called Ghostty.

His latest blog post on it, also related to latency & benchmarks comparable to Alacritty.

https://mitchellh.com/writing/ghostty-devlog-006

For those interested in the tech, he’s developing it with Zig, uses SIMD extensively, and it’s cross platform. Can’t wait for him to launch it.


I hope he targets something better than a vt102 (going for 520 like Zutty would be good) and remembers to do extensive checking with VTTEST. Blog post from the Zutty creator about correctness: <https://tomscii.sig7.se/2020/12/A-totally-biased-comparison-...>


This is promising: https://mitchellh.com/writing/ghostty-devlog-005#xterm-compa...

(I have also been bitten by terminals calling themselves `xterm-256color` or some such thing while not being remotely xterm-compatible; GNOME being the worst offender, as usual. My daily driver is mlterm¹, which is pretty good in that respect² — its main downside for me is its idiosyncratic font configuration, which is tailored for multilingual flexibility, not the English-only programmer.)

¹ https://github.com/arakiken/mlterm

² https://invisible-island.net/xterm/xterm.faq.html#bug_mlterm...


Zutty is useful for only one thing: passing VTTEST. I've tried it and it's unusable as the terminal you want to rely on to do everyday work.


It seems to work perfectly fine for me. What kind of issues do you have with it?


I ~am~ was interested in trying out alacritty after reading this, but they don't seem to favour making it friendly to OS package managers. They don't seem to be friendly to feature requests either, I see resistance to package managers and tabs. Its performance is quite good though.


It's opinionated, which comes with upsides and downsides. I won't blame the maintainer to keep things focused, feature creep (even for worthy features) can kill a FOSS project.

Another example is sixel support, there's a fork where it all works but is not sufficiently "proven" (code quality just as well as sixel being the best fit for the problem)

https://github.com/alacritty/alacritty/pull/4763#issuecommen...

It may be annoying but I get the reasoning, and there are other terminals.


https://repology.org/project/alacritty/versions seems to show it in most of the major distros?


As expected fine-tuned st is the clear winner. However, it can be improved even further by removing double buffering altogether. Since the code is very hackable it's quiet easy to do and many LOCs can be thrown out.


I have a hard time believing that after this much effort benchmarking terminals and eventually having to switch to a different one that it wouldn't have been easier--and a lot more constructive, not only for this one user but for society in general--just to contribute a fix to Xterm for whatever issue it is having with Unicode (which fwiw I guess hasn't affected me yet, but if it ever does I'll just fix it).


Have recently made the switch to Termius. I really like it. However, I noticed there is local lag when typing. Anyone else see this?


What is the use case for these latencies? Multi-player text gaming?


Higher input latency typically results in more mistakes while typing. The lower you can get it, the better.


Perhaps, but the difference between 3 and 30 milliseconds?

Call me skeptical.

I can edit fine over a remote desktop connection to a machine across town where a VirtualBox is running Ubuntu, where I have Vim in a Gnome Terminal.

> The lower you can get it, the better.

Really? There is no point of diminishing returns? If you have 5 microseconds latecy in your terminal, 2 is better?

That's literally what you're saying.


Correct. If I had access to a reasonably priced 1000Hz monitor, I would use it, for exactly this reason. It's not just this one thing in isolation, it's the whole tool chain, and latency is cumulative. So, for a simplified example, if I have a bluetooth keyboard in the mix, and it adds 50ms, then with your numbers, I'd have 52ms total, which is indeed better than 55ms. It all adds up, and with enough links in the chain adding their own little bit, it can easily add up to be perceptible and meaningful.

I wish I could link you to byuu's deep dive on emulator latency so you could see where I'm coming from, but their site is now offline and has been excluded from the Wayback Machine. This is the old link, in case anyone knows of a mirror: byuu.org/articles/latency/

Edit: found a mirror: https://github.com/higan-emu/emulation-articles/tree/master/...


The above GH link don't work for me - too much JS coupling.

This however yields content: https://archive.is/IoShg


With my numbers, you'd have 50 ms total. The bluetooth delay of 50 ms has two significant figures in it at best, and so when we add the 0.002 ms (2 microseconds), that's just noise.


Apologies, I glossed over your mention of "microseconds", and was reading it as "milliseconds". In that case, you're right, I have no intention of splitting hairs amongst thousandths of milliseconds. However, that does make me unsure of the point you're trying to make, unless it's just to be absurdly pedantic.


OK, so 5 ms versus 2 ms keystroke delay isn't pedantic; 5 μs vs 2 μs is.


Yes? There's a 1000x difference between those speedups.


There is also a 1000X difference between a nanosecond keystroke delay and a microsecond keystroke delay. So that must be equally relevant since 1000 = 1000.


Makes sense. You can expect people to find microsecond speedups a lot more relevant compared to nanosecond speedups.

Are you really equating people's perception of tens of milliseconds of latency with their perception of nanoseconds? Is it surprising that people would notice a 50 ms slowdown more than a 50 nanosecond slowdown?


Yes, but to be pedantic, absurdly so.


> 1000Hz monitor

Wouldn't you have to first upgrade to faster cells in your retinas?

A glance through some research in this area doesn't indicate a benefit to going beyond 250 Hz.


If I use a terminal with 30msec latency to connect to a host with another 30msec network latency, it's now 60msec - pretty noticible by an average human being. That's why reducing latency where you can to the minimal possible value is important.


The use case is everyday use without noticing delays


Ok... I'll just say, 40 ms is fine.

Not if you are playing a game, or watching one of those text-art movies. But for normal terminal usage, as long as it's constant, it's just fine.


How does latency affect text-art movies? Is there audio going out of sync?

Movies don't respond to anything you're doing. A movie could be delayed by 2 seconds, and you wouldn't know, if the audio is synced, and you're not jumping around in it where you'd notice the buffering delays.


Ugh, I hate when audio in a movie is too far out of sync. More than ~16ms in either direction is really obvious, especially when lips are flapping.


> More than ~16ms in either direction is really obvious

No, it isn't.

Note that if someone is talking to you from just five meters away, their audio is delayed by around that much.

It starts to be obvious at around 100.

I don't fiddle with downloaded videos due to abundant access to streaming (plus no time), but back in the day, I usually adjusted bad sync in increments of 100 ms. Maybe the odd time I would go to 50.


Wow, someone on the Internet is advising me on my own subjective experience! Amazing!


Nobody can deny that you subjectively perceived an obvious lipsync problem.

But you put a millisecond figure on it, which would have involved some objective measurement method and instrumentation. That figure has got to be wrong, due to some mistake.

Or else you have superhuman perception in this area. If you're talking to someone from across the room, and perceive an obvious lag in lip sync, I withdraw all my remarks.


I don't typically mind some latency for typing commands and processing the output, but latency while editing in vim etc can be annoying.


I have literally no idea why people care so much about typing latency. If you type that fast, I suggest that you seriously consider finding a more intellectually challenging task.


Typing speed (throughput) and latency are (somewhat) orthogonal.

Some people appear to be sensitive to throughput, some to jitter, some to absolute latency. I'd even argue that "fast" typists may be less sensitive to absolute latency as there's more of a streaming approach than with "slow" typists which is more event-based.

But it's also personal. Whatever my typing speed is, I want things to happen ~"right now".


I think latency on its own is not as much of an issue as fluctuations of it (jitter) when it comes to human perception. At least myself is very susceptible to jitter so I appreciate post author's work documenting those stats, though jitter is not addressed directly.


It's not like the typing latency is blocking. You can keep typing and issuing commands. The screen will catch up. I feel like anything under 50ms (20fps) is probably fine; the other various feedbacks like tree-sitter and LSP and what-not will probably be bigger bottlenecks anyhow. (Ed: whoa, uh, was surprised how slow these editors are, but that seems to be for full screen / little-whitespace redraws?).

As a vim user I can be going full steam ahead without looking at feedback. People used to come by and ask, how do you read that screen, everything is so small, and in jest I'd say I don't need to, I already have perceived everything there. My internal model of where I am and where I want to go and what I want to change is only semi-gated on seeing things. I'm not particularly great at navigating between things, but it still feels like a lot of this moving-between-things and changing-things-around stuff is pushed down to a semi-subconscious layer. The brain thinks and somehow the fingers do, and it doesn't feel like I'm really paying attention as these things happen.

One of my favorite pieces of anticipation and matching, that I haven't done in a decade, is Mass Effect 2, which had a pretty neat "hacking" mini-game that involved trying to recognize which of multiple windows of scrolling code was the "right" of code, from shape. Something about that really spoke to me as one of the finer arts of coding, of pattern recognition, of being able to discern place & identify patterns & location quickly from lo-fi moving screens. Being able to navigate code & yourself by look and feel is awesome. https://masseffect.fandom.com/wiki/Bypass#Hacking

Terminals ought to be fast though. We should expect it. I respect that. Some of these terminals do seem unreasonably slow, and that should be improved. And maybe ghostty just rocks everyone & we all find we're way better after it, after terminals get way faster, but I also expect there's rapidly diminishing rapidly returns somewhere. But it'd be cool to set up on a 540 Hz monitor with a fast as sin emulator and find out if that's true or not.


Fast feedback loops aid accuracy.


If you need the visual check, that's probably a colossal amount of latency already. Even if the screen updates are instant, the act of reading & comprehending is - I believe - slow.

It does help to have fast response. Agreed. I use Debian's aptitude, and man, even on a beastly desktop there is a lot of waiting >1s+. On my ultra-portables it's even worse (really should go back to atomic system images via btrfs to avoid this). Latency sucks. But I feel like there's significantly outsided attention paid to whether a terminal is 10ms or 40ms. Once we start getting to 100ms, it's starting to be a real issue, is problematic. But I think generally most devs pretty quickly reach a point where as they type and do work, they use feel & their mental model way more than the screen to achieve their goals. The feedback, when it's fast, stops being visual.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: