A Little Less Latency Please
By Mike Masnick, Thu Oct 28 21:45:00 GMT 2004

While everyone's focused on the throughput of wireless data connections, it might be time to pay a little more attention to latency.

Every time people discuss wireless broadband solutions, the focus seems to be on the "speed" offered. However, by speed, most people mean throughput or bandwidth offered on the connection. In the mobile world, it's often just as important to talk about latency, even if it doesn't get nearly enough attention. Latency, simply defined is the "time a message takes to traverse a system." On a network, this includes the "overhead" needed to send data across a connection. For all the speed that 2.5G and 3G networks provide, people are just starting to realize that latency is a real hurdle. End users don't distinguish, but if they realize that downloading a web site or an application on their phone is taking a very long time, they're not going to be happy, whether it's throughput or latency that causes the problem.

Traditional DSL connections tend to have latency between 50ms and 100ms -- which is fine for most web browsing and some applications (gamers usually want lower latency). VoIP, for instance, usually is acceptable if latency is less than 150ms. Most wireless networks have much more noticeable latency. Reports suggest that GPRS and EDGE systems tend to have latency around 800ms to 1100ms (over a second!), 1xRTT systems are in the 300 to 500ms range and EV-DO brings the latency down a bit to somewhere around 200 to 300ms. For users who are expecting zippy high speed data connections, that latency may make them wonder what happened. It's especially noticeable on small chunks of data. For example, a simple WAP page should load quickly, even at low bandwidth, but the latency becomes the true bottleneck. Many web sites will also appear slowly, since they consist of multiple components, each of which may have latency issues as they're downloaded separately.

Are there solutions to latency? Some applications can certainly be designed around the latency issue. For example, people doing email on a Blackberry don't have issues with latency, because the system downloads email and stores it locally -- more or less "hiding" the latency issue away from the end user. Of course, not all applications can be built that way.

The original eWeek article linked above points to a solution from one company that tries to optimize content to reduce the latency problem by taking a web site and making into a single WAP element, meaning the latency only hits once, rather than multiple times. However, this only works on web content, and, there's still only so much you can do to hide the core latency of whatever system you're using. In a famous paper on latency, Stuart Cheshire once wrote: "Once you've got yourself a device with bad latency there's absolutely nothing you can do about it (except throw out the device and get something else)."

Throwing stuff out and getting something new might just be what everyone is going to do eventually. Newer network technologies like FLASH-OFDM, TDD UMTS and HSDPA all claim lower, more reasonable, latencies. While they also offer higher throughput, for many, it may actually be the lower latency that's really appealing. In the meantime, though, many people will just have to sit and wait, as the "latency" for getting these newer networks available may also be longer than everyone would like.