THE MOBILE NETWORK - Throughput and Latency

Although these acronyms and the constant evolution of cellular and wireless network technology
can be baffling, the important thing to understand is that a variety of networks are used to provide data connections to your users ’ devices. As with the physical and diversity - related challenges of the devices themselves, you need to be cautious about assumptions for these connections.

Speed or throughput of the network connection is an obvious constraint. At the end of 2010, according to Akamai, the average fi xed - line broadband speed in the United States was 5Mbps,
many factors faster than even the theoretical peak speed of most mobile networks. This has a direct impact on the users ’ Web experience because it defi nes the minimum time that an uncached web page takes to download to a device. You ’ re probably not surprised to read that many mobile devices also do not have comprehensive or long - lived caching capabilities, thanks to their memory constraints.

A user with a 3G UMTS connection in the United States might expect an average download speed of 250Kbps, and 750Kbps on HSDPA (although such speeds are drastically affected by movement and the density of other data users in the local area). Even this is six times slower than a typical wired desktop experience: A web page containing a 1Mb video fi le might load in 2 or 3 seconds on a desktop, but it would take at least 15 seconds on a fast mobile network. That may be longer than an impatient user on the go is prepared to wait for the download. If you expect to deliver rich media to your mobile web users, you certainly need to look at limiting or adapting fi le sizes.

In addition to pure speed, other factors significantly affect the impact of the network on the user
experience. One is the setup time for the data connection itself. A desktop or laptop computer
usually has a persistent connection to the Web, and the fi rst page visited by a user starts to
download immediately. Most devices, on the other hand, connect on demand (in order to preserve power when the data connection is not in use), and this can add as much as 5 to 7 seconds to the time for the first page to be requested. Your users may already be impatient by the time your page even starts downloading.

A more persistent but often overlooked consideration is that of roundtrip latency. This is a
measure of the time taken for data packets to proceed from the device, through the network, to
the destination service, and back again, although excluding the time actually taken for the server
to perform any processing. This is influenced entirely by the type and topology of the network, the route the packets take, and any intermediate proxies or gateways that process the data en route.

On a fixed - line ADSL connection, latency is so low that it is barely considered. Regardless of the throughput speed, a ping time of less than 80 milliseconds to most web servers can be assumed from within the United States, and at most a few 100ms to internationally hosted servers.

On a mobile network, however, latency is a real consideration. This is partly because packets
sent from a mobile device to a public web server traverse a number of sub - networks and their
boundaries. First, there is the cellular air interface to a nearby cell station — which has a
particularly significant impact on latency — then a backhaul link to the network carrier ’ s switching center. Then there is a sequence of gateways that connect the traffic, often through firewalls and proxies, to the Internet, and then finally the packet proceeds through web infrastructure to the server. The effects on latency can be significant.

AT & T quotes a latency overhead of between 100ms and 200ms for requests to servers immediately external to their current UMTS and HSDPA networks, and 600ms over their GPRS and EDGE networks. While this is impressive, given the complexity of the cellular network involved, you should still expect the latency of a typical browser - to - server - to - browser roundtrip to be an order of magnitude longer than for a broadband connection.

In some respects, latency is more important than the raw throughput of the network, and this is particularly true for web applications, where a page is made up of a large number of component
parts. The request for each graphic, each style sheet, and each external script is delayed by the latency of the network, and if those objects are small, this can easily dominate the total time taken for the page to fully download. Web developers can take real action to mitigate these effects, such as combining CSS files, using sprite techniques for interactive images, and tuning cache settings for external objects.

Source of Information : Wiley - Professional Mobile Web Development with WordPress Joomla and Drupal

0 comments


Subscribe to Developer Techno ?
Enter your email address:

Delivered by FeedBurner