You've Lost Us at Two Seconds: Tips on Improving Mobile App Load Times

June 4, 2013 Eric Yuen

Does your app or mobile website load up in two seconds or less? If not, you should be concerned, because it needs to be faster. One commonly touted statistic is that 40% of consumers will abandon a page if it takes more than three seconds to load; that is a very dramatic loss. Additionally, it’s estimated that if your site generates $100,000 per day, a one-second delay can cost you $2.5 million a year. For consumers today, there’s no excuse these days to have an experience that is slow.

So what can we do about it? For starters, we can expect network latency in the range of 500ms to 1s (e.g., Sprint nationwide network has 800ms). There are some things we can do to better manage that, but for the most part it’s out of our control. That doesn’t leave us with much time. Let’s look at the server and the client to try to find some definite wins.

The Server

Different platforms have different needs. Images displayed on a smartphone need to be smaller than ones displayed on a desktop. There are serious memory and performance issues with sending an image that is the wrong size. On the server side, it is critical that images are scaled to the exact size needed by the device. This reduces the workload on the device. It can’t be overstated how detrimental to it is to have incorrectly sized images.

When it comes creating the web services and the question of XML (typically SOAP) vs. JSON, there is little debate in our eyes that JSON is the way to go. XML responses are typically two to three times larger than the equivalent JSON. A full functioning desktop website might display every single field it has for a product, whereas an app might only have room to show a few fields. So make sure the server doesn’t return any more than the client needs. Having the flexibility to specify in your request which fields to include in the response is an effective way of reducing the size of the payload. ETags (which stands for entity tags and is part of HTTP, the protocol for the World Wide Web) are also a very good way to reduce unneeded payloads; this is particularly effective for serving feeds or article content that update infrequently or sporadically.

You’re making sure that every client has ultimate control over what it needs and gets back. The great thing about these optimizations is that the benefits can be felt across all channel.

The Client

On the client side, especially a mobile client, we should never assume that the connection is reliable. Having an app that breaks or is not functional under spotty data connection is a terrible user experience. There are a few things that can be done to be curb this and improve load times.

  1. Make priorities. Not all requests should be treated the same. Implement a proper request queue with priorities and the ability to cancel ones that aren’t needed anymore. Requests that block the user from performing the next action should be at the top of the queue. After that would be requesting supporting content like images. Lastly, reuse your connections and make use of your idle cycles by trying to make the next request for them. If the network is unreliable, try queuing up a batch of requests that will run once the internet is available again.
  2. Implement caching in a smart way. This can work in conjunction with ETags. Apps can cache to memory or to the file system. In either case, it’s always faster than going over the network. A caveat when using the file system as a cache: make sure to limit the number of concurrent file accesses and delete stale data. In addition to caching the responses already returned to the user, sometimes you can predict the next set of requests and have them already cached when the user requests them. For example, consider a list of articles. You can imagine that users will most likely view the top ones in more detail first. Having the full content already available for the user when they request it means you can show it to the user immediately. That is an amazing experience. But don’t go overboard. This is where good analytics and clear use cases really help.

For a real deep dive into the problem, I strongly encourage everyone to watch the great “Breaking the 1000ms Time to Glass Mobile Barrier” presentation given by an engineer at Google. It’s an enlightening talk.

Summary

Ultimately, this should be a reminder of how important load times are. Consumers are justifiably impatient but it’s not just a matter of it being fast enough. There are real benchmarks with real impact. In most cases, optimizing your images, having a flexible backend, limiting the requests per screen (to ideally one) and the minimizing the size of the response will get you to that sub-two second benchmark that makes a real difference. With mobile data connections getting even faster though, we should all really be aiming for load times of less than one second. It’s doable. We’ve seen it. Let’s make it happen.

 

Connect with Eric on LinkedIn.

About the Author

Biography

Previous
Jeff Scott Brown – Groovy and Grails Overview
Jeff Scott Brown – Groovy and Grails Overview

… Read more

Next
Pivotal People—Q&A with Tomcat Expert Mark Thomas
Pivotal People—Q&A with Tomcat Expert Mark Thomas

Pivotal People are some of the best programmers, data scientists and engineers on the planet. This series i...