TouchMarks II: Touchscreen Latencies in Flagship Tablets

In our last TouchMarks report, we looked at the touchscreen latencies of the flagship smartphones from different manufacturers. In this report, we’ll benchmark the touchscreen latencies of the leading tablets, including iPads, Microsoft’s Surface and leading Android tablets including Amazon’s newest Kindle Fire HD.

Methodology

Before we get into the results, a little bit about our methodology (for a full recap please check out our first post. In TouchMarks, we use our Touchscope to benchmark the App Response Time of touchscreen devices by measuring the time between when the user touches the screen and the device updates the display. We place the light sensor and touch at the center of the screen to account for the device’s 60 Hz refresh rate, the devices are in airplane mode with full brightness and they have had all background apps closed to make the test as fair as possible.

For this report, we will specifically be measuring the Minimum App Response Time by using OpenGL/DirectX-based optimized apps with minimal logic to quickly flash the screen white in response to a touch. We’ve open sourced the apps here for review. If you think there is a material discrepancy between our test apps on the different operating systems that gives one device an unfair advantage over the other, please let us know in the comments or submit a pull request.

With that, let’s take a look at the results!

Results

For this report, we benchmarked the latest versions of popular tablets from various manufacturers, including Amazon’s brand new Kindle Fire HD. We also threw Nvidia’s new Tegra 4-based SHIELD in there even though it’s not a traditional tablet. Here’s a graph of the results:

touchmark_graph_ftablets_rev2

In a result that’s perhaps now unsurprising, the iOS devices are more responsive than its competitors. Interestingly, the iPad mini – with its smaller screen and 1024×768 resolution – performs similarly to the larger fourth generation iPad and its 2048×1536 resolution, suggesting that responsiveness is not reduced by the larger screen size or resolution.

Perhaps the more interesting results are in the non-iOS devices. Nvidia’s SHIELD performed much better (though strangely with much higher variance) than the other Android devices, suggesting that Nvidia went through special effort to optimize the responsiveness of the device. It’ll be interesting to see whether the improvement carries over to other, non-Nvidia Tegra 4-based devices.

Amazon’s Kindle Fire HD was a surprising leader in the Android section, especially given that it’s $90+ cheaper than the other two Android tablets. This could be a result of Amazon’s willingness to sell expensive hardware at a loss. Finally, Microsoft’s Surface RT also performed much better than we expected, especially given that Nokia’s Lumia also running Windows didn’t fare significantly better than Android devices in our smartphone comparison. Given these results, I’m especially excited to run Amazon’s higher-end Kindle HDX and Microsoft’s new Surface 2 through our Touchscope. I’ll be updating this post with the results as soon as I can get my hands on them.

Our previous speculations continue to apply –  more responsive devices may process touches earlier in the stack, poll for touches more frequently or have touchscreens optimized or calibrated to be more responsive. In our latest review of our test apps, we discovered an optimization that suggests that the GPU or GPU drivers in the devices might also add significant latency. We will explore the ramifications of this and possible explanations in a future TouchMarks, but note our current test apps do not include the optimization in order to better represent the performance of normal OpenGL/DirectX apps.

Conclusion: if you primarily use your tablet for reading, watching videos or browsing the web, then shop around and pick the best tablet – iOS, Android, Windows 8 – that suits your needs. With their lower price points and high PPI screens you may find an Android tablet works perfectly for you. If, however, you’re into latency-sensitive applications like games or interactive music apps, then your best bet might be an iPad.

TouchMarks I: Smartphone Touchscreen Latencies

Last week, Apple released the iPhone 5C and iPhone 5S. As soon as these devices hit the streets, they’ll invariably be benchmarked. Nowadays, we have benchmarks for mobile CPUs , GPUs  and even repairability . However, when we talk about responsiveness, our reviews are much more qualitative and subjective.

As a technology platform focused on low-latency streaming of apps from the cloud to mobile devices, Agawi’s 15 person team has been thinking, breathing and dreaming response time (also known as latency) on mobile for the last 3 years. Since every few milliseconds of latency reduces the responsiveness of the app being streamed, we have focused on relentlessly identifying, measuring and eliminating latency to make streaming applications to mobile devices as responsive as possible.

Today, our latency experts are using their knowledge to introduce the first quantitative and objective benchmark of app response times: TouchMarks. By introducing TouchMarks to the market, we hope to bring more rigour to discussions around touchscreen response times, device lag, streaming latency and other topics related to how responsive an application feels on a mobile device. We’ll define some terms in the space so when we talk about response times, we’re all talking about the same thing. We’ll also try to bring to light all the sources of latency that many people and companies are unaware of so they can improve their products taking them into account.

The hardware and software behind TouchMarks will be open sourced so that others can replicate our results, improve TouchMarks and use these tools to improve the responsiveness of their products and services.

touchscope1

Agawi's Touchscope was built in-house to get multiple samples of touchscreen response times very quickly, the Touchscope measures App Response Time (ART) by capturing the time delta between activation of the Force Sensitive Resistor on the glove and the Light Sensitive Resistor positioned over the device. The Touchscope is based on the Arduino platform and uses easily available electronic components so it's easy for any electronics hobbyist to replicate. The specs and code will be released soon.

With that, let’s get into our first release: TouchMarks I.

TouchMarks I

For TouchMarks I, we decided to measure and reveal the minimum response times of flagship smartphones from top manufacturers. Are Apple’s touchscreens more responsive than Android devices as Apple lovers claim? Or does Samsung save its best displays for its own use?

Using a combination of high frame rate cameras capturing at 240fps and custom Agawi hardware (pictured above) , we can accurately measure the App Response Time (ART), which we define as the latency from the time the user feels that they’ve touched the device’s display to the time the user sees a response on the screen. For TouchMarks I, we wanted to measure the minimum response time an app developer could expect on various devices. We built simple, optimized apps to flash the full screen white* as quickly as possible in response to a touch. The apps contain minimal logic and use OpenGL/DirectX rendering to make sure the response is as quick as possible. Since these are barebones native apps doing nothing more than filling the screen in response to a touch, this benchmark defines the Minimum App Response Time (MART) a user could experience on a mobile app on each device.

 Here are the results:

touchmark_graph_1rev

As you can see, the results are remarkable. At a MART of 55ms 72ms, The iPhone 5 is twice 1.5X as responsive as any Android or WP8 phone tested. All the Android devices’ MARTs fell in the same 110 – 120ms range, with the WP8-based Lumia 928 falling into that bucket as well. (Incidentally, the ranges all span about 16ms, which is expected given the 60 Hz refresh rate of these smartphones. 1/60s = 16.6ms)

There are several possible reasons for this. Since touchscreen hardware has significant latency itself (check out this video from Microsoft Research for a visual demonstration), our best guess at Agawi is that Apple’s touchscreen hardware is better optimized or more sensitively calibrated for capturing and processing touch. Another possibility is that while the Android and WP8 code are running on runtimes (Dalvik and CLR respectively), the iPhone code is written in closer-to-the-metal Objective-C, which may reduce some latency. In future TouchMarks, we’ll compare C/C++-based Android apps to Java based apps to determine if this is the case.

Regardless of the reasons, the conclusion is clear: the best written apps on iPhones will simply feel more responsive than similar apps on the current gen of Android devices. (We speculate this might be a major reason why the iPhone keyboard generally feels better than the Android keyboard to many people.)

Most developers are unaware of this latency inherent in touchscreens, leading them to often dramatically underestimate the end-to-end latency of their apps. For example, if the app is calling out to the network to respond to a touch event, achieving an end-to-end latency of under 80 milliseconds is unrealistic (not even counting app processing time), even on an iPhone. Our suggestion is to add the device’s MART score to the app’s internally calculated response time to approximate the app’s end-to-end latency.

In this TouchMarks report, we explored the Minimum App Response Time –  essentially the best an app could possibly do. In future TouchMarks, we’ll look at how device latencies have changed over time (including benchmarking the new 5C and 5S) and the actual response times achieved by a few apps on these devices. Stay tuned…

*Theoretically we might be able to do better by only filling a small portion of the screen. However, we felt filling the full screen is more representative of a typical app use case, which might involve panning and shaders (in games) or screen transitions (in apps). We might explore the fill rate’s effect on latency in future reports.

Special thanks to Tim English at Stanford for consulting and checking our work on TouchMarks.

UPDATE: In preparing for our TouchMarks II release, we discovered an optimization in our iOS test app that was not present in our Android or Windows Phone test apps. To keep the benchmark consistent across all devices, we have removed the optimization from our iOS test app and updated the iPhone results and graph in this post to reflect the change. We’ll be exploring the effect of the optimization in a later post. I apologize for the error.