devxlogo

Mobile Application Performance Testing: An End-to-End Approach

Mobile Application Performance Testing: An End-to-End Approach

With mobile application capabilities and usage increasing, the application functions that are performed on a mobile device are also becoming more complex in functionality and more critical to businesses. The mobile mode of service delivery — being closest to end users and always available — requires that developers pay particular attention to mobile application performance testing. Even users expect more responsive performance from mobile applications than they do from desktop apps.

So, it has become extremely important to determine the performance that a mobile application’s end-user will experience before actually deploying the application onto a mobile platform. To do so, mobile application developers need to performance test their applications with an end-to-end application delivery model that tests for all the varying scenarios on each component involved.

In this article, we propose such an end-to-end approach to mobile application performance testing. Our aim is to present a performance testing exercise that will help mobile application developers accurately gauge performance from the end-user’s point of view.

Mobile Application Testing by Type

It is important first to understand the types of mobile applications developed today. Each type calls for a tailored performance testing approach targeting the components and processes involved.

Testing Browser-Based Mobile Apps

Browser-based mobile applications might perform differently as compared with the same application accessed via a desktop browser. The causes of these differences include the bandwidth limitations of the user’s cellular network data plan and differing device configurations. The advantage of a mobile browser-based application is the lower development cost, as the mobile app developer needs to consider only how to make it compatible with most mobile browsers.

While testing for performance, it is important to replicate the user server load from mobile browsers too, in case additional dedicated components are being deployed to cater to such requests. It is also important to test Web page rendering on the target device as device configuration (e.g. CPU, memory or browser rendering engine) can affect end-user performance.

Testing Native Mobile Apps

Native mobile applications are installed directly onto mobile devices, allowing users to access the applications quickly. The disadvantage of native mobile apps is the development cost. Because the applications are platform dependent, an app developed for one mobile OS won’t work for another. The developer is left having to develop and test the application on all the platforms, such as iOS, Android, Blackberry, and so on. This calls for thorough performance testing on each target platform as well as on a set of devices in each selected platform.

Testing Hybrid Mobile Apps

Hybrid mobile applications are installed on mobile devices just like native mobile apps but users can access them via either an installed app or the device browser. The application is developed using Web technologies and then wrapped in a platform-specific shell that allows the app to be installed just like a native app.

As with native applications, the disadvantage of hybrid mobile applications is development cost. To allow a user to access the application via an installed app, the developer has to develop the application for all the platforms mentioned in the previous section. Again, a performance test strategy has to target the load generated by users of such hybrid apps as well as gauge the on-device application performance.

Mobile Application Performance Drivers

An ideal mobile application performance test should target every performance driver. Figure 1 presents the key components governing a mobile application’s performance.

For a typical application delivered to a mobile device, the performance as perceived by the end user depends on two key factors:

  • Content delivery to device: This process involves the request delivery to the server, response generation at the server, and the response delivery back to the source device. Typical parameters that affect this are load on the server components, type of network involved, load on the network, and the external component behavior. These parameters can be handled and tested for by emulating the expected real world conditions on the server and network.
  • Content rendering at device: This process involves displaying the response from the server at the source device. Typical parameters that affect this are type of platform, type and configuration of device, and the single user behavior patterns. These parameters can be handled and tested for by in-parallel, on-device application access and monitoring.

Mobile Application Performance Testing Approach

Considering the need to target every driver of the mobile application’s performance, Figure 2 presents at high level the steps required.

So before one gauges the performance of the application at the end device, it is important that the performance testing include the other scenarios in a typical application delivery.

Recreating real-world server conditions

This involves identifying and recreating the conditions as expected on the production servers. There are multiple conditions that need to be recreated. Table 1 below describes them.

ConditionRecreated ByAvailable Solutions
Workload type in terms of where the requests are generating fromExample: Web application (desktop and mobile); Native iPhone and Android applicationCreate the scripts specific to a workload typeAny load testing solution (HP LoadRunner, Soasta CloudTest, etc.)
Load on server in terms of number of users from different workload typesExample: 200 from Web application, 50 each from iPhone and Android native appsCreate load testing scenario specific to the load and the associated scripts for a workload typeAny load testing solution (HP LoadRunner, Soasta CloudTest etc.)
Load on server in terms of number of users from different geographic locationsExample: 50% from US, 50% ROWGenerate load from the load generators at the identified locationsCloud-based load testing solution(Soasta CloudTest, Gomez Web Load Testing, Keynote Web Load Testing)
Table 1. Mobile App Performance Testing: Recreating Server Conditions

Recreate real-world network conditions

This involves identifying and recreating the conditions on the network to be targeted, while gauging the application’s performance on the target device. Table 2 describes these multiple conditions.

ConditionRecreated ByAvailable Solutions
Network type and quality
Example: 3G/2G/WiFi — average/best/worst
  1. Emulation
  2. Devices in real-networks by mobile carriers or ISP
  1. Infosys Windtunnel, Shunra
  2. Keynote DeviceAnywhere, Gomez Synthetic Monitoring
Network Load
Example: 50% bandwidth utilized
Possible only by emulationAny Network Emulation Solution (Infosys Windtunnel, Shunra)
Network by Geography
Example: AT&T 3G in New York, Airtel 3G in Bangalore
  1. Emulation
  2. Devices in real-networks by mobile carriers or ISP
  1. Infosys Windtunnel, Shunra
  2. Keynote DeviceAnywhere, Gomez Synthetic Monitoring
Table 2. Mobile App Performance Testing: Recreating Network Conditions

Recreate real-world device conditions

This involves identifying and recreating the conditions on the mobile device to be targeted, while gauging the application’s performance on the target device. Table 3 describes these multiple conditions.

ConditionRecreated ByAvailable Solutions
Application Type
Example: Thick or Thin Client
Load the application under test on real devicesKeynote Mobile Web Perspective, Gomez Synthetic Monitoring
Platform Type
Example: iOS, Android
Load the application under test on real devicesKeynote Mobile Web Perspective, Gomez Synthetic Monitoring
Device Type
Example: iPhone/iPad, Samsung Galaxy Nexus
Procure the necessary devices and load the application under test on real devicesKeynote Mobile Web Perspective, Gomez Synthetic Monitoring
Parallel Applications/OS Component Running
Example: Music/GPS component in use
Put the concerned component in use and load the application on real devicesKeynote Mobile Web Perspective, Gomez Synthetic Monitoring
Table 3. Mobile App Performance Testing: Recreating Device Conditions

Gauge the performance at each component

After recreating the real-world conditions, the mobile app developer should measure the performance of the application for every delivery mode along with the other hardware and software components involved. It is important to measure the performance on every component as that will provide an end-to-end view of the application’s performance and not just an isolated one-device performance perspective. Table 4 lists various metrics that can be measured on different components.

ComponentMonitoring Parameters
(Not Exhaustive)
Available Solutions
Server
  • CPU usage
  • Load
  • Process time
  • Bytes total
  • User time
  • Packets sent/received
  • Any server monitoring solution(HP Sitescope, Perfmon)
    Network
  • Packets and bytes sent
  • Packets and bytes received
  • Average delay
  • Packet drops
  • Any network monitoring solution
    Device (if monitoring allowed)
  • CPU and memory usage
  • Method level profiling
  • Web application component level performance
  • Response times
  • Platform specific profiling solution(Instruments, DDMS)Keynote MITE, Gomez Cross-Browser Testing
    Transaction
  • Response times
  • Throughput
  • Any load testing solution
    Table 4. Performance Metrics To Monitor By Component

    Following the approach described in this article will ensure that the performance of the mobile application under test is measured and assessed accurately — when the other factors affecting its performance in production are emulated as accurately as possible, of course. This approach will provide insight into the performance on the device as the end user would perceive it.

    devxblackblue

    About Our Editorial Process

    At DevX, we’re dedicated to tech entrepreneurship. Our team closely follows industry shifts, new products, AI breakthroughs, technology trends, and funding announcements. Articles undergo thorough editing to ensure accuracy and clarity, reflecting DevX’s style and supporting entrepreneurs in the tech sphere.

    See our full editorial policy.

    About Our Journalist