Login | Register   
LinkedIn
Google+
Twitter
RSS Feed
Download our iPhone app
TODAY'S HEADLINES  |   ARTICLE ARCHIVE  |   FORUMS  |   TIP BANK
Browse DevX
Sign up for e-mail newsletters from DevX


advertisement
 

Mobile Application Performance Testing: An End-to-End Approach-2 : Page 2


advertisement

Mobile Application Performance Testing Approach

Considering the need to target every driver of the mobile application's performance, Figure 2 presents at high level the steps required.



Click here for larger image

Figure 2. Steps to Target Mobile App Performance Drivers



So before one gauges the performance of the application at the end device, it is important that the performance testing include the other scenarios in a typical application delivery.

Recreating real-world server conditions

This involves identifying and recreating the conditions as expected on the production servers. There are multiple conditions that need to be recreated. Table 1 below describes them.

Condition Recreated By Available Solutions
Workload type in terms of where the requests are generating from Example: Web application (desktop and mobile); Native iPhone and Android application Create the scripts specific to a workload type Any load testing solution (HP LoadRunner, Soasta CloudTest, etc.)
Load on server in terms of number of users from different workload types Example: 200 from Web application, 50 each from iPhone and Android native apps Create load testing scenario specific to the load and the associated scripts for a workload type Any load testing solution (HP LoadRunner, Soasta CloudTest etc.)
Load on server in terms of number of users from different geographic locations Example: 50% from US, 50% ROW Generate load from the load generators at the identified locations Cloud-based load testing solution (Soasta CloudTest, Gomez Web Load Testing, Keynote Web Load Testing)

Table 1. Mobile App Performance Testing: Recreating Server Conditions

Recreate real-world network conditions

This involves identifying and recreating the conditions on the network to be targeted, while gauging the application's performance on the target device. Table 2 describes these multiple conditions.

Condition Recreated By Available Solutions
Network type and quality
Example: 3G/2G/WiFi -- average/best/worst
  1. Emulation
  2. Devices in real-networks by mobile carriers or ISP
  1. Infosys Windtunnel, Shunra
  2. Keynote DeviceAnywhere, Gomez Synthetic Monitoring
Network Load
Example: 50% bandwidth utilized
Possible only by emulation Any Network Emulation Solution (Infosys Windtunnel, Shunra)
Network by Geography
Example: AT&T 3G in New York, Airtel 3G in Bangalore
  1. Emulation
  2. Devices in real-networks by mobile carriers or ISP
  1. Infosys Windtunnel, Shunra
  2. Keynote DeviceAnywhere, Gomez Synthetic Monitoring

Table 2. Mobile App Performance Testing: Recreating Network Conditions

Recreate real-world device conditions

This involves identifying and recreating the conditions on the mobile device to be targeted, while gauging the application's performance on the target device. Table 3 describes these multiple conditions.

Condition Recreated By Available Solutions
Application Type
Example: Thick or Thin Client
Load the application under test on real devices Keynote Mobile Web Perspective, Gomez Synthetic Monitoring
Platform Type
Example: iOS, Android
Load the application under test on real devices Keynote Mobile Web Perspective, Gomez Synthetic Monitoring
Device Type
Example: iPhone/iPad, Samsung Galaxy Nexus
Procure the necessary devices and load the application under test on real devices Keynote Mobile Web Perspective, Gomez Synthetic Monitoring
Parallel Applications/OS Component Running
Example: Music/GPS component in use
Put the concerned component in use and load the application on real devices Keynote Mobile Web Perspective, Gomez Synthetic Monitoring

Table 3. Mobile App Performance Testing: Recreating Device Conditions

Gauge the performance at each component

After recreating the real-world conditions, the mobile app developer should measure the performance of the application for every delivery mode along with the other hardware and software components involved. It is important to measure the performance on every component as that will provide an end-to-end view of the application's performance and not just an isolated one-device performance perspective. Table 4 lists various metrics that can be measured on different components.

Component Monitoring Parameters
(Not Exhaustive)
Available Solutions
Server
  • CPU usage
  • Load
  • Process time
  • Bytes total
  • User time
  • Packets sent/received
  • Any server monitoring solution (HP Sitescope, Perfmon)
    Network
  • Packets and bytes sent
  • Packets and bytes received
  • Average delay
  • Packet drops
  • Any network monitoring solution
    Device (if monitoring allowed)
  • CPU and memory usage
  • Method level profiling
  • Web application component level performance
  • Response times
  • Platform specific profiling solution (Instruments, DDMS) Keynote MITE, Gomez Cross-Browser Testing
    Transaction
  • Response times
  • Throughput
  • Any load testing solution

    Table 4. Performance Metrics To Monitor By Component

    Following the approach described in this article will ensure that the performance of the mobile application under test is measured and assessed accurately -- when the other factors affecting its performance in production are emulated as accurately as possible, of course. This approach will provide insight into the performance on the device as the end user would perceive it.



    Amit Gawande works as a Technology Lead at Infosys Labs, the research wing of Infosys Limited. He has gained considerable experience in performance engineering methodologies and cloud computing during his 5 years of experience in the field. His research interests include performance modeling and simulation techniques.
    Comment and Contribute

     

     

     

     

     


    (Maximum characters: 1200). You have 1200 characters left.

     

     

    Sitemap