A/B test your mobile apps and websites for quick UX wins

Posted on April 18, 2024
22 min read

Share

Every product designer or developer needs A/B testing in their toolkit, including those who design mobile experiences. A/B testing is a valuable method for app creators and product teams, and it’s one of the most impactful testing tools for making iterative changes and optimizing user experiences. We’ve put together this handy resource on A/B testing mobile apps and other mobile experiences to help your team make data-informed decisions while creating a truly user-centric digital product.

What is A/B testing for mobile apps?

A/B testing for mobile apps is a method teams use to compare two app versions to determine which one performs better. The process involves creating two variations of the app, typically differing in one aspect like design, features, or user interface (UI) elements. The two versions then simultaneously launch, each to a different segment of the app’s user base or target audience.

Mobile app A/B testing process

The A/B testing process for mobile apps generally works like this:

  1. Form a hypothesis: The first step is hypothesizing what change might improve the app experience. This could be anything from changing the color of a button to reorganizing the entire layout.
  2. Create variations: Make two versions of the app. One variant will contain the proposed change (version A) and the other will remain unchanged (version B). The apps should be identical except for the specific element you’re testing.
  3. Segment your audience: Divide the app’s user base into two groups and give each group access to one variation. You can make this division either randomly or based on specific criteria.
  4. Collect data: Define the metrics you want to track to measure the performance of each variant of the app. These metrics could include user engagement, retention rate, conversion rate, or any other relevant key performance indicators (KPIs).
  5. Analyze the data: Perform statistical analysis on the data from both versions of the app to determine whether there’s a significant difference in performance between the two.
  6. Make a decision: Based on your analysis, decide which version of the app performed better. If version A significantly outperforms version B, implement it as the new standard version of the app.

Related reading: Demystifying UX statistics: What is p and what does p < 0.05 mean?

Why A/B testing mobile apps is important

A/B testing allows app developers to make data-driven decisions about UX and UI design for mobile apps, including design and functionality changes. By comparing different app versions, product teams can determine which design, feature, or interface changes lead to better outcomes. These decisions will improve user experience (UX), engagement, and overall app performance. 

Of course, it’s always a good idea to follow standard mobile app UX best practices during the design process. Some of these include:

Benefits of A/B testing mobile apps

Conducting A/B testing during your app design process offers your UX team several advantages, such as:

  • Data-driven insights
  • Optimized in-app engagements
  • Real-time observations
  • Audience segmentation insights
  • Feature impact observation
  • Better understanding of user behavior
  • Improved KPIs
  • Iterative improvements
  • Cost-efficiency

Other app-testing resources

Consider conducting a mobile app evaluation and a mobile app comparison using our comprehensive Testing template gallery. 

Checlout Testing mobile experiences guide, to see how user feedback from mobile testing can guide the conception, design, development, and implementation of your team’s mobile experiences.

How do I test my mobile app for QA?

Testing a mobile app for quality assurance (QA) means making sure the app functions correctly, meets user expectations, and delivers a positive user experience. Here’s a general overview of the QA testing process for a mobile app:

1. Define testing objectives

Outline your goals for the QA testing process. Determine what aspects of the app need to be tested, including functionality, usability, performance, compatibility, and security.

2. Create test cases

Develop test cases that cover all of the app’s features and functionalities. These should include steps to reproduce specific scenarios, expected results, and acceptance criteria.

3. Conduct manual testing

Perform manual testing by following your test cases on various devices, operating systems, and network conditions. Test different aspects of the app, including navigation, UI elements, input validation, error handling, and data integrity.

4. Perform automated testing

Using testing frameworks and tools, implement automated testing to streamline repetitive tests and get consistent results. You can automate regression tests, UI tests, integration tests, and performance tests to validate your app’s functionality across different environments.

5. Test functionality

Verify that all of your app’s features and functionalities work as intended by testing these elements:

  • User interactions
  • Input validation
  • Data processing
  • Navigation flows
  • Integrations with external services or APIs

6. Conduct usability testing

With real users, evaluate the app’s usability to gather feedback about the app’s UI, navigation structure, accessibility, and overall UX.

7. Test the app’s performance

Assess your app’s performance under various conditions, including different device configurations, network speeds, and user loads. Test for:

  • Responsiveness
  • Loading times
  • Resource utilization
  • Scalability

8. Test for compatibility

Your app should function correctly on different devices, screen sizes, resolutions, and operating systems. Test for compatibility with popular devices, platforms, and browser versions to reach a wider audience.

9. Perform security testing

Within the app, identify and address security risks and vulnerabilities by testing for common security threats. These include data breaches, unauthorized access, injection attacks, and encryption weaknesses. 

10. Conduct regression testing

Regression testing can help you verify that recent code changes or updates haven’t introduced new defects or regressions. At this point, it’s prudent to rerun your test cases and validate your app’s behavior to maintain its stability and reliability.

11. Track and report bugs

During the testing process, document any issues, bugs, or defects you discover. Use bug-tracking tools to log detailed descriptions, screenshots, and steps to reproduce the issues. Prioritize and assign tasks to resolve based on their severity and impact.

12. Test iteratively

Based on feedback, test results, and evolving requirements, continuously iterate and refine your QA testing process. Incorporate user feedback, performance metrics, and market insights to improve the app’s quality and effectiveness over time.

How to A/B test an app store listing

Conducting A/B tests for app store listings involves experimenting with different elements of your app’s listing on platforms like the Apple App Store or the Google Play Store to determine which variations lead to higher downloads, conversions, and engagement. Some common elements you can test are:

  • App title: Test different variations of your app’s title to see which one attracts more attention and accurately conveys your app’s value proposition.
  • Icon: Experiment with various app icons to determine which one stands out more and entices users to click and learn more about your app.
  • Screenshots: Try different sets of screenshots to showcase multiple features, functionalities, and benefits of your app. Vary the order of your screenshots and their captions.
  • App description: Test different versions of your app’s descriptions to see which one resonates better with users and encourages more downloads. You can vary the tone, length, and your app’s key selling points.
  • Keywords and metadata: Experiment with various keywords, tags, and metadata to improve your app’s discoverability in search results. Test variations in keywords’ relevance, popularity, and competitiveness.
  • App preview video: If you have different versions of your app preview video, test them to see which one effectively demonstrates your app’s features and functionality and leads more users to download it.
  • Call-to-action (CTA) button: Test variations in the text, color, size, and placement of your app’s download or install button to optimize conversion rates.

Expert Q&A: A/B testing mobile apps

For this section, we held a Q&A session with Nancy Hua, the CEO of Apptimize. Apptimize is a California-based organization that enables mobile app design teams to conduct effective A/B tests, implement new features, and create personalized user experiences.

What are the most common UX problems you see on mobile apps?

One of the most common problems we see mobile apps doing is not onboarding users well. On mobile, users make decisions about your app very quickly. First impressions matter. According to research, 25% of apps are abandoned after first-time use. A lot of teams out there are focused on making a whole bunch of awesome features for their users, but if a huge chunk of your users are deleting your app after only a few minutes, all that effort goes to waste.

If you don’t nail onboarding, your developers may as well have been drinking beers instead of building those features that no one saw. It’s not a contest of who has the most features. The apps that succeed are the ones that convey their value proposition to the users from the get-go. Instead of focusing on building out more cool features, app teams should focus on how to showcase them properly.

The first step is to drive users toward the aha moment, the instant when users realize what value the app will provide in their lives. This can be done using onboarding tutorials, good clean UI design, and quickly driving users toward core functionality. Once your users reach the aha moment, they won’t delete your app. Then, you can start showing off all your other features.

What’s next? What should teams do to engage and retain users?

The aha moment is reached relatively early on in use, so the next key is to show the user what else they can do in an app, and get them deeper into your ecosystem.

Another common mistake is to hide additional features in an app drawer, the most common of which is the hamburger menu. The hamburger menu is problematic. It signals to users that features tucked away in the menu are not important. Since screen space is incredibly limited, only the most valuable features are immediately viewable. Even if your feature is amazing, the impression that users get is that it’s neglected, cast aside in the same drawer as user settings, share buttons, legal, and other non-essentials. Users don’t explore as much as we think they do, so showing features they’ll value front and center is vital.

Make it easy for users to find your best features.

One of our clients, a social fashion app, increased retention rates by 18% by simply moving a few key features from the hamburger menu into a tabbed menu at the bottom of the screen. We also recently talked to Slickdeals and they shared with us a similar problem: users were only using what was immediately displayed on the home screen. Their other features that are very popular on the web were neglected in the side drawer, so now they’re working on a redesign to better showcase their top features.

The point is that users generally don’t spontaneously discover all your features. If you have something your users will love, display it prominently to make sure that it’s being seen and utilized. But of course, you can’t overdo it either and bombard your users with ten thousand different features at once. This is where user testing and A/B testing come in to determine your top features and the ideal layout.

What are some of the testing methods app developers can use to improve their products?

One of the best things that app developers can do is to A/B test changes before deploying them to all of their users. Because the app marketplaces are so slow, iterating fast and understanding how your users are reacting to your feature changes can be incredibly difficult. With these methods, any mistakes cost precious time and resources.

With A/B testing, teams can deploy new features to a small percentage of their users, and test to analyze how the features affect user behavior. If it’s beneficial, they can then instantly deploy that feature to 100% of their users, without having to resubmit.

App teams can also do user testing to get qualitative feedback from their users and hear them describe their experience in their own words. It’s good for pinpointing exactly what is confusing—or delightful—to the user.

Where does A/B testing fit into the process of testing and optimizing an app?

The traditional app cycle is incredibly long, drawn-out, and rigid. The typical release cycle looks somewhat like this, with dev cycles ranging anywhere from a few weeks to a few months:

There are two key roles here that A/B testing plays while testing and optimizing an app.

The first is to use instant A/B testing to shorten the QA process, as well as bypass app store approvals. Using those, you’re able to instantly make changes and roll them out to your users, without waiting for an app store review.

The second place A/B testing plays a key role is getting better, more actionable data and analytics after launch. On a typical mobile release, teams usually roll out a bunch of different changes all at once, ranging from bug fixes, to feature additions, new UI elements, and so on. If your metrics are affected, it’s very difficult to figure out why.

If they go up, great! But you don’t know what specifically contributed to those increases. Was it the bug fixes? Did users love the new feature? Or was it due to the easier-to-use UI? This is where qualitative testing comes in. 

How can you use A/B testing and user testing together to get deeper insights into how users interact with your product?

User testing and A/B testing go hand in hand. While user testing provides the qualitative feedback, A/B testing takes care of the quantitative. User testing is great for macro opinions. They’ll give you early feedback on macro issues with your design. Using it will help you understand why users behave in a certain way, and what they think/feel about your app.

A/B testing, on the other hand, allows you to experiment with more detailed aspects of your app that users may not know they’re responding to. For example, users can tell you when a checkout flow is downright confusing, but they might not know whether cutting out a step in your checkout flow would help them buy more things. The two types of testing are two different approaches, so you can attack a problem from different angles.

Ultimately, both are essential tools that complement each other well. User testing results will give you ideas on what to A/B test. A/B testing results will give you ideas on what to ask your users.

What's the most common mistake developers make when they set out to optimize their apps?

The most common mistake is assuming that the changes they’re building are going to delight users. More broadly, mobile app creators assume that they are good at predicting what their users want and how their users will behave and feel. And hindsight bias allows us to feel like we knew the answer all along. But really, you need to ask your users and experiment in a data-rigorous way to really know how your users will act and react to your decisions.

Don't make assumptions about what will work. Talk to your users, and constantly experiment.

Just because people ask for a change, doesn’t mean they really want it. A great example of this was when the digital magazine The Next Web was inundated with requests and pleas to build an Android app in addition to their popular iOS app. So they did. Turns out, people weren’t downloading and using it.

"We had gotten enough requests for it and had gotten the impression there were thousands of anxious Android tablet owners holding their breath for an Android version of our magazine. Unfortunately, we’ve found out that although Android users are very vocal they aren’t very active when it comes to downloading and reading magazines."

Boris Veldhuijzen van Zanten, Co-Founder of The Next Web

Without hard data, it’s very difficult to judge what people say they want and what they really want. Figure out ways to validate your features before deploying them to your entire user base.

What are the most successful apps in the field doing?

The most successful apps in the field are staying on top of new technologies and trying out cutting-edge methods of development. They’re the ones that are always learning from their peers, keeping their ears to the ground, and aren’t afraid to make some mistakes.

The App Store almost discourages experimentation. Along with the arduous processes required to make any changes, every time you release a new version, you lose all your ratings. The best apps don’t let these types of things stop them. Instead, they learn to validate their hypotheses and incrementally compound their successes.

The best ones are also extremely user-focused. Rather than guessing and assuming they know their users, they ask them and test out many hypotheses. They constantly question what they can do to improve the user experience. 

Do you ever have clients who have trouble getting buy-in from their teams for testing? What do you say to them?

All. The. Time.

“We don’t have time to test right now because we’re working on this big release that’s coming out in three months.”

“We don’t have the resources to support a nice-to-have.”

At Apptimize, we hear this a lot. But the truth is that your release cycle shouldn’t be three months. While you’re working on your waterfall of three months, your competitors are staying agile, talking to customers, learning from customer behavior, and improving their product 10x with six smaller iterations in the time it took for you to do one big release. We’re not in the ’90s anymore. Mobile isn’t boxed enterprise software.

Staying agile and user-focused is critical to staying alive in this space where customers not only have a lot of choices but switching costs are low and expectations are high. 

If you aren't testing, you'll get left behind.

This is why top apps like Facebook and Netflix built their own user testing and A/B testing processes and platforms before anyone else in the space even thought about doing these things. This is why these companies have found so much continued success. And they spent a lot of time and resources on building these capabilities at a time when building your own was the only way to do it. Now any app can just buy the same functionality for one 100th of the cost of building your own. Not doing it would be paramount to getting left behind.

How can app developers create a culture of testing and optimization within their company?

We actually interviewed Lacy Rhodes, iOS Engineer at Etsy, a while ago on this very topic. Essentially, there is no one silver bullet. It’s about incrementally showing value and showing how positive learnings and results from testing compound into huge gains.

Small early wins definitely help to get buy-in from a larger team. Ultimately, testing is about proving value. Both user testing and A/B testing will help you prove the value of your ideas to the rest of the team and get everyone really excited about knowing what’s actually working and what’s not. It helps everyone waste less time, be more focused, and be heard.

For more tips on testing and optimizing your app, check out UserTesting's usability testing templates and checklists.

Expert Q&A: Mobile website user testing

Mobile user testing plays a critical role in creating winning experiences. Michael Mace compiled a series of answers to the top questions we hear time and time again when helping our clients run their UserTesting mobile studies. Michael, the VP of Market Strategy at UserTesting and a 35-year tech industry veteran, has occupied marketing and strategy roles at Apple and Palm, co-founded two startups, and consulted for multiple tech companies.

First, Michael answers our most commonly asked questions about mobile website user testing.

We have a successful computer-based website. How much should we worry about mobile?

It depends on your customer base. If you’re sure that none of them ever use mobile devices, you probably don’t have anything to worry about. In a far cry from the early 2000s, most U.S. adults today say they use the internet (95%), have a smartphone (90%) or subscribe to high-speed internet at home (80%), according to a Pew Research Center survey conducted May 19 to Sept. 5, 2023. Ask yourself, if you’re not competing on mobile, are you leaving yourself vulnerable to competitors who are?

Most companies should at a minimum test their websites on mobile to make sure they work properly and meet user expectations. And you should seriously consider either designing a mobile site from the bottom up or modifying your current one for smartphones and tablets. That involves rethinking not just how the site works, but what tasks users will want to do on mobile.

Which mobile platforms should we test our website on?

The leading mobile web platforms in the U.S. are iPhone, Android phones and tablets, and iPad. So you should definitely test on at least those three platforms. 

When should we test on mobile?

You should do user tests throughout the development process, so you can fix problems before they get too deeply embedded in your site. You can start testing as soon as you have anything to show to users, even if it’s just conceptual sketches.

It’s best to run frequent tests of a few testers each than to save up your tests and do them all at once at the end of development

What sorts of tests should we do on mobile?

When people hear “user testing,” they tend to assume that means only usability testing. That is, of course, one of the things you should do with user testing: Have users go through the main functions of your site, make sure they’re intuitive, and identify questions or hesitations users might have. This is especially important if you have a purchasing process on your site.

But it’s also important to test for emotional engagement. In other words, how do people feel about your site? Can they quickly accomplish what they want to do? Do they feel rewarded by using it? Mobile users are notoriously impatient and easily distracted. Even if your site is easy to use, people may not stick around unless they feel engaged.

You should also plan different tests for smartphones versus tablets. Smartphones are used most often for quick access to info while people are on the go. For example, on shopping sites, people frequently use smartphones to do product and pricing research, even though they may not be as likely to make the final purchase during their visit. You should test to make sure it’s easy for people to find product information, pictures, and price information on your site.

In contrast, tablets are much more likely to be used for long browsing sessions. So you should make sure that the tablet shopping experience is rich, engaging, and easy to use.

I understand the importance of designing specifically for mobile, but I have trouble convincing the people I work with. How do I educate them about the specific needs of mobile?

If you’re struggling with a specific design issue, user tests can be a terrific way to end the argument quickly. You’ve heard the old saying that “a picture is worth a thousand words”? In our experience, a user video is worth a thousand hours of debate. If your team is having an argument about a feature, you can use UserTesting to get quick video of some real users reacting to the proposed solutions. We find that those videos can be far more persuasive than a roomful of opinions.

Should we focus on a mobile app or a mobile website?

No single answer is right for every organization.

We’re finding that many commerce companies choose to do both. The website is aimed at casual visitors, while the app is aimed at their most loyal customers (the people who are most likely to download an app). So, you use the website for prospecting and the app for deepening the relationship.

Conversely, many major brands are using mobile apps as marketing tools to help spread awareness and affinity. 

Related video: Here’s how Burberry increased app engagement 200%

The most important thing is to understand what your mobile strategy is. How does your presence on smartphones and tablets fit with all of the other ways you engage with customers, and what are you hoping to accomplish in mobile? We’re long past the days when you could create something on mobile and expect users to respond just because it’s trendy.

Expert FAQ: Mobile app testing

Here, Michael answers some frequently asked questions about mobile app user testing.

We already have some beta testers who use our app. Do we also need to do user testing?

Any testing is good, but as beta testers get to know your app, it gets more challenging to spot usability problems because they no longer have fresh eyes. Also, friends and family (the usual source of beta testers) are not necessarily a good proxy for typical users because they are too emotionally invested in your product or aren’t representative of your customer base.

UserTesting gives you feedback from typical users who don’t already know your app and have no emotional investment. User tests help you understand the needs and reactions of normal users, who will write your reviews in the app store.

Should we test before or after the app release?

We strongly recommend testing apps before release and during development. It’s much easier to fix a problem in the early stages of development than after the product is finished (not to mention after you’ve received a slug of bad reviews in the app store).

Testing an iOS app before release is difficult because of Apple’s security. How do you handle that?

We have special processes to simplify the testing of iOS apps. We manage UDIDs and don’t deplete your allocation of UDIDs. Just fill out the test form online, give us a link to your app, and we’ll take care of the rest. (Note: due to the extra logistics involved, iOS unreleased tests take an average of three business days to complete after we receive your .ipa file.)

When should we do user tests in the development process?

Whenever you have something you can show users, it’s a good idea to get feedback immediately. You can even test prototypes and preliminary wireframes (anything you can display in a browser or in a user test). Generally, the sooner you identify problems, the easier it is to fix them. We’ve seen tragic examples of companies that tested at the end of development and identified problems but released anyway because it was too late to make changes.

What sorts of tests should we do on mobile apps?

People who hear “user testing” assume it means only usability testing. Of course, one of the things you should do with user testing is have users go through the main functions of your app, make sure they’re intuitive, and identify questions or hesitations users might have.

Many apps include icons and controls that the developer custom-created for the app. Although artistic creativity is great, one of the most common causes of confusion in user tests is buttons and controls that users can’t easily understand. It’s essential to test these features.

However, there are also three other essential tasks for user testing. The first is to look for emotional engagement. In other words, how do people feel about your app? Can they quickly accomplish what they want to do? Do they feel rewarded by using it? Mobile users are notoriously impatient and easily distracted. If your app doesn’t engage people, they may move on to something else and never return.

The second task is to gain a deeper understanding of your customers' thinking. The better you understand them, the better you can make decisions on their behalf. User tests are like mini-focus groups, but you can organize them on a day’s notice and with far less expense and hassle.

The third use is settling internal arguments. Because mobile devices are highly personal, you may find that people on your team can be extremely passionate about a dispute over a feature or UI element. Rather than having a knock-down argument on the subject, letting the actual users give you a ruling is often easier and faster. User tests make it easy to bring in that voice of the customer.

Should we test on both tablets and smartphones?

If your app is designed to run on both tablets and smartphones, you should test on both of them. A screen layout that looks good on a smartphone can look uninviting on a tablet, and vice-versa. And user behaviors are subtly different on tablets and smartphones.

This post was updated April 15, 2024.

Decorative image of a woman sitting in front of a laptop speaking and gesturing with her hands

Make an impact with human insights

Try UserTesting for free today!

In this Article

    Related Blog Posts

    • Photo of UserTesting THiS London stage

      Blog

      Digital innovation and insights driving customer-centric transformation: THiS Connect London 2024

      The Human Insight Summit (THiS) Connect: London 2024 was a must-attend event for digital...
    • Blog

      How to achieve product-market fit

      According to CISQ, $2.26 trillion is spent on software re-work in the US So...
    • product design principles

      Blog

      Top product design principles for creating standout products

      Having user-centered product design principles means emphasizing early user involvement and iterative refinement to...