Blog
3 A/B Tests App Developers Can Do Right Now

Try again. Fail Again. Fail Better.

The first few versions of your app won’t be perfect. Neither will the few after that. It takes dozens of tries to get to the point where you’re monetizing properly, engaging users well, and getting a decent number of downloads. How do you know your app is on the path to success? Let’s take a look.

What is mobile A/B testing?

It’s just what it sounds like -- checking how users react to two versions of a specific app feature (A and B) so you can see which one performs best.

How should you prepare for mobile A/B testing?

Before you begin, it’s best you do some data research on your app. Check your analytics to see where your app needs improvement. Is it revenue? Engagement? Installs? Are users falling off at a certain point in the app flow? Are in-app purchases lower than you expected?

(Check out this post to learn how to use your raw data to understand where users are falling off and why.)

Once you’ve identified the problem, use visitor behavior tools to understand what’s causing it. You can try heat maps, surveys, and user recordings to dig deeper.

Then, just like you would in any science experiment, make a hypothesis. For example:

  • If the game tutorial was more detailed, then users would stay in the game longer.
  • If users receive four free rewards in the beginning rather than two, then they’d buy more IAPs.

Then, of course, comes the mobile a/b testing itself, which we’ll get to in a bit. Once you’ve completed the tests, you’ll typically see a clear winner. The last thing to do is simply update your app accordingly.

What are the best tools for mobile A/B testing?

There are tons of great tools out there to help advertisers with mobile A/B testing. SplitMetrics, Optimizely, Apptimize, and Swrve are just a few.

But, if you’re stretched for cash, there are alternative methods as well.

(Note that the following methods are only meant to help developers optimize app store pages, not features within the app itself.)

Google Play Store A/B testing

In 2015, Google opened their Google Play Developer Console to allow developers to A/B test their app store page on Google Play.

You can test up to three versions of your app store page, either globally or locally. Variables include app icon, feature graphic, screenshots, promo video, and your app description in up to 5 languages. Bear in mind that the data Google Play collects is limited.

You can read more about Google Play experiments here. Unfortunately, Apple doesn’t offer mobile A/B testing yet.

Mock landing page testing

In some cases, it’s not always worth it to publish a new app version. Luckily, there’s a shortcut -- using mobile web landing pages to understand the success rates of different variables.

First, create three or four mini landing pages, each with a different icon, price point, etc. The constant is a CTA that encourages the user to download the app in the App Store or Google Play. The landing page should be minimal, displaying an icon, a two line description, the variable (of course), and a CTA.

Next, set up a small ($50) Twitter, Facebook, or Adwords campaign to drive traffic to the pages. Once each version has about 100 visitors, see which one received the most CTA clicks. This will be your winner.

3 mobile A/B tests you can do right now

 

Test 1. App Icon

Your app’s icon is the first thing users see, making it the most important part of your app store page. It should be appealing and catch users’ attention in just a few seconds.

In 2013 at WWDC, Apple UX Evangelist Mike Stern gave a presentation explicitly outlining 6 best practices for app icon design -- and if anyone knows about good app design, it’s Apple. (Apparently, app icons are among the top 3 reasons why apps get rejected from the App Store!)

Here are his 6 tips:

  • Focus on a unique shape (think Flipboard or Vimeo)
  • Stick to one or two colors
  • Avoid using photography
  • Avoid text
  • Use relevant graphics (think One Password)
  • Be creative

Ultimately, your app icon needs to stand out from the crowd. For example, if you have a camera app, don’t use a camera graphic as the main icon -- it’s overplayed. Instagram, Hipstamatic, and about 100 others have already thought of it.

You can watch the video here or read the transcript here.

Test 2. App Price Point

The difference between $0.99 and $1.99 can make all the difference. But how do you know at which point you’re turning potential users away, or conversely, where you might be able to drive more revenue?

When running your mobile A/B tests, keep in mind that just because more people clicked to download the $0.99 version doesn’t mean that that’s necessarily the right way to go.

For example, if 100 people download your app at $0.99, you receive $99. But if 60 download your app at $1.99, you get $119.40. Clearly in some cases, the higher price and lower number of installs is the way to go.

The price point you end up settling on can also be a good rule of thumb for determining IAP prices.

Remember that iOS users are more likely than Android users to purchase apps and make IAPs. So don’t be afraid to test price points across app stores.

Test 3. App Description

There’s no hard and fast rule for app descriptions, making it an especially difficult code to crack. It can be long, short, salsey, dry, include reviews, exclude reviews, etc. How do you know which direction to go in?

We suggest creating two different landing pages. Each landing page should look exactly like an app store page, meaning that the CTA should be on top, next to the icon and app title.

The first should be short and sweet while the second should be a bit more in depth. Either way, put your strongest sentence first. Remember that you only get about 255 characters to catch a user’s attention without them having to click to extend the description. Both descriptions in this phase should be relatively similar -- one is just a condensed, tighter version of the other.

If you have a gaming app, try specifying its subcategory and opening with a unique story that sets the scene. Or, for any other kind of app, be sure to highlight a problem and explain how your app solves it.

Once you get a winner, do another test. In one, include positive reviews. In another, place a short tagline in CAPS at the top. In the last, list your top features.

Continue playing around with possible variables. After a week or two, you should end up with a perfectly crafted app description.

 

Of course, there are dozens of more mobile A/B tests you can do to optimize your app. These are just a few to start you out. But before you get ahead of yourself, it’s important to remember that you should only be performing a couple of tests at a time. Otherwise, the data gets complicated and becomes unhelpful. Happy testing!

 

Questions or comments? Engage with us on Twitter @ironSource.

Interested in contributing to the ironSource blog? Email us at communications-editor@ironsrc.com.

Let's put these tips to good use

Grow with ironSource