Guides/Google Play Console/Running A/B Tests with Google Play Store Listing Experiments

Google Play Console

Running A/B Tests with Google Play Store Listing Experiments

Learn how to set up and run A/B tests on your Google Play Store listing. Test icons, screenshots, descriptions, and feature graphics to optimize conversion rates.

6 steps
2 min read
1

Navigate to Store Listing Experiments

In Google Play Console, select your app > Grow > Store listing experiments. This section lets you create experiments to test different versions of your store listing elements against each other.

2

Choose what to test

You can test the following elements:

  • App icon — Test up to 3 variant icons
  • Feature graphic — The 1024×500 banner at the top of your listing
  • Screenshots — Test different screenshot sets
  • Short description — The 80-character preview text
  • Full description — The 4,000-character listing text

Test one element at a time for clean, actionable results.

Tip: Start with screenshots or your app icon—these are the highest-impact visual elements and typically show the fastest, clearest results.

3

Create your experiment

Click Create experiment. Give it a name, select the element to test, and upload your variant assets. Google will show the original (control) to 50% of visitors and the variant to the other 50% by default. You can also set a custom traffic split.

4

Configure the experiment settings

Set the target audience (all users or specific locales). Choose whether to test on first-time visitors only or all visitors. Configure the experiment to end automatically when statistical significance is reached, or set a manual end date. Google requires a minimum of 7 days of data.

5

Launch and monitor

Click Start experiment. Google will begin splitting traffic between control and variant. Monitor results in the experiment dashboard, which shows install rate, retained installers, and statistical confidence. Wait for at least 90% confidence before drawing conclusions.

6

Apply the winner

Once the experiment shows a statistically significant winner, you can Apply variant to make it your new default listing. If the control wins, keep your original. Document your findings—these insights compound over time and inform future experiments. AppDrift's platform helps you manage the resulting asset updates across your listings.

Common Errors & Solutions

Experiment not reaching statistical significance

Solution: Your app may not have enough traffic. Experiments need thousands of store visitors to reach significance. For low-traffic apps, run the experiment longer or test higher-impact elements (icon vs. description).

Cannot start experiment while another is running

Solution: Google Play allows only one active experiment per listing element at a time. End or apply the current experiment before starting a new one for the same element.

Frequently Asked Questions

Automate this with AppDrift

Skip the manual work

AppDrift connects to your Google Play Console to automate metadata publishing, screenshot management, and localization across 40+ languages. Set up once, publish everywhere.

Get Started FreeFree to start · No credit card