A/B Testing: 15 Steps to the Perfect Split Test

image of testing work

Table of Contents

Is an A/B test planned? Bookmark this checklist for the best results before, during, and after.

Marketers often use intuition to predict what will make people click and connect when creating landing pages, email copy, and call-to-action buttons.
A/B testing is better than marketing decisions based on “feeling,” which can hurt results.
To make the best decisions from your A/B testing results, read on.

How does A/B testing work?

A/B testing, or split testing, is a marketing experiment that splits your audience to test campaign variations and determine which performs best. Thus, you can show half your audience version A of a marketing piece and the other half version B.
A/B testing is helpful because audiences behave differently. One company’s solution may not work for another.

CRO experts dislike the term “best practices” because it may not be the best for you. If done correctly, this testing can be simple.
To avoid misjudging your audience, let’s review A/B testing.

How Can I Do The A/B Testing? 

An A/B test requires two content versions with a single variable change.

A/B testing lets marketers compare two marketing content versions. Two A/B tests can boost your website’s conversion rate.

Test User Experience

You can test whether moving a CTA button from the sidebar to the homepage will increase its click-through rate.
To A/B test this theory, create another web page with the new CTA placement.
The “control” sidebar CTA design is Version A. Version B is the “challenger.” Then, a predetermined percentage of site visitors would see each version.
Visitors should see both versions equally.

image of Design Test

Design Test

Change the color of your CTA button to see if it boosts click-throughs.
To A/B test this theory, you’d create a different-colored CTA button that leads to the same landing page as the control.
You may switch to green if your A/B test shows that the green CTA button gets more clicks than the red one in your marketing content.

Marketing A/B Testing

Depending on what you test, A/B testing can benefit a marketing team. You can test anything to see how it affects your bottom line.
Test these campaign elements:

  • Subject lines.
  • CTAs.
  • Headers.
  • Titles.
  • Fonts, colors.
  • Product photos.
  • Blog images.
  • Copy body.
  • Navigation.
  • Opt-in forms.

This list needs to be completed. Options abound. These low-cost, high-reward tests are most valuable to a business.
Take a $50,000-a-year content creator. This content creator writes five weekly company blog posts, totaling 260 annually.
If the company’s blog generates ten leads per post, it costs $192 to generate ten leads ($50,000 salary / 260 articles = $192 per article). That’s substantial.

If you ask this content creator to spend two days developing an A/B test on one article instead of writing two posts, you may lose $192 because you publish fewer articles.
If that A/B test increases conversion rates from 10 to 20 leads, you spent $192 to double your blog’s customers potentially.
If the test fails, you’ll lose $192, but you’ll learn more for your next A/B test. That second test could double your company’s revenue by $384.
No matter how many times your A/B test fails, its success will usually outweigh its cost.

a/b Test Objectives

A/B testing can reveal how your audience interacts with your marketing campaign.
A/B testing helps determine audience behavior and marketing goals.
A/B testing marketers have these business goals.

image of Traffic Growth

Traffic Growth

A/B testing will help you to find the right website title wording to attract visitors.
Testing different blog or web page titles can change the number of people who click on that hyperlinked title to visit your website. Increases website traffic.
Web traffic growth is good! Traffic increases sales.

Increased Conversion

A/B testing increases traffic and conversion rates.
Testing CTA placement, color, and anchor text can increase landing page clicks.
This can boost the number of website visitors who fill out forms, submit their contact info, and “convert” them into leads.

fewer bounces

A/B testing can help identify website traffic loss. Your website may resonate with something other than your audience. Your audience may dislike the colors if they clash.
Testing blog post introductions, fonts, and featured images can retain website visitors who “bounce” quickly.

Product Images

You’re confident in your offering. How do you know your product image conveys what you offer?
A/B testing can determine which product image attracts your target audience. Choose the image that sells best.

Cart Abandonment

70% of online shoppers abandon their carts. “Shopping cart abandonment” hurts online stores.
Testing product photos, checkout page designs and shipping cost placement can lower this abandonment rate.
Let’s review an A/B test checklist.

How To Design An A/B Test?

A/B test design can seem complicated at first. Trust us—it’s easy.
To design a successful A/B test, identify which parts of your blog, website, or ad campaign can be compared and contrasted.
Check out these A/B testing best practices before testing your entire marketing campaign.

Appropriately test

List factors affecting how your target audience interacts with your ads or website. Consider which parts of your website or ad campaign affect sales.
Choose elements that can be modified for testing.
You could test which fonts or images attract your audience in a Facebook ad campaign. Or, test two pages to see which keeps visitors on your site longer.

Pro tip: List and prioritize elements that affect sales or lead conversion to choose test items.

Correctly size the sample.

A/B test sample size can affect results, sometimes negatively—small samples skew results.
Ensure your sample size is sufficient for accurate results.

Verify data

Split tests produce statistically significant and trustworthy results. Your A/B test results are not random. How do you know your results are statistically significant and reliable?
Tools help verify data, just like sample size.
Convertize’s AB Test Significance Calculator lets users enter traffic data, variable conversion rates, and confidence levels.
Data is less likely to be random with higher statistical significance.
Use A/B test significance calculators to ensure data is statistically significant and reliable.

Schedule tests

When comparing variables, keep all controls the same, including test scheduling.
Holiday sales are essential in e-commerce.
If you run an A/B test on the control during peak sales, your website traffic and sales may be higher than the variable you tested in an “off week.”
Choose a comparable timeframe for both elements to ensure split test accuracy. To get the most accurate results, run your campaigns for the same length.

Pro tip: Choose a timeframe with similar traffic to both split test parts.

Test one item.

Your audience’s behavior is affected by every website or ad campaign variable. A/B tests should focus on a single element.
Multi-element A/B tests yield unreliable results. Only reliable results will show which factor influenced consumer behavior most.
Split test one element of your ad campaign or website.
Avoid testing multiple elements at once. A good A/B test will only test one element.

Data analysis.

Marketers may know how their target audience interacts with campaigns and websites. A/B testing can reveal how customers use your sites.
Examine the data after testing. Your campaigns may not be as successful as you thought.

Pro tip: Reliable data may change your mind. Use data to plan or change campaigns.

What Should I Do Before the A/B Test

Let’s discuss the pre-A/B test steps.
Test one variable.

  1. Set a goal.
  2. Make a “control” and “challenger.”
  3. Randomly divide sample groups.
  4. Determine sample size (if applicable).
  5. Determine your results’ significance.
  6. Run one test per campaign.

For the A/B Test

  1. A/B test.
  2. Test both variations simultaneously.
  3. Allow the A/B test to gather valuable data.
  4. Get user feedback.
  5. Focus on your goal metric.
  6. Use our A/B testing calculator to determine the significance of the results.
  7. Act on results.
  8. Plan your next A/B test.

Reading A/B Testing Results

Marketers understand automation. Use software to calculate A/B test results, which is helpful. However, you must know how to interpret your calculations. Discuss how.

Check goal metrics.

A/B test results start with your goal metric, usually conversion rate.
After entering your results into your A/B testing calculator, each version will yield two results. Each variation yields a significant result.

Compare conversions.

Your results will likely show which variation performed better. Success is measured by statistical significance.
Variation A converted 16.04%. Variation B converted 16.02% and had a 95% confidence interval. Variation A has a higher conversion rate, but the results are not statistically significant.

Segment audiences for insights.

To understand how each key area responded to your variations, break down your results by audience segment, regardless of significance. Audience segmentation factors include:
Visitor type—which version worked best for new versus returning visitors?
Mobile vs. desktop performance.
Traffic source, or which version performed best.
Here are some business A/B experiments.

Test the A/B Test Today!

A/B testing reveals audience preferences for content and marketing. Use the free ebook below to master some of the steps above.
This post was updated in May 2016 for completeness.

Related posts

Leave a Reply

Your email address will not be published. Required fields are marked *