Blog Insights
A/B Testing Your Email Marketing

We often hear, and we inherently know, that when it comes to email marketing, A/B testing is incredibly important. So much so that most mainstream email marketing software platforms have A/B testing functionalities embedded directly so that it’s that much easier to do. 

A/B testing, also known as split testing, is a method of comparing two versions of an email element against each other to determine which one performs better.  A/B testing is an effective way to identify potential areas for improvement in your emails, especially if an element isn’t performing as expected. It allows you to test new designs, layouts, and content. While research from 3rd parties can tell you what colors may tend to perform better than others, no audience is the same. The main point of A/B testing is to find out what works for your audience.  While there multiple benefits to using it, A/B testing remains at the bottom of most priority lists, as emails get pushed out the door with little to no review process, resulting in chronically-low open and click-through rates. If this rings true for you, the good news is that it’s never too late to course-correct. Before we begin our experiment, we need to determine the necessary factors for effective A/B testing. As you embark on or rethink your existing approach to A/B testing, here are some of the most important things to stick to. 

Define the goal

Why are you considering A/B testing? What do you want to achieve by doing this? More opens? More click-throughs? Make sure this funnels up into your overarching digital goals and organizational goals. 

Decide what to test

What element do you want to experiment with? Is there something that isn’t getting results as expected? The element you choose should reflect the goal you have set in advance, e.g., if you’re looking to improve open rates, try A/B testing the subject line or friendly forms.  It’s best practice to only test one thing at a time, as this allows you to track exactly what caused the outcome of your experiment.  Common elements to experiment with are:
  • Subject line
  • Call-to-action (CTA) text
  • Button color
  • Text, image or button size
  • Copy (length, voice, tone, etc.)
  • Images
What does success look like? To gauge success, you need to establish a strong hypothesis. For example, you want to test your hypothesis that changing the CTA from “Learn More” to “Discover the difference you can make” will increase click-through rates because this more-detailed language resonates more deeply with the target audience. 

Create sample sizing to ensure the most accurate results

Do you want to test your whole list or just a portion of it? Of course, the larger the sample size, the more accurate your results will be. However, how much of your list do you really need to use for the testing portion? To determine this, you’ll need to consider the statistical significance. A great way to manage this is by using a sample-size calculator. The main elements that make up statistical significance include:
  • Confidence level: the probability that your sample accurately reflects the attitudes of your population. 
  • Population size: the number of total subscribers on your email list.
  • Margin of error: the amount of possible variance in an email’s test results. The smaller the margin of error, the more exact your results will be at the given confidence level. For most purposes, you can leave the margin of error at 5%. 
PRO TIP: a 99% confidence level will require a larger than 95% sample size confidence level. As the margin of error decreases, a higher sample size will be required.

Decide how long your test length and run your test

Ideally, you would want to give your emails a full 24 hours to ensure the most accurate results. However, f you’re unable to wait that long, we recommend waiting no less than 3 hours, as shorter than that will make it especially difficult to determine significance.

Determine result significance

Once the test has ended, it’s time to determine which, and if, there is a winner. This calculator will help you determine if there is a statistically-significant winner based on the metric you have determined in your hypothesis. For example, if you are testing a subject line, you are most likely testing the open rate. Insert the results from your two variations, as well as your list size, into the calculator and see if there’s a winner!

Take action

Once the experiment is complete and you’ve ensured the results are statistically significant, push out the winning email to the rest of your mailing list. Additionally, think about how these results should impact your emails going forward. Depending on the email platform you are using, the winning email will be pushed out automatically after the test period is complete, which saves you having to go in and send manually. If you’ve ensured the results are statistically significant, then great! The winning email will be sent out, and then you can decide how these results will impact your design and copy decisions going forward. E.g., if you’ve determined a significant increase in clicks by changing the CTA from “Learn More” to “Discover the difference you can make,” then you may be ready to opt for a more personal action statement going forward. To be more confident, test this element again in a next email to get more data that supports it (or inversely shows that maybe it was just a one-off and requires further testing). If your results aren’t statistically significant and it’s not possible to say the change was definitely more effective, you’ll want to take a different approach to get the desired result and re-test. 

Embed A/B testing in your process

A/B testing is incredibly important when it comes to email marketing as it allows you to effectively identify potential areas for improvement. A/B testing is an ongoing process, so while you may have solved for one area, there will always be another one to explore. In this way, you’ll want to embed A/B testing in your email testing in the long term to continuity test what is best for your audience. 

Written by

Are you ready to create impact?

We'd love to connect and discuss your next project.