How to A/B Test Email Campaigns
Published: Friday 10 October 2014 | Last updated: Tuesday 04 September 2018
In this brave new age of social media, email marketing sometimes gets forgotten by smaller businesses. This is a shame as email remains one of the most effective marketing activities and this isn’t changing any time soon. A study carried out last year found that in fact customer acquisition via email has quadrupled in the past four years.
So email marketing is alive and thriving – but what if you’re finding it’s not working for you? There are several reasons that your campaigns may be failing to deliver conversions. It could be that your list simply isn’t yet big enough for you to see a return, it could be the time and day you send out your campaign, or it could be the way that the mail is crafted is off-putting.
A great way to test things out in order to troubleshoot your campaigns is with A/B testing, which is also known as split testing.
A Quick Overview of A/B Testing
Whilst it can see slightly daunting to the uninitiated, A/B testing is simple. You take two groups of people and send them two differently crafted emails. The changes can be simple or they can be significant, but the idea is to see which group of people take the most action.
The differences can be in:
- CTAs (calls to action)
- Time of day
- How you address the recipient (first name or last, for example)
- Subject line
It’s better not to make too many changes so that the emails going out to group A and group B are not completely different in every way. This is because if you do it will be incredibly difficult to pin down exactly what made the difference and prompted the user to click through.
It’s also advisable to run the test more than once; in fact, you should keep it running for a month at least to ensure that you get accurate results and not just a one-off.
Determine Your Goals
Before you get going with your test, it’s necessary to define what you want to achieve by running the test. Do you want more clicks from a CTA? Or perhaps you’re just looking for more opens? Whatever the case, decide on your goal first and then it’s much easier to see what changes you need to make to the actual email campaign itself.
You should also choose what element you want to test before going ahead as this will dictate the changes you make. If you want more opens, then it would make sense to change the subject line, for example, whilst if you would like more click throughs to your site, then perhaps you need to look at buttons within the mail and the text/language used.
How Many Test Subjects?
Ideally, you want to be testing your entire list but depending on the email service you’re using, this could prove expensive if it’s a very large list and they change for testing per email address. If this is the case, then test as many in the list as you can afford and ensure that you select people to include at random so that your results are accurate.
A/B testing works better the larger the ‘sample’ (the amount of people in each test) you have. It’s likely that if your list is very small your results won’t be very accurate so if that’s the case, then you should really work on getting your list up first.
What Else You Will Need
Assuming you’ve decided on your sample size and exactly what it is you’re going to test, before you begin it’s a good idea to go over previous data so that you can measure results when they come in. Most email campaign software includes analytics for opens, click throughs and conversions and tools for carrying out A/B testing.
If there's no testing option in your software, you will have to do it manually, which simply involves splitting your list into two and sending one group the usual mail and the other the new one. It’s not essential that you keep the email template that you were using, you can change it, but in the first instance it’s nice to learn where you’ve been going wrong.
Once you’ve run the campaign (remember, about a month, or at least send around 4 times) then you can take a look to see if your new mail has returned better results than the old. This is very simple with A/B testing software but not as simple if you’ve done it manually.
However, Google Analytics now has a feature which allows you to monitor traffic coming from email campaigns so you can measure clicks and conversions without testing software, but not really opens. You could of course request a read receipt to measure how many opens the campaign is getting, but these tend to irritate people and are not the most accurate as many people have the preview window enabled in their email client.
It’s not strictly necessary to measure conversions (although it’s useful) as once the recipient lands on your site you’ll want to be tracking their behaviour on there. Your site message should be consistent with the mail campaign though, so ensure that any special offers that are listed in the mail are on the landing page of the site. Once they are on the site, it’s better to look at how they’re interacting with the information and set-up there, rather than the email itself. It’s common practice to A/B test landing pages so if you find that the bounce rate is high once they've clicked through, then it’s worth testing your pages too.
Quick Tips for A/B Testing Email
Many companies find A/B testing to be hugely rewarding and it can increase conversion rates significantly by just making the smallest of changes. In the Hubspot landing page test below for example, it was just the addition of an image that increased conversions by 24%.
- Run the test on both groups at the same time, don’t send at different times of the day or the results will be skewed (unless of course the time is what you’re testing).
- Use the tools integrated into your email marketing software.
- Test often to really hone your campaigns and get the best results.
- Don’t test too much at once, stick to one or two variables so that you can drill down what works.
- Consider using a double CTA in the form of a P.S at the end of the mail in a test.
- Don’t give the user too much choice – limit to one or two CTAs and some content links.
It’s important that you don’t try to second guess results. Whilst you may think you know visitors will prefer one design over another, one thing that’s very common in A/B testing is the element of surprise. For example, in another Hubspot test carried out on the Performable website, it could easily have been assumed that a green button would attract more clicks than a red button.
We associate green with positives, it means go after all, whilst red is associated with stop or delete.
The results? The red button design won out, beating green with a 21% increase in conversions. A good lesson to learn from this is to never assume that you know how your audience will react during a test, as they will often take you completely by surprise.
When it comes to your email campaigns, run a few tests to see which is the most effective and then perhaps consider running a test on your website landing page too. Use the email to help drive traffic to your landing page and then test how best to convert those clicks into customers.
Hello Bar is simple to customise and can increase newsletter sign ups
If you don’t have a large enough list to warrant a test then consider why your site isn’t performing very well. Is your newsletter sign up box too well hidden? Consider using a lightbox popup if so – these generally wait around 30 seconds before appearing so that users can input their email address. You see a lot of these around lately but do consider that some people are instantly turned off by popups. Another good find I’ve come across lately is Hello Bar, which places a sign up bar at the top of your site and connects to your campaign software.
A/B testing is a powerful marketing tool and can make a real difference to your conversion rates so don’t discount it. There are plenty of examples to be found online which prove the power of testing, in some cases marketers see a huge jump by as much as 400% in conversions just by altering the smallest element on a web page or within an email.
Published: Friday 10 October 2014 | Last updated: Tuesday 04 September 2018