In a world where people are bombarded with countless emails on a regular basis, it’s more important than ever to craft emails with purpose. According to Statista, 269 billion emails were sent in 2017, and that figure is expected to rise to a staggering 333 billion by 2022. These days it’s not enough to assume you know what type of email your audience will want to open — let alone read through it entirely. You have to be certain. Creating great emails requires a lot of hard work, researching, and strategizing. The best emails are crafted not only with goals in mind, but also with the target audience at the forefront. From subject line strategies to sound design principles, there are many components that make up a successful email. But how can you be sure that one version of an email will be more successful than others? You’re not the first person to ask that question. What if there was a way to be sure that one version of an email would generate more engagement, lead to more landing page views, and/or provoke more sign ups? Well . . . there is. Email A/B testing or split testing is a brilliant way to determine what resonates with your audience and what sparks their interest. With email A/B testing, your team can gather data-backed proof of the effectiveness of your email marketing. (AWeber just released a new email A/B testing feature that allows you to test more than just your subject lines — like send times, copy, templates, buttons, and more! Try out AWeber for FREE for 30 days — and split test away!)
Getting Started with Email A/B TestingConducting an email A/B test is simple. Create two or three identical versions of the same email, but change one variable like the subject line, the lead image, or the CTA button. You can test variables as distinct or as nuanced as you see fit. For example: You might test the color of a CTA button versus testing the subject line. Related: 6 Email A/B Tests You Can Set Up in 1 Minutes If you think that creating multiple versions of the same email with a tweak or two sounds tedious or time-consuming (and wonder how much insight can you gain from changing the text on a CTA button), consider this. AWeber customer and photo sharing community Light Stalking split their email subject lines to gauge the success of one versus the other. As a result, they were able to increase their web traffic from the winning subject line email by 83%. How’d they do it? The founder of the community, Rob Wood, wanted to run an email A/B test on the subject line of the Light Stalking weekly challenge email, which asked subscribers to send in a photo of a silhouette. The test was simple: Wood created two identical versions of the same email, changing only the subject lines. The first email used a straightforward subject line, “The Weekly Challenge is Live!” and the second email was just one word and hinted at the nature of the challenge, “Silhouettes.” The email with the shorter headline (“Silhouettes”) was the winner, which Wood sent to the remaining 90% of his list. From there, the email yielded an above-average click-through rate, which drove more people to the Light Stalking website and increased overall engagement levels. Impressive, right? And simple. This is a perfect example of how email A/B testing helps you make data-backed decisions. With that, let’s talk a bit more about the basics of email A/B testing and how it can help you optimize your next email campaign. Related: Should You Capitalize Your Subject Lines? This Marketing Expert Found Out
Setting Goals for Email A/B TestingAnyone can split test an email, but like anything in digital marketing, having a clear goal and purpose for testing is essential. Sure, you can run a quick email A/B test and obtain useful results, but having a more precise testing strategy will yield more powerful data. Email A/B testing is a great tool to use at any time, but it can be especially useful if you want to gain insight on a new campaign or email format. Before you begin your test, it’s essential to establish what you are testing and why. A few questions that can help guide your team at this stage include:
- Why are we testing this variable?
- What are we hoping to learn from this?
- What is the impact this variable has in relation to the performance of this email?
Copy ElementsCopy elements such as subject lines, headlines, body copy, and calls to action immediately come to mind when thinking about what variables to test. After all, copy elements are some of the first things people see when your email pops into their inbox (as well as after they open it), so it’s important to optimize. For example, a personalized subject line that reads, “Ben, did you see this?” versus “Did you see this?” could be the difference between a subscriber opening and deleting the email. But just how important are a few words? We wanted to get to the bottom of this, so we added an extra word to a call-to-action button in one of our promotional emails. Doing so subsequently increased our trial subscriptions by 12.8%. Talk about the power of words.
Design ElementsDesign elements like colors, fonts, images, templates, and spacing are just as crucial to an email as the copy and links. Did you know that 53% of emails are opened on mobile devices? With this in mind, think about how your email visually appeals to subscribers and what they need to get the best reading experience. These two emails have the same copy and messaging, but are presented in very different ways. One puts a bit of written copy up top, while the other relies on a central hero image as a visual cue. This simple tweak in formatting could yield wildly different results. Email A/B test different templates, layouts, and formats to see which yields the best results for your email campaigns. Related: How to Create Amazing Photos for Your Emails on Zero Budget
Additional ElementsAside from the visual and copy elements within an email, you can A/B test a few other variables as well. Testing when you send an email could be just as important as what your email says. When measuring the success of an email as it relates to the time it’s sent, consider:
- Day of the week
- Time of day
- Relation to the time of year (e.g., holidays, industry events, seasons, etc.)
How big should your test sample size be?It’s important to note that when conducting your email A/B test, you’re testing on only a small percentage of your subscriber list. You want your test list to be large enough that you can gauge how the rest of the subscribers will likely react without using the entire list, but just small enough that you can send the winning version to a large portion of your audience. The goal is to get accurate, significant results, so bigger lists (minimum 75 to 100 subscribers) typically work the best. However, keep in mind that you should be using a sample that represents the whole list, not just a specific segment. Related: Your Start-to-Finish Plan to Get 1,000 Subscribers So what does a sample look like? There are many ways to approach this. You can figure out a generic sample size with a calculation that factors in your email list confidence level, size, and confidence interval. Or, if you’re an AWeber customer, you can manually select the percentage of your list that will receive each version of the split test. Either way, make sure you select a viable percentage of your list to send your test emails to so you have enough data to analyze. Often this is in the 10% to 20% range.
Best Practices for Email A/B TestingEmail A/B testing seems pretty straightforward, right? It is, but like any experiment, if you don’t solidify the details and ensure your test is valid, your results may turn out to be useless. Keep these things in mind when creating your split test:
- Use a large enough sample to get as close to statistically significant as possible
- Make sure your sample group is randomized
- Test early (like before a campaign launch, so you have time to interpret the results) and test often
- Identify each variable you want to study and test one at a time
Email A/B Testing Set UpYou have the basics of email A/B testing down, so let’s next discuss how to set one up properly.
Determine your goalsFirst things first: Identify the intentions behind the campaign you want to test. Your goals will act as your compass when figuring out the details of your email A/B test. Every component of your campaign should trace back to your end goals.
Establish test benchmarksOnce you have defined your goals, take a look at your current email data and examine how your previous email campaigns have fared. From there, use your findings as benchmark numbers. These numbers will be significant when it comes time to analyze your email A/B test data so you can gauge early success. These numbers should also help you decide on the variables you want to test moving forward.
Build the testYou have your goals and your benchmark data; now it’s time to build your test. Remember to test only one variable at a time. (Refer back to our best practices — above — if needed.) Bonus: Did you know AWeber customers can automatically split test their email campaigns (and can test up to three emails at a time)? It’s true. Here’s how it works: 1. Log into your AWeber account. 2. Hover over Messages, then click Broadcasts. 3. Click on Create. 4. Name your split test. Be as detailed as possible when naming them so you can make sure you select the right one when it comes time to run the test. 5. If you’d like, you can send your split test to a segment a.k.a. a group of subscribers. Click on the drop-down menu and select the segment. 6. Using the slider, define your split segments into their two or three groups. (You can change the percentages to make sure you’re testing with only a small percentage of your list. So if you were sending to two groups, you could have 10% of your list get one variation and another 10% get the second variation. Then, you can send the winning message to remaining 80% of your list.) Once you are satisfied with your settings, click Save. 7. Then, to select the message you want to test, click Select a Draft on the right hand side menu. 8. From there, select the message you want to use and click Select. 9.You will then see the selected message added into one of your split test segments. Click Schedule to schedule your split test message. 10. Schedule your message just as you would with any other Broadcast message within your AWeber account. Once your Broadcast settings are set, click Send Message Now. There you have it! Repeat these steps each time you want to send a split test message.
Email A/B Testing Inspiration and ExamplesIt can be tricky to identify what variable test can help you improve key metrics. Here are a few examples that can help you figure out which variables to test.
To improve your open rate…This one is easy! To improve your open rate, you need to test different subject lines. We recommend trying a few different types of subject lines like questions, capitalization, long vs. short, subject lines with emotional value, emojis, etc. You can also test different preheaders — the preview snippet of text that is next to your subject line or below it (on mobile) in your inbox. In addition to testing subject lines, try sending the test emails at different times of day and see if that has an impact on the open rate. Your subscribers may be more inclined to open an email in the morning on their way to work or at night after dinner instead of during the middle of a workday. The better your subject line, the more likely your subscribers will open the email and read through. Having a solid subject line is like getting your foot in the door. Related: How Do I Avoid the Spam Filter?
To improve your click through rate…Keep subscribers interested in the email by providing eye-catching, engaging content throughout. If it’s your click-through rate you want to improve, make sure you create clickable content. Consider how interactive content, information gaps (missing pieces of info that spark a reader’s curiosity), or contests could boost your in-email engagement. There are also many variables you can test to optimize for click-through rate — a strong CTA, intriguing anchor text, personalization, spacing, or bold imagery. Just remember to test one at a time to ensure you know precisely why subscribers are clicking more (or less).
To improve your reply rate…Many marketers tend to overthink this one, but it’s actually pretty simple. If you want your subscribers to reply to your emails, ask them to! It’s that easy. Try testing a “From [your name] at [your business here]” approach, which can make an email feel like a personal note instead of an email blast. (For instance, “From Andy at AWeber” would be the sender name that appears.) Think about it: If subscribers think they are replying to an actual person, they are more inclined to do so. You also might try testing long-form vs. short-form emails with a call-to-action that encourages subscribers to reply to the email with their thoughts, opinions, or questions. Leverage that P.S. line, too. That last line can be an opportunity to encourage conversations and replies from subscribers.
Tracking and Measuring Email A/B Testing SuccessWe’ve covered a lot of ground so far around email A/B testing. With so many elements to test, you might be thinking, “How can I verify that a campaign is successful or that a test yielded helpful data?” The answer: Think back to your goals. Your goals will tell you what metrics you should pay the most attention to and what you should work on improving. For example, if generating more leads from email campaigns is your goal, you’ll want to focus on metrics like open rate, click-through rate, and form fills. It’s also important to look at your metrics as a whole to see the big picture of how an email performed. Being able to track that data and refer back to it will also help you optimize future campaigns. Another question that might be top-of-mind for you: How long should you let an email A/B test run for before ending it and analyzing the results? According to Zapier, after about four to five days the effectiveness of an email dies out. They claim that if your email isn’t seeing any other significant activity after five days, it’s likely it won’t see any other activity. However, digital marketer Neil Patel recommends running your A/B test for at least two weeks with 100 subscribers to determine any statistical significance of your results — or that they aren’t due to chance. If you run your test for too short a period, you run the risk of not allowing enough subscribers to open the email. With that being said, why not test how long you run your test? If you see engagement with your emails die out after 48 hours, then you can cut the tests off around that point. Once your test has ended and as you begin analyzing your data, keep detailed notes of your findings. Ask yourself:
- What metrics improved?
- What elements of the email flat-out didn’t work?
- Were there any patterns that correlated with past tests?
Get Started with Email A/B Testing TodayEmail A/B testing is imperative to the success and optimization of any email campaign. It allows you to gain real insight that can help you make decisions about existing and future emails. Email marketing is always changing, and as subscribers’ attention spans seem to get shorter, it’s vital to know what will yield the most success. Get started today with AWeber. Our email A/B testing tool allows you to do more than just split test subject lines — you can test almost anything (calls-to-action, colors, templates, preheaders, images, copy, and more!). Give AWeber a FREE spin for 30 days. Want to learn even more about email A/B testing? Download our free guide here.