Has this ever happened to you?
You’ve done all the work. You’ve created a new webpage or email campaign, and you’re ready to release it into the wild – then, right at the last minute, you start second guessing yourself.
Maybe that image you chose isn’t quite right? Should your subject line be punchier?
When you’re marketing a business you’re making tons of tiny decisions. How can you be sure the choices you’re making are the right ones?
That’s where A/B testing comes in.
What is A/B testing?
A/B testing, or split testing, is an experiment that involves creating two versions of your marketing material (option A and option B) and releasing them both to see which one performs better for a given conversion goal. A/B testing can be used to test a variety of marketing elements such as copy, design/visual elements, navigation, submission forms and calls to action (CTAs).
So, whether you’re an email marketer trying to boost open rate with the best subject line or an e-commerce business looking for the best CTA to drive revenue, running an experiment to test your hypothesis is a great tactic to decipher the version with the best conversion optimization strategy.
A/B testing not only gives you greater understanding as to which strategies work best, but the analytics and data acquired from the experiment will ultimately provide incredibly valuable insights into your business that will benefit future marketing campaigns. The more efficiently and effectively you roll out campaigns as marketers, the quicker you drive leads, conversions, and your bottom line.
How does A/B testing work?
An A/B test is performed by taking a marketing element (i.e. a webpage, email, etc.) and creating a second version of this piece. The second piece will contain minor modifications. Think simple modifications: such as a different CTA button, or slightly varied copy.
Differences should be minimal between the two versions so you can isolate what exactly is affecting engagement from your audience. Once the two versions are created, half your traffic will see the original version (also known as the control) and the other half will see the other version (known as the variation). User behavior is then monitored, and engagement rate is measured via statistical testing software.
Sound complicated? It doesn’t have to be. In practice, tools like Constant Contact’s a/b testing make it simple for anyone to run an a/b test with step-by-step guidance.
How to run an A/B test
While the concept of A/B testing is quite simple, it is imperative to run an organized, statistically sound experiment in order to collect accurate, useful data. Follow these A/B testing best practices to ensure you get the most out of your tests.
1. Identify Opportunity
Peruse your marketing analytics to decipher the best opportunity to utilize this form of testing. A great place to start is finding email campaigns or areas of your website that have a low conversion rate, or conversion rate that is starting to see a significant dip.
2. Verify concrete goals
While the overall goal is to increase conversion rate, what specific actions in user behavior are defining that conversion? Is it signing up for an email list? Is it adding an item to their cart? Is it clicking to redeem a coupon? Keep your goal top-of-mind so you can track these metrics when reviewing your test.
3. Create different variations
Generate variations that have elements that will impact user experience (there are testing tools for this). Be careful not to create two versions that are wildly different. If there are too many contrasting elements, it will be near impossible to isolate which element is driving the impact.
4. Create a hypothesis
Why do you think the modified version will perform better than the original version and what will its relative impact be?
5. Run the A/B Test & Analyze Results
Your versions will be randomly presented to visitors. User experience is then tracked and analyzed for the total number of visitors. Once complete, you should assess whether or not there is a statistically significant difference in the results. If so, you should start running our test campaign!
Where to start experimenting with A/B testing
Even if you understand what A/B testing is, you might not know how to use it for your business.
Unsurprisingly, A/B testing is easiest when you leverage technology to do the hard work for you. Let’s review three places you can run A/B tests and how you can get started easily.
1. Subject lines for higher open rates
Subject lines have a big impact on whether or not someone opens your emails. Which is why I love to test a couple of versions to see which one catches more attention and generates higher open rates.
Luckily, Constant Contact’s split testing tool makes it super simple to test subject lines. Once you’ve created your email campaign, click the A/B Test button and add two options as your variables.
You can customize what percentage of your audience receives each version, and the variation that generates the most opens will be declared the winner. This will then be sent to the remainder of your subscribers.
I’ve used A/B testing to find the right subject line length, punctuation, and tone. Test out a couple of options in your next email and check back in your reports to see which subject line really won your subscribers over.
2. Sign-up forms that draw more email subscribers
The right sign-up form can dramatically increase the number of email signups your business receives.
A/B test different versions of your sign-up form to see which one results in the most opt-ins from website visitors. You can experiment with your type of sign-up form, your copy, images, and form fields.
For example, if you want to try asking new subscribers for their birthday on your sign-up form, A/B test with the birthday field included and left off to see if there’s any change to your sign-up rate.
3. Landing pages to spur more conversions
Maybe your business uses landing pages to collect webinar registrations or encourage your target audience to download an ebook. Performing an A/B test on your landing page will help you optimize your page so you’re driving more action from your visitors and increasing conversion rates.
Start by creating one version you feel is set up for success. Then, make a copy of your page and tweak one detail. Consider cutting down the amount of text, using a different image, or updating the wording used in your call to action.
You can then direct 50 percent of your website traffic to one version and 50 percent of visitors to the other variation. Monitor the conversion rate to see which results in the better conversion rate.
What to do after you run an A/B test
After you run a test, document the results so you can learn from your findings and apply them in the future. Keep in mind your results aren’t always definitive. If your subject line test shows that a funny subject line resulted in a higher open rate, that doesn’t mean you should go with a joke every time.
Great results come from trying new things and keeping your content feeling fresh. You many want to revisit some A/B tests regularly to see if results change over time. It’s possible that your audience likes one approach originally, but then loses interest as it begins to feel stale. A/B testing will help you realize when it’s time to spice things up and try something new.
Frequently asked questions about A/B testing
What’s the difference between A/B testing and multivariate testing?
Multivariate testing means adding more variables to your test. In a multivariate test, you can test multiple elements like the number of fields on a sign-up form, as well as the background color to see which version results in the highest conversion rate.
Multivariate tests involve a more advanced strategy and works best for business with large amounts of traffic.
How large does my sample size have to be for my A/B test to be considered statistically significant?
For subject line testing, we recommend testing each line on a sample size of at least 1,000 contacts if possible. 1,000 users for each version is typically a good benchmark for any A/B test, but if you don’t have enough contacts or visitors to hit those numbers, it’s still worth running the test. Having some data to work with is always better than leaving things to guesswork.
How many elements should I test at once in a single A/B test?
Keep every element exactly the same except for the element you’re testing. This is the best way to make your results clear and reliable. If you change multiple elements at once, then you can’t be sure which variable affected your results.
How long should I run my split test for?
Ideally, you run your test until you hit statistical significance. When testing a subject line through Constant Contact, you can decide to run the test for 6, 12, 24, or 48 hours. This may seem like a short period of time, but since checking email is part of most of our daily routines, I find that 24 hours is usually more than enough.
For something like a website test, you probably want to have the test run at least a week to accumulate enough data from your users.
What do I do if the results are the same across both versions?
Don’t worry, this happens to me plenty! Sometimes I test a subject line and both versions perform almost exactly the same. That’s not necessarily a bad thing, it just means you have to get even more creative to make a splash. It’s not always easy, but I recommend keeping at it and trying something creative the next time around.
If you’re looking for subject line inspiration, these subject line ideas should do the trick.
Put your marketing to the A/B test!
With A/B testing, you never have to wonder how a small change could impact your open rates, click-through rates, revenue, and other important metrics. You can see for yourself and have confidence that your marketing campaigns and materials are optimized for the best results!
Start thinking about what small tweaks you’d like to test out in your email subject lines, sign-up forms, and landing pages. A/B tests are well worth the effort to learn more about your target audience and what works best for them.