Conducting A/B tests of your marketing initiatives is a great way to learn how to drive more traffic to your website and generate more leads from the visits you’re getting. But what is A/B testing, anyway?
A/B testing, also known as split testing, is a method of testing through which marketing variables are compared to each other to identify the one that brings a better response rate. In this context, the element that is being tested is called the “control,” and the element that is argued to give a better result is called the “treatment.”
You’d probably like to see how these elements might look like -- what are some real-life examples of A/B testing in action?
In this blog post, we'll cover A/B tests across three marketing channels: landing pages, email, and calls-to-action. All the examples that we discuss here have been conducted by members of the HubSpot marketing team, and each test can teach you a lesson about A/B testing. More specifically, our three case studies covered here advise you to:
- Optimize landing pages that don’t convert at a high rate.
- Start your optimization process with an offer test.
- Use the knowledge you’ve gained to improve existing processes.
1. A/B Testing: Landing Pages
This test was produced for the 2011 Landing Page Optimization Summit held by MECLABS, the parent company of MarketingSherpa & MarketingExperiments. HubSpot was part of a live landing page optimization test in which the audience was given a control page and asked to build a treatment page by incorporating changes that could positively affect conversion rates.
The entire control page (see above) was taken as a variable and was revamped. We changed the image and its placement, and we shortened the copy and the form. In an effort to drive a lot of traffic to this page, we did a heavy email marketing push, blog posts, and social media promotion. The test was conducted using HubSpot’s landing page tools.
Our A/B test results showed that the control converted at a rate of 47.91% and the treatment converted at a rate of 48.24% -- an absolute difference of 0.3%, which is a pretty negligible increase.
Marketing Lesson: Optimize Landing Pages That Don’t Already Convert at a High Rate
The conclusion we arrived at is that the audience was highly motivated by the marketing offer itself, regardless of the other page variables such as copy/form length and layout. The lesson here is, when conducting A/B tests on landing pages, start with pages that have a low visitor-to-lead conversion rate to begin with. It’s pretty hard to beat a page that already converts 47% of its traffic!
2. A/B Testing: Calls-to-Action
The screenshot below is of a call-to-action A/B test that sought to compare two marketing offer types. The image actually illustrates what HubSpot’s homepage used to look like in 2010! Originally, HubSpot’s homepage offered our community a seven-day free trial. However, we were curious to see if offering a longer trial period would entice more visitors to sign up. Would it have a significant effect? In this case, our control was a variation that offered the seven-day free trial, and the treatment offered a 30-day free trial.
Results from the test showed that the 30-day free trial enticed more visitors and had a significant effect on conversion rates. The 30-day free trial won with a 99.9% confidence rate and created a 110% increase in HubSpot free trials. The control had a 0.326% visitor-to-free-trial conversion rate, while the treatment had a 0.709% visitor-to-trial conversion rate.
Marketing Lesson: Start Your Optimization Process With an Offer Test
The takeaway from this A/B test is that the type of offer can exercise a tremendous influence over lead generation efforts. Therefore, if you want to optimize your calls-to-action (and, for that matter, emails and landing pages), comparing different offers is a great place to start. Such experiments will provide you with a better understanding of what prompts your visitors to convert into leads.
3. A/B Testing: Email Marketing
Subject lines are a critical element of email marketing. They have the power to grab the attention of recipients and impact click-through rates (CTRs) greatly. That's why we always want to make sure we're using the best possible email subject lines when emailing our subscribers. We regularly conduct A/B tests to evaluate winning subject lines.
But besides a subject line, recipients also see a sender name in their inbox. Who is the email coming from? This sender name can make a big difference on open and click-through rates. So in 2011, we conducted a test to compare a generic “HubSpot” sender name to a personal name of someone from the marketing team.
Our control generated a 0.73% CTR, and the treatment generated a 0.96% CTR. With a confidence of 99.9%, we had a clear winner. Our conclusion after conducting this A/B test was that emails sent by a real person are more likely to be clicked on than emails sent from a company name. But how did CTR impact the number of leads we generated?
The treatment generated 292 more clicks than the control. Since HubSpot's average visitor-to-lead conversion rate on landing pages is 45%, that means the treatment got us 131 more leads. This is great result in and of itself, but you should also think about how valuable this lesson is when applied to HubSpot's broader email marketing program.
Marketing Lesson: Use the Knowledge You’ve Gained to Improve Existing Processes
We took this insight and used it to revamp HubSpot's lead nurturing campaigns, adding a personal sender name and signature to each of those email messages. The lesson is simple: when your A/B tests lead to significant results, use the knowledge you’ve gained to improve existing processes.
While we've covered 3 A/B testing examples in this post, there is no shortage of variables you can test in your marketing campaigns. Check out this exhaustive list of split testing options, and start testing to improve the impact of your programs.
This blog post is an excerpt from HubSpot's ebook "An Introduction to Using A/B Testing for Marketing Optimization." Download the entire 50-page ebook to learn more about split testing.