A/B Testing: The Good, the Bad, and the Ugly

Download Now: Free Marketing Plan Template
Marjorie Munroe
Marjorie Munroe


If you're a marketing professional, generating quality leads is probably top-of-mind. And while the prospect of conversion optimization may excite you, it can be difficult to know where to start. Understanding how to run different experiments, like A/B tests, can help power your conversion strategy as well as help you gather valuable insight on what factors influence your online audience's choices.


What is A/B Testing?

A/B testing, also known as split testing, is the act of running a simultaneous experiment between two marketing variables (such as copy, images, layout, etc.) to identify which one brings about a better conversion rate. This is not to be confused with multivariate testing, which enables you to test many variables simultaneously. 

In the context of marketing, the element that is kept the same in the experiment is known as the "control" and the element that is hypothesized to give a better result is known as the "variable" or "treatment." 

The Good

Running A/B tests in your marketing initiatives can be a great way to learn how to effectively drive traffic to your website and generate more leads from the visits you’re getting. Even small tweaks to a landing page, email, or call-to-action button can affect the number of leads your company attracts and converts. 

The potential benefits of A/B testing include:

  • Improved content 
  • Higher page engagement
  • Higher conversion rates 
  • Reduced bounce rates on pages

You are also not limited to testing on top-of-the-funnel offers. According to Justin Champion, an Inbound Professor at HubSpot Academy, content is the fuel that brings visitors through the buyer's journey. As a result, your company is likely already generating content that corresponds with every stage of the inbound methodology. 


Any element of your content can be tested on, as long as you are establishing a clearly defined control and variable. This opens up a realm of countless optimization opportunities as you use insight gathered from user behavior to inform your design and content creation decision moving forward. 

The Bad

As you build assets like landing pages, calls-to-action, and email campaigns, you may wonder about the elements you can test and optimize to increase conversion rates. Will changing the color of your call-to-action button result in more clicks? Does adding an image help the submission rate of your landing page? While so much choice can be a good thing, it can also be frustrating if you don't know where to start. 

What you can do

Not all variables are created equal, and some may prove more worthy of your time than others. You may find that certain elements in your marketing assets don't have a large impact on the click-through rate or conversion rates at all. So what should you do? In short, start by focusing on elements that frame your conversion opportunity. 

On landing pages, this can mean testing areas such as:

  • Headlines
  • Content
  • Images/Video
  • Social Proof
  • Calls-to-action
  • Layout
  • Form design

Interested in trying it out? Check out this article to see how you would run those tests in HubSpot.

On calls-to-action, start by testing out areas such as:

  • Wording
  • Color 
  • Images
  • Shape
  • Placement
  • Size

And let's not forget A/B testing in email! Think about testing the following areas:

  • Subject line
  • Layout
  • Personalization (Example: “Mr. Smith” vs. “Joe”)
  • Body text
  • Headline
  • Closing text
  • Images

The Ugly

A/B testing enables you to ground your findings and use data to determine what is and what isn't working well. On the flip side, however, the element you're testing is often only a piece to a larger conversion process. Other factors, completely outside of your control, may be influencing whether or not a person converts. As a result, it can be difficult to determine, with 100% accuracy, if the change you made influenced visitor behavior. 

What you can do

Before you start running any tests, determine the sample size and time frame necessary to achieve statistical significance. Depending on what element you're experimenting on, how you'll do so may vary.

For example, emails have a finite audience and your company probably only sends them at certain times of day. This means you need to have a recipient list or sample size large enough to accurately draw conclusions from once the email is sent. For this reason, HubSpot recommends sending emails to 1000 contacts when running an A/B test. 

Landing pages, on the other hand, can collect visits and submissions over time. According to Mike Griffin, an Inbound Consultant at HubSpot, this means you want to let your landing page A/B test run for about four weeks on average. Depending on the traffic your landing page is receiving, this allows for enough time to gather statistically significant submission data.

Using calculators such as Kissmetric's Significance Test can also be a great way to determine if your A/B test is statistically significant enough to draw conclusions from. 

A/B testing is often only a piece to the wider puzzle of optimizing your content and improving your conversion rates over time. However, when implemented correctly over time, it can help inform the user-driven design of your website.

Ready to get started? Check out this checklist (you'll want to bookmark it) that walks you through setting up your first A/B test today!

New Call-to-action

Outline your company's marketing strategy in one simple, coherent plan.

    Marketing software that helps you drive revenue, save time and resources, and measure and optimize your investments — all on one easy-to-use platform