How to Determine if Your A/B Test Results Are Significant

Brian Whalley
Brian Whalley

Updated:

Published:

statistics bookSometimes, when looking at a test you’ve conducted on your website or in an email, it can be difficult to tell if the difference in performance actually meant something. But have no fear! By the end of this post, you should have superior knowledge of how to understand your marketing A/B test results and whether they're significant.

We won’t dive into a full lesson on statistics and details today; there are full college courses completely devoted to imparting those detailed lessons onto marketers. However, what marketers should know today is that the difference itself is important, so you can quickly understand whether your test results were just "different," or even better, significantly different.

Step 1: Create Your Campaign

When you're creating your campaign, think about which variable you want to test. For example, do you want to try out different text phrases for the call-to-action in your email, different colors, or an entirely different layout? For instance, in the test we're using as an example for the purposes of this article, we're looking at using a regular orange button for a HubSpot free trial offer versus using some custom art for the free trial button to see if the custom art really makes a difference. Just about anything can have an impact on how many people take the suggested action, so pick one item to vary in your message before beginning your campaign.

Don’t pick more than one variable at a time. Otherwise, it will be impossible to tell which of your variables actually caused the change in behavior. Stick to either changing the color or the layout or the text -- but not all of them at once. Make sure your software provides a clear way to test what’s successful as well. For example, software that provides integrated analytics for A/B testing that can report back on the data.

describe the image

Step 2: Check Your Data

Once you’ve wrapped up the campaign or want to check your results (maybe 30 days after you launched the landing page or a week after you sent the email), check your data. Record how many people viewed each version of what you did as well as how many reacted. Using our example, if each call-to-action was viewed 200 times, with the top one generating 20 clicks and the bottom one generating 6 clicks, save that information.

describe the image

Step 3: Determine Statistical Significance of Results

There are many online calculators available to help you determine whether the difference in your test results was significant. Use the following as a guideline to think about test significance:

If the difference between the tests was very small, it may be that the variable you tested just doesn’t influence people who are looking at it.

You can also use PRCOnline's Statistical Significance tool to determine whether your test results are statistically significant. Plug in your numbers to find out if your test yielded significant results. The "sample size" should be the number of people who saw each version of what you did. The "response" should be whatever measured that difference. In our example, the number of clicks on our free trial offer. If the tool says “Significant,” you know you’re on the money. If not, don’t despair. It just means your test didn’t show a significant change in your conversion rate. Not every test will be a winner, and it is important to distinguish which changes are important and which ones are not.

statistical significance resized 600

Step 4: Act Accordingly

From the example experiment discussed in this article, we were able to conclude that the customized art for our free trial button made a big difference in whether people chose to click on the call-to-action. The call-to-action was significantly less popular when we used an orange button. Therefore, we've implemented that customized button throughout our website.

One of the benefits of A/B testing is that it's so easy to do. You can regularly conduct A/B tests in a number of different marketing channels and greatly impact results on a case-by-case basis. You can also take the lessons you've learned from one test and apply it to future efforts. For example, if you've conducted A/B tests in your email marketing and you've repeatedly found that using numbers in email subject lines generates better click-through rates, you might want to consider using that tactic in more of your emails.

Do you conduct A/B testing in your inbound marketing? How could you use these kinds of analytics to test your website or marketing automation messages?

Photo Credit: David Goehring

New Call-to-action
Topics: a-b-testing

Related Articles

We're committed to your privacy. HubSpot uses the information you provide to us to contact you about our relevant content, products, and services. You may unsubscribe from these communications at any time. For more information, check out our Privacy Policy.

Outline your company's marketing strategy in one simple, coherent plan.

Marketing software that helps you drive revenue, save time and resources, and measure and optimize your investments — all on one easy-to-use platform

START FREE OR GET A DEMO