Just as it's important to know the
best practices of landing page creation
and A/B testing to see great results, it's crucial to know what
to do to make sure your efforts aren't wasted. Consider these 7 common mistakes marketers make when
performing A/B tests
, and make sure none of them crop up in your next (or current) round of A/B testing.
The 7 Worst A/B Testing Mistakes Marketers Make
1.) Running an A/B test when you should run a multivariate test.
What's the difference between the two? In a few words, an A/B test evaluates the performance of
two different version
s of a web page; a multivariate test evaluates the performance of the
web page, and offers far more possible outcomes because of the combination of elements that can result. Make sure you're running the right test for your needs. If you're not sure which test is right for you, read this
explanation breaking down the difference between A/B tests and multivariate tests
, and how to know which test to run.
2.) Not establishing the criteria for success.
Now that you know you're running the right test, do you know what your goal is? A successful A/B test will have a specific end it is trying to achieve. Hypothesize what the changes you're making will result in, and know which metrics will indicate success. Some admirable goals might be to lower bounce rate for new visitors by a certain percentage or to
increase click-through rate by 1300%
. Whatever criteria for success you choose, remember that you can't achieve success without knowing what it looks like!
3.) Starting your test with crazy web pages.
The design, layout, and copy you choose for your first pages shouldn't just be a shot in the dark. Base your decisions off of best practices so you're not wasting your time with designs that, based on hundreds of thousands of tests from other websites, probably won't work. In other words, they're called
for a reason.
That being said, one reason A/B testing is so useful is because it tells you when you should flout best practices to achieve the best results.
But only when testing tells you it's prudent to do so.
So in the meantime,
follow best practices from people who've learned the hard way
, and tweak according to the results of your A/B tests.
4.) Not performing a radical redesign.
We just told you not to start with crazy web pages, and now we're telling you to perform a radical redesign. What gives? The pages you're testing should follow best practices, sure, but they shouldn't still look exactly like one another. Move the form from the right side of the page to the left; dramatically change the size of your header; test the response to totally different language; experiment with different images. And do it all at the same time. If you don't perform
radically different tests
, you could hit your "local maximum" and start iterating on designs that aren't as effective as they could be to begin with.
5.) Performing tests on pages with too little traffic.
A/B testing is great for new websites because you don't need a ton of traffic to get meaningful results. But you still need
traffic for statistical significance. Make sure you run tests on pages that are either highly trafficked, or if you're running this test on new or buried pages, that you run the tests for longer than you do on your more popular pages to ensure you have enough data points for a meaningful evaluation. Before jumping to any conclusions, make sure you have
enough data to make a relevant determination of success
6.) Not considering how your changes affect other metrics.
Have your design changes increased conversions but decreased time on site? Is that okay? It might be. If you've established the criteria for success (see mistake #2)
you analyze how all your important metrics are affected when you make design changes, you can be sure what you interpret as improvements aren't actually having an unintended, negative consequence.
7.) Giving up when you see no results.
Just because you didn't see results with one test doesn't mean you've hit the jackpot page. Consider more iterations you can test -- different colors, layouts, copy, images, and proportions -- to see if there's still a better page design out there that you just didn't consider in the first round of A/B testing.
Have you made any mistakes while
that adversely affected your results? Share them with the rest of us so we can learn from them, too!