The Button Color A/B Test: Red Beats Green

Joshua Porter
Joshua Porter

Updated:

Published:

Button color is one of the longest standing debates in the world of conversion and optimization. Everyone seems to have their favorite color. At different times in the last two years, I’ve heard green, pink, red, orange, and even light blue as THE ONE COLOR that works best. Obviously, this can’t be the case.

Fortunately, button color is extremely easy to test. Back in the day, we ran a button color test on the home page of Performable's website , and the results surprised us. Button color had a big effect on the overall conversion of the page.

The colors we chose to test were green and red. First, I created the normal home page with the green button color I had originally designed. Then, I cloned that page (created an exact copy) and changed the button color to red. I did not change anything else on the page. The content, message, and graphics were exactly the same on each page variation. The only difference was the hex value that determined the color of the button. If there was some conversion difference affected by the button color, the idea was that we would see it in this test's results.

You can see the two pages we tested below:

red green button

Each of the colors we chose -- green and red -- have interesting connotations.

Green

Green connotes ideas like “natural” and “environment,” and given its wide use in traffic lights, suggests the idea of “Go” or forward movement. Green was also in Performable's color scheme (along with black and gray), so a green button fit a bit more nicely into the page design. Green is also heavily used at the moment, being the chosen hue of many web 2.0 websites. (Although I’m not sure how this happened.)

Red

The color red, on the other hand, is often thought to communicate excitement, passion, blood, and warning. It is also used as the color for stopping at traffic lights. Red is also known to be eye-catching. Red, in general, is not used as a button color nearly as often as green.

Hypothesis

So, which color would convert more people to click? Would it be green, which connotes “Go,” or red, which connotes “Stop”? Would those connotations actually affect whether or not people clicked?

My hunch was that even if one color performed better than the other, the difference would be small. I could imagine that one color might be more appealing or grab the user’s attention better than another, but that the overall conversion numbers would be overwhelmed by the overall message of the page. I assumed that the results of this test would show what we’ve seen in testing before -- that the major difference between good and poorly converting pages was the message the page was communicating.

Button Test Results

We ran the test over a few days of traffic. In total, we had over 2,000 visits to the page, and for each visit, Performable recorded whether someone clicked on the button or not. (Using Performable's tools , all analytics and conversion data were automatically gathered, so we could watch along as the results rolled in.)

The result?  The red button outperformed the green button by 21%

21% more people clicked on the red button than on the green button. Everything else on the pages was the same, so it was only the button color that made this difference. This was a much larger difference that I expected. 

Consider this: a 21% increase in the conversion of this page is potentially a 21% increase to all downstream metrics. So  by getting 21% more people to click at the top of this process, we added 21% at the bottom as well. This is why optimizing pages is so valuable. We did not have to increase traffic to the page to see improved results . Instead, we improved the  efficiency  of the page. And by improving conversion on existing traffic, we thus added considerable value.

Additionally, it is interesting to note that this is the sort of result you can’t easily find in user testing. Because it takes so many trials to find a statistically significant result (often thousands of trials), it would cost a fortune in time and money to run the test with that many people face-to-face. In general, A/B testing is great for quickly and easily running tests of individual variables on a page like this. That’s why its a good idea to use a balanced approach when testing, using the appropriate test type to garner the results you want.

Marketing Takeaway

As always, we cannot generalize these results to all situations. The most we can say is that they hold for the conditions in which they occurred: in this page design, on this site, with the audience that viewed it. It could be that Performable’s audience happened to like red (or dislike green) or that red happened to contrast nicely with the green in Performable's color scheme. There are many possible reasons that could explain why this particular test resulted in the way it did.

Therefore, do not go out and blindly switch your green buttons to red without testing first. You should test colors on  your  page and with  your  audience to see what happens. You might find something interesting in your data that we don’t have in ours.

What kinds of A / B tests have you run on your own website? What were the results?

conversion centered landing page design

                      buttoncolor_4                
Topics: Calls to Action

Related Articles

We're committed to your privacy. HubSpot uses the information you provide to us to contact you about our relevant content, products, and services. You may unsubscribe from these communications at any time. For more information, check out our Privacy Policy.

Outline your company's marketing strategy in one simple, coherent plan.

Marketing software that helps you drive revenue, save time and resources, and measure and optimize your investments — all on one easy-to-use platform

START FREE OR GET A DEMO