“Statistics can be made to prove anything -- even the truth.” – Author Unknown
If you want to know what someone really thinks of something, sometimes the best way is simply to ask them. In marketing circles, this usually translates to surveying our customers or prospects to determine what’s going on in the real world.
We use marketing surveys for a number of reasons, from identifying overall customer satisfaction scores, to deciding how to position new products, to conducting new research to support thought leadership content, as is the case with HubSpot's new 2013 Inbound Marketing Survey, which we launched last Friday.
But crafting an effective marketing survey is a little bit more complicated than just deciding what you want to know. Your survey’s success is only as good as the data you collect, and there is both an art and a science behind how to ask the right questions. And as we were putting together our own survey, we realized other marketers might be struggling with some of these same issues. So whether you're currently struggling with survey design or you just want to learn some best practices to keep in mind for your next marketing survey, here are some critical steps I've learned from crafting effective surveys over past three years.
1) Decide How You'll Use the Data
Nailing down a clear idea of all the ways you'll ultimately use the data you gather from your survey is a vital first step to developing your initial survey draft. This is because the structure of your questions should largely depend on what kinds of information you want to report back.
Consider Question Formatting
For example, if you're going to create charts or infographics, make sure you write questions that pull in specific numbers or ranges. After all, you'll want your charts to include numerical values like “1 – 5” and “6 – 10” rather than a vague “more than last year.” Conversely, if you'll ultimately want to include customer quotes in an upcoming webinar, for example, you'll want to make sure your survey includes open-ended questions where respondents can write in their specific opions and observations.
If you are writing an annual report, consider asking many of the same questions you asked in your survey last year. While you may feel inspired to change your survey every year, tweaking the wording of your questions will often result in different interpretations and, thus, produce different answers. People really like to see comparative and directional data to see how an industry has changed over time, and the only way to accurately do that is to ask the exact same questions every year for a least one section of your survey.
Consider Respondents' Demographic Information
The way in which you plan to report your survey data should guide your initial demographic questions as well. For example, while I was crafting the State of Inbound Marketing Survey, we decided to create two final reports -- one for primarily U.S. businesses, as well as an international version. As a result, rather than asking our survey respondents' geographic region, we added a drop-down menu asking the survey-takers to choose their specific country. Because of this approach, now I'll be able to develop charts and graphs that compare inbound marketing trends in France with those in Germany -- something I wouldn’t have been able to do if I'd only asked people to designate whether or not they were in Europe.
2) Draft an Initial Outline & Get Early Buy-In
Surveys are often launched to support company’s internal change initiatives -- for example, to measure customers' opinions about a new product or pricing change. Don’t let the information you discover from your survey be a surprise to other departments. You're ultimately going to socialize this data, and you're likely using it to prove a point. To make this more palatable to your audience, give stakeholders a chance to weigh-in and offer questions about what they'd like to learn from the survey.
While an efficient survey process certainly can’t fix major corporate culture gaps, you can set your survey up for success by making sure all your internal stakeholders -- especially those who will be affected by the results of the survey -- are involved in the initial question drafting and approval process. It’s much harder for the head of IT to argue that a survey question ignores crucial technology if he or she was the one who wrote the question.
Interview key stakeholders, listen to their thoughts, and make sure your draft questions include some of their actual language. According to Robert Cialdini’s book Influence, people are more inclined to say yes when they read content written in their own words.
One of the best strategies to get sign-off is to draft your initial questions, circulate them to key stakeholders with a date when you need them to sign off, and then fill in the underlying bullets. While this can occasionally mean you need to go through two rounds of approval (one for the question outline, one for the final draft), it also assures that you get approval on the concepts your survey will cover before you delve into the semantics of your draft questions.
3) Write Statistically Valid Questions
Once you have signoff on the direction of your survey, it’s time to draft your survey questions. This is where you'll get into the science and psychology of test design. Keep in mind that you need to be very careful about how you structure your survey questions. Writing statistically valid questions -- ones that don’t prompt an automatic response or skew your data -- is a lot more complicated than you might think.
People pay millions of dollars for experts who understand the psychology behind survey design. But even without an advanced degree in statistical modeling, there are some general rules that every marketer can adopt to write concise, valid questions.
Common Structural Mistakes in Survey Design to Avoid:
- Avoid Leading Questions: Lawyers spend years in law school learning how to formulate questions that include a right answer. For example: “Isn’t it true that inbound marketing budgets have increased in the past year?” To create objective questions, make sure you're crafting questions that don't lead to a specific response.
- Prevent Assumptive Closes: Avoid writing questions that assume an answer at the outset. For example: “How much will marketing budgets increase next year?” What if marketing budgets are actually decreasing? Your question won’t capture that data.
- Don’t Imply Answers: Questions also shouldn’t lead people toward a specific cause-and-effect response. Designing a question to read, “If you were to boost your email marketing output next year, how much improvement would you expect from your efforts?” Very few people will answer 0% to that question, given the preceding clause.
- Never Coerce: Questions that explicitly coerce answers will also skew your data. While advertisers might get a lot of traction with a campaign that reads, “Will you buy your mother flowers to prove you love her on Valentine’s Day?” your survey won’t produce truly valid results that way.
- Pay Attention to Scale: Even without blatant directional signals, your question scales can imply a “right” answer. The first draft of my survey included a rating scale of: “completely agree, somewhat agree, or disagree.” My colleague correctly reminded me that I was missing a correspondingly moderate disagree option. The new draft, which will produce statistically valid results, now reads: “completely agree," "somewhat agree," "somewhat disagree," or "completely disagree.”
- Clarify Your Language: Just like website optimization techniques, you want to minimize the friction in your survey questions. To do that, keep the language in your questions as simple as possible. Confusing language will ultimately produce either garbled data, or survey abandonment. Check to make sure your questions generally make sense, and that you aren’t using industry buzzwords or other colloquialisms that people outside your company won’t understand.
As we just hinted, keep in mind that people will often stop taking a survey the moment they don’t understand a question. I recommend asking an outside party to read your survey, and ask them to identify any questions where they think “huh”? Chances are more than one person will be confused by an awkward question, and it’s safer to change it.
4) Ruthlessly Edit Your Survey
Try to keep your surveys as short as possible. You can expect a steep drop in responses if your survey takes longer than 5–10 minutes. As a rule of thumb, that’s roughly 40 questions.
Furthermore, test your surveys with laypeople -- I like to use family members and people from different departments -- to make sure the average person only takes 10 minutes to answer the whole thing, and doesn’t get stuck on any questions. Using people outside your team will also often call attention to questions that aren't clear, helping you eliminate a lot of industry jargon. If you really need to ask more than 40 questions, just make sure you offer a compelling incentive, such as being entered into a drawing for a compelling prize.
For example, because I was tasked with writing two major research reports from the 2013 State of Inbound Survey data, and we only do one industry survey each year, I ultimately narrowed down my question pool to 46 total questions. To compensate people for their time, we are offering a chance to win a free iPad or $500, plus a free 60-page ebook on inbound marketing statistics.
You can also do a little back-end magic and with the logic in your survey, using question branching features depending on your survey tool (I recommend using QuestionPro, SurveyGizmo, or SurveyMonkey) to expand your total pool of questions. This way, you could have an overall question pool of 60 or 80 questions, but based on your individual survey-takers' responses, a single person will only ever see 40 total questions.
Unfortunately, adding complicated branching schemas to your surveys can also build in a lot of potential for technical difficulties that mess with your data. I once sent a survey to 600,000 people, and somehow built in logic that always skipped question 10 -- which meant we never collected data on that question. So if you do plan on incorporating branching questions, just be sure your double and triple checking your question logic to avoid a situation where you want to throw your computer out the window.
5) Put Your Survey Questions in the Right Order
In addition to writing valid questions, you should also pay close attention to the order in which you present them. Here are three keys to making sure your survey arrangement delivers the data you want:
- Put Demographic Questions First (And Make Them Mandatory): People historically drop off answering questions toward the end of a survey. For example, if you were planning to present data on the difference between B2B and B2C strategies for inbound marketing, you'd need to make sure you can parse all your survey responses by their core business model -- so make that's one of the first five questions.
- Transition From General to More Specific Questions: Just like any conversion path, a survey should follow a logical thought pattern. You don’t want to confuse your readers by asking them hard questions up-front before they can even wrap their brain around the topic. For the 2013 Inbound Marketing Survey, we started asking readers about their overall marketing priorities, and then we dove into more specific, tactical considerations.
- Randomize Questions Choices: For multiple choice questions, check the capabilities of your survey tool to enable you to randomize the answers. People have a tendency to answer with the first two options in a question set. Varying these answers to appear at random will ensure more representative responses.
6) Incorporate a Separate Copy Edit Round
Once you've drafted valid survey questions, it’s also important to make sure they are clear, concise, and spell-checked. This step should go without saying, but on big projects with multiple rounds of reviews, it’s actually surprising how easy it is to forget a final copy edit -- or assume someone else has already done it. Before you publish your survey, put on your editor’s hat. Check your survey for grammar and copy editing the same way you would a blog article, or any other piece of content that goes out the door.
Here is a brief list of questions I asked my team to review the final draft of my survey with:
- Can we make the questions clearer or more concise anywhere?
- Do all the verbs agree?
- Are the questions parallel? Beyond the verbs, are all the questions structured alike, like you would make sure all the headers in a blog post agree.
- Is the punctuation correct and consistent between questions? (E.g. Include all periods or none)
- Do the questions all seem to be written by one person? With so many people weighing in on a survey, check to make sure your survey questions have a consistent tone and style.
- Does the content in the survey align with our written style guide?
As you can see, it’s easy to get overwhelmed by the number of pitfalls that can trip you up during the creation of a simple survey. As Mark Twain famously said, “There are three kinds of lies: lies, damned lies, and statistics.” Do your best to avoid that by setting out to create a marketing survey an honorable statistician would approve of. And don’t lose heart. Most of these recommendations fall under the common sense umbrella.
I'd love to know what you think of my own survey's questions.
With your help, we're looking to understand marketing trends in 2013 for the 5th annual State of Inbound Marketing Report. By taking part in the most important report of the year, you'll be entered into a drawing to win a free iPad or $500, and receive a complimentary report. (Because iPads are fun, just like inbound marketing!) Take the 5–10 minute survey here.
Then leave a comment on this blog post and let me know what you thought! We'll also contact you when it's published in early April so you can be the first to read and learn from the findings. Thanks for your help!
Image Credit: albertogp123