On a recent cross-country flight, I made a pitstop in the restroom. On the way out, I was asked to rate my experience by pressing a button — one of four faces ranging from a happy smiley face to a red angry face. My selection was a quick, painless customer satisfaction survey.
Knowing what your customers think and feel makes all the difference when improving your user experience. So, what is a customer satisfaction survey, and what are the best practices for creating one?
I took a deep dive, so you don’t have to. Below, I’ll share some best practices from my personal and professional experience. I’ll also get some other experts to weigh in. Let’s get started.
Table of Contents
You might ask respondents if your products or services are useful, your website or app is easy to use, your staff is friendly, or your restroom is clean. Here’s an example of a customer feedback survey from HubSpot:
Going about our lives, we take part in customer surveys almost every day — sometimes multiple times a day — often unknowingly.
For example, if I hop in an Uber to head to my local coffee shop, I’ll get asked to rate my driver after they drop me off. After I’ve ordered my cup of joe, the coffee shop might ask me to rate my drink or the staff member who served it to me.
If companies are asking us to take customer satisfaction surveys so often, they must have some value, right? You bet.
In my career as a marketer and co-founder of Omniscient Digital, I’ve used various types of surveys — from basic one-question surveys to longer questionnaires — to get to the root of customer experiences. Here’s why they matter.
Customer satisfaction surveys are about more than confirming how much your customers love your brand. Let me explain where their real value lies.
From my experience as a marketer, customer feedback is a goldmine of information on more than just satisfaction levels. When customers tell you what they like and don’t like about products or services, you’ll find out what you should definitely keep and what you might want to change.
For a real-world perspective, I talked to Amy Maret, HubSpot’s principal researcher for trends and thought leadership and a former market research manager. According to Maret, satisfaction surveys allow businesses to track their most important relationship — their relationship with customers.
“We see over and over that having an exceptional customer experience is absolutely critical to a business’s success, so being able to see in real-time how your customers are feeling, diagnose potential issues, and act quickly as soon as satisfaction starts trending down is a huge advantage to any organization,” Maret says.
I agree with Maret on the importance of customer relationships, and I think that you can strengthen your relationship by simply listening to customers.
If I don’t know why customers churn, I can’t do much to keep them — or win them back if they’ve already left. Essentially, I can’t know a customer’s thoughts if they don’t tell me.
Take my experience at a new coffee shop. I started going regularly until a barista told me that they had stopped serving oat milk. They offered alternatives, but I wasn’t interested and switched shops.
What’s the lesson? I’m not one to complain over minor things, but had I been given a customer satisfaction survey, I would have mentioned my milk issue.
Maret agrees that businesses should ask if customers are satisfied, but also why. In fact, Maret notes that the best surveys go beyond just measuring KPIs and actually identify which factors impact customer satisfaction.
“As soon as the business sees unsatisfied customers in their survey, they can go deeper into what exactly is causing that dissatisfaction and address those problems directly — which we know from our research leads directly to happier customers that are less likely to churn,” Maret says.
With a well-designed survey given at the right time, you can identify issues and resolve them, helping you increase customer loyalty.
If customers are genuinely happy with your product or service (and not just using it because they can’t find an alternative), they can become an extension of your marketing team — and they do it all for free!
When I moved to a new city, I asked around the office if anyone knew a good place for a haircut. Suddenly, everyone became an advocate for their hairdresser. I got information on everything I needed to know, including prices, personalities, and free extras, like top-tier snacks and drinks.
These recommendations were both honest and enthusiastic, and I found them more convincing than any paid marketing campaign.
If you can identify your brand advocates through customer surveys, you can show your appreciation for them and even incentivize their word-of-mouth marketing.
When it comes down to making important business decisions, you need to get the input of all the important people in the room, and that includes your customers.
From a basic customer survey, I learned that something as simple as a new user interface for a website can annoy previously satisfied customers. As you might imagine, I held urgent discussions with my team to see how we could continue to satisfy loyal customers.
To show you how you can retain customers and have them shout your company’s benefits from the rooftops, I’ll share some customer satisfaction survey examples and templates.
Customer satisfaction surveys come in a few common forms, usually executed using a popular response scale methodology, like:
Each of these types of customer satisfaction surveys measures something slightly different, so it’s important to consider the specifics if you hope to use the data wisely.
The NPS is a popular customer satisfaction survey methodology, especially for those in the technology space.
It‘s rare to see a survey that doesn’t use this famous question: “How likely is it that you would recommend this company to a friend or colleague?”
Ask your customers this question with HubSpot's customer feedback software.
Okay, you got me — this first example doesn’t technically ask customers about their level of satisfaction. But, I’ve found that customer satisfaction and willingness to recommend are often directly linked.
If I don’t like a new product/service, I won’t recommend it to friends. Sounds straightforward? Not exactly. When it comes to subjective experience and preferences, things can get a little tricky.
For example, if I watch a new movie that doesn’t appeal to me because it’s too quirky, I might still recommend it to a friend who loves that kind of thing.
So, how does this affect businesses? Well, if you build a great product, you might have a good NPS, even if not every potential customer can benefit from it.
Let’s look at how the NPS is measured:
On a rating scale of 0–10, your detractors score you 0–6, passives score you 7 or 8, and promoters score you 9 or 10. Your NPS score is your percentage of promoters minus your percentage of detractors.
For example, if 70% of my respondents are promoters and 20% are detractors, my NPS is +50. If you think +50 sounds disappointing, you might be surprised. According to market research firm B2B International, the average NPS for business-to-business (B2B) firms is +25 to +33.
For me, CSAT is the easiest customer satisfaction survey type to use and respond to.
Usually, it takes the form of a question phrased like, “How satisfied were you with your experience today?” and a corresponding survey scale, which is generally from 1 to 5.
Here’s what these numbers mean.
To get your CSAT score, expressed as a percentage, you’ll need this equation:
(Number of positive responses (4 or 5)/Number of people surveyed) x 100
Opinions differ on this, but I think a good CSAT score is 75% to 85%. Higher is better, but don’t be too worried if you don’t hit a perfect 100% (it’s tricker than you might think).
As someone who has answered countless CSAT surveys and given plenty to customers, I can say that the results are a pretty useful indicator of how a customer feels at that moment, but you need to take the results with a pinch of salt.
For example, if I’m shopping for clothes online and encounter serious issues at the checkout page (no, I don’t want to become a member), I get frustrated. When my payment finally goes through, and I’m asked about my satisfaction level, I might say that it is 1 or 2.
However, when the clothes are delivered quickly and look and fit great, my overall satisfaction level could rise considerably. So, you need to be specific when asking this question (e.g., “How satisfied are you with the checkout experience?”) and get your timing right. (You can easily ask your customers about their satisfaction level with HubSpot’s customer feedback tool.)
In some cases, a simple satisfaction survey won’t give you the answers you’re looking for. Instead, you might want to ask about ease of experience.
Enter the CES.
The CES is a useful metric for measuring a customer’s service experience. Generally, your survey answer options should range from “very difficult” to “very easy.”
I’ve used these CES options to ask customers if they could easily navigate a website to get what they needed and if using an app was challenging.
A few answers of “very difficult” aren’t usually cause for concern, but a high percentage of them means you should consider redesigning your product or service.
Ask your customers this question with HubSpot's customer feedback tool.
It’s impossible to understate the importance of survey design. It’s the foundation your survey is built on, and it can affect everything from response rates to accuracy. If the design is wrong, the data won't be useful to answer your questions about your customers.
With that in mind, I’ll take you through the different types of questions that underpin your survey design.
The first type of survey question is a simple binary distinction:
The options for all of these are dichotomous, such as yes/no or thumbs up/thumbs down.
I think you’ll agree that the benefit of this survey type lies in its simplicity. When I use this question type, I want a straightforward answer from customers, nothing more.
In some cases, as you lengthen the survey scale, you end up with data you can’t use. As Jared Spool, co-founder of Center Centre, said in a talk, “Anytime you’re enlarging the scale to see higher-resolution data, it’s probably a flag that the data means nothing.”
If I’m finished browsing an ecommerce website and about to exit the page, I often get asked a binary question about my experience. Because it’s easy to answer, I don’t mind doing it, even if I’m in a hurry.
Bear in mind that binary questions have some shortcomings. For example, if a restaurant asks me the binary, “Did you like your meal?” and I answer, “No,” they won’t know if it’s because I found my food too salty, cold, or both.
When to use: I recommend using this question type when you only need a yes/no answer, and there’s no real ambiguity about what the answer means for your business. If you think your target audience could reasonably answer, “Well, kind of,” then choose a different question type.
Multiple-choice questions have three or more mutually exclusive answers. These tend to be used to collect categorical variables, like names and labels.
When I’m creating surveys that I’m going to conduct data analysis on later, multiple-choice questions are my go-to. That’s because I can easily segment the data based on different categorical variables to get the valuable insights I need.
Let’s say I have an accounting software company. I can ask respondents what their job title or business industry is when soliciting feedback. Then, when I’m analyzing the data, I can compare the satisfaction scores by job title or industry. I might find that CEOs don’t like the software’s expense, while their finance teams love its functionality.
When to use: I use this question type when I want to glean deep insights from survey responses, and you should, too. However, if you just want basic information from customers and don’t have the time or resources for data analysis, skip this one.
Many popular satisfaction surveys are based on scale questions. For example, the CSAT asks, “How satisfied are you with your experience?” and you rate the experience on a scale of 1 to 5 (a Likert scale).
You can design scale questions so that they’re answered using numbers (1-5), words (strongly agree), emojis (😀), and more.
Like other question types, they’re not without faults. For example, if I ask customers to rate my product and they give it a low score, I’m left guessing why exactly they gave me that score.
When to use: Use Likert scale questions and other scale types when you want a more specific response than a binary answer. They’re generally suitable if you don’t care too much about the reason behind a response.
Similar to scale questions, semantic differential scales are based on binary statements, such as disagree and agree, but respondents can choose from a wide number of points between them.
That means customers don't have to pick just one or the other — they can choose a point between the two poles that reflects their experience accurately.
Here’s what a semantic differential question looks like in practice:
Say a website owner is testing out a new homepage design. So, they ask visitors a semantic differential question, “Do you like the new website design?” They let respondents answer using two extreme options of “I don’t like it” and “I like it” or multiple unlabelled options in between.
This can give the website owner a good handle on respondents’ actual perceptions of the new design, particularly if there are follow-up questions about specific design elements.
When to use: Use this type of question if you want respondents to be able to interpret the answer continuum as they see fit. This can help you find out their strength of satisfaction or dissatisfaction without introducing bias from labels like “neutral” or “somewhat satisfied.”
None of the surveys I’ve discussed above tell you the why of an experience — you only get the what. To find out the why, you need qualitative data, which you can get from this type of customer satisfaction survey question.
Maret notes that qualitative questions allow customers to tell you the real why behind their satisfaction, without you having to make assumptions about what matters to them.
“When you want to dig deeper into motivations and underlying factors, it is helpful to hear from customers in their own words. But be careful — too many open-ended questions in one survey can cause respondent fatigue — potentially frustrating your customers and damaging your data quality,” Maret says.
I’ll add to her warning with one of my own: Make sure you’re asking clear questions without letting your bias or expectations slip through.
For example, if you write, “Tell us, in your own words, why our products are so great,” you leave little room for collecting customer feedback that’s critical of your products, and that’s a missed opportunity.
In many of my surveys, I’ll add some open-ended questions to take an in-depth look at customer expectations, customer needs, and more. Often, the insights I get from open-ended questions are the most valuable, but it can take a lot of time and resources to mine these insights.
When to use: Use open-ended questions when you have the time to really dig deep into the answers.
You’ll often see this type of question at the end of a survey with multiple question types — I have found that it helps respondents share thoughts that aren’t covered in other questions.
To bring you closer to creating the most suitable customer satisfaction survey for your business, I’ve collected some tried-and-tested best practices for you to follow.
If you want quantitative data, I don’t recommend open-ended questions. But there’s nothing stopping you from having a mixed survey type to cover all your bases. If you go this route, I recommend a logical, consistent approach to keep readers on track.
For example, I find that open-ended questions work well at the end of a group of closed questions or as the final section of a survey.
Remember I said to be careful of your question phrasing? I meant it. Make sure you’re asking the right questions, which should be clear and free from bias.
Avoid wording your questions in a way that suggests a ‘correct’ answer or subtly encourages one response over another. This helps you collect honest and accurate feedback that truly reflects your customers’ feelings and experiences.
Also, keep your survey simple and accessible.
“The simple happy, neutral, and sad face responses are simple enough that your customers will be inclined to answer your survey, and their meaning is universal across cultures and languages,” says Derek Pankaew, the founder of Listening.com.
Pankaew notes that you’ll want qualitative data from users, too. However, Pankaew suggests that you give them the option to elaborate if they wish rather than forcing them to answer required, open-ended questions.
“When you try to force longer answers and deeper participation, you’ll always lower the quality of the overall responses, create survey fatigue, and lose many to drop off before they complete the questions,” Pankaew says.
Customer satisfaction surveys should be concise to respect the respondent’s time and increase completion rates. Aim to create a survey that takes no more than 5-10 minutes to complete.
In it, focus on specific areas you want feedback on, avoiding broad questions that can lead to vague responses. Each question you ask should serve a clear purpose in meeting your overall survey objectives.
If you measure customer sentiment at the right time, you’ll get actionable results — and a chance to make things right with customers before it’s too late. The correct stage will largely depend on your business type, but in general, you should send out surveys shortly after a purchase.
If you’ve lost a customer, an exit survey can tell you why. But if you spot signs of dissatisfaction and survey customers before they’ve left, you can make a real difference.
For example, I once got negative feedback from a customer just before they were about to switch to a competitor. When I told them I understood their concerns and would work with them to make things right, they appreciated feeling listened to and retained our services.
By getting regular customer feedback from different touchpoints (e.g., in-app, email surveys, and social media), you can better understand how customers feel about your product/service in the long term.
For example, for complex apps, I recommend checking in with customers early (within the first week post-purchase) in case they are frustrated by technical challenges. You can offer tutorials to help and check in regularly afterward (e.g., monthly or quarterly).
If you ask too many questions, your customers might start to answer them on autopilot. This makes surveys a waste of time for them and a waste of time and money for you.
When it comes to survey questions, I always recommend quality over quantity, especially when the results will inform company decisions.
Sticking to around 10-15 questions is a good rule of thumb, but also consider how long the survey will take. For example, I answer binary questions really quickly, but I spend some time mulling over my answers to open-ended questions.
By asking similar questions but in different ways, you can get unexpected but valuable answers from customers. This can tease out customer pain points that they hadn’t considered initially.
For example, “Is the app fast?” and “Is the app slow at times?” can yield different answers.
If you’re using rating scales (like 1-5 or 1-10), keep the scale consistent throughout the survey. This prevents confusion and maintains uniformity in how respondents evaluate different questions. Decide whether a higher number should always represent a better outcome or vice versa, and stick to it across all applicable questions.
Choose distribution methods that are most likely to reach your target audience effectively. For example, email might be best if your customers are primarily online; if they engage with your brand via social media, consider using those platforms. Tailoring your survey delivery method to your audience’s online habits can improve response rates.
Guaranteeing that you will keep your survey respondents anonymous can encourage them to give you honest and critical feedback. Make sure to communicate clearly that individual responses will not be traced back to respondents and that you’ll report the data in aggregate form. This builds trust and increases their likelihood of participation.
Don’t just send your survey out without any input from others. I usually get colleagues to read over my questions to check if they hit the mark. Then, I send them to a small group of customers to ensure they’re interpreted correctly before I release them to the masses.
If your survey isn’t anonymous, then check in with customers and reply to their grievances. Address all their points, and try not to get too defensive if you think their responses are inaccurate.
The popularity of large language models (LLMs) and generative AI tools like ChatGPT and Perplexity.ai have changed how marketers create content, from blog posts and social media captions to web copy and email newsletters.
Some marketers and customer support reps also use AI tools to build customer satisfaction surveys and analyze the answers they receive from respondents. According to our 2024 State of Service report, 92% of customer experience teams using AI say it saves them time on the job.
This is because AI-driven tools can automate data collection and analysis, tailor questions based on previous responses, and even predict future customer behaviors based on gathered data.
I spoke to a couple of marketers who have used AI to build and deliver customer satisfaction surveys, and here are some benefits they pointed out:
In Zendesk’s Trend Report, 90% of customers say they’ll spend more money with companies that personalize the customer service they offer them. In fact, 68% said they expect all experiences to be personalized. By sending personalized customer satisfaction surveys, you can learn about the specific issues your customers are facing and tackle them accordingly.
AI algorithms analyze historical interaction data to understand each customer’s unique preferences and behaviors. This information allows AI to customize surveys for individual respondents, presenting relevant questions based on their purchasing history, previous feedback, and predicted future needs.
Eli Itzhaki, the CEO and co-founder of Keyzoo, said his team wasn’t happy with the standard survey they sent out via email after a service call because the response rate wasn’t great, and the few answers they got were bland. So, they transitioned to an AI-driven survey platform, which transformed their survey.
“The AI can tailor the survey questions based on the specific service call. So, if it was a late-night lockout, the survey might ask about the technician's bedside manner. This personalization gets people to engage more, and they feel like their experience is truly being heard,” Itzhaki says.
Through natural language processing (NLP), AI excels at performing sentiment analysis on open-ended questions. This technology analyzes text responses to detect nuances in mood, opinion, and emotion, categorizing them into sentiments like positive, negative, and neutral.
This allows businesses to sift through vast amounts of unstructured feedback quickly and accurately, identifying prevailing sentiments and emerging themes. Eli Itzhaki told me about how this feature has helped his team at Keyzoo.
“[The] AI-driven survey platform goes beyond just a star rating and reads the open-ended comments to understand the emotions behind the words. Were they relieved? Frustrated? Annoyed by the wait? This helps us pinpoint exactly where we're excelling and where we need to improve,” Itzhaki says.
For example, Itzhaki noticed a recurring theme in comments about customers feeling confused after a technician departed. They were uncertain about how to maintain their new locks or what steps to take if they required another key.
“With the AI‘s guidance, we’ve introduced a post-service email containing helpful tips and resources. It's a slight adjustment, yet it demonstrates our commitment and helps avert similar issues in the future,” Itzhaki says.
AI automates the labor-intensive parts of the survey process, such as distribution, data collection, sorting, and preliminary analysis. This speeds up the process, reduces human error, and frees up human resources to focus on more strategic tasks, such as interpreting AI-generating insights and developing actionable strategies based on their analyses.
Alexandra Dubakova, the CMO at Free Tour, uses AI in the company’s customer satisfaction survey as a boost.
"AI helps us send the survey to the client using the right channel. Customers are different, and we often use three survey methods depending on the profile that best fits the customer,” Dubakova says.
Dubakova notes that AI also makes it easy for the team to analyze a client’s behavior.
“It goes through their purchases and history and uses triggers and rules to send the right survey to them at the right time. It is time-saving, accurate, and efficient,” Dubakova says.
Despite AI’s susceptibility to bias, it’s possible to use AI algorithms to identify potential survey questions or responses that could have correlated biases and correct them before launching the survey. The algorithm tries to find patterns in question words or question structures that might provide clues for respondents to respond in one direction.
“On one survey, AI identified a question worded in a way that encouraged respondents to give far too positive feedback,” says Jason Brooks, the co-founder of UK Linkology. “Once we spotted the issue, we reworded it to be more neutral and representative of genuine feelings.”
AI can improve the objectivity of your surveys, and that’s where the actual value of AI to sales lies: capturing how your customers feel.
If you need an AI-first customer service platform with which you can build, deliver, and analyze AI-powered customer satisfaction surveys, check out HubSpot’s Customer Service Platform.
After your customer satisfaction survey is done and dusted, you can’t just put your feet up. Here’s what you need to do:
To help bring you new insights into implementing satisfaction survey results, I reached out to Simon Bacher, CEO and co-founder of Ling, a language learning app, to find out his team’s approach.
He says the team at Ling analyzes what aspects of gamification users love, such as the app’s banana points system or the badges earned through completing challenges.
“Furthermore, surveys reveal user preferences for different learning styles, enabling us to tailor the gamification experience to better suit their needs. Once we've implemented changes, we track how users engage with the updated features through app analytics and surveys,” Bacher says.
Customer satisfaction surveys are generally a great idea, provided you put some thought into designing them, distribute them at the right time and frequency, and — most importantly — do something about what you’ve found out from them.
Customers appreciate the simple act of sharing their thoughts, but in my experience, they won’t be truly satisfied unless a brand’s products or services meet their needs.
I hope these tips on survey design, use, and implementation help you in your efforts to improve your customers’ experience. Many of them have worked for me, but I’m always on the lookout for new ways to improve customer satisfaction.
Net Promoter, Net Promoter System, Net Promoter Score, NPS and the NPS-related emoticons are registered trademarks of Bain & Company, Inc., Fred Reichheld and Satmetrix Systems, Inc.
Editor’s Note: This post was originally published in February 2020 and has been updated for comprehensiveness.