Recently the team at ChurnZero (a really nifty customer success software with a pretty terrific HubSpot integration) invited me to give a talk about scaling service as you grow your company. To help guide the talk, I told the story of how HubSpot grew support and services and the challenges we faced along the way (I ran support and services for seven years at HubSpot). Check out a recording of the presentation here.

There were a number of really great questions during the broadcast but also a few we didn’t get to. We figured HubSpot customers and Service Hub user like you may have a lot of the same questions, so we decided to answer them here on the User Blog.

If you have your own questions for me, reach out on Twitter.

As a leader, how do you know when [a services need] requires a one-to-one human touch, a one-to-many human touch, or a none-to-many automated touch? When should it be a rep's task versus a machine's ?

In an ideal customer world, every customer could have every touchpoint any way they wanted — one-to-one, one-to-many, or none-to-many (automated). And in a purely budget-driven world, you’d use the lowest touch necessary every time.

Back in reality, things are harder. In particular, most companies find that one-to-one, human work is the most flexible, but most expensive on the long-term. Fully automated touchpoints are inflexible but cost-effective long-term.

My recommendation here is to first, map our your touchpoints. Then, assess (grade yourself, even) on which touchpoints you’re doing a great job at. For those that you’ve really cracked the code on: awesome, write the code down and automate it! For those you’re still figuring it out, keep applying that human touch so that you can iterate rapidly and get it right. For those where some blend is optimal or you’re constrained on automation resources/budgets, consider getting at least some scale and efficiency in one-to-one.

In this model, eventually every touchpoint ends up automated. But each touchpoint has its own “lifecycle” while it’s getting figured out by the customer experience team. We’ve seen that to work quite well at HubSpot, at least, and have felt the pain of ignoring this rule at both ends; automating too early and forcing iteration through resource constrained engineering teams is as bad for the company as failing to automate and continuing to have humans turn the crank.

How do you map the customer journey to  key success activities like implementation, ongoing support, and QBRs? Your team's day-to-day?

Most companies with more than a few customers have a rough sense of the first few, key moments in the customer journey. This question actually makes mention of what those often are, especially for SaaS busineses (kickoff, implementation, so on). The first few you often get for “free,” in the sense that you can just trust your gut and reason out what they ought to be.

From there, things get more complicated and there's a severe risk of analysis paralysis. On the quest to understand leading indicators of churn, you’ll go down many rabbit holes, find many false truths, and fail a lot. For instance, you probably know that customers who log in to your software retain more than those who don’t!

“Great!," says the over-eager customer service (CS) leader, “So, let’s call everyone who doesn’t log in and get them to log in!”

Of course, thats a little misguided: The root cause of this segment’s eventual cancellation isn’t not logging in...it’s something else in the health of the account that led up to not logging in and cancellation.

The two paths that are generally useful to pursue here are what I’ll call Health Checks and Alerts. Health Checks are a mechanism in your customer experience to measure, over time, the health of a customer. This health can be as simple as a score where different factors are worth more or less points and correlate to retention, or can be more sophisticated tools like ML. The key is that it’s a meaningfully accurate heuristic to flag customers who are falling off the path and could use intervention to improve.

What intervention, specifically? Well, that’s often the job of the CSM, because the data itself may not hold the true root cause and true solution of why health is flagging. Health checks should occur on some regular cadence, like monthly. They provide a backbone cadence to the Customer Success Manager's (CSM) role.

The second generally useful path is the converse of Health Checks: Alerts. Alerts are events that can happen at any point in time, and should alert the vendor that “something is up."

For instance, if a power user is deleted from an account, that’s an alert. Or if a key point of contact leaves the company on LinkedIn, that’s an alert. Or if a customer hits the pricing page/ToS for a product they don’t yet have, that’s a (potential sale) alert. These events can happen anywhere, any time, anywhere, and can be good (like the potential sale) or bad (like a key PoC leaving).

Alerts and Health Checks together form an operational construct that you can use to augment the “easy” success activities like implementation, QBRs, etc. When combined together thoughtfully by a CSM, they can also make a customer feel like you’re not only guiding them on the path to success, but also catching them when they fall off. At the least, they’re two mechanisms that can help you move beyond the “easy” activities!

How do you take a very manual onboarding/implementation process and automate it without frustrating customers who want human help?

I had a few thoughts on automation in general above, but this is a slightly different concern. Take the construct from above: Once you know the “right” way to do a given motion, you have an option of automating it.

Let’s use kicking off a new account as an example. Once you know the “right” way of kicking off, you could just turn that into an email, a few videos, and call it a day, right? Well, maybe not, if you think that might run afoul of the level of touch and experience the new account expects.

In that case, I’m a fan of blending the human and machine motions. To explain what this is: You’d still build the automation that you think is a good idea for a kickoff — those emails, videos, etc. And you’d still deliver them the same way. But you’d augment that automation with a human action. So when that welcome email gets sent, you’d create a task for a human to also call within an hour or two of the email, make reference to the email, and offer to set up a call.

To the customer, how’s this look and feel? The beauty is that the customer can read this however they want to: as human-led-and-machine-augmented (for the “I expect a human to do this” customer) or as machine-led and human-augmented (for the “I don’t want to talk to a person unless I need to” customer).

This gives the CS team some cool advantages and efficiencies, too. In particular, it’ll reduce the amount of one-to-one touches because many customers won’t need the touch, or as much of it. I’m a fan of this approach to ease your way into a more automation-centric lifecycle.

How should young companies think about starting up a CSM function?

The moment you have a customer, you’re in the business of customer service. Full stop. You may not be good at customer service, or you may not have people with those titles, but make no mistake about it, you’re signed up for it!

Customers-5

The point of a CSM function is usually to maximize the long-term success of a customer and the value of that customer for the business. It’s less often to handle the day-to-day reactive questions, which I’d label as “support” more than “customer success.”

With that definition and goal in mind, when is the right time to start funding a CSM function? I’d offer that it’s at the point when long-term outcomes matter.

If you’re a very, very early company just trying to find product marketing fit and get onboarding right, a CSM function might actually not be the right way to spend limited resources.

But what if you have a big installed base that’s aging and your older customers comprise a meaningful percentage of today’s revenue, future distribution/upsell, and risk? Well, you’re gonna need a good CSM function— fast.

A note here: Gaining clarity at the executive level on the purpose and timing of CSM is a critical moment in the life of a company. The CSM function is the result of a mindset that long-term outcomes matter.

Should you group onboarding work with ongoing success work? What's the case for or against specializing work vs. ongoing tasks?

In an ideal world, you wouldn’t group onboarding work with ongoing success work. Reality may dictate otherwise, but here’s why I take this approach:

Employees — who are humans, by the way — will prioritize work. That’s what humans do. We look for order and categorization. Even if all of a given employee’s work is “the same” (e.g., it’s all “onboarding new customers”), you can still bet they’ll find a way to prioritize it (e.g., “I’ll work with the excited ones first”).

Let’s look at these two buckets in question.

  • Onboarding work: new customers, high excitement, needy, and the work is largely reactive to new customer energy.
  • Ongoing work: older customers, low excitement, not needy, and the work is largely proactive, where the energy is created by the CSM, not by the customer.

With that in mind, what does mixing ongoing work with onboarding work do? How will CSMs prioritize that work?

The answer that I’ve found in practice is that they will prioritize the onboarding work because those customers are new, excited, and asking for help. That work will often expand to fill all of their time, too, since those new customers essentially require infinite work.

Meanwhile, the ongoing work is less exciting and requires more energy creation. In the event that some of this work is “required” of the CSM, they’ll do it...but it won’t expand to fill their time in the same way that onboarding work will because the customers aren’t as needy. Instead, the CSM has to create the “need.” That takes more work and requires focus.

The net on this is that if you really care about ongoing work and long-term CSMing, you’ll separate those functions or very, very carefully handle time allocation on a per-CSM basis. Of course, when you’re small, this is hard. But even understanding the tensions that ongoing and onboarding work have in a CSM’s bucket can be useful to managing the small team.

What do high-functioning relationships between product teams and success teams look like?

I’ll preface this response with a statement born of experience: This takes work. The relationship evolves over time, like any good one, and both parties mature and self-actualize continually. So, my answer to this is also probably a reflection of how “far” I’ve seen the relationship go at the companies I’ve gone deep with, more than the “right” way to do this.

In my view, a high-functioning relationship between support and product involves:

  1. Each department having an honest view of itself.
  2. Each department having an honest view of the other.
  3. Both departments having a shared set of customer-driven values.

Let’s start with the first, Customer Support knowing itself. What is Customer Support good at?

Support has the broadest surface area with customers. It also often has the deepest when it comes to troubleshooting. Conversely, Support is often weaker at going deep on customer use cases as it moves from one broken thing to the next and is relatively low leverage in what it can do to make the company better; put differently, Support only controls the support interaction, not the product itself that the support experience revolves around.

Second, Product knowing itself: What is Product good at?

Product has the ability to go extremely deep in research and understanding of customer use case. It also controls the product itself, so it has massive leverage in its ability to create change. But on the downside, Product often knows the “real” use cases less well than support, since it has narrower customer interaction. Further, it lives in a non-reality of its own making. It thinks out into the future about what the product could be, versus what it is.

Product and service

So what do you do in a world where Product has all the power to change direction but Support has the compass pointing north? You align on a third point of reference that you can both march toward, starting at different places.

I find the best 3rd point of reference is the customer. When the conversation moves away from “We keep getting calls about this” and “We want the product to serve this use case,” and to “Our customer wants to do this,"  the conversation improves dramatically. Focusing on the customer embraces the fact that the two departments have different toolsets and empowers both parties to go at the goal of customer value from different angles.

Changing the toolsets won’t work — Product has its strengths, Support has its strengths — the key is to enable both parties to do their best work, with their own tools and skills, in service of a common goal.

How, then, can Support and Product agree on a set of customer outcomes they want to enable?

The journey to get here uses tools like persona development (“Who is our target customer?”), journey mapping (“How hard is it for our customer to find success with us?”), value proposition refinement (“What do we really do for our customers, again?”), and other notoriously high-level strategic buzzwords.

The thing is, you must have these conversations at the executive level in your company, and you must live by the outcomes in order to create an environment where different departments can focus on the fight to make customers successful, as opposed to the fight with each other. As a bonus, if you get this right, your sales, marketing, and other teams will have clearer direction and align their vectors with the customer better, too.

This is absolutely a tricky topic. I hope the discussion of “honestly understanding each department” and “aligning vectors around the customer” is a useful one; each company finds their path to cross-departmental alignment and customer outcomes differently. I’ve seen this one to be useful a few times.

What's the purpose of a CRM for CSMs?

CRM is a must-have for CSMs. CRMs hold all the metadata, contact information, and historical behavioral data about an account. Without a proper CRM, this data has to live somewhere — how else would a CSM know a customer’s location (metadata), phone number (contact information), or purchase habits (behavioral data)? Sure, it’s possible to have these in multiple systems, but it’s a pain as a day-to-day for a CSM.

Further, a CRM offers a singular place for multiple job functions to align around the customer. If your salespeople are using a CRM but your CSMs aren’t, your CSMs are blind to the origin of that customer’s experience: the pre-sales process. CRMs are key for understanding end-to-end experience.

A CRM is a must-have, and there’s no excuse not to have one. If you don’t have a CRM yet, HubSpot offers a free one to get going and get organized today.

What's the relationship between churn and NPS?  What are NPS best practices when it comes to collecting, viewing, and actioning the data?

In the data I’ve seen, churn and NPS are correlated at scale. So, on a segment basis with statistical significance, NPS and churn do track together. However, on a small sample basis or even an individual basis, it typically has very little correlation. This is the source of a lot of problems in understanding NPS for the industry.

Service Hub industry _ personas _ messaging (1)I’m a proponent of NPS. Let me explain why. NPS is a very, very simple process that can lead to great fundamental business change if deployed and discussed correctly. At the end of the day, NPS is a 2-question survey with a 0–10 point scale (“How much would you recommend us to a friend or colleague?”) and an open response. Then, you can use a formula to yield a score.

That score is where many, many problems with NPS come from. Folks, especially executives, begin to obsess about the score. Meanwhile, there’s a treasure trove of feedback in the open response.

The best way, in my view, to use NPS is to track but not obsess on the score. Obsess on the feedback, and be fearless in your segmentation. Find segments you care about, aggregate, itemize, and understand the feedback and what needs to change to improve the customer experience.

Then use all that executive interest to create pressure on the teams that own those components that these customer segments are telling you to change. Hold a quarterly meeting and track the progress and how those trends change. The numbers might not move, but the issues will — that’s the sign of progress. Stay on that treadmill, and you’ll eventually see results, even if they take a long, long time.

This operationalized approach to NPS, and non-obsession with the score, seems to work. I learned it from Jill Ward of Intuit, and we practice it at HubSpot. It’s not un-frustrating, I’ll be honest, but it’s indeed the best way I’ve seen to inject the direct voice of your customer, at scale, into your company, and gain sponsorship from the executive team. It won’t work for everyone, but give it a think — it may work for you!

When it comes to onboarding, where are the lines between training, consulting, and implementing?

Great question here about a murky topic. I’d define these terms as follows:

  • Training: teaching a customer how to use a tool to accomplish a specific business task.
  • Consulting: advising business-level changes to a customer. This could be tool-agnostic.
  • Implementing: Configuring a tool so that it's set up and can eventually accomplish a certain set of business tasks.

In practice, I think most SaaS companies blend these three in what they offer. As they grow, they perhaps make some effort to break off (and even charge differently for) consulting, in particular. They also often handle “training” in a more one-to-many or automated manner than implementation or consulting. With even more sophistication and growth, organizations will even start to stratify and specify different skill sets or profiles for the employees delivering each of these nouns.

At the end of the day, I think this is an important topic and distinction so that we as CS leaders can reflect on what we are actually offering to our customers when we use any of these terms.

If you have your own questions for me, reach out on Twitter.

Want to connect with others on HubSpot tips, tricks, and updates? Head over to the HubSpot Community to join a conversation or start one of your own.

New Call-to-action

Net Promoter, Net Promoter System, Net Promoter Score, NPS and the NPS-related emoticons are registered trademarks of Bain & Company, Inc., Fred Reichheld and Satmetrix Systems, Inc.

Originally published Jan 30, 2019 8:00:00 AM, updated January 30 2019