Vocalabs Newsletter: Quality Times


In This Issue

The Structure of a Customer Service Survey

Customer service surveys generally have the same goal: to collect customer opinions about a particular experience, and how that experience might influence other opinions and behavior. Since the survey is structured around a particular phone call, purchase, or other well-defined event (and usually happens immediately after the event), we want to organize the questions around the structure of the event.

1. Introduction

Every interview begins with a short introduction, where the interviewer introduces him or herself and determines that they’ve reached the right customer. Normally it isn’t necessary to ask for the customer by name—it can be hard to get the customer’s name from the client’s system, and it can increase the “creep-out” factor for some customers.

After the customer confirms that we’re talking to the right person, we ask if he or she would be willing to take a survey about that experience. It’s important to give some expectation of the time involved (short surveys are better) and, if the customer has questions, answer them to the best of the interviewer’s ability.

2. High Level Outcomes

The first section of the survey usually deals with high level outcomes: the customer’s opinions of the company overall and the customer service experience as a whole.

These are important to put first because we want as unbiased a view as possible of these overarching customer impressions. Later in the survey, when we start asking about specific events during the customer experience, we will potentially remind the customer of things that might color that overall impression.

3. Follow the Experience

Most customer experiences follow a reasonably predictable sequence of events. A customer service call typically begins with an automated system, then (depending on what the customer is trying to do) gets transferred to a live person, possibly transferred to another person or a supervisor, then ends. A retail store visit begins when the customer enters the store, continues as the customer finds and selects the products (possibly with the assistance of a salesperson), and ends when the customer exits the store after paying for her purchase.

Survey questions about specific things that happened during the customer’s experience should follow the sequence of events. This helps the customer better remember what happened and how the customer felt at the time. You are essentially asking the customer to tell the story of the experience, albeit in a very directed and structured way.

4. Wrap Up With Other Details

The end of the survey is where you ask about things related to the customer’s experience which don’t fit neatly into the sequence of events. This is the place to ask why the customer called, whether the customer researched the purchase online before visiting the store, or whether the overall experience was as efficient as the customer wanted.

If there are demographic questions, they should go to the end of this section—I recommend putting demographics as close to the end of the survey as possible, since some customers find these questions intrusive and you may experience a lot of hangups at this point in the survey.

5. Parting Shots

I recommend ending every survey with two specific things.

First, ask if the customer has any other comments or suggestions. This gives an opportunity for any “parting shots,” or feedback which doesn’t fit neatly anywhere else in the survey. This is often where we get the most insightful comments, because customers often want to suggest (or complain about) things which nobody thought of when crafting the survey script.

Second, if the customer had any complaints, the interviewer can choose to offer the customer a follow-up call from the company. Our experience is that customers often decline this follow-up, but always appreciate the offer. This gives the company a chance to fix the problem, possibly save a customer, and discover where their customer experiences are going sour.

NCSS Results for Mobile Phones (1Q 2011)

Download Executive Summary

We released the latest results from our National Customer Service Survey tracking the major wireless carriers, AT&T, Verizon, Sprint, and T-Mobile. This latest report covers the six quarters through the first quarter of 2011, and is based on over 5,000 customer interviews immediately following a customer service call.

We found that call resolution continues to be the biggest obstacle to customer satisfaction in this industry, but being made to go through repetitive or irrelevant steps is a very close second. Streamlining the customer experience--eliminating the unnecessary parts of a customer service call--may be the lowest hanging fruit in trying to improve business outcomes like customer loyalty and satisfaction.

Sprint's customer service levels in Q1 remained well ahead of where the company was a year ago, though the company's top-level business metrics (overall company satisfaction, loyalty, and willingness to recommend) slipped slightly. The changes were right on the threshold of statistical significance, and given that the reported service levels (call satisfaction, resolution, etc.) held steady from the December quarter, we have to speculate that the decline in business metrics is related to something outside the customer service operation.

Newsletter Archives