The Customer Service Survey
Structure of a Customer Service Survey
Customer service surveys generally have the same goal: to collect customer opinions about a particular experience, and how that experience might influence other opinions and behavior. Since the survey is structured around a particular phone call, purchase, or other well-defined event (and usually happens immediately after the event), we want to organize the questions around the structure of the event.
Every interview begins with a short introduction, where the interviewer introduces him or herself and determines that they’ve reached the right customer. Normally it isn’t necessary to ask for the customer by name—it can be hard to get the customer’s name from the client’s system, and it can increase the “creep-out” factor for some customers.
After the customer confirms that we’re talking to the right person, we ask if he or she would be willing to take a survey about that experience. It’s important to give some expectation of the time involved (short surveys are better) and, if the customer has questions, answer them to the best of the interviewer’s ability.
2. High Level Outcomes
The first section of the survey usually deals with high level outcomes: the customer’s opinions of the company overall and the customer service experience as a whole.
These are important to put first because we want as unbiased a view as possible of these overarching customer impressions. Later in the survey, when we start asking about specific events during the customer experience, we will potentially remind the customer of things that might color that overall impression.
3. Follow the Experience
Most customer experiences follow a reasonably predictable sequence of events. A customer service call typically begins with an automated system, then (depending on what the customer is trying to do) gets transferred to a live person, possibly transferred to another person or a supervisor, then ends. A retail store visit begins when the customer enters the store, continues as the customer finds and selects the products (possibly with the assistance of a salesperson), and ends when the customer exits the store after paying for her purchase.
Survey questions about specific things that happened during the customer’s experience should follow the sequence of events. This helps the customer better remember what happened and how the customer felt at the time. You are essentially asking the customer to tell the story of the experience, albeit in a very directed and structured way.
4. Wrap Up With Other Details
The end of the survey is where you ask about things related to the customer’s experience which don’t fit neatly into the sequence of events. This is the place to ask why the customer called, whether the customer researched the purchase online before visiting the store, or whether the overall experience was as efficient as the customer wanted.
If there are demographic questions, they should go to the end of this section—I recommend putting demographics as close to the end of the survey as possible, since some customers find these questions intrusive and you may experience a lot of hangups at this point in the survey.
5. Parting Shots
I recommend ending every survey with two specific things.
First, ask if the customer has any other comments or suggestions. This gives an opportunity for any “parting shots,” or feedback which doesn’t fit neatly anywhere else in the survey. This is often where we get the most insightful comments, because customers often want to suggest (or complain about) things which nobody thought of when crafting the survey script.
Second, if the customer had any complaints, the interviewer can choose to offer the customer a follow-up call from the company. Our experience is that customers often decline this follow-up, but always appreciate the offer. This gives the company a chance to fix the problem, possibly save a customer, and discover where their customer experiences are going sour.
- About this Blog (2)
- About Vocalabs (42)
- Above and Beyond (13)
- Agile Customer Feedback (16)
- Analysis (14)
- Customer Experience (48)
- Hall of Shame (11)
- Interesting Tidbits (26)
- NCSS (9)
- Pretty Good Practices (9)
- Rants and Horror Stories (23)
- Success Stories (2)
- Survey Design and Technique (46)
- Things We’ve Learned (14)