The Customer Service Survey

How Not to Do a Phone Interview

by Peter Leppik on Mon, 2013-06-10 13:42

I'm a big believer in using phone interviews to get real-time in-depth feedback on customer service. There's simply no better way to have an actual conversation with a customer, get him or her to open up, and find out how the experience was.

But there's a right way and a wrong way.

Over the weekend I was asked to do a survey about a phone call I made to a nurse line a couple weeks ago. Two weeks ago my wife had a bicycle accident, and I needed help deciding if she needed to go back to the doctor to look at a possibly-infected wound. She's doing much better now, thanks.

The interview was practically a case study in how not to do a customer service survey:

  • The interview took place almost two weeks after the original call. Pretty Good Practice: The faster you can reach the customer the better--calling back within a few minutes is best, though in this particular situation waiting several hours may be more appropriate. People's memories start to degrade very quickly, and after two weeks I remember very few details of my call other than the fact that it took place. That means that my answers are not going to be very detailed or specific, and I won't be able to give meaningful answers to a lot of questions.
  • At no time did the interviewer tell me how long the survey would take. Pretty Good Practice: Always inform customers how long the survey will take before asking if they'll participate. Seriously. They're doing you a favor, the least you can do is have a little respect.
  • The survey was 14 minutes long (according to the timer on my phone). Pretty Good Practice: Keep phone interviews under five minutes if at all possible. Fourteen minutes is way too long, and likely indicates the people designing the survey didn't have the discipline to cut questions which aren't useful or necessary. On the other hand, this explains why the interviewer didn't tell me how long it would take. Again, have some respect for customers and their time.
  • Many of the questions were slight variations of each other. Pretty Good Practice: Make sure different questions are really asking different things. This is where forcing yourself to write a five-minute interview script is a useful discipline. It quickly becomes obvious that you don't need to have a question about whether the nurse was "caring" and then another question about whether she was "helpful." Pick the two or three things which are most important and leave the rest off.
  • Some of the questions were confusing or didn't apply to the situation. The interviewer was not allowed to do anything but repeat the question. Pretty Good Practice: Adjust questions to make sure they make sense, and give the interviewer some latitude to help the customer understand the question. We all write bad survey questions from time to time, I get that, and you don't want interviewers going off the rails and re-interpreting the survey. But don't sacrifice accuracy on the altar of consistency, either. One of the great advantages of an interview over an automated survey is the interviewer can help a confused customer understand the intent of the question. Here's my slightly embellished recollection of how one of the questions on my interview went:

    Interviewer: How satisfied are you with the wait?
    Me: What do you mean by the wait? The wait to speak to a person?
    Interviewer: I'm only allowed to read the question, you can interpret it however you want.

    Seriously? Would it have killed them to let the interviewer tell me that the question was about how long it took for someone to answer my call? Knowing what I do about call centers and the likely purpose of the survey, I could infer that this was what the question was about. But someone else might have interpreted it to be about the wait at a clinic or something completely different, and that data would be useless. And fix the question, while you're at it.

  • There was hardly any free response on the survey. Pretty Good Practice: On a phone interview, have plenty of follow-up questions and opportunities for customers to give more detail. Another great advantage of the interview format is you can do this. Especially on such a long survey, not taking advantage of the customer's willingness to go beyond the scripted questions is a terrible waste of everyone's time.

  • The interviewer sounded bored and monotonic. Pretty Good Practice: Phone interviewers should sound engaged and try to establish a rapport with customers. I seriously did start to wonder if the interviewer was a recording, so every now and then I would toss in a question of my own to make sure I was still talking to a person. A shorter, better-designed script would have helped a lot. It also helps to hire interviewers who are naturally engaging. Who knows--this interviewer may actually sparkle most of the time, she's just been beat into submission by the awful dullness of it all.

  • Use of "Which of these statements best fits..." type questions on a phone interview. Pretty Good Practice: Don't use these on phone interviews. Really. Just don't do it. Any time I see a question like this on an interview script, I know it was written by someone who never had to actually do an interview. These questions are suitable for written surveys, where the participant can read the choices, quickly re-read the most relevant ones, and doesn't have to remember all the options. In a phone interview, it takes a long time to read the choices, and the participant can only remember so much at once. If the customer has forgotten one of the choices, the interviewer will need to re-read the option, which takes even longer. Find some other way to get the data you're looking for.

  • Asking demographic questions on a customer service survey. Pretty Good Practice: Get the demographic data somewhere else, or do without. Some researchers love their demographic questions, but experience has shown that they rarely lead to insights on a customer service survey. Surveys for other purposes are a different matter, but if you're trying to learn about a specific customer experience the demographics of the customer are usually less important than other factors like who the customer spoke to, the reason for the call, etc. There are many reasons not to ask for demographics: it takes time (in what should be a short survey), customers often find the questions intrusive, and there's a decent chance you can get the same information elsewhere.

Sorry...This form is closed to new submissions.