The Customer Service Survey

Channel Bias in Surveys

by Peter Leppik on Wed, 2013-09-04 17:00

One of the key decisions in designing a survey is which channel to use: e-mail, IVR, interviews, pencil-and-paper, or something else. Often there are cost and practical reasons for choosing one channel over another. It's also to keep in mind that the choice of channel will bias the survey results in two important ways.

The first is that customers' responses will change somewhat depending on whether there's a human interviewer, and also whether the interviewer is perceived as a neutral third party. People natually want to please the interviewer, and will tend to shade their responses towards what they think the interviewer wants to hear.

The other, which in my experience is much more important, is that the channel has a very strong effect on whether customers take the survey at all. The highest response tends to be with an interview; more automated and more annoying channels generally see a substantial drop in response.

When the overall response rate is lower, participants bias towards customers who have stronger opinions, and towards customers who are more engaged with the brand.

Sometimes the channel itself will introduce a strong bias in who takes the survey. For example, many companies struggle to maintain accurate e-mail lists of their customers. If one segment of your customer base is less likely to provide an e-mail address, you will bias against those customers. One of my clients has a particular customer segment where they almost never manage to get a valid e-mail, and that segment is also most likely to churn--this makes the client's e-mail surveys a very poor measure of their overall performance.

Finally, the more automated the survey process, the more brittle it tends to be. Where a human interviewer will let you know if there's a problem in the script and can work around a technical problem, you don't get that level of resilience with an e-mail or IVR survey. If the survey is badly written you may get customers abandoning it or giving nonsense answers; and if something breaks it will give errors. I've seen companies with broken surveys, oblivious to the fact that they are getting worthless responses. Automated surveys require constant monitoring, you can't just set-and-forget them.

Sorry...This form is closed to new submissions.