The Customer Service Survey

Facts vs. Opinions

by Peter Leppik on Wed, 2010-09-22 17:27

Just like I learned in second grade, there are (broadly speaking) two types of survey questions: questions about facts, and questions about opinions. Most scripts our clients use contain both. The customers' opinions usually matter the most to the company, but the factual questions are essential to understanding why the customer feels the way he feels.

Facts are easy to deal with on a survey: you just ask the question (clearly and understandably) and the customer answers (or sometimes not). Factual answers tend to be the same no matter when in the survey you ask them, and they won't change much if you ask the question in a slightly different way.

Opinions are trickier, since they depend on the customer's emotional state and can be influenced by other questions on the survey. Customers also tend to answer all opinion questions on a survey similarly: if a customer was very satisfied with the overall experience, she is also likely to rate everything else about the experience very highly (the converse is also true).

It's possible to ask about the same thing in the form of either a factual or an opinion question--and sometimes it makes sense to do both. For example, on the NCSS surveys of phone service, we ask customers both how long they had to wait to speak to someone, and whether they felt the wait was reasonable. Having both yields a goldmine of information about customers' threshold of pain for having to sit on hold.

For analysis purposes, the facts of a customer experience help understand the underlying drivers of customers' opinions: are customers less satisfied with support calls than sales calls? Are customers more annoyed at being asked more questions by an automated support system, or being transferred when the call was misdirected?

With opinion questions, sometimes the most interesting gems are not in how the customers' different opinions correlate (since they usually correlate very strongly), but how they differ. Do customers consistently give a lower satisfaction score to one portion of the experience than another? Is the customer happy with the particular experience you're asking about, but unhappy with the product or company as a whole?

I've found that many customer surveys skimp on the factual questions, and this is a serious mistake. It's easy to ask about the specific events of a customer's experience, and having that data makes it much easier to understand why the opinions are what they are and how to improve.

 

Sorry...This form is closed to new submissions.