I often hear about companies which implement very sophisticated customer satisfaction measurement through surveys (and other techniques), but then ignore the data. "We do all this work," complain people in charge of the survey, "but when we send our report to management, nothing ever happens!"
Not only is this a waste of money--might as well not survey at all--but it is extremely frustrating for everyone involved in the project.
One reason survey data gets ignored is a psychological phenomenon called the confirmation bias, which is people's tendency to believe information which confirms what they already believe, but disbelieve (or simply ignore) information which disproves what they believe.
Think, for example, what you think when you listen to Rush Limbaugh on the radio. If you're politically liberal, you probably tell yourself all about the many ways he's wrong, before changing the station in disgust. On the other hand, if you already agree with what Limbaugh is saying, you're much more likely to keep listening, and maybe even cheer him on.
This willingness to accept what you already believe, but not what you disagree with, is the essence of the confirmation bias. Confirmation bias isn't always a bad thing: we live in an uncertain, noisy world, and this is one of the ways our minds filter out misleading and irrelevant information.
Inside a company, most people (and executives in particular) believe that they're doing a good job, working hard, and perform at least as well as the competition in the things which matter. When a survey comes back which purports to show that the company is failing somehow, the initial instinct is to disbelieve or ignore the data. After all, the survey can't possibly be right, because it doesn't reflect what management knows is the truth.
It doesn't help that a lot of surveys are easy to ignore. It's easy to think of ways that a given survey might be biased, ask the wrong questions, be improperly administered, or reflect only a vocal minority of customers--and many of those problems often exist, though I've found that they are more likely to bias the results in favor of the company than against.
So if you have a customer survey which shows that improvement is needed, how can you get past the tendency to discount the data? How do you get past the "we're doing OK" belief and move forward?
There's no magic bullet, but there are some things which help:
- Be credible. The data should be as bulletproof as possible, which means taking care to make sure the survey is well structured, properly administered, and addresses management's specific concerns. If possible, the senior decision makers should take an active role in defining the goals of the survey. Even though an outside consultant might not actually be smarter than in-house staff, a credible third party is also harder to ignore.
- Be positive. Everyone likes to think that they're doing better than average, but nearly everyone will acknowledge that there's room to improve. It may be best to skip benchmarking against competitors (at least initially), and not worry about the absolute level of performance. Instead, focus on identifying strengths, weaknesses, and specific ways to improve.
- Be repetitive. As frustrating as it is, confirmation bias will break down with enough contrary data. One negative survey is a fluke, two is a coincidence, but three, four, five, or more becomes hard to ignore.