I see a lot of questions about survey response rate--usually from people who are only seeing a couple percent response rate, and they're wondering (1) why, (2) if it's affecting the data, and (3) what they can do to improve.
So here are the answers:
1) It could be any of a number of things
2) It depends on the answer to (1)
3) Fix (1)
Why do People Complete Surveys?
Rather than asking why people don't complete surveys, let's first ask why they do.
Usually, someone will complete a survey if three things are true:
a) Someone asks him,
b) It's not too much of a nuisance, and
c) He thinks it matters.
Note the complete absence of any direct reward to the person completing the survey. We've found that incentives are usually a poor way to boost response because most people are more strongly motivated by social factors, and the people who take the survey strictly because of the incentive aren't as careful about providing good feedback. (Not that incentives should always be avoided, but any incentive should be more along the lines of a tangible thank-you than direct payment for the survey).
Here's a quick list of things to look at if your survey response rate is too low:
a) Did you ask the customer to take the survey? It's not enough to merely ask the hypothetical question, "Would you like to take a survey after the call," you have to actually present the survey to the customer and ask, "Will you take this survey right now?" If you're relying on the customer to remember to take the survey, it's not going to work. Your survey is simply not a high enough priority in the life of an average consumer.
So if you're not actively reaching out to the customer with the survey--phoning, e-mailing, etc.--you're not really asking.
What's more, you have to actually get that message to the customer. The survey has to be delivered in a medium the customer is willing to use (which usually means a phone call if you're surveying about phone service, an e-mail if you're surveying about online service, etc.), and in a way that it won't get filtered out as just another marketing message.
b) Is the survey too much of a nuisance? Always respect the customer's time, and remember that the customer is doing you a favor. If the favor becomes burdensome, you'll lose out.
Customers won't want to take surveys which are too long, too intrusive, or too difficult. We've found that a good upper length for surveys is about five minutes for a phone interview, a handful of questions for an IVR survey, about a page for a written survey, and around 15 questions for a web form. Longer surveys will be refused more often. Avoid intrusive questions like age, income, race, etc., and don't make the questions too hard to understand or answer. You're looking for opinions, not deep thought.
c) Does the survey matter? Even if the survey is delivered properly and not too much of a nuisance, customers will often refuse if they think that the company doesn't really take the survey seriously.
On the flip side, one of the most powerful ways to boost response is to effectively "sell" the survey to the customer by communicating that the survey is important and the result will be carefully reviewed and acted upon by the company.
The best way to deliver this message is by a direct personal appeal from a live human, and the more authority the better. Very few people will refuse if the CEO personally comes on the phone and asks them to take a five-minute survey (on the other hand, very few CEOs are willing to pull that duty).
If the CEO is unavailable, the next best approach is to use a live interviewer who describes herself as someone working on behalf the company to help improve customer service (for example, she can describe herself as a "quality specialist" or emphasize how the company is using the data to improve service).
The least effective option is an automated request and survey, since the very fact that the survey is automated implies that the company isn't willing to spend a lot of money on it. Even there, however, we've found that an effective recording, emphasizing that this call was specially selected and the company cares deeply about the results, can improve the response rate by a factor of two or three.
The Impact of Poor Response
All else being equal, a poor survey response leads to lower-quality data and a more expensive survey.
When the response rate is low, survey-takers are more likely to be people with strong opinions, and as a result the survey scores aren't truly representative of the customer base. There can be other biases, too, such as technical barriers to taking the survey, or even customer service representatives actively working to keep unhappy customers out of the survey.
The survey also winds up being more expensive, since you have to present the survey to more customers to get the desired number of completed surveys.
Survey response can vary from well under 1% (which I've heard about in some poorly managed automated surveys) to over 50%. At VocaLabs, we regularly achieve over 50% response in some of our live interview surveys by combining an efficient process with skilled interviewers and a well-designed script.