In This Issue
- Improving Survey Response Rates
- Superior Problem Resolution Helps Put Verizon, T-Mobile on Top in Vocalabs' Study of Mobile Phone Customer Service
- Upcoming Events
A lot of people ask what the response rate for a customer service survey "should be." Response rates depend on a number of factors, and we've seen surveys with response rates under 1% and over 60% when surveying about a customer service call. Some of the important factors are:
- Is the survey a live interview or IVR? Live interviews have a much higher response rate, sometimes as much as 10x the response of a similar IVR survey. You can ask many more questions in a live interview before the refusal and drop-out rates start to climb, compared to IVR surveys.
- How long is the survey? A live interview over five minutes will start to see an increased refusal rate; an IVR survey longer than 3-4 questions will start to see people hanging up before finishing.
- When is the survey administered? In a call center environment, it's best to administer a survey within a few minutes after a call. Surveys administered immediately after the call can yield response rates that are double the response of surveys attempted the next day.
- What kind of relationship do customers have with the company? Companies with loyal customers and ongoing relationships see a higher response rate.
- How is the survey administered? Surveys which customers perceive as being taken seriously by the company get a higher response rate than surveys which come across as pro-forma or half-baked.
And while we're on this topic, here are a couple of other points to keep in mind:
It's often not a good idea to have the agent offering the survey, since agents have a direct stake in the outcome of the survey (even if they're not being directly measured, they still want to look good). We've seen a number of surveys where agents deliberately offered the surveys only to their best calls, in order to inflate their own survey scores.
For similar reasons, it's often not a good idea to administer the survey at the end of the call--instead, the survey should be an immediate call-back process. Otherwise, it's too easy for agents to get customers to hang up before the survey, and even absent any direct manipulation, angry customers often hang up before the end of the call. These people are excluded from any end-of-call survey.
So there really is no one simple answer, and there can be a 100x difference between the best and worst response rates. You really need to start with the business goals for the survey, and work from there to design a process which will most cost-effectively meet those goals. Other considerations are often more important than response rate: are you getting a useful sample, are you asking the right questions, are participants understanding the questions the way you expect, how are you analyzing the data, and are you getting the data in a form and manner which makes it useful?
We recently released our latest research on the quality of phone-based customer service, The National Customer Service Survey – Mobile Phone Customer Service. In 1,142 telephone interviews conducted immediately following a customer service call, 66% of Verizon customers said they were “Very Satisfied” with the experience, with T-Mobile garnering 65% “Very Satisfied.” Sixty-three percent of AT&T and 59% of Sprint customers gave their service experiences the top rating. Results are for surveys conducted between July 2009 and February 2010.
Verizon and T-Mobile’s strong performance can be attributed to their superior call resolution rates. Sixty-six percent of Verizon and 67% of T-Mobile customers reported that their problems were resolved during the call, compared to 64% of AT&T and 63% of Sprint customers. Our research has consistently shown that problem resolution is one of the most important factors in driving overall satisfaction and other business goals such as loyalty and willingness to recommend.
Verizon and T-Mobile’s superior showing is likely due to their ability to serve customers’ needs on the call, which overcame other more negative aspects of their customer service. This independent research is underwritten and conducted by Vocalabs on an ongoing basis to benchmark industry trends in phone-based customer service.
To download a free copy of the Executive Summary, click here.
We'll be presenting at two industry events this summer. Let us know if you’ll be at either show. We’d love to schedule time to meet.
ICMI Annual Call Center Expo (ACCE), June 14-17, 2010, New Orleans, LA
Cellular South/Vocalabs Case Study Presentation: Using Real-Time Customer Feedback to Improve Customer Service and Save Money (Session 402), June 16, 2010, 11:00am - 12:15pm
Austin Fisher, Manager Customer Contact Center, Cellular South
Bonnie Meadows, Customer Satisfaction Analyst, Cellular South
Jeff Richardson, Director, Customer Advocacy, Cellular South
Peter Leppik, President and CEO, Vocalabs
SpeechTEK 2010, August 2-4, 2010, New York, NY
Exhibit Hall - Visit us at Booth #507
Conference Presentation: User Feedback and Speech Applications, Wednesday, August 4, 2010, 11:45am – 12:30pm
Peter Leppik, President and CEO, Vocalabs
Emily Selene de Rotstein, VP Sales and Marketing, Vocalabs
Sooner or later, every speech application will generate user feedback — whether it’s from usability testing, customer surveys, or complaints from your mother. How can you use this feedback to improve? How do you deal with unexpected complaints? What steps should you take to manage the feedback process? Learn how to systematically build customer feedback into your speech project, or deal with the feedback you’re getting from the application you’ve already deployed.
SpeechTEK Labs - Peter Leppik will be on hand to help judge this year’s SpeechTEK labs’ Analytics Lab. Test-drive a variety of analytic tools. Learn how they can help you better understand what your users are doing and what they need. Evaluate various analytic tools in action, and judge for yourself if their results provide valuable insight. Determine the amount of ramp-up time necessary to use each tool and how easy each tool is to use. See if these tools provide you with the kind of information you need to improve your contact-center services and speech-enabled systems.