In This Issue:
By Rick Rappe
OK, we admit it. Since beginning the Vocalabs blog The Customer Service Survey we've been remiss on getting out a regular issue of Quality Times. We receive regular compliments on our newsletter along the lines of thanks for providing useful information rather than just being a corporate advertisement, and we hope you find this issue no different.
A blog and a newsletter can and should serve different purposes. The blog is a more personal and free-form medium where opinion far from a central theme is possible. And that can often include venting on a wide range of topics.
On the other hand, the newsletter should have a consistent theme. In our case, topics related to the provision of quality customer care. And we believe that product promotion is perfectly OK in a newsletter so long as it also delivers good information and industry insights. We hope you agree that Quality Times does just that.
By Rick Rappe
Recently several vendors have been promoting the idea of asking callers to stay on the line at the end of the call in order to take a short survey. This is often sold as an inexpensive way to gather customer satisfaction data, but we've been very critical of this technique.
Callers who hang up in frustration when they can't get through an IVR never stay on line to take a survey at the end of a phone contact, so the results can be wildly off track. And we've also found repeated examples of contact center agents manipulating results by controlling which callers get to the survey.
To better quantify how biased an end-of-call survey may be, we found one of our clients that had performed both an end-of-call survey and a follow-up survey asking exactly the same questions. Calling customers back provided for a more reliable sample of all callers, and we found that the end-of-call survey had inflated the number of callers who were "Very Satisfied" by about 40 percentage points.
With the number of factors that can influence sample bias, this error will be different for every survey. But clearly end-of-call surveys can easily have huge biases, enough to make the results very suspect.
By Rick Rappe
The latest results of our ongoing SectorPulse study of customer care performance are in. Among the major financial institutions we looked at, Washington Mutual once again stands well above other banks in the quality of service their customers report with scores substantially higher than second place Wells Fargo. In fact, of the five companies we studied, only Washington Mutual has ever posted a customer satisfaction score above the median in our historical database. Bank of America trails Wells Fargo slightly, and Citibank is far in the rear earning scores well below the median satisfaction score of all the companies in the VocaLabs data base.
Among the mobile phone companies we track, Verizon Wireless and T-Mobile are in a statistical tie for best caller satisfaction. SprintPCS has posted dramatic gains over the past couple years, though it is still behind the two leaders.
We have tracked the performance of customer care within the major wireless carriers for almost three years now, and the improvements in service quality have been statistically significant with Verizon and T-Mobile now both earning "A" grades in the quarter ended March 31st. Cingular, perhaps as a result of absorbing low scoring AT&T Wireless, has not posted significant gains over the past couple quarters, but Sprint, which in times past rivaled only the former AT&T Wireless in low satisfaction, has made substantial gains, and now has better caller satisfaction than both Cingular's legacy customers and the combined score for the merged Cingular.
One interesting point is the dramatic difference in the frustration rates at the different companies. "Caller Frustration" is our measure of how difficult customers think it is to reach a live agent, and it is influenced by factors including intentional barriers to reaching an agent (for example, forcing callers to use a self-service system), time spent on hold, and poor design of automated systems. We have found that the Caller Frustration score correlates very strongly to overall satisfaction and single call completion.
Given that the factors which influence the Caller Frustration rate are entirely under the control of the company, the wide variation in Frustration scores means that different companies are taking very different approaches to providing customer service. And some of those approaches are working, while others are not.
By Peter Leppik
VocaLabs is sponsoring the Frost & Sullivan Executive MindXchange in Orlando next week. We'll have a display table, including a sample Express Feedback report, and I'll be presenting on trends in the industry. If you're attending, be sure to stop by and see us.