I'm not a big fan of automated surveys, but one thing they usually get right is collecting immediate, real-time feedback. Because one thing computers are better at than people is speed. Raw, blinding, trillions-of-operations-per-second speed.
Except apparently at AT&T, where I was recently asked to take a three-question survey that took almost three hours to complete. Oh, and there were actually four questions.
The survey was following up on a call I had made to AT&T's customer service. Even though I had spent 15-20 minutes talking to a person at AT&T, the company decided to robo-text me the following day for feedback. Apparently it was too much trouble to survey me the same day--or even have an actual person talk to me--so it's clear from the start that this survey isn't really a big priority for AT&T.
But the real head-scratcher is that after I responded to the first question of the three-question survey it took 42 minutes for AT&T to reply with the second question. I don't know what AT&T was doing in those 42 minutes, but I'm pretty sure it wasn't because they wanted to take the time to carefully compose a thoughtful reply.
Then it took 55 minutes to send me the third question (of three). And 59 minutes after I replied to that question, they sent me a fourth question, which somehow didn't count as a question because....why?
If you're keeping track, it took two hours and 50 minutes to ask and answer all four questions on a three-question survey (including the time it took me to reply).
I'm assuming that AT&T did not intend for this survey to take so long: something was probably broken in their back end. Nevertheless, this survey sent a very strong message that AT&T did not actually care about my feedback, and didn't respect me as a customer:
- By spreading the survey out over three hours, AT&T managed to interrupt me four times rather than just once. Remember that I am doing AT&T a favor by taking time out of my day to answer their questions. The least AT&T can do is respect that favor and minimize the disruption.
- The fact that the survey process was clearly degraded in some fashion suggests that either nobody at AT&T noticed, or nobody cared enough to fix it. That sends a message that the company doesn't really put a priority on this feedback.
- I also provided low scores on the first question, and in the last question told AT&T that I planned to take my business elsewhere. That should be ringing alarm bells: I'm a longtime, highly profitable customer to AT&T. Yet they have not attempted to contact me in any way to save my business and as a result they will lose it within the next few days. My only conclusion is that AT&T doesn't really care about me as a customer.
- And as for the surprise fourth question, I really can't understand that. Given that it was the open-ended comments question, pretending that it doesn't exist suggests that my suggestions and comments somehow don't count. Or maybe they thought they had to lie about the length of the survey to get me to actually take it. Or maybe AT&T honestly doesn't think my comments are as important as the numerical scores I provide. Whichever it is, it says that AT&T either doesn't care what I think, doesn't respect my intelligence, or doesn't feel that improving itself is a priority.
One of the key principles we adhere to at Vocalabs is that customer surveys should respect the customer and leave a positive impression. This survey doesn't.