One of the big trends in call centers is software that allows the caller to opt-in to a survey after the call is through. Most often, you are prompted, before the CSR answers, to choose whether or not to take the survey. Sometimes the CSR asks you to take the survey when the call is over. In some cases, you can also leave voice-mail type comments for the CSR once you’ve answered a few questions. As specialists in Customer Satisfaction research and call centers, our group is often asked our opinion about these surveys. Here’s a quick take on our current thoughts:
- You can get decent response rates and overall satisfaction numbers on a regular basis
- Recorded messages direct to CSRs provides them with a real "voice-of-the-customer" that can be potentially motivating and add a new dimension to performance management
- The customer’s experience is still very "fresh"
- There is some question of customer bias when the customer goes into the service experience knowing that he or she will be rating it afterward.
- Likewise, if CSRs are asking customers to take the survey there is and added risk of bias. CSRs will be naturally motivated to ask for surveys from customers they feel have been pleasant and easy to work with, but won’t be too quick to offer the survey to "difficult" customers.
- It is possible for customers to use the survey as a leverage tool (e.g. "If you want me to give you a good rating you better give me what I want!") during the call.
- Most of these surveys are short – two or three questions at most. While this can give you good overall satisfaction tracking, it’s no substitute for a more detailed satisfaction analysis that gives you comprehensive, actionable data. Is the cost in software, upgrades, and internal man hours worth the general data you will receive?
- The number of questions, quality of questions and survey methodology have everything to do with the quality and actionability of the data you get out of it. We’ve heard some after-call surveys that will provide little or no value for the company simply because those who designed the survey clearly didn’t know how to ask the right questions. Software tools generally require that the user program the actual survey themselves, program the data they want out of it, and figure out what it means. Make sure you understand how to use the tool before you invest a lot of resources in it or it could be another one of those whiz-bang "solutions" that gathers dust on your server.
The loudest voices we hear talking up these surveys are the software vendors selling them. I’d be interested to know if your company has used them and what your experience has been – either positive or negative. Post a comment and share your experience!