Do After Call Surveys Work?

KeypadOne of the big trends in call centers is software that allows the caller to opt-in to a survey after the call is through. Most often, you are prompted, before the CSR answers, to choose whether or not to take the survey. Sometimes the CSR asks you to take the survey when the call is over. In some cases, you can also leave voice-mail type comments for the CSR once you’ve answered a few questions. As specialists in Customer Satisfaction research and call centers, our group is often asked our opinion about these surveys. Here’s a quick take on our current thoughts:

Potential Upside:

  • You can get decent response rates and overall satisfaction numbers on a regular basis
  • Recorded messages direct to CSRs provides them with a real "voice-of-the-customer" that can be potentially motivating and add a new dimension to performance management
  • The customer’s experience is still very "fresh"


  • There is some question of customer bias when the customer goes into the service experience knowing that he or she will be rating it afterward.
  • Likewise, if CSRs are asking customers to take the survey there is and added risk of bias. CSRs will be naturally motivated to ask for surveys from customers they feel have been pleasant and easy to work with, but won’t be too quick to offer the survey to "difficult" customers.
  • It is possible for customers to use the survey as a leverage tool (e.g. "If you want me to give you a good rating you better give me what I want!") during the call.
  • Most of these surveys are short – two or three questions at most. While this can give you good overall satisfaction tracking, it’s no substitute for a more detailed satisfaction analysis that gives you comprehensive, actionable data. Is the cost in software, upgrades, and internal man hours worth the general data you will receive?
  • The number of questions, quality of questions and survey methodology have everything to do with the quality and actionability of the data you get out of it. We’ve heard some after-call surveys that will provide little or no value for the company simply because those who designed the survey clearly didn’t know how to ask the right questions. Software tools generally require that the user program the actual survey themselves, program the data they want out of it, and figure out what it means. Make sure you understand how to use the tool before you invest a lot of resources in it or it could be another one of those whiz-bang "solutions" that gathers dust on your server.

The loudest voices we hear talking up these surveys are the software vendors selling them. I’d be interested to know if your company has used them and what your experience has been – either positive or negative. Post a comment and share your experience!

Technorati Tags: , , ,
Flickr photo courtesy of gazer138

  9 comments for “Do After Call Surveys Work?

  1. June 14, 2006 at 5:25 pm

    Hi, I haven’t had the opportunity to implement these kinds of surveys, so I’m not speaking from direct experience. But I have to agree with your list of potential biases. Because of all the reasons you cited, it seems that surveys of this nature really have limited value.

  2. RS
    June 15, 2006 at 6:31 am

    I have worked in a couple of environments that do customer surveys. The most effective environment I’ve been in is the one where the customer is emailed a survey after the call. There is a random selection of what customers receive the survey. Our CSRs were required to tell the customer (at the end of the call) that they may or may not receive a survey and they would ask the customer to comment. Our participation rate was very high (60%) and we received valuable information.

  3. June 15, 2006 at 6:48 am

    Thank you both for the comments. I’m glad to hear about your thoughts and experiences.
    RS, we’ve found the same to be true – though our experience with response rates has been much better when customers are phoned 24-48 hours after the call. We still do the occasional mail survey, but it seems to be slowly dying away. We have seen response rates as high as 85 percent, depending on the client.
    Thanks for stopping by!

  4. Ann
    June 19, 2006 at 10:37 am

    Each time I’ve encountered these surveys, the recorded message before I reach the CSR says I’ve been randomly selected and that if I’d like to participate I need to ask the rep to connect me at the end of the call. I’m in the survey research industry, and I’ve never experienced one of these surveys because by the time I wrap up my issues, I’ve forgotten to make the request! It seems a far better user experience would be for the recorded message to ask me to push a button if I’d like to take the survey, and for that selection to be visible to the CSR. Then as the call is wrapping up they can remind me “I see you’ve chosen to take the survey, let me connect you through” so I don’t hang up prematurely. Until then, the surveys have a sampling bias toward people who aren’t juggling 27 tasks.
    Alternatively, one could simply have the reps ask every caller to take the survey, removing the incentive for a CSR to selectively provide better service. In this case I’d have a quick push-button satisfaction rating for the majority of the respondents, with the software randomly sampling a portion of the respondents for verbatim feedback, keeping admin costs to a dull roar.

  5. June 19, 2006 at 4:35 pm

    Thanks for the great feedback, Ann. I actually had one of these surveys that had the push-button option like you mentioned, but then I forgot about the survey and hung up before it clicked in!
    Excellent suggestions. Thanks for stopping by and adding to the conversation.

  6. February 1, 2007 at 3:00 pm

    Survey Methodology

    Dont worry, I dont know the It uses a stratified multi stage

  7. Mallory
    October 19, 2008 at 8:26 pm

    In my organization the customer is asked to participate in a 5 questions survey. I administer the application that collects the results for reporting purposes. I also provide management with results. The challenge is the ability to create good questions to obtain accurate results. In addition, our response rate is in the 50% range. I would like to know what type of questions would add value in obtaining customer feedback so we can better service our customers needs.

  8. October 21, 2008 at 7:40 am

    That’s a great question, Mallory. In fact, it’s the $64,000 question. Specific answer(s) to your question are elusive to me because there are so many unknown variables that are specific to your business: the types of calls you’re taking, what intelligence you already have from other surveys or research, etc. If you’re interested in more specific help, please contact me so we can discuss it further.
    In general, after-call surveys are best utilized for top-of-mind satisfaction questions (e.g. “On a scale of 1 to 5, how satisfied were you with the service experience you just received”) and maybe a focus on one or two key drivers of satisfaction (e.g. “Was the issue resolved to your satisfaction”). Make sure that the questions you ask will provide you with useful information. Comparing customer’s reported levels of satisfaction with the resolution they received to the call center’s reported rates of “one-call resolution” may yield some interesting information. Being able to dig into which call types are driving low satisfaction with resolution will help direct you towards improvement opportunities.
    Of course, even with 50% response rates, after call surveys are ripe for response bias. In other words, you may be getting a great response, but your data may be biased towards people who like to take surveys or who had a particularly negative (or to a lesser degree) positive experience. How customers are approached to take the survey can make a huge difference.

  9. Susanbailey62
    January 18, 2012 at 6:29 pm

    When rating on numerical scales with highest satisfaction being 5 you think customers can become confused and rate low, thinking 1 is the highest rating?

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: