Category: QA Software

QA Today: Pondering Some Foundational Thoughts

This is the first part of a series of posts regarding the state of Quality Assessment (QA) in the Call Center or Contact Centre.

I’ve been on a sabbatical of sorts for a few months. My apologies to those who’ve missed my posts and have emailed me to see if I’m okay. We all need a break from time to time and after almost four years I gave myself a little break from posting. While on sabbatical, I’ve been watching the trends in the call center industry and, in particular, what others have been saying about Quality Assessment (QA). I’m finding a sudden anti-QA sentiment in the industry. One client mentioned that the call center conference she recently attended had no sessions or workshops about QA. I then had an article sent to me by a client. It bemoaned the failure of QA and called for QA to “modernized.” At the same time, I’m hearing about companies who are shutting down their QA operations and turning to after call surveys and customer satisfaction metrics to measure agent performance.

I’ve been in this industry for almost twenty years. And I’d like to take a few posts to offer my two cents worth in the discussion, though more and more I’m feeling like a voice crying in the wilderness. First, I’d like to make a couple of general observations as a foundation for what I’m going to share in subsequent posts.

  • QA is a relatively new discipline. It has only been in the past 15-20 years that technology has allowed corporations to easily record interactions between their customers and their agents. In even more recent years, the profusion of VoIP technology in the small to mid-sized telephony markets has proliferated that ability into almost every corner of the market place. Suddenly, companies have this really cool ability to record calls and have no idea what to do with it. Imagine handing an Apple iPhone to Albert Einstein. Even the most intelligent man is going to struggle to quickly and effectively use the device when he has no experience or frame of reference for how it might help him. “It can’t be that hard,” I can hear the V.P. of Customer Service say. “Figure out what we want them to say and see if they say it.” The result was a mess. Now, I hear people saying that QA is a huge failure. This concerns me. I’m afraid a lot of companies are going to throw the QA baby out with the bathwater of trending industry tweets rather than investing in  how to make QA effectively work for them.
  • We want technology to save us. We are all in love with technology. We look to technology to help us do more with less, save us time, and make our lives easier. We like things automated. We have the ability to monitor calls and assess agents because technology made it possible. Now I’m hearing cries from those who’d like technology to assess the calls for us, provide feedback for us and save us from the discomforts of having to actually deal with front-line agents. This concerns me as well. If there’s one thing I’ve learned in my career it’s this: Wherever there is a buck to be made in the contact center industry you’ll find software and hardware vendors with huge sales budgets, slick sales teams, and meager back end fulfillment. They will promise you utopia, take you for a huge capital investment, then string you along because you’ve got so much skin in the game. Sometimes, the answer isn’t more, better or new technology. Sometimes the answer is figuring out how to do the right thing with what you’ve got.
  • The industry is often given to fads and comparisons. Don’t get me wrong. There’s a lot of great stuff out there. We all have things to learn. Nevertheless, I’m fascinated when I watch the latest buzz word, bestseller and business fad rocket through the industry like gossip through a junior high school. Suddenly, we’re all concerned about our Net Promoter Scores, and I’ll grant you that there’s value to tracking how willing your friends and family are to tell others about your business. Still, when your NPS heads south it’s going to take some work to figure out what’s changed in your service delivery system. If you want to drive your NPS up you have some work ahead of you to figure out what your customers expect and then get your team delivering at or above expectation. And, speaking of junior high, I also wonder how much of the felt QA struggle is because we spend too much time worrying about comparing ourselves to everyone else rather than doing the best thing for ourselves and our customers. I’ve known companies who ended up with mediocre QA scorecards because they insisted on fashioning their standards after the “best practices” of 2o other mediocre scorecards from companies who had little in common with theirs.

Know that when I point a finger here, I see three fingers pointing back at me. We’re all human and I can see examples in my own past when I’ve been ask guilty as the next QA analyst. Nevertheless, I’m concerned that the next fad will be for companies to do away with QA. I know that there is plenty of gold to mine in an effective QA process for those companies willing to develop the discipline to do it well. 

Creative Commons photo courtesy of Flickr and striatic

We Can Record Calls. Now What Do We Do?

One of the common frustrations I’ve witnessed in the past 16 years of working with clients and their Quality Assessment (QA) programs happens shortly after a company invests in new phone technology or call recording and QA software. The technology is installed and the switch is flipped. They have this great new software tool with bells and whistles and… no idea how to get started.

Here are a three things you’ll need to do:

  • Decide how you are going to report the data and information you generate, and how you want to utilize them. Do you simply want data to show the VP that you’re doing something to measure and improve service? Are you going to use the results for individual performance management and incentives at the agent level? Do you simply want to provide call coaching to your front line agents? Do you want to leverage what you’re learning to impact training, marketing and sales? What you want out of your QA program is the first thing you need to determine because it affects how methodical and critical you need to be in subsequent steps of setting up your program.
  • Decide on and list the specific list behaviors you want to listen for. What is most important to your customers when they call (a small post-call survey could help you here)? What specific behaviorsare important to representing your brand? What are the important touch-points at different stages of a common customer interaction? It’s easy to get caught up on the myriad of exceptional situations, but when setting up your scale/scorecard/checklist/form you need to focus on your most routine type(s) of call(s).
  • Decide who is going to monitor and analyze the calls. Many companies use the front-line supervisor to monitor the calls. Others go with a dedicated QA analyst. Some companies hire a third party (that’s one of the service our group provides). There are pros and cons to each approach, and many companies settle on a hybrid approach. It’s important to think through which approach will work best for you and your team.

These are only a few broad bullet points to help focus your thinking, but they provide a rough outline of critical first steps. All the best to you as you set out on your QA journey.

If you need help, don’t hesitate to contact me. It’s what we do.

Be Discerning with Post Call IVR Surveys

Bigstockphoto_Senior_Woman_Using_Cell_Phone_2930471

"After this call, you have the option of taking a brief survey about your experience."

More and more, consumers are being given opportunities to take surveys. They are on the receipt of almost every big box retailer and restaurant chain. Technology has made it increasingly easy for companies to take a survey of customers through their IVR system. In fact, when talking to contact center managers about doing a focused customer satisfaction survey, I will often hear "we're going to use the IVR to do that."

Be careful. While IVR surveys are a great way to gather certain types of data, you need to be discerning:

  • IVR surveys tend to have built in response bias. People who "opt-in" to IVR surveys generally fall into three categories: customers who had a really good experience and want to tell you about it, customers who had a really bad experience and want to tell you about it, ir customers who like to take surveys. You may be missing a large, important, and silent segment of your customer population.
  • Depending on the system you use, CSRs can skew the response. If the survey is dependent on the CSR to ask, offer or transfer the customer, you'll likely get bias based on whether the CSR determined they wanted feedback from that particular customer.
  • Owning a nice spatula doesn't make you a chef. As with all surveys, the questions you ask can make all the difference on the quality of data you get out of it. Many companies will sell you the technology, but determining the questions to ask so you get the data you want and need requires a different expertise.

Please don't get me wrong. IVR surveys are a great way to gather data, but they may not give you the complete picture of your entire customer population. You may also find that you're not getting everything you want from the technology.

IBM Using Speech Analytics to Protect Recorded Info

Blacked_outOne of the most critical questions companies must ask is how to protect customer information that ends up on call recordings. Until now, most of the security measures are centered around who has access to recorded phone calls and screen capture videos.

A recent post over at Call Center Script made me aware of the fact that IBM recently announced that they are developing speech analytics software that will recognize when customers are providing critical, personal information and will block it in the recording – as well as blocking personal information from being recorded on the screen capture.

This appears to be good news, but there are two critical questions:

  1. Does it really work without the glitches being more headache than it’s worth?
  2. Will cost prohibit any but the largest corporations with deep pockets to use it?

Creative Commons photo courtesy of Flickr and simeon barkas.

Who Owns Your QA Data?

Daffy_2We’ve recently been dealing with an interesting issue across multiple clients that might be relevant for Call Center and QA Managers to consider.

Most software vendors who provide programs that capture voice/screen data also provide a process to score phone calls and then report the results. These scoring tools are generally just a database that store your input, then use an underlying, built-in formula to calculate a score and tabulate the results. These tools generally do a decent job with basic calculations, but they are not designed for more robust statistical analyses.

I happened to be in a meeting the other week in which a software vendor was talking to my client about his company’s voice/data capture package. He made the comment, "When you score a call, the data you enter is yours. You can have access to it any time you want." In this case, the software came with a data export feature that spit all the scores into an Excel spreadsheet with a click of a button.

That same week we were part of a discussion with a different client who uses a different software vendor. The client had asked us to analyze calls for our Service Quality Assessment using their call capturing software, which we did with the agreement that we could get the raw data back out of the program to run more complex statistical analyses for our report (which the software could not do). In this case, a conflict arose when the software vendor seemed to consider the client’s data a hostage for which they could charge a tidy sum to release it.

It reminded me of the classic Looney Tunes cartoon in which Daffy Duck sells Porky Pig a home technology unit complete with a RED button that jacks his house into the sky in case of a tidal wave. When Porky presses the button, Daffy comes along in a helicopter and says, "For a few more dollars I can install the BLUE button to get you down!"

Perhaps you would never have need to get data out of your software and the point is moot. More often than not, however, we find that call scoring software does not always analyze your QA data the way you want to see it. It’s a simple task if you can only get the raw data into a spreadsheet where you can run your own calculations using Excel, SPSS or any number of statistics programs.

I remember asking one of our clients to run a simple report from their QA software tool. She laughed and said that this would require her to run the software’s "standard" reports which would spit out 200 pages of useless information from which she would have to weed through to find the data she needed to then manually re-enter into another program and run the calculations on her own. Arrrrrrrrgghhhh.

If you’re considering buying QA software, it might be worth asking the question "Once I score the calls in your software, can I easily get my raw data back out without you charging me for it?" Beware of the standard reply: "Our software can run ANY report you could possibly want!" Perhaps that’s true, though I have yet to witness it. It’s better to be safe and make sure you can easily export your data whenever you want it.

Technology Does Not a QA Program Make

Technology
Quality Assessment in today’s call centers and businesses is an interesting phenomenon because it’s a discipline that emerged out of available technology. In the past decade, technology has made it relatively simple for businesses of all shapes and sizes to capture and monitor the phone calls and computer screens of their Customer Service Representatives. The way it was sometimes stated, the technology promised efficiency and productivity for businesses that utilized it.

Then companies installed the technology and began recording phone calls and computer screens. More than one client called to tell us that our services would no longer be required because they now had the technology to do it themselves. In each case we graciously and sincerely thanked them for their business and told them to call if we could help them in any way.

Typically, the call came twelve to eighteen months later.

In certain cases our clients spent a year or more doing it themselves only to reach a point where they were so embroiled with internal fighting over methodology that the technology was not getting used to effectively train and coach the CSRs. In other cases our clients pieced together QA programs, but they were unhappy with the results. The time and resources expended had not resulted in the productivity and quality improvements they expected. While they were doing it themselves, they knew that they weren’t doing it well. Other companies ended up getting distracted by the demands of the queue and the tyranny of the urgent. They have the technology and still swear that they will use it someday when they have more time.

Today, we’ve assisted several clients in utilizing their QA technology productively. In a few cases, clients have found that it is cheaper to hire us to do call analysis and call coaching for them. In other cases, we’re still available to help when they get around to actually using their QA technology.

The lesson is this. Having the technology does not mean you have immediate knowledge or expertise for how to utilize it effectively. It’s okay to ask for help.

Creative Commons photo courtesy of Flickr and Rutty.

Will Post-Call IVR Surveys Replace QA?

Dr. Jon Anton recently responded when asked what he thought call centers would look like in five years. Among his responses was this opinion:

2. I see a major move to use the caller in the
call quality monitoring process through post-call IVR surveys with the
results linked immediately back to the agent via an agent-specific
dashboard indicating caller feedback in real-time, including a verbatim
response from the caller indicating (by the caller’s voice) what the
agent could have done better in handling the call. I see this new
technology, replacing some, if not all, of the currently expensive and
ineffective call monitoring software using internal quality teams
.

While I agree that we will see an increase in the use of post-call survey technology, I think it will ultimately backfire if companies try to replace their QA efforts with it. Here’s why…

  1. Post call IVR surveys have a built-in bias issues. Instead of getting a random sample of customers, you tend to get customers who either have a very positive or very negative  experiences. Don’t get me wrong, we’ve seen post-call IVR surveys provide some value – but the value of the information you receive is limited.
  2. Post call surveys may tell you what customers thought of the experience, but they don’t provide any context for what did or did not happen within the phone call to elicit that result. Without quantifying what CSRs are and are not not doing/saying – how can they make improvements? A CSR may consistently get negative comments from customers – and that might motivate the CSR to want to improve – but it doesn’t necessarily provide a road map for what they need to do to improve.
  3. While I agree that call monitoring software is expensive and ineffective, my experience tells me that the problem lies in the user – not in the technology. The answer is to apply better methodology to make QA more effective and less expensive. Post-call IVR surveys have the same issue. It’s expensive technology that requires expertise within the call center to effectively program the questions, determine the methodology, and analyze the resulting data. Call centers that abandon call monitoring software for post-IVR surveys will find themselves in the same – if not worse – situation. "We have this cool technology, but we don’t have any idea how to use it."

Dr. Jon may be right. Let’s be real – companies tend to throw money and resources at the latest, greatest technology to jump on which ever bandwagon the industry is on at the moment. I just hope this time we look before we leap.

Quality is More than Window Dressing

I’m writing this morning from the lobby of a business hotel in the Twin Cities. It’s a nice place. It’s been remodeled recently and it looks great. It’s well appointed with trendy-cool furniture, mosaic tile by the front-desk, and they’ve hung interesting, fashionable artwork all over. I’m tellin’ ya – staying here is like being in a virtual Target ad.

But I couldn’t get the water in my shower any hotter than the cool side of luke-warm, and the "premium" coffee here in the lobby is still just crappy dark-brown water served in miniscule, wimpy styrofoam cups that look like the size of cups Starbucks uses to hand out samples. When they remodeled the lobby and the rooms, they should have paid attention to the things that really matter to their guests.

It reminds me of many companies I’ve witness-ed, who spend an amount equal to the gross national revenue of a third-world country ror state-of-the-art call capturing software and call scoring tools that have all the bells and whistles money can buy. They can brag to their peers and customers about their commitment to "service" (spending money=commitment….doesn’t it?). Nevertheless, they still don’t have any idea how to use it, nor do they have a foundational commitment to quality to use it well. It’s just trendy artwork in a hotel room with luke-warm water and crappy coffee.

You know, it doesn’t matter how well you dress things up for the customer, your true commitment to quality and service will ultimately be revealed for what it is.

Related Posts:
Owning a Spatula Doesn’t Make Me a Chef
Buyer Beware! QA Software Considerations
Another "Gut" Check

Flickr photo courtesy of esotericsean has been removed by request

Do After Call Surveys Work?

KeypadOne of the big trends in call centers is software that allows the caller to opt-in to a survey after the call is through. Most often, you are prompted, before the CSR answers, to choose whether or not to take the survey. Sometimes the CSR asks you to take the survey when the call is over. In some cases, you can also leave voice-mail type comments for the CSR once you’ve answered a few questions. As specialists in Customer Satisfaction research and call centers, our group is often asked our opinion about these surveys. Here’s a quick take on our current thoughts:

Potential Upside:

  • You can get decent response rates and overall satisfaction numbers on a regular basis
  • Recorded messages direct to CSRs provides them with a real "voice-of-the-customer" that can be potentially motivating and add a new dimension to performance management
  • The customer’s experience is still very "fresh"

Questions:

  • There is some question of customer bias when the customer goes into the service experience knowing that he or she will be rating it afterward.
  • Likewise, if CSRs are asking customers to take the survey there is and added risk of bias. CSRs will be naturally motivated to ask for surveys from customers they feel have been pleasant and easy to work with, but won’t be too quick to offer the survey to "difficult" customers.
  • It is possible for customers to use the survey as a leverage tool (e.g. "If you want me to give you a good rating you better give me what I want!") during the call.
  • Most of these surveys are short – two or three questions at most. While this can give you good overall satisfaction tracking, it’s no substitute for a more detailed satisfaction analysis that gives you comprehensive, actionable data. Is the cost in software, upgrades, and internal man hours worth the general data you will receive?
  • The number of questions, quality of questions and survey methodology have everything to do with the quality and actionability of the data you get out of it. We’ve heard some after-call surveys that will provide little or no value for the company simply because those who designed the survey clearly didn’t know how to ask the right questions. Software tools generally require that the user program the actual survey themselves, program the data they want out of it, and figure out what it means. Make sure you understand how to use the tool before you invest a lot of resources in it or it could be another one of those whiz-bang "solutions" that gathers dust on your server.

The loudest voices we hear talking up these surveys are the software vendors selling them. I’d be interested to know if your company has used them and what your experience has been – either positive or negative. Post a comment and share your experience!

Technorati Tags: , , ,
Flickr photo courtesy of gazer138