Category: Customer Research

Great Resolutions for 2010

Looking ahead to a new year. Looking ahead to 2010? What are you going to accomplish? What are your goals? How are you going to make this year a banner year?

Here are a few suggestions to make 2010 a year of continuously improving quality:

  • Survey your customers. The most fundamental mistake that companies make in assessing quality is ignoring what your customers think, want and expect. The second most fundamental mistake is making educated guesses about what your customers think, want, and expect based on questionable data. Make an investment in a truly random, focused survey of customers right after they've called your company. Get some good data that will allow you to make tactical, actionable decisions which will impact satisfaction, loyalty, and the bottom line.
  • Clean up your QA process. Many companies have a QA process that was hobbled together on the fly just to get it done. It's got problems. The scale has issues. The CSRs have legitimate complaints. The QA team and supervisors fight like an old married couple. Make this year the year that you get the program streamlined, cleaned up and effectively working for you.
  • Get calibrated. You know you've got "QA Nazis" on your team who are using the program to bludgeon CSRs into submission. You've also got "QA Hippies" who are letting CSRs get away with murder in an altruistic effort to boost their self-esteem. You know you need to get everyone on the same page and get them working together, but it's been too easy to just ignore it. It's a new year. It's a new decade. What a great time to start getting everyone working together and analyzing calls consistently.
  • Increase QA's reach. QA data can be utilized for so much more than agent evaluation. What are you hearing from customers about the latest promotion? How can you correlate spikes in talk time to call types? What issues drive the most unresolved calls? Many call centers struggle with getting upper management to support investments in QA. What if QA provided more than just a CSR scorecard? What if QA provided tactical data that helps operations, marketing, IT, and sales make better decisions? It's a great year to start thinking outside the box with the things you're measuring!

A very Happy New Year to all of our readers and subscribers. Here's to a prosperous, high-quality year in 2010. If there's any way our group can help your team achieve your goals, please let me know!

Creative Commons photo courtesy of Flickr and optical_illusion

Iowa Telecom’s Healthy Mix of Service & Sales

Iowa Telecom. I had a great time Wednesday at the Association of Contact Center Professionals (ACCP) get together at Iowa Telecom's (ITC) Newton, Iowa contact centers. ITC was a fantastic host and gave those of us in attendance a terrific overview of their contact center operations along with a tour of their facilities.

Our group has worked with ITC for many years providing Customer Satisfaction surveys. We also helped them start their QA process and get it off the ground. I have to admit that I felt some pride as I toured the facilities and saw what a great job their team was doing.

One of the things that I've witnessed ITC doing well is to merge cross-sells and up-sells into their customer service environment. It is an example of how the "cost center" mentality normally associated with Customer Service can evolve into a revenue generator for the company. A couple of high-points:

  • The CSAT research our team provides ITC clearly established that a good percentage of ITC customers are sometimes or always willing to hear up-sell or cross-sell offers if their primary issue has been resolved. This knowledge provided a firm foundation on which to build their up-sell process.
  • The management team has done a good job of establishing realistic guidelines for when CSRs should, or should not, present offers – and which offers to present.The up-sells are natural value adds to the customer's communication needs (not an irrelevant add-on like time-shares in Jamaica).
  • The QA team focuses on a healthy mix of customer service skills and sales skills. Sales opportunities are aggressively tracked right along with soft skills and resolution.
  • Utilizing a combination of outbound surveys and post-call IVR surveys, ITC is keeping their finger on the pulse of the customer. If there is a shift in winds of customer satisfaction, they should know and respond.

We all know that great customer service builds customer loyalty. Leveraging that into immediate sales opportunities is a delicate balance. Iowa Telecom is walking the tightrope well.

Thanks to Tim Lockhart and the Iowa Telecom team for being such generous hosts. Thanks to the local ACCP board for putting the event together. Thanks to Avtex for sponsoring the event.

Creative Commons photo courtesy of Flickr and jalexartis 

The Crystal Ball Approach to QA

Gaze into my crystal ball. At last weeks' ACCE Conference, I had a number of conversations with call center professionals who reported that their Quality Assessment (QA) programs were making a major shift towards the subjective. My collective understanding from all these conversations is that companies are getting tired and frustrated trying to behaviorally define their expectations in a workable form and are weary of haggling in calibration over every jot and tittle. So, they threw away the form and asked supervisors, managers and QA analysts to simply listen to the call and answer a few questions like:

  • "Did you exceed the customer's expectations?"
  • "Did you represent the brand?"
  • "Did you resolve the call?"
  • "Did the customer have a good experience?"

On the surface it appears simple and less cumbersome. On the back end, I'm afraid it is rife with obstacles. Here are a few concerns:

  • Outcome is analyst dependent. Depending on where the analyst sits on the continuum between "QA Nazi" (e.g. "QA is my opportunity to identify and weed out every single flaw you have and ensure you submit to my personally unattainable high standards in the interest of an altruistic goal of exceptional service through call center domination.") and "QA Hippie" (e.g. "QA is my opportunity to build self-esteem, spread a positive attitude, and avoid any conflict by giving you glowing remarks on every call and politely ignoring those pesky 'rough edges' which I'm sure you'll change all by yourself without me having to say anything and risk you going postal."), the simple, subjective approach to QA will create radically different feedback to CSRs across the call center.
  • Crystal ball required. Trying to determine an individual customer's thoughts, ideas, expectations of a single call requires a magic crystal ball or ESP.  Unless you have a specific customer IVR survey tied to the specific call you're coaching (which, some centers do), you don't know for certain what the customer thought (even if you do have an IVR survey attached, an individual customer's feedback to the agent can be highly correlated to their overall experience with the product or company – which could lead to unreliable feedback). The bottom line is that, in most cases, the QA analyst is just making an educated guess filtered through their own bias. Whenever you start guessing about what the customer thought, your analysis has lost any reliable objectivity. Any performance management data made on the analysts' subjective perception of the customer experience can create all sorts of HR problems.
  • You still go back to behavior. If a CSR gets poor marks on a call, he or she still wants to know "what do I specifically need to do in order to do a better job?" The analyst must still create a behavioral definition of what a "good job" is to them (i.e. "You need to be more polite by using 'please' and 'thank you.' You need to use the customer's name more often to make it more personable." etc.). However, now that behavioral feedback is analyst dependent. You have multiple analysts giving their personal prescriptions for what they consider a 'good job.' You haven't escaped the behavioral list. You just let each analyst create and control their own individual list. Now you have multiple people in the center applying their own personal definition of quality rather than driving the center to a unified understanding of quality and what is expected of CSRs.
  • You have poor data on which to make decisions. CSRs on the Inbound Order team are getting better QA marks this month, but why? What made the difference? Which behaviors did they modify to make the improvement and what percentage of the time are they displaying those behaviors? How do you know the supervisor isn't just in a better mood now that his divorce is final? If the Training team wants to know which service skills need remedial training and focus, they can see how many times a CSR did not represent the brand, but what specifically the CSRs are doing or not doing is up to the analyst's best recall, highly dependent on analysts definition of what represents the brand, and likely requires someone to go through every form and try to pull some kind of meaningful data from it. You may have simplified the front end of the process, but you have very little actionable data or information on the back-end to benefit your team.

This isn't to say that there isn't a silver lining to simple, anecdotal feedback. There is a place for listening to a call and providing an initial reaction based on what was heard. The approach does provide feedback. It does give the CSR a general idea of where they stand and provides an opportunity for communication about service and quality. The subjective approach is, however, a poor substitute for a systematic, data-driven, objective analysis of what CSRs are and are not saying across a random set of calls.

Creative Commons photo courtesy of Flickr and Kraetzsche

Choosing to Focus on the Positive

Its all in how you see it.

This past weekend was spent conducting customer surveys for a retail client of ours. My wife and stood by the checkout lines and asked purchasers and non-purchasers to take a customer satisfaction survey of their experience. This isn't normally a part of my job (not part of the 85% I love or the 15% I dred), but we found ourselves needing some extra help completing the project.

Though it made for three long days, it was a great experience. The vast majority of customers we spoke with made the task pleasant and enjoyable. I would say that 95 percent of the customers were friendly, kind and gave us valuable feedback. Then there were those five percent of customers who made the task more difficult. A few customers treated us like we had a communicable disease and were downright rude when we asked for their opinion. A few others took the survey but extended their bad attitude towards us as they did so.

It was a good reminder of working in the contact center. The vast majority of customers are pleasant, kind and respectful. It is always that small minority that makes the job more difficult. For some reason, our human nature tends to focus on the few difficult customers rather than the majority of pleasant experiences. It's like one of those optical illusion drawings where you can see different things depending on how you look at it.

If you're working the phones today, I'd encourage you to keep a note pad or sticky-note on your desk. Each time you have a pleasant, friendly customer – jot down a hash mark or write the customer's name on your sheet. When one of those negative customers comes along, look down at your sheet and refresh your memory of all the pleasant customers you've talked with.

Sometimes it takes a conscious effort to see the glass half full.

Creative Commons photo courtesy of Flickr and bertiemabootoo

Be Discerning with Post Call IVR Surveys


"After this call, you have the option of taking a brief survey about your experience."

More and more, consumers are being given opportunities to take surveys. They are on the receipt of almost every big box retailer and restaurant chain. Technology has made it increasingly easy for companies to take a survey of customers through their IVR system. In fact, when talking to contact center managers about doing a focused customer satisfaction survey, I will often hear "we're going to use the IVR to do that."

Be careful. While IVR surveys are a great way to gather certain types of data, you need to be discerning:

  • IVR surveys tend to have built in response bias. People who "opt-in" to IVR surveys generally fall into three categories: customers who had a really good experience and want to tell you about it, customers who had a really bad experience and want to tell you about it, ir customers who like to take surveys. You may be missing a large, important, and silent segment of your customer population.
  • Depending on the system you use, CSRs can skew the response. If the survey is dependent on the CSR to ask, offer or transfer the customer, you'll likely get bias based on whether the CSR determined they wanted feedback from that particular customer.
  • Owning a nice spatula doesn't make you a chef. As with all surveys, the questions you ask can make all the difference on the quality of data you get out of it. Many companies will sell you the technology, but determining the questions to ask so you get the data you want and need requires a different expertise.

Please don't get me wrong. IVR surveys are a great way to gather data, but they may not give you the complete picture of your entire customer population. You may also find that you're not getting everything you want from the technology.

The Company’s Customer Sat Survey May Not Serve the Contact Center


Our group believes in designing and implementing customer-centered programs for measuring and improving service quality. When talking to contact center managers and executives about improvements to their quality process, we usually start by asking what they know about their customers. We are often told: "Our company already does an annual survey." The problem is that the broad customer sat surveys done on the corporate level (which are usually the ones referenced) are only useful for getting a, well, broad picture of the customer's satisfaction. They won't give you much detailed information for the customer's expectations and satisfaction with their contact center experience.

An short, focused survey of customers who have recently called the contact center can provide you with a plethora of actionable information. The data can feed into a quality scorecard, training initiatives, system enhancements, as well as prioritizing and justifying capital expenditures. For call center managers or a V.P. of Customer Service, a small investment in a focused contact center survey can yield tremendous return when it allows you to make tactical decisions you know will positively impact your customers experience.

Your Customers Don’t Care About Industry Standards

Measure the right things. In the past couple of weeks I’ve been reminded of a principle that few companies are willing to face:

Your customers may not care about the industry standard.

Take a client of ours who, based on industry standards, looked at the metrics off their phone switch and completely freaked out at their high rate of abandoned calls. The entire operation immediately began investing time, energy, financial resources and human resources to get that abandon rate down so that it was in line with industry standards.

A small time later, our group was asked to do a small, random, focused after-call survey of customers who had recently contacted the call center. The purpose of the survey was to determine key drivers of satisfaction so that they could focus their resources specifically in areas that mattered to their customers.

What the client discovered was that long queue times and not being able to get to an agent quickly was relatively low on their customers’ priority list. Their customers didn’t care that much if they couldn’t get through to a CSR quickly. What they really cared about was getting their issue resolved in a courteous, friendly, personable manner once they did get through.

Armed with this information, the client could stop wasting time, energy and resources chasing an industry standard metric about which their customers didn’t care. Those resources could then be reallocated to focus on improving resolution and soft skills that customers did care about greatly.

The result? Follow-up surveys revealed significantly increased customer satisfaction, despite abandon rates that continue to be well above the industry average.

Stop trying to keep up with the industry. Start figuring out what your customers expect. If you need help, send me an e-mail ( or give me a call (515.321.9788) and let’s talk about it. It’s what we do.

Creative Commons photo courtesy of Flickr and saz.

Competing on Price is a Sugar High

Competing on price is a sugar high. I recently read a great post by John Goodman over at The Retail Customer Experience in which he lays out Five Myths of Customer Service. It's a good, quick read and I particularly enjoyed his Myth #2: Price is the name of the game to expand share and profitability.

In over 15 years of measuring customer satisfaction and service inside client contact centers, I have learned that the easiest way to compete is with price – but it's not the most profitable way. Slashing prices is a sugar high. You get a quick infusion of business from those customers who scurry from supplier to supplier based on price. But, the same customers who came your way to get your low price will scurry right out your door when the competitor lowers their price. The crash comes just as quickly and may leave you lower than when you started.

What your competitor will have the greatest difficulty matching is a great customer service experience. Investing the creation and sustenance of a service culture within your company builds loyalty in your customer base. Customers keep coming back, even if your prices are a little higher than the other guy.

If you want to build long-term customer loyalty, learn to serve your customers well. Find out their expectations. Then build a service delivery system that will meet and exceed those expectations.

Creative Commons photo courtesy of Flickr and ktylerconk

Extra-Mile Service in a Cost Cutting Business Climate

Lonely on the extra mile. Heidi Miller has been a kindred spirit in the blogosphere since I first started blogging over three years ago. She has a great post over at the Spoken Communications Blog asking how we're supposed to determine where to cut when customers want us to go the extra-mile (I've always said that there are no traffic jams along the extra mile).

One of the reasons I've come to respect Heidi so much is that she get's it. She spins her own take on the mantra I've been repeating for years. If you really want to make strategic decisions about what your customers want, you should start by asking them – and carefully listening.

FYI: When you're ready to listen to your customers, please feel free to contact me. Getting actionable data from listening to customers is our specialty!

Creative Commons photo courtesy of Flickr and Stitch