Category: Business Trends

The Social Media Buzz; Time for Decaf?

I was part of a great ACCP event last week sponsored by Avtex and hosted by Pella Corporation at their headquarters. There was a wonderful presentation made on the subject of monitoring and responding to customers through social media by Spindustry and their clients from Omaha Steaks. Then, this morning, the Wall Street Journal dedicated an entire section to the subject of Social Media and IT.

In case you’ve had your head buried in the sand for the past year or two, the buzz in the call center world is currently “social media.” The very mention of the term seems to get call center personnel wound up like they’ve just swigged a triple-shot-chocolate-sugar-bomb-espressiato with a Red Bull chaser. Everyone wants to talk about it. The big call center conferences have been scrambling for the past two years to fill their keynotes and workshops full of social media gurus, how-tos, and software vendors. All the buzz has prompted great conversation with clients and colleagues.

For years, I’ve been advocating that every client listen to what customers are saying on the internet and through social media outlets. There is a huge leap, however, between keeping your ear open and diving into a full scale social media task force within your customer service team complete with the latest, greatest social media monitoring software. One of the questions that came up in the ACCP meeting last week was whether our group was doing Customer Satisfaction research for customers who use social media to contact a client company. The reality is that, for most of our clients, the number of customers using social media as a means of communication is still very small. So small, in fact, that they must be regarded as outliers and not representative of the general population of customers.

That does not mean that social media will not grow in importance and influence. It definitely is growing in importance and influence (But, how far will it grow? How influential will it become?). It does not mean that social media is not a critical piece of the marketing and customer service picture for some companies. I simply want to make the point that the time, energy and resources that an individual company invests in social media efforts should be considerate of how many customers they have actively engaged in the medium. Our group is helping some clients determine that very fact. By investing a little money in a survey to find out how important social media is to their customer population as a whole will help them wisely steward their resources when it comes to making an investment in their overall social media strategy. I begin to fear that clients will throw a lot of money and resources to engage a small number of customers in the social media arena when a much larger segment of customers are still encountering significant service issues through traditional channels (as boring and old school as those traditional channels may be).

In the meantime, I’m sure the social media buzz will continue unabated. In the call center industry there always seems to be a buzz where there is software, hardware and/or workshops to sell. Please do not misunderstand me. I’m not against social media in any way. I’m a blogger, tweeter, texter and Facebook junkie. I think social media is great and have led the charge in getting clients to “listen” to what customers are saying via social media. Social Media is here to stay and will continue to evolve. I am, however, dedicated to helping my clients make wise, measured decisions when it comes to their customers and their resources. So, when it comes the social media buzz, make mine decaf, please. Remember, there was a lot of buzz about betavision, too.

Creative Commons photo courtesy of Flickr and thetrial

Self-Serve Success Means Changing Metrics

Just read a great post over at Customer Experience Crossroads reminding us all that the boon in customer self-serve options means that a greater percentage of the calls which do get through to live agents tend to be those which are more complex.

This is a crucial thing to remember as call center managers, supervisors and QA analysts monitor and set metrics such as Average Call Time (ACT) or Average Handle Time (AHT). If you see your ACT and AHT numbers creepoing up, do a little homework before you bring Thor’s Hammer ringing down on your beleagured Customer Service Representatives (CSRs).

  • Check your self-serve channels to monitor useage and, if possible, how customers are using it. We have a client whose self-serve IVR has gone through the roof with customers accessing basic customer account information that used to mean calls of 60-90 seconds in length. Off loading these “short” calls means the average call getting through to the CSR is going to be longer.
  • Use your QA to track customer and call type, or do a quick statistical survey of calls to get good data. By tracking the reason customers are calling, you can begin to link it to average call time by call type to find out which calls drive the highest ACT/AHT. Those become the targets for improvement. (Warning: call data driven by CSR self reporting is usually worthless. CSRs are notorious for not coding calls or choosing the wrong codes. Don’t waste their time or yours.) 

QA Today:The Human Element

There are a growing number of companies who are scrapping internal call monitoring programs and Quality Assessment initiatives. One noticeable trend is the shift toward after call satisfaction surveys to replace traditional call monitoring. In most cases, the customer is asked to rate their satisfaction with the agent and/or the resolution of the issue. In some cases, customers can leave comments for the Customer Service Representative (CSR). I’ve heard of some companies who use the satisfaction ratings from these post call surveys as the only service quality metric.

From a management perspective, this tactic has all sorts of budgetary, productivity and managerial upside. In effect, you automate the process. Let the IVR handle the survey, let your software spit out a report that gets emailed to the CSR and supervisor. If customers are happy, then the company is happy. You only deal with CSRs who get consistently poor ratings.

Sounds like a dream. So, what’s the problem?

  • Bias. Post IVR surveys are rife with all sorts of response bias. You’re not getting an objective, random sample of customer feedback. You’re typically getting feedback from customers who are really happy, really unhappy, or who like entering the survey sweepstakes.
  • You get what you measure. If a CSR knows that they simply have to get good ratings then they will make sure they get good ratings. This might include giving away the company store to ensure blissful customers or badgering customers to give them good ratings (e.g. “I might lose my job if I get bad ratings.”). You might never know this, however, because you’re not listening to the calls.
  • No actionability. One of the most critical piece you miss when relying on customer sat as your lone QA metric is actionability. So, customer’s aren’t satisfied with a particular agent. Typically, there’s no objective data to help that CSR know what he/she is doing that dissatisfies customers. You might pick up a few ideas from anecdotal messages customer’s leave but it’s certainly not an objective measurement. You could coach your CSR to focus on a particular behavior based on one or two irate customers who leave a post-call tirade, but completely miss some critical service skills that the CSR needs to address to consistently improve the customer experience.

In an era in which technology is touted as a cure for every business problem, it’s easy to want to flip a switch and have QA report automatically generated and sent out. However, great Customer Service is still largely a human enterprise conducted by human beings with a wide range of education levels, skills, experience and personalities. The best way to address human behaviors is with human assessment and human interaction. It may be messy at times, but it can be done efficiently, done successfully, and done well.

QA Today: Pondering Some Foundational Thoughts

This is the first part of a series of posts regarding the state of Quality Assessment (QA) in the Call Center or Contact Centre.

I’ve been on a sabbatical of sorts for a few months. My apologies to those who’ve missed my posts and have emailed me to see if I’m okay. We all need a break from time to time and after almost four years I gave myself a little break from posting. While on sabbatical, I’ve been watching the trends in the call center industry and, in particular, what others have been saying about Quality Assessment (QA). I’m finding a sudden anti-QA sentiment in the industry. One client mentioned that the call center conference she recently attended had no sessions or workshops about QA. I then had an article sent to me by a client. It bemoaned the failure of QA and called for QA to “modernized.” At the same time, I’m hearing about companies who are shutting down their QA operations and turning to after call surveys and customer satisfaction metrics to measure agent performance.

I’ve been in this industry for almost twenty years. And I’d like to take a few posts to offer my two cents worth in the discussion, though more and more I’m feeling like a voice crying in the wilderness. First, I’d like to make a couple of general observations as a foundation for what I’m going to share in subsequent posts.

  • QA is a relatively new discipline. It has only been in the past 15-20 years that technology has allowed corporations to easily record interactions between their customers and their agents. In even more recent years, the profusion of VoIP technology in the small to mid-sized telephony markets has proliferated that ability into almost every corner of the market place. Suddenly, companies have this really cool ability to record calls and have no idea what to do with it. Imagine handing an Apple iPhone to Albert Einstein. Even the most intelligent man is going to struggle to quickly and effectively use the device when he has no experience or frame of reference for how it might help him. “It can’t be that hard,” I can hear the V.P. of Customer Service say. “Figure out what we want them to say and see if they say it.” The result was a mess. Now, I hear people saying that QA is a huge failure. This concerns me. I’m afraid a lot of companies are going to throw the QA baby out with the bathwater of trending industry tweets rather than investing in  how to make QA effectively work for them.
  • We want technology to save us. We are all in love with technology. We look to technology to help us do more with less, save us time, and make our lives easier. We like things automated. We have the ability to monitor calls and assess agents because technology made it possible. Now I’m hearing cries from those who’d like technology to assess the calls for us, provide feedback for us and save us from the discomforts of having to actually deal with front-line agents. This concerns me as well. If there’s one thing I’ve learned in my career it’s this: Wherever there is a buck to be made in the contact center industry you’ll find software and hardware vendors with huge sales budgets, slick sales teams, and meager back end fulfillment. They will promise you utopia, take you for a huge capital investment, then string you along because you’ve got so much skin in the game. Sometimes, the answer isn’t more, better or new technology. Sometimes the answer is figuring out how to do the right thing with what you’ve got.
  • The industry is often given to fads and comparisons. Don’t get me wrong. There’s a lot of great stuff out there. We all have things to learn. Nevertheless, I’m fascinated when I watch the latest buzz word, bestseller and business fad rocket through the industry like gossip through a junior high school. Suddenly, we’re all concerned about our Net Promoter Scores, and I’ll grant you that there’s value to tracking how willing your friends and family are to tell others about your business. Still, when your NPS heads south it’s going to take some work to figure out what’s changed in your service delivery system. If you want to drive your NPS up you have some work ahead of you to figure out what your customers expect and then get your team delivering at or above expectation. And, speaking of junior high, I also wonder how much of the felt QA struggle is because we spend too much time worrying about comparing ourselves to everyone else rather than doing the best thing for ourselves and our customers. I’ve known companies who ended up with mediocre QA scorecards because they insisted on fashioning their standards after the “best practices” of 2o other mediocre scorecards from companies who had little in common with theirs.

Know that when I point a finger here, I see three fingers pointing back at me. We’re all human and I can see examples in my own past when I’ve been ask guilty as the next QA analyst. Nevertheless, I’m concerned that the next fad will be for companies to do away with QA. I know that there is plenty of gold to mine in an effective QA process for those companies willing to develop the discipline to do it well. 

Creative Commons photo courtesy of Flickr and striatic

FREE ACCP Meeting for Central Iowa Companies & Contact Centers

For readers and subscribers in the central Iowa area, please consider attending the ACCP event  April 21 sponsored by Avtex. The half-day event is FREE and the group will be touring the Principal Financial Group contact center in Des Moines.

AVTEX IS SPONSORING THE ACCP MEETING - April 21, 2010


ACCP SPRING MEETING

We hope you can join us! It's a great chance to network
with your industry peers, discuss and share ideas!


MEETNG DATE: Wednesday, April 21, 2010

REGISTRATION: 8:00AM – 8:30AM CST   REGISTER NOW!
MEETING TIME: 8:30AM – NOON
LOCATION: Principal Financial Group
 
6200 Park Avenue
     
Des Moines, IA 50321

Don’t miss this rare opportunity to tour
Principal Financial Group's Contact Center

Founded in 1879, the Principal Financial Group (The Principal) is a leader in offering businesses, individuals and institutional clients a wide range of financial products and services, including retirement and investment services, life and health insurance and banking through its diverse family of financial services companies.

As a 401(k) leader and a member of the FORTUNE 500, the Principal Financial Group has $280.4 billion in assets under management and serves some 18.6 million customers worldwide from offices in 12 countries throughout Asia, Australia, Europe, Latin America and the United States. The Principal employs 14,900 employees nationwide.

We hope you can join us! It's a great chance to network
with your industry peers, discuss and share ideas!


MEETING AGENDA:

Registration, Network and Breakfast

Welcome and Introductions

Principal Financial Overview & Presentation:
    – Kim Post, Manager Client Contact Center
    – Rachel Torres, Manager Client Contact Center
    – Q & A Session

Tour Principal Financial
    – Chris Lynch, Manager Client Contact Center

Small Group Networking/Breakout Discussions
   Breakout Sessions
– Metrics – how are they used/how are they calculated
– Rewards on Metrics
– Economy – what changes were made due to the economy
– System Technologies
– VoIP
– Workforce Management/Staffing/Flexible Work time

Wrap-Up / Closing

* * Seating is limited * *
CLICK HERE to register today!

EVENT SPONSORS:


www.AVTEX.com


About ACCP:
The Association of Contact Center Professionals (ACCP) is a non-profit networking group of contact center professionals. ACCP consists of Contact Center Executives, Managers and Supervisors from various companies across central Iowa who meet to network and share their experiences on various topics relevant to today's contact center industry.

ACCP meetings are
FREE to attend!



TO REGISTER NOW
CLICK HERE!


Please forward this email to
other colleagues on your team
who may also have an interest
in attending this meeting. If you
need any assistance with your
registration, please email us at
avtexmarketing@avtex.com.

www.AVTEX.com 800.323.3639

Why Monitor Your Company’s Phone Calls?

Bigstockphoto_Making_the_call_817508 While it seems that everyone is monitoring phone calls these days, and it is certainly the norm in the call center industry, the reality is that there are many small to mid-sized companies who have not entered the world of call monitoring. Some companies are unaware that the technology exists and is easily accessible to companies who have just a few people serving customers on the phone. For others, the idea of call monitoring and Quality Assessment (QA) seems a daunting idea. The thought of recording and assessing phone calls brings to mind several uncomfortable questions. Many executives and managers are overwhelmed with the idea of trying to figure out what to do with it and how to figure out how to make it work for them. Others prefer to remain blissfully unaware.

Nevertheless, for a business of any size, there is value in call monitoring. When it’s done well, the recording and assessment of customer interactions provides:

  • Valuable Knowledge. Monitoring and analyzing calls between your business and your customers is far more than playing Big Brother and grading the performance of your agents. Within those recorded conversations is a plethora of valuable information. From monitoring calls you find out why your customers are calling, what problems your customers are commonly experiencing with products and services, what customers are saying about your business, and who your customers are. You discover clear opportunities to improve efficiency, productivity, and improving your customer’s experience.
  • Accountability. Call monitoring also provides you and your employees with accountability, ensuring that your brand is being consistently communicated and your people are performing to their potential. Monitoring calls and performance allows you to reward those who you know are contributing to your success and address those who are impeding it. Without call monitoring, you’re blind to the hundreds or thousands of “moments of truth” that are impacting your customer’s satisfaction and future purchase intent on a daily basis.
  • Tactical Improvement. When our group performs employee satisfaction surveys for our clients, we find employees consistently desiring more communication and feedback from their superiors. The vast majority of employees want to know know how they are doing and how they can improve. Call monitoring provides a company with the means to make that communication and feedback happen. A successful Quality Assessment (QA) process gives employees specific, behavioral goals for improvement, tracks their progress, and gives managers the data they need for productive performance management discussions.

There has never been a greater opportunity for businesses of every shape and size to benefit from the available technology to record, monitor and analyze conversations between your company and your customers. Companies who take advantage of the resulting data and information will find themselves a step ahead of others who continue to trust their gut.

Who QA’s the QA Team?

Bigstockphoto_Business_Meeting_1264156 It’s a classic dilemma. The Quality Assesment (QA) team, whether it’s supervisor or separate QA analyst, evaluates calls and coaches Customer Service Reps (CSRs). But, how do you know that they are doing a good job with their evaluations and their coaching? Who QA’s the QA team?

The question is a good one, and here are a couple of options to consider:

  • QA Data analysis. At the very least, you should be compiling the data from each supervisor or QA analyst. With a little up front time spent setting up some tracking on a spreadsheet program, you can, overtime, quanitfy how your QA analysts score. How do the individual analysts compare to the average of the whole? Who is typically high? Who is the strictest? Which elements does this supervisor score more strictly than the rest of the group? The simple tracking of data can tell you a lot about your team and give you the tool you need to help manage them.
  • CSR survey. I hear a lot of people throw this out as an option. While a periodic survey of CSRs to get their take on each QA coach or supervisor can provide insight, you want to be careful how you set this up. If the CSR is going to evaluate the coach after every coaching session, then it puts the coach in an awkward position. You may be creating a scenario in which the coach is more concerned with how the CSR will evaluate him/her than providing an objective analysis. If you’re going to poll your CSR ranks, do so only on a periodic basis. Don’t let them or the coaches know when you’re going to do it. Consider carefully the questions you ask and make sure they will give you useful feedback data.
  • Third-party Assessment. Our team regularly provides a periodic, objective assessment of a call center’s service quality. By having an independent assessment, you can reality test and validate that your own internal process is on-target. You can also get specific, tactical ideas for improving your own internal scorecard.
  • QA Audit. Another way to periodically get a report card on the QA team is through an audit. My team regularly provides this service for clients, as well. Internal audits can be done, though you want to be careful of any internal bias. In an audit, you have a third party evaluate a valid sample of calls that have already been assessed by the supervisor or coach. The auditor becomes the benchmark and you see where there are deviations in the way analysts evaluate the call. In one recent audit, we found that one particular member of the QA team was more consistent than any other member of the QA and supervisory staff. Nevertheless,there was one element of the scorecard that this QA analyst never scored down (while the element was missed on an average of 20% of phone calls). Just discovering this one “blind spot” helped an already great analyst improve his accuracy and objectivity.

Any valid attemps you make to track and evaluate the quality of your call analysis is helpful to the entire process. Establishing a method for validating the consistency of your QA team will bring credibility to the process, help silence internal critics and establish a model of continuous improvement.

If you think our team may be of service in helping you with an objective assessment or audit, please drop me an e-mail. I’d love to discuss it with you.

The Call Center as Social Media Outpost

Customers talk about you on Twitter. At ICMI's ACCE 09 conference last month, the buzz was around expanding the call center to become a social media outpost. It is rapidly becoming clear that interacting with customers is no longer just through phone calls. Interacting with customers must happen through the emerging communication channels like Facebook and Twitter.

I recently had an article come across my desk from Keith Fiveson of ITESA. He agrees:

Agents can outreach and act as a “social media outpost” casting their net to capture conversations, hear, and deal with hearts, minds, problems and people that impact your business products or services. Problems are inherent, in any business and it is essential that you are diligent in addressing and resolving them. Using a contact center as a “Social Media Outpost” is a good strategy to address concerns, bad press or consumer affairs issues that can plague the best brand management strategy.

Here's the entire article: Download The New Frontier Your Call Center as a Social Media Outpost

Are you preparing your call center for the new frontier of customer communication?

The Crystal Ball Approach to QA

Gaze into my crystal ball. At last weeks' ACCE Conference, I had a number of conversations with call center professionals who reported that their Quality Assessment (QA) programs were making a major shift towards the subjective. My collective understanding from all these conversations is that companies are getting tired and frustrated trying to behaviorally define their expectations in a workable form and are weary of haggling in calibration over every jot and tittle. So, they threw away the form and asked supervisors, managers and QA analysts to simply listen to the call and answer a few questions like:

  • "Did you exceed the customer's expectations?"
  • "Did you represent the brand?"
  • "Did you resolve the call?"
  • "Did the customer have a good experience?"

On the surface it appears simple and less cumbersome. On the back end, I'm afraid it is rife with obstacles. Here are a few concerns:

  • Outcome is analyst dependent. Depending on where the analyst sits on the continuum between "QA Nazi" (e.g. "QA is my opportunity to identify and weed out every single flaw you have and ensure you submit to my personally unattainable high standards in the interest of an altruistic goal of exceptional service through call center domination.") and "QA Hippie" (e.g. "QA is my opportunity to build self-esteem, spread a positive attitude, and avoid any conflict by giving you glowing remarks on every call and politely ignoring those pesky 'rough edges' which I'm sure you'll change all by yourself without me having to say anything and risk you going postal."), the simple, subjective approach to QA will create radically different feedback to CSRs across the call center.
  • Crystal ball required. Trying to determine an individual customer's thoughts, ideas, expectations of a single call requires a magic crystal ball or ESP.  Unless you have a specific customer IVR survey tied to the specific call you're coaching (which, some centers do), you don't know for certain what the customer thought (even if you do have an IVR survey attached, an individual customer's feedback to the agent can be highly correlated to their overall experience with the product or company – which could lead to unreliable feedback). The bottom line is that, in most cases, the QA analyst is just making an educated guess filtered through their own bias. Whenever you start guessing about what the customer thought, your analysis has lost any reliable objectivity. Any performance management data made on the analysts' subjective perception of the customer experience can create all sorts of HR problems.
  • You still go back to behavior. If a CSR gets poor marks on a call, he or she still wants to know "what do I specifically need to do in order to do a better job?" The analyst must still create a behavioral definition of what a "good job" is to them (i.e. "You need to be more polite by using 'please' and 'thank you.' You need to use the customer's name more often to make it more personable." etc.). However, now that behavioral feedback is analyst dependent. You have multiple analysts giving their personal prescriptions for what they consider a 'good job.' You haven't escaped the behavioral list. You just let each analyst create and control their own individual list. Now you have multiple people in the center applying their own personal definition of quality rather than driving the center to a unified understanding of quality and what is expected of CSRs.
  • You have poor data on which to make decisions. CSRs on the Inbound Order team are getting better QA marks this month, but why? What made the difference? Which behaviors did they modify to make the improvement and what percentage of the time are they displaying those behaviors? How do you know the supervisor isn't just in a better mood now that his divorce is final? If the Training team wants to know which service skills need remedial training and focus, they can see how many times a CSR did not represent the brand, but what specifically the CSRs are doing or not doing is up to the analyst's best recall, highly dependent on analysts definition of what represents the brand, and likely requires someone to go through every form and try to pull some kind of meaningful data from it. You may have simplified the front end of the process, but you have very little actionable data or information on the back-end to benefit your team.

This isn't to say that there isn't a silver lining to simple, anecdotal feedback. There is a place for listening to a call and providing an initial reaction based on what was heard. The approach does provide feedback. It does give the CSR a general idea of where they stand and provides an opportunity for communication about service and quality. The subjective approach is, however, a poor substitute for a systematic, data-driven, objective analysis of what CSRs are and are not saying across a random set of calls.

Creative Commons photo courtesy of Flickr and Kraetzsche

Be Discerning with Post Call IVR Surveys

Bigstockphoto_Senior_Woman_Using_Cell_Phone_2930471

"After this call, you have the option of taking a brief survey about your experience."

More and more, consumers are being given opportunities to take surveys. They are on the receipt of almost every big box retailer and restaurant chain. Technology has made it increasingly easy for companies to take a survey of customers through their IVR system. In fact, when talking to contact center managers about doing a focused customer satisfaction survey, I will often hear "we're going to use the IVR to do that."

Be careful. While IVR surveys are a great way to gather certain types of data, you need to be discerning:

  • IVR surveys tend to have built in response bias. People who "opt-in" to IVR surveys generally fall into three categories: customers who had a really good experience and want to tell you about it, customers who had a really bad experience and want to tell you about it, ir customers who like to take surveys. You may be missing a large, important, and silent segment of your customer population.
  • Depending on the system you use, CSRs can skew the response. If the survey is dependent on the CSR to ask, offer or transfer the customer, you'll likely get bias based on whether the CSR determined they wanted feedback from that particular customer.
  • Owning a nice spatula doesn't make you a chef. As with all surveys, the questions you ask can make all the difference on the quality of data you get out of it. Many companies will sell you the technology, but determining the questions to ask so you get the data you want and need requires a different expertise.

Please don't get me wrong. IVR surveys are a great way to gather data, but they may not give you the complete picture of your entire customer population. You may also find that you're not getting everything you want from the technology.