QAQnA is Getting a New Identity & New Look

SQC(blackwhite) LR My regular readers and subscribers have likely noticed a marked downturn in posts in recent months, but it's not because I've been idle! I've been working for several months on an exciting project to move QAQnA to a new domain with exciting new look.

One of the consistent pieces of feedback I've received over the four years QAQnA has been in existence is that the name of the blog is cryptic and difficult to remember. So, with the upcoming launch, QAQnA will become Service Quality Central (SQC), your hub for information regarding customer service, quality assessment and contact center related issues.

Same great content - easier name to remember!


Don't Let Your Communication Skills Sink Your Ship

It's Friday and we could all use a little chuckle. Thanks to Matthew and Conversations with Life for sending this my way. It's a great reminder for all of us in the contact center industry. When you serve others over the phone you don't have body language, facial expressions, or other non-verbal cues to help you communicate.  On the phone, what you say and how you say it is critical. Having capable langauge and communication skills can mean the difference between swimming or sinking in the mind of your customer (pun intended).

Should CSRs Perform Their Own QA Assessment?

Bigstockphoto_Customer_Feedback_Survey_1564305 Our good friend at Call Centre Helper recently responded to this series of posts on who should do the Quality Assessment (QA) in the contact center, and suggested we've missed two alternatives: CSR self-assessment and technology based speech analytics. I think both of these options deserve consideration.

Let's start with a post about CSR sefl-assessment. Many call centers allow or require their Customer Service Rrepresentatives (CSRs) to listen to and assess their own calls. It can be a great training tool:

  • Individuals can listen without the pressure of feeling someone else's judgment. In call coaching situations, some CSRs are so nervous about having someone listening to their calls or judging their performance that they tend to miss the point of the process. By listening alone to their calls, a CSR can sometimes focus in on what took place in the call without these interpersonal distractions.
  • We tend to be our own worst critics. Individuals will regularly hear things that others don't. It is quite common in coaching sessions for CSRs to point out things they could have improved that didn't even occur to me. By having CSRs critique themselves, they may listen more critically than even an objective analyst, and that can be a huge motivator for some CSRs.
  • Having the CSR go through and assess the call using the QA scorecard engages them with the process and forces them to consider the behavioral standards. Many QA programs create contention simply because CSRs do not understand the criteria with which their conversations are analyzed, and don't understand how the process works. When a CSR sits down with the scorecard and analyzes their own calls, it forces them to think through how they performed on each behavioral element.

You'll notice I wrote that self-assessment is a great training tool. I don't believe that self-assessment is a great way to approach your QA program if you want to get a reliable, objective assessment of what took place on the phone. Self-assessment has its' drawbacks:

  • Having people grade themselves is inherently biased. If you want a reliable and statistically valid measurement of what's happening on the phone in your call center, you need someone other than the person who took the call to analyze the call.
  • Based on the personality and attitude of the CSR, individuals tend to be overly critical ("It was AWFUL. I sound TERRIBLE!") or not critical enough ("That was PERFECT. I heard nothing wrong with that call."). Sometimes CSRs get highly self-critical about a minute issue that makes little difference to the customer experience while missing larger behavioral elements that would impact the customer. Even with self-assessment, CSRs often need help interpreting what they are hearing.
  • Because individuals are so focused on their voice and their own performance, they tend to be blind to the larger policy or procedural issues that can be mined from QA calls by a more objective analyst who is trained to look at the bigger picture.

Self-assessment has its' place as part of the quality process, but our experience tells us that its strength lies in the training end of the program. If your QA program requires meaningful and objective data, then a more objective analyst is required.

A Hybrid Approach to QA

Mixing strengths. Many companies have discovered that having just the just the supervisors analyzing calls and providing coaching does not have the desired effect. Likewise, those who put all their QA eggs in the basket of an internal QA analyst/team or a 3rd party provider find themselves wanting greater impact.

That's why many will create a hybrid program to leverage the strengths of each approach. Here are three common hybrid approaches our team sees and recommends in the right circumstances:

  • Internal/External. In this approach, the company uses an internal analysts and coaches on a day-to-day basis, but utilizes an external third party to provide periodic, comprehensive assessment. Our team will often measure a significantly great number of behaviors on a periodic bases as a reality check, but then help the client's internal team to make sure they are focused on the proper, albeit smaller, list of crucial behaviors. This hybrid can save time, avoid wasted resources and provide a continual source of objective feedback that can help you make strategic improvements.
  • Supervisor/QA. The internal hybrid approach also divides duties to maximize efforts. The QA team, who is tasked with focusing on service quality, provides an on-going, detailed analysis of calls. The supervisors, who have limited time and resources to do call/data analysis, continue to monitor/coach calls, but listen for a short list of inconsistent behaviors unearthed by the Quality Team's detailed assessment.
  • Analyst/Coach. Some teams get mired in the belief that they must coach the CSR on every call analyzed. Others analyze a ton of calls and do nothing with them. Find the right point of tension between analysis and coaching. As long as you have a statistically valid and manageable scale, you can have an individual or team who analyzes a bunch of calls to provide you with good data, but another person who is trained in coaching can take the data and a few call examples to provide the feedback.

The key to any hybrid approach:

Creative Commons photo courtesy of Flickr and curious gregor

Does a 3rd Party QA Provider Make Sense?

Third Party QA Solution. Just as there are pros and cons to utilizing front line supervisors or a dedicated internal Quality Assessment (QA) analyst/team, there are also pros and cons to considering an independent third party. It may not make sense for some companies, while for others it is a great fit.

In the interest of full disclosure, I must tell you that our group provides third-party QA solutions. For some clients, we are their entire QA program from call capture & analysis to the one-on-one call coaching. For some we have established the QA program and then trained an internal team to take it over as we help them develop the necessary skills and discipline. For others, we simply provide an objective outside assessment in an effort to audit and improve their existing internal program. It is, however, a foundational principle of our group that we will not do a project if we do not believe we can bring measureable value to the client. So, I recognize that it works for some and not for others.

Having said that, here are a few reasons it makes sense to consider a third party:

  • Established experience & expertise means that you aren't wasting precious time, energy and resources trying to develop a program from scractch. In addition, a third party brings experience from other client call centers which helps avoid common pitfalls and introduce innovations your internal team may not know about without sending them to expensive conferences. As one client told us, "I know our products. I know how to sell them. I know how to service them. But you guys know our customers (we do regular customer sat surveys for them) and you know how to measure quality. I'd rather pay my people to take care of customers and pay you to make sure we're doing it well." In addition, I commonly find that our team can analyze calls better, faster, and provide a greater depth of actionable data than internal teams. It comes from 20 years experience analyzing tens of thousands of phone calls.
  • Objectivity. A third-party analyst doesn't have the potential bias you find internally, especially with supervisors whose call analysis can be colored by personality or other performance issues. Even internal QA analysts can find it tempting to listen with an internal perspective. A third party QA provider can generally provide you with more objective, customer-centric feedback than you'll get internally.
  • Accountability. One client put it to me bluntly: "The reason I have your team do our QA is because it gives me the luxury of picking up the phone and firing you at any given moment. If I hire a team internally it will cost me far more in the long run. I'll expend far more resources in FTEs as they try to figure out how to do it well. They probably won't do it as well in the end. And then I'm stuck with them. If I'm disappointed with your team, I just pick up the phone and tell you I'm done with you." The fact is, if our team doesn't perform well, we won't be around long. I know a lot of V.P.s of customer service who feel stuck with a broken, inefficient quality process that they wish they could scrap.

That doesn't mean a third party is always the best option. There are challenges inherent to a third party approach:

  • Limited product/procedural knowledge. A third party QA provider will rarely, if ever, have the intimate knowledge of your internal products, services, systems and procedures. While a third party can give you a clear picture of the customer's perspective (who also does not have this internal knowledge), the third party analyst will not always be able to catch the correctness of the CSR's answer or know all of the possible options the CSR had available to resolve the caller's issue the way a front line superisor will. A good third party provider will learn as much as possible in order to give the best possible feedback, but will almost never be as accurate as an internal analyst in measuring the correctness of answers given.
  • Control. A third party provider of QA is a hired vendor, and does not give managers or senior managers the degree of oversight and control that comes with an internal team. My experience is that some companies and executives require a high degree of control over the entire quality process. A good third party provider will be responsive, communicative and flexible, but will never offer the access or control a manager will have with direct reports.
  • Cost. Getting a worthwhile third party QA provider will cost money that most call center managers do not have easily accessible in their budgets. I know that our team has consistently provided clients with better data analyzing fewer calls and costing less money, time and energy than would have been expended with an internal team. Nevertheless, the number one reason most call centers do not consider a third party solution is for financial reasons. If you want, or a required, to analyze a tremendous number of phone calls -the cost of a third party solution easily becomes out of reach for most companies.

In conclusion, I've found that our services as a third party QA provider have consistently worked well in a handful of common situations:

  • Small to Mid-Size Contact Centers. Companies who have 1-50 seats will often struggle to justify the expense of a dedicated internal QA program. A 3rd party QA provider can often provide effective and cost efficient solutions that fit the budget.
  • QA Start Up. If setting up a program from scratch, you can often save yourself a lot of time and headaches by having a 3rd party assist in establishing, implementing the program and training your internal team of analyst and coaches.
  • Multiple Division/Site Situations. Larger corporations often find themselves with multiple call centers in different locations who each have a quality program. It's common to find that each location measures quality differently. We have been able to help companies in this scenario by providing an objective assessment across multiple centers on a common scorecard so that the executive team can get an "apples-to-apples" perspective of how each contact center is doing in service quality. We have also assisted with the development and implementation of a "common scorecard" for multiple contact center situations.

Sometimes contact centers choose not to exclusively use a front-line supervisor, QA team, or third party, but rather find a hybrid approach that works for them. We'll explore some of those options in our next post of this series.

Creative Commons photo courtesy of Flickr and barron

Should You Have a Dedicated QA Analyst or QA Team?

A dedicated listener. In the first post of our series we explored the pros and cons of having front line supervisors be the Quality Assessment (QA) analyst and call coaches. Rather than burdening an already loaded supervisory staff with the taks of QA, some companies choose to utilize a dedicated QA individual or team. As with the supervisors, there are pros and cons to this choice.

Having a dedicated QA analyst or team had advantages.

  • A dedicated QA function generally ensures that the call analysis will receive greater time and attention. A good QA analyst will not only listen for the quality of the CSR's performance but also mine the calls for more information and detail. That detail can sometimes surface policy or procedural issues which can increase productivity and reduce costs.
  • As a result of the increased focus, the resulting data will tend to be more reliable. For companies who utilize QA data for performance management, this reliability can be crucial in ensuring that your process will meet necessary HR standards.
  • Because QA analysts do not have direct supervisory role with the CSR, the possibility of bias due to personality issues or performance issues outside of the call is greatly diminished.

Having a dedicated QA analyst role is not always a slam-dunk, either.

  • Because the QA analyst or team is typically not on the phones, they are less knowledgable of the day-to-day issues facing CSRs on the call floor. While this lends itself to objectivity, it may be more difficult in call coaching situations when the CSR questions the QA coach's knowledge or experience.
  • A QA analyst can easily create contention in the call center. Supervisors and QA analysts find themselves at odds as the supervisor feels the need to "defend" their CSRs and raise their team's quality scores. Rather than working with the QA team to improve CSR performance, supervisors regularly see the QA team as overly critical grinches who are making them (and their team) look bad.
  • For call centers strapped to stay adequately staffed, there simply not be resources available for a dedicated QA function.

Is there a compromise? Some companies opt for a hybrid approach and others choose to hire a 3rd party. We explore both options in the continuation of this series.

Creative Commons photo courtesy of Flickr and personalspokesman


Are Supervisors the Best People to Do QA?

Bigstockphoto_Business_Meeting_1264154 Who should monitor your teams phone calls and do your Quality Assessment (QA)? Today we begin a multiple post series on who should analyze your team's phone calls.

We begin with the front line supervisor who seems like the natural choice, and for good reason:

  • They are the closest managerial person to the floor.
  • They usually know the Customer Service Representative (CSR) better than anyone else.
  • They usually have direct responsibility for the CSR's performance management.
  • They can closely monitor progress and keep their eye on the CSR day-to-day.

However, in over 15 years of working with call center QA programs, I've found that there are inherent problems with supervisors being the primary call analysts and coaches:

  • Quality becomes back burner issue. I've always held that front-line supervisors have the toughest job in the call center and they are usually the most stressed out level of management. I have the greatest respect for them. They have the competing priorities of helping their agents, training, mentoring, managing, taking escalated customer calls, answering e-mails, scheduling, facilitating team meetings, motivating, counseling, and we haven't even gotten to all of the things call center and upper management ask of them by way of reports, special projects, performance management, and committee meetings. And this is before call volumes spike, systems crash, and a viral epidemic spreads through the floor. This is why, in most cases I've encountered, the front line supervisor struggles as a QA analyst and coach. With all of the pressing issues demanding their attention at any given moment, QA responsibilities are quickly and easily pushed to the back burner.
  • Evaluations are rushed. For all the reasons I just stated, even when supervisors do get to their QA duties, they simply can't afford the time and attention required to objectively analyze phone calls with the required precision. Quality Assessment is done with little, well, quality. QA duties get procrastinated until the just before their report is due and then a bunch of calls are hastily evaluated just to meet the requirement. This is not a criticism of the supervisor! This is simply the reality of most call center organizational systems.
  • Objectivity is easily skewed. People are people. When you work with someone everyday, and you have issues with someone everyday, it's easy to lose your objectivity. Through the years I've had some pretty tense discussions with supervisors who are upset when a CSR's quality scores are good (you read that right). When a supervisor has issues with a CSR's attitude, attendance, or personality - it's easy for their frustration to bleed over into their analysis of the CSR's behavior on the phone. The reverse is also true. When a CSR happens to be a model employee and has the favor of the supervisor, the supervisor is apt to overlook and excuse negative behaviors that the CSR consistently demonstrates with customers on the phone. In either case, you've got problems which undermine the objectivity and validity of your entire quality program.
  • Call Coaching becomes HR Coaching. When supervisors coach calls, it is easy for the call coaching session to get sidetracked into all sorts of other productivity or HR related issues. Instead of the session being centered on how the CSR can provide better service to the customer, it ends up being about how the CSR can be a better employee for the supervisor. 

While many call centers utilize supervisors to analyze calls and provide quality coaching, the issues I've just related usually have some degree of impact on the effectiveness of the quality program. Call Centers must actively work to minimize these problems or take their QA progam another direction.

Next post: The Pros & Cons of having a dedicated QA team.

FREE ACCP Meeting for Central Iowa Companies & Contact Centers

For readers and subscribers in the central Iowa area, please consider attending the ACCP event  April 21 sponsored by Avtex. The half-day event is FREE and the group will be touring the Principal Financial Group contact center in Des Moines.


We hope you can join us! It's a great chance to network
with your industry peers, discuss and share ideas!

MEETNG DATE: Wednesday, April 21, 2010

LOCATION: Principal Financial Group
6200 Park Avenue
Des Moines, IA 50321

Don’t miss this rare opportunity to tour
Principal Financial Group's Contact Center

Founded in 1879, the Principal Financial Group (The Principal) is a leader in offering businesses, individuals and institutional clients a wide range of financial products and services, including retirement and investment services, life and health insurance and banking through its diverse family of financial services companies.

As a 401(k) leader and a member of the FORTUNE 500, the Principal Financial Group has $280.4 billion in assets under management and serves some 18.6 million customers worldwide from offices in 12 countries throughout Asia, Australia, Europe, Latin America and the United States. The Principal employs 14,900 employees nationwide.

We hope you can join us! It's a great chance to network
with your industry peers, discuss and share ideas!


Registration, Network and Breakfast

Welcome and Introductions

Principal Financial Overview & Presentation:
    - Kim Post, Manager Client Contact Center
    - Rachel Torres, Manager Client Contact Center
    - Q & A Session

Tour Principal Financial
    - Chris Lynch, Manager Client Contact Center

Small Group Networking/Breakout Discussions
   Breakout Sessions
- Metrics – how are they used/how are they calculated
- Rewards on Metrics
- Economy – what changes were made due to the economy
- System Technologies
- VoIP
- Workforce Management/Staffing/Flexible Work time

Wrap-Up / Closing

* * Seating is limited * *
CLICK HERE to register today!


About ACCP:
The Association of Contact Center Professionals (ACCP) is a non-profit networking group of contact center professionals. ACCP consists of Contact Center Executives, Managers and Supervisors from various companies across central Iowa who meet to network and share their experiences on various topics relevant to today's contact center industry.

ACCP meetings are
FREE to attend!


Please forward this email to
other colleagues on your team
who may also have an interest
in attending this meeting. If you
need any assistance with your
registration, please email us at 800.323.3639

Best of QAQnA: Who Are You Satisfying with Your QA Scale?

I'm always struck by the mixture of motivations underlying many call center QA scorecards. Companies love to give lip service to delivering excellent customer service and improving customer satisfaction. Their QA scale, however, may reflect the designs of a cost-sensitive mangement team (driven by lowering costs with no regard to impact on the customer) or a sales-driven management team (driven by increasing sales with no regard to impact on the customer).

This mixture of messages frustrates front-line agents who see the hypocrisy: "You say you believe in customer service but all I'm told in QA is to keep it short or push the cross-sell." It also frustrates the QA coach who must try to justify or explain the obvious, mixed message. Of course, it is possible to have a balanced methodology in which you satisfy customers and look for ways to be efficient and opportunistic. The key is to make sure the customer is not left out of the QA equation.

If you really care about keeping your customers coming back, you should start your entire QA program with a valid, objective customer satisfaction survey. The results can give you the data you need to impact Customer satisfaction and retention.

Find out what is really driving your customer's satisfaction and loyalty. Then use that information in building and weighting your QA score card. In fact, some of our surveys have measured the customer's willingness to hear up-sells and cross-sells in a customer service interaction. The results are often surprisingly positive, and the data can be a powerful tool in building buy-in among the front-lines for your sales drive.

Oh, and by the way, it's possible that your company already does customer sat research and you've never seen it. Just the other day we provided a call center manager with a copy of a survey our group had done for his boss a few years ago. He was never aware that it had been done and had not been given access to the information, even though it was critical for driving tactical decisions in his call center. I wish that this was an isolated incident, but my gut tells me it happens more often than not. It may be worth it to ask around. Of course, trying to decipher the data in many customer sat surveys we've seen can be a mind-numbing tast - but that rant will have to wait for another post!

Technorati Tags:
, ,

Improvement Priorities: Don't Major on the Minors

First, get across the pool. I grew up as a competitive swimmer. When I first started as a child, I literally could not swim across the width of the pool. I began by learning how to swim. As I progressed to racing, it was amazing how a few fundamental changes could result in several seconds improvement in my times. Years went by. I got better. By the time I was in high school there were no longer any quick and easy improvements. I was trying to shave tenths of a second off my time and looking for tiny improvements I could make in every aspect of the race. I even shaved my head for the conference finals so that my hair (which was then much longer and thicker) would not create unnecessary drag through the water.

I think about this quite often as I work daily in Call Center Quality Assessment. When our group begins doing a third-party assessment for clients, I can almost guarantee that the client performs poorly in some of the nit-picky details of the call like hold etiquette and transferring callers. Transfer and Hold behaviors are usually the lowest bars on the bar chart.

It's a common reaction for clients to overreact to the results in these areas. At first glance, it appears that these behaviors are the most critical behaviors on which to improve (because they are being performed so poorly). The truth is that these are relatively minor issues in the larger picture of the customer's experience. It would be like me, as an eight-year-old novice swimmer, shaving my head to improve my time when the most important issue was that I could barely swim across the pool. There were far more important and fundamental improvements I needed to make before focusing on those little details made any sense.

For most contact centers, Holds and Transfers occur on a small fraction of phone calls and have relatively small impact on customer's satisfaction. If you've got issues in basic courtesies and resolution related behaviors (which occur on every call), you're better off investing your resources in improving performance in those behaviors. When you get to the point that you're doing the major things well, then you should turn your focus on the "minor details" that make the difference between "very good" and "excellent."

Creative Commons photo courtesy of Flickr and evoo73

About Tom

cwenger group web site

Subscribe to Feed

Search QAQnA

  • Google


Badges of Honor

Powered by TypePad