Statistics and Decision-Making in a Professional Setting
Types of Information Collected
The professional context here is that of Performance Assessment and Monitoring in Transcomm SA, an outsourced customer service provider. More familiarly known as a call center, Transcomm is only about a decade old but it leveraged the special opportunity in its service horizontal: a fairly small manpower complement is good enough to compete for local business within a state or, as in the case of Europe, for all who require an external contact center to service customers all over the country. Since domestic presence is vital, expansion, therefore, takes place by setting up new markets. Today, the company has over 75 locations throughout North and South America, Europe, the Philippines, North Africa, and the Middle East.
Cost-saving is the primary rationale for a business outsourcing its customer service function to independent third parties. The principal cost components happen to be manpower, computing, and communication (C & C) equipment and their associated software, and company overhead. The cost of C & C hardware is not a factor since it is the same for both the client and the outsourcer. The software itself does not contribute materially to saving by outsourcing. In the first place, some clients who develop specialized CS software are perfectly willing to have their outsourcer install this on the latter’s premises. Secondly, there has been a sea change in software development, away from the pricey industry leaders to freelancers who develop substitutes that they then release to the “open source” community free of charge.
Ultimately, therefore, cost savings center on staff costs and enterprise overhead. Staff costs are about the same for both sides since most agents are young (i.e. paid just above white-collar minimum wage. The naturally glaring difference between Big Business and small call center overheads is lost during the competition for the client’s business. At the end of the day, controlling staff costs is the primary way for contact centers to generate a profit.
Call center supervisors and operations managers are so habituated to formal, weekly performance reports (and the occasional crisis-induced inspection from day to day) that they spend all their time on the immense detail of tactical performance metrics. Starting with time and output reports for a given city location, massive real-time computerization has given the ability to “drill down” to account teams handling one client, the three-shift units that make up each account, and so on down to the individual agent whose metrics are three standard deviations lower than either the standard or the team average.
Herewith are just a few of the numerous customer service performance statistics. They are all gathered in the name of productivity: maximizing output for given wages.
Total call volume, by person, agent shift team, and account team. This varies seasonally (e.g. airline and hotel CS teams get more calls during the peak travel season, naturally), working day or holiday, even by hours during the day and night.
Average handling time (AHT) = Average “productive” talk time (ATT) + Average after call work time (ACW), both divided by the number of calls handled for the hour or day. ACW amounts to all the necessary “paperwork” and recordkeeping after each call.
Service level = The percentage (agreed to with client) of all incoming calls answered by a live agent within so many seconds or minutes, net of dropped calls before the time cut-off is reached. This is a “strategic” statistic since SL dictates how many telephone trunk lines should be installed and how many agents should report at what times of the day, after accounting for “normal” absenteeism (but not tardiness because this is ruthlessly minimized) and to take up the workload when others take meal and coffee breaks.
Types of Information That Are Not Collected But Should Be
Blocking rate = The percent of calls “offered” (meaning total agents available to take calls that very minute) that are not allowed into the system; generally % of callers receiving busy signals even after repeated tries. Since this is the converse of Service Level, the customer-oriented client who pays for enough agents and facilities to attain 99% SL in effect demands no more than 1% Blocking Rate from his outsourcing center.
Every call center’s Automatic Call Distribution System (ACD) can track and report BR but call centers are averse to compiling and disclosing this statistic to clients. The reason is that SL may indeed reach 99% if 9,900 were taken care of among 10,000 who waited, say, 90 seconds or even longer while hearing a ringing tone. But if, in reality, a total of 12,000 had dialed their credit card company toll-free cardholder assistance “hotline” but 2,000 had gotten a busy signal, then that means the real BR was not 1% but no fewer than 16.7%. One in six disgruntled cardholders reached no one and were therefore frustrated (Bergevin, 2005).
Customer Satisfaction Rate (CSAT). To prevent bias, an interactive voice response (IVR) program should be able to get online with the customer after the agent has hung up but before the caller can do so. IVR can be as straightforward as a script that says, “Press 9 on your handset to indicate that you were totally satisfied, 0 if you were completely disappointed, and any other number in between that will accurately convey your satisfaction with the way your concerns were addressed.” This should normally reflect satisfaction with agents’ handling of the complaint, service request, or record update. But the aforementioned 16.7% who could not get through earlier will certainly modulate their opinions no matter that the customer service call went smoothly when they finally got through.
Advantages of Improving Decision Making
At the end of the day, CSAT has to be the most vital statistic, since the client operates a customer relations group (in-house or outsourced) principally to keep customers satisfied and not switch to other brands. This means that, if a call center also takes customer satisfaction tracking as the core goal, then CSAT should take the front seat to cost-control measures when it comes to motivating, caring, and nurturing the agents who make customer service happen.
Bergevin, R. (2005). Call centers for dummies. Mississauga, ONT: John Wiley & Sons.
SearchCRM. (2009). Schools: Call center metrics. Web.