Research

Analytics Versus Intuition

By Keri Pearlson, Jan 21, 2011

Recently, when Assurant Solutions compared decisions made by highly trained experts with decisions made by ‘silicon-based intelligence,’ computerized judgments based on analytics, they were able to double and in some cases triple relevant performance metrics. That’s the story told in a recent Sloan Management Review article, Matchmaking with Math: How Analytics Beats Intuition to Win Customers, by Cameron Hurst as interviewed by Michael S. Hopkins and Leslie Brokaw (SMR’s December 15, 2010 issue).

Assurant Solutions is in the financial services industry, and like many of their peers, they run a robust, state-of-the-art call center. A key function of the call center is to retain customers who call in to cancel their payment protection insurance. The best industry standard retention rates of 15%-16% still means that 5 out of 6 customers were not retained, and Assurant Solutions sought a way to lower that number. Since they had a lot of data, they decided to try something completely different: they brought in mathematicians, instead of call-center experts, to find relationships in the data. What they found were relationships that defied conventional logic in some cases, and would never have been found without the volume of data in their database.

Conventional wisdom dictated that improvement in operational experience, i.e. reduction in wait time, was directly related to customer satisfaction, which was reflected in retention rates. So state of the art centers strove to continually cut down wait time. But analysis of the data suggested that there might be other ways to increase retention such as matching customer service reps (CSRs) with customers based on rapport and affinity. When the call-routing algorithm was changed to reflect this correlation, retention rates increased.

The Assurant story doesn’t end here, however. The data also suggested ways to micro-segment the customers, instead of macro-groupings. The micro-segments made it possible to identify the high degree of variability in both the customer base.

Similar analysis identified corresponding variability in the CSRs, making a matching algorithm more accurate and ultimately more successful. The analysis of thousands of interactions indicated that the 80/20 rule (80% of the calls need to be answered in 20 seconds or less) was really outdated.

Customers were not that uniform in their response to wait-time. They found that most customers were ok with wait times double the convention or more, and Assurant managers were able to test various other scenarios such as 60/60. At some point there is a negative effect due to waiting too long, but it’s much longer than 20 seconds and that extra wait time gives Assurant the opportunity to make a better match between customer and CSR. And while every call center knows that some customers are worth more revenue than others, making them higher priority customers to save, Assurant can identify these customers in a more granular way, making additional decisions on how to manage them, and ultimately saving them as customers.

This article highlighted a number of counter-intuitive conclusions that dramatically changed the operations at the call center for Assurant. In the end, when conventional wisdom says that state of the art saves means retaining $.15-$.16 on the dollar, Assurant is able to boast retention well into the high 40s, and some days it’s been as high as $.58. It’s not a surprise that having good data and good analysis can produce good results, but in the Assurant story, we see an example of counter-intuitive results that blow away the performance metrics of even the best call-center “carbon-based intelligence,” the subject matter expert.

About the author

Author photo

Dr. Keri E. Pearlson is an expert in the area of managing and using information. She has worked with CIOs and executives from some of the largest corporations in the world. She has expertise in helping executives create strategies to become Web 2.0-enabled enterprises, designing and delivering executive leadership programs, and managing multi-client programs on issues of interest to senior executives of information systems. Keri specializes in helping IT executives prepare to participate in the strategy formulation processes with their executive peers. Current issues include web2.0/3.0 strategy, ecoinformation systems, finding additional value from current investments, project vulnerability analysis, and succession preparation. She’s a skilled relationship manager and an accomplished meeting facilitator. She’s the Founding Partner and President of KP Partners, a CIO advisory services firm.

Keri has held various positions in academia and industry. As Vice President-Leadership Development for nGenera (formerly the Concours Group), she designed and delivered executive-level workshops for CIOs and their direct reports, and she led research programs on issues of importance to CIOs. She was a research and program director at the Research Board, a small, private think tank for CIOs, from 2001- 2003. From 1992-2000, she was a member of the information systems faculty at the Graduate School of Business at the University of Texas at Austin where she taught management information systems courses to MBAs and executives. Keri was also a research affiliate with CSC-Research Services where she conducted a study of the design and execution of mobile organizations. From 1986 to 1992, she did research for faculty at the Harvard Business School and for CSC-Index’s Prism Group. Prior work was at AT&T and Hughes Aircraft Company.

Keri holds a Doctorate in Business Administration (DBA) in Management Information Systems from the Harvard Business School and both a Masters Degree in Industrial Engineering Management and a Bachelors Degree in Applied Mathematics from Stanford University.


Tags