Analytics Journal: Making Analytics Projects Succeed
By Daniel Magestro, Oct 04, 2016
Last week the International Institute for Analytics hosted its semi-annual Analytics Symposium in Boston, bringing together 100+ analytics leaders from companies across industries for a day of best-practice sharing on many facets of analytics programs. Agenda topics ranged from artificial intelligence and workplace automation, to defining analytics roles and forming effective recruiting partnerships with universities.
What makes IIA’s Symposium events unique, and in some ways quite extraordinary, is the level of best-practice sharing made possible by both the intimate, classroom-style nature of the discussion, and the broad cross-industry representation. With birds-eye views of the Boston harbor and the North End as the backdrop, attendees (a mix of clients and IIA experts) and presenters dug in, listened, shared, networked, and left with a few more pieces of the business analytics puzzle in hand.
As IIA’s Research Director, I probably enjoy our symposium events the most. I have the opportunity to design compelling, broad-reaching agendas that are both directly influenced by the companies we support and innovative in how we try to support those companies. It gives me a strong sense of purpose and the freedom to try new things to support our clients.
In Boston last week, this sense of purpose and design freedom were reflected in a session I titled “The Analytics Project Lifecycle.” In my prior experience in analytics leadership roles at large enterprises, and in countless conversations with other companies, I’ve come to view the multifaceted journey of analytics projects, starting from initial business need and weaving all the way to the final benefits from the project, as encapsulating pretty much every aspect of a successful analytics program.
In fact, one of the most common taglines used to support deeper investment in business analytics is along the lines of “70% of analytics projects fail.” Various authors, research firms, and technology vendors have shared their favorite reasons for this, usually of a high-level nature like data quality concerns, lack of engaged stakeholders, and communication gaps between technical and business folks. I have a few other reasons myself. (For one, alignment of analytics strategy to business strategy is a frequent challenge of central analytics teams.)
But to me, most reasons for high rates of analytics project failures are actually symptoms of the lack of a comprehensive end-to-end approach to analytics projects. In other words, analytics projects fail because analytics projects are uniquely multifaceted and complex. There aren’t just a few reasons: Analytics projects have dozens of twists and turns, large and small, where they can fall off track.
My goals with the Analytics Project Lifecycle session in Boston was threefold: raise this awareness of the truly multifaceted dependencies and challenges of analytics projects via a new step-by-step framework that we’re developing for our clients, encourage attendees to evaluate every phase of their analytics project processes, and arm attendees with perspectives and advice on specific pain points. The session was capped by a panel discussion with analytics leaders from 5 companies and industries (Loblaw, Merck, Hallmark, VMware, and McGraw-Hill Education).
The Analytics Project Lifecycle framework has many layers of detail (which is part of the point!), but at a high level there are three phases to the framework:
Planning. The process by which analytics projects come to exist, from the initial business need to the many decisions on how best to carry out the work.
Execution. The traditional analytics work, from data gathering to the decisions informed or driven by the analysis. This phase can be highly iterative, and like the full Analytics Project Lifecycle it is laced with many ways to fail.
Implementation. The follow-through on taking action on the decisions, identifying the benefits, and informing future analytics projects and investments.
By all accounts, the Analytics Project Lifecycle session was a success. The framework that we shared resonated with folks; in fact, to make the session as actionable as possible for attendees, we even included worksheets in the event folders that proved quite popular. And the panel discussion was just what I had hoped: a cross-industry exchange of wide-ranging best practices that reflects the long list of considerations that analytics departments and teams need to succeed. As I said at the event, our goal is for IIA clients to have a far lower project failure rate than other companies (0% is our hope!), and I believe our framework is a key tool to get there.
Of course, like all good frameworks, success with the Analytics Project Lifecycle means making it bend and work for specific companies and situations, and that’s where IIA is there to help. It’s why we exist, and it’s why I joined IIA one year ago: to support analytics leaders and programs in their unique journeys, rather than prescribing one recommendation. And with this framework now in hand, I’m looking forward to finding new and deeper ways to assist companies on these journeys.
If you’re an IIA client, send me a note and we’ll send you a copy of the worksheet.
About the author
Daniel Magestro, Ph.D., is the Vice President, Research Director at the International Institute for Analytics. An accomplished analytics executive with 10+ years of experience in healthcare, banking and insurance, Dan manages IIA’s robust research agenda, leads IIA’s international network of faculty experts, and advises IIA’s global community of analytics practitioners. Dan’s primary interests include effective analytics leadership and strategy, high-performing analytics teams, and the customer journey. Prior to joining IIA, Dan managed multiple analytics teams at Cardinal Health in Columbus, Ohio, where he built an analytics center of excellence to serve the company’s Pharmaceutical Division. He also held analytics roles at JP Morgan Chase, Nationwide Insurance, and Investor Analytics. Since 2010, he is Adjunct Professor at Ohio State University’s Fisher College of Business, where he teaches courses on data analysis and advises the college on analytics initiatives. Dan came to business from science; he holds a Ph.D. in nuclear physics.