Data Preparation: Is the Dream of Reversing the 80/20 Rule Dead?
By Bill Franks, Oct 13, 2016
I recently had someone ask me, “For years we’ve talked about changing analytics from 80% data prep and 20% analytics to 20% data prep and 80% analytics, yet we still seem stuck with 80% data prep. Why is that?” It is a very good question about a very real issue that causes many people frustration.
I believe that there is actually a good answer to it and that the perceived lack of progress is not as bad as it first appears. To explain, we need to differentiate between a new data source and/or a new business problem and existing ones we have addressed before.
Breaking New Ground
Whenever a new data source is first acquired and analyzed, there is a lot of initial work required to understand, cleanse, and assess the data. Without that initial work it isn’t possible to perform effective analysis. Much of the work will be a one-time effort, but it can be substantial. For example, determining how to identify and handle inaccurate sensor readings or incorrectly recorded prices.
From the earliest days of my career, some of the most challenging work has been working with new data. For the first couple of analytics on a new data source, the ratio of data prep and other grunt work to analytics is certainly much closer to 80% prep/20% analysis than to 20%/80%. However, as time passes and more analytics are completed with that new data source, things become much more streamlined and efficient.
Revisiting A Well-Worn Path
Once a data source has been utilized for a range of analytics and is well understood, developing a new analytic process with it starts to drift towards the 20/80 ratio. By making use of things like Enterprise Analytic Datasets, it is possible to jump almost directly into a new analysis as long as that analysis can utilize the same type of metrics that past analysis made use of.
In fact, many large organizations have greatly standardized and streamlined the use of traditional data sources for analytics. For example, transactional data is utilized to analyze customer behavior in a wide range of industries. Many organizations have a large number of standardized customer metrics available that can feed analytics both new and old. I know of companies with tens of thousands of metrics for each customer based on transactional history. Spinning up a new analytic process with these metrics is not that difficult and can often be more of a 20% prep/80% analysis proposition than an 80/20 proposition.
Even if you accept all of the points above, doesn’t it still seem like your analytics organization is spending a ton of time on data preparation today? Well, your instinct is probably on target, but not for the reasons you may initially think of.
The Big Challenge of Big Data
The rise of big data has led to a proliferation of data sources over the past few years. Simultaneously, analytics have become a major focus and there is demand for analytics to address an ever widening range of business problems. When combining these two trends, we are left with a large amount of new ground to break, which drives us back to the need for an abundance of work to understand, cleanse, and assess data. We therefore end up spending much of our time on data preparation and still see an 80/20 ratio.
However, it is important to look backwards and recognize the progress that has been made. The data that required a lot of work a few years ago likely does NOT require a lot of work today. The ratio of data prep to analysis may well be nearing the 20/80 target ratio in those cases. We tend to lose sight of this progress when we are inundated with the data issues of today. Even though we have made a lot of progress with our old data and analytics, we’re simply facing a huge amount of new data and problems to work on.
Keep The Right Perspective
It can certainly be frustrating to feel like your organization is forever stuck doing more data preparation than analysis. However, it is critical to recognize that the data and problems for which you’re doing that prep are constantly changing. It is simply impossible to analyze new data for a new problem without going through a bunch of grunt work and data prep at the outset. There is nothing wrong with this.
In fact, if your organization is breaking enough new ground with analytics to feel stuck in a data preparation mode, then you should be happy because it means you are likely making progress. The key is to ensure that once you’ve solved today’s problems and understand today’s data sources that you drive to a higher level of automation and standardization for those data sources and processes. By making analytics easier for the data and problems you already understand, you free up time to prepare the data for your next analytics adventure.
Originally published by the International Institute for Analytics
About the author
Bill Franks is Chief Analytics Officer for Teradata, where he provides insight on trends in the analytics and big data space and helps clients understand how Teradata and its analytic partners can support their efforts. His focus is to translate complex analytics into terms that business users can understand and work with organizations to implement their analytics effectively. His work has spanned many industries for companies ranging from Fortune 100 companies to small non-profits. Franks also helps determine Teradata’s strategies in the areas of analytics and big data.
Franks is the author of the book Taming The Big Data Tidal Wave (John Wiley & Sons, Inc., April, 2012). In the book, he applies his two decades of experience working with clients on large-scale analytics initiatives to outline what it takes to succeed in today’s world of big data and analytics. The book made Tom Peter’s list of 2014 “Must Read” books and also the Top 10 Most Influential Translated Technology Books list from CSDN in China.
Franks’ second book The Analytics Revolution (John Wiley & Sons, Inc., September, 2014) lays out how to move beyond using analytics to find important insights in data (both big and small) and into operationalizing those insights at scale to truly impact a business.
He is a faculty member of the International Institute for Analytics, founded by leading analytics expert Tom Davenport, and an active speaker who has presented at dozens of events in recent years. His blog, Analytics Matters, addresses the transformation required to make analytics a core component of business decisions.
Franks earned a Bachelor’s degree in Applied Statistics from Virginia Tech and a Master’s degree in Applied Statistics from North Carolina State University. More information is available at www.bill-franks.com.