Skip to content

Analytics Maturity and the Balancing Act Between Data Consumers and Producers

As IIA’s data and analytics community put a bow on 2023 and put their planning hats on for 2024, I reflected on the key principles essential for establishing and maintaining a high-performing D&A organization. These include recognizing each firm's unique information economy, ensuring collaborative balance between data supply and demand, and the necessity of consistent progress measurement with objective methodologies. I also explored the challenges faced by organizations in analytics initiatives, often due to a lack of understanding of their specific information economy and cultural readiness for analytical transformation. IIA’s five-step process aims to effectively characterize an organization's information economy, involving understanding both the demand-side needs (data consumers) and the supply-side capabilities (data producers), and finding solutions that satisfy both. IIA’s view is that it’s critical to optimize for data democratization, shifting your enterprise from a supply-controlled to a demand-driven approach, and we stress the importance of changing organizational mental models for the successful implementation of advanced analytics and AI programs. In short, embracing change and addressing the “soft parts” of analytics are crucial for achieving success in organizational data and analytics endeavors.

In this article, let’s talk about the balancing act between data consumer and producers. IIA’s thinking on this dynamic is driven by two fundamental principles: 1) Demand will get its needs met, one way or another, and 2) Data and analytics organizations (or suppliers) cannot deliver business value without collaboration with their internal data consumers. As fundamental principles go, this might sound simple enough, but in our years of experience working with large, complex enterprises navigating the transition to advanced analytics and AI, putting this level of collaboration into action is very difficult. In part, the difficulty arises from a point made above: high-performing D&A organizations orient their operating model around demand, not supply. Sadly, too many organizations still don’t come terms with the fact that data consumers will go around data producers to get what they need, when they need it, and no investment in data and analytics capabilities will magically produce an analytically-driven enterprise without intentional collaboration with those who consume this data.

Analytics Maturity Assessment

Find out how you can get in-depth diagnostic insights, recommendations, and benchmarking on your enterprise analytics maturity, your analytics organization capabilities, and the skills of your individual contributors.

In my webinar with Chris Donavan, executive director of enterprise analytics at Medical Mutual of Ohio, we discussed the IIA analytics maturity assessments he’s had his organizations go through at different stages of maturity. For Chris, our analytics maturity assessments have been incredibly useful in various ways. Currently, at Medical Mutual, the focus is on driving analytic maturity, which is a huge task. It involves advancing across multiple areas: data, people, processes, technology, and most importantly, it's about cultural change and change management.

What Chris finds particularly valuable about the assessment and the tailored recommendations is its ability to pinpoint where people think they have opportunities. The aspect of the survey that weighs “importance versus effectiveness” has been instrumental in identifying where his own team sees gaps. This insight allows him to direct his efforts toward areas that will not only show tangible benefits but also deliver a high return early on, as perceived by his team. To paraphrase Chris, the assessment process essentially acts like an accelerator. The team starts to see real value in areas that are important to them, which, in his (and IIA’s) view, is a critical step in driving forward his analytics capabilities.

The “importance versus effectiveness” measurement really gets at the heart of the value of assessment: understanding, planning, and prioritizing data and analytics initiatives through a demand-intensive lens, as opposed to, say, adding another layer of technology to solve X problem.

And who is to blame a data consumer for finding their own way if the organization can’t meet their needs in the moment? After all, they’re trying to get their jobs done, so they’re going to figure out how to get what they think they need. But there's a self-limiting aspect to consider, which is where the assessment becomes valuable. Often, people don't fully grasp the potential of what they could achieve. They might not be aware of other information, different approaches, or additional insights that could lead to better decision-making. This creates a sort of double bind. Not only is there a lack of an efficient or consistent delivery mechanism, but there's also this self-imposed ceiling within the organization. If you can't envision the potential of new methodologies, you're inadvertently capping your own progress.

IIA observes many companies that equate competing on analytics and data with just building a world-class supply organization. They think having top-notch technology and skills will automatically lead to high maturity and performance. However, what we consistently see is that even world-class supply cannot deliver business value without the balance and collaboration of demand. Thankfully, modern data and analytics leaders like Chris are having this revelation that being technically proficient isn't everything.

More to the point, it's often easier to build the supply side than to develop the ability to utilize new and different data and insights to make informed decisions. This is something Chris and other data and analytics leaders we’ve worked with have learned from experience. You can get excited about investing in top-notch analytics, but if they're not used for decision-making, it's just waste. The true value lies in whether the decision-makers on the other end will actually use these analytics or even know how to. If they don't, or if the analytics are not integrated into data consumer workflows and processes, the data and analytics team has failed to deliver any real value to the enterprise.

Enterprise Data and Analytics Supply and Demand Balancing Act

Image1 Supply and demand balancing act

So, let’s dive deeper into the balancing act of analytics maturity. On the supply side, in terms of levels of maturity from the bottom of the vertical axis up through the top, is a range from a complete absence of data analytics, progressing to basic data, then to information products and analytical applications, and currently, the state of the art involves supplying embedded and automated analytics. On the demand side, moving from left to right on the horizontal axis, we go from no interest in using data for decision-making to a rudimentary demand for data, where the thought is, “I'll just take the data and make my own decisions.” This evolves to a demand for more sophisticated information products—reports and dashboards—and further to seeking predictive and prescriptive support, and finally, decision automation and embedding.

Demand Side Maturity: From “Dataless” Decisions to Decision Automation

Artboard 1 copy 100

Through the analytics maturity assessments and free text comments we collect, we can listen to demand signals and diagnose where your demand-side partners stand today. This can range from feeling they don't need data to make decisions because of their extensive experience, to recognizing a need for data to enhance decision-making. This progresses to an appreciation for KPIs, metrics, and dashboards, and advances to wanting clear directives on where to focus, particularly in sales contexts. The ultimate stage is when they prefer not to be involved in the decision-making process at all, opting instead for automated decision-making.

And as your enterprise grows from one stage to the next, the challenges differ significantly. Moving from just needing data to needing information products, the hurdle is often trust. If they can't get consistent, understandable data, they won't trust it unless they process it themselves. Getting people to trust that you can take on the data work for them is a substantial organizational lift. Then, advancing from information products to predictive and prescriptive demands involves education and upskilling, since understanding and acting on predictive modeling is a more significant ask.

Finally, the stage where you want people to arrive is recognizing that they don't have to be involved in every decision but do need to manage the risks associated with automated decision-making. This represents another significant cultural shift. Automation doesn't mean abdicating decision-making; rather, it redefines the role of leaders and decision enablers at that level of maturity.

Supply Side Maturity: From Siloed Stovepiped Projects to Repeatability

Artboard 1 copy 2 100

On the supply side, we tend to see a split between the lower and upper halves. Initially, there's a pattern of isolated, project-based work—what we call a “stovepipe” approach, where activities are repeated in silos without integration. Eventually, analytics teams transition to a more product-driven, reusable methodology. This shift involves a managed platform orientation that breaks away from the repetitive stovepipe method, fostering repeatability and efficiency.

This relates to the maturity progression because, at first, stakeholders may only allow analytics teams to assist in constructing their datasets, but there's still a lack of complete trust. Initially, you might be relegated to doing isolated, stovepipe work for them. However, as trust is built, you can ascend on the maturity axis. It's crucial to advance on both the trust axis and the capability axis simultaneously, fostering a situation where as capabilities grow, trust grows in tandem.

Optimal Collaboration Between Supply and Demand

Artboard 1 copy 3 100

Now, let's consider the two axes together in a typical two-by-two matrix. In the lower left, there's low capability supply paired with simple and basic demand, which is a common scenario. More often, as I've noted before, there's a significant investment in supply with high capability, but the demand doesn't match this sophistication—let's call this simple demand. Then, to complete this metaphor, you find users with very high expectations and complex demands facing a supply side that can't keep up. The ideal state, of course, is a balance: high capability supply meeting complex demand.

Reflect on your organization's balance. The goal, as Chris mentioned, is to ensure that as you develop along these axes, you maintain equilibrium. When this doesn't happen, imbalances occur. For example, in the upper left, you might have an oversupply—lots of activity that fails to make an impact, leading to inefficiency and wasted investment. Conversely, on the right, you face undersupply or demand overflow, where demand becomes so frustrated that they turn to outsourcing to meet their needs. This can lead to the supply side spending too much time managing external contractors instead of focusing on their core work. The key is to avoid these extremes and strive for a harmonious advancement in both supply and demand.

We often field questions about what needs to be done. On the supply side, we talk about skills and capabilities, with a particular emphasis on maturing capabilities. As for the demand side, fixing cultural dysfunctions and upgrading consumption skills—rather than design skills—is crucial. Moving these two in coordination is what we commonly see. For many data and analytics leaders reading this, and as our assessment instrument reveals, you might see analytics technology and techniques (called the ATTA in our assessment portfolio) scores that are higher than the analytics maturity—or demand-side—scores (otherwise known as the AMA in our assessment portfolio). This discrepancy is common.

As a data and analytics organization, you're not always in perfect balance; you're perpetually climbing the curve. But when you assess your position in simple terms—simple demand and simple supply—you might recognize a deep-seated cultural resistance to the transformative power of data and analytics (low-capability supply / simple demand). Now, imagine the upper left quadrant (high-capability supply / simple demand): perhaps you have 300 data scientists with the fanciest tools imaginable, but there's no demand for their work, and basic needs are unmet. In the lower right quadrant (low-capability supply / advanced demand), there's siloing, with sophisticated analysts in departments like marketing or HR operating in a DIY environment or managing costly external integrators. Now, moving to the upper right quadrant (high-capability supply / advanced demand), data producers and consumers are more in balance (i.e. achieving optimal enterprise analytics maturity), but you still must keep track of uneven penetration.

From here comes the question of budget allocation—should you feed high-performing areas or lift the lower performers? Frankly, being stuck in the middle can be the worst scenario, where siloed expertise exists in some corners of the organization, far removed from the centralized enterprise program that remains underutilized.

One strategy to combat this is to connect with those in the lower right and deliver what's needed in the upper left to them. To steal a phrase from Chris, invest in your “coalition of the willing.” This way, you elevate both demand and supply toward the top right for the organization. Again, don’t downplay the trust factor. The demand side may not want advanced analytics because they lack trust in what they haven't had a hand in creating. They've never been given a reason to trust data from a corporate or enterprise team, and even if they're given data, they might redo it because they simply don't believe in its accuracy.

As Chris highlighted in our discussion, it took him a while to realize why all this great data wasn't being utilized. It boiled down to an open colleague of his admitting, “I don’t trust your data. I’ve never been given a reason to believe.” This was an aha moment for Chris as a leader, and this lack of trust is a significant barrier to overcome.

Indeed, one aspect that seems particularly salient for the chief data and analytics officer role, is that success is not a solo endeavor. Regardless of how skilled or well-intentioned you are, it ultimately comes down to the cultural demand-decision axis. If people aren't using what you're providing, you won't be able to drive change. There's a lot of enthusiasm and ongoing interest in becoming a data-driven organization that leverages advanced analytics. However, in our experience with clients, it is common to find leaders two or three years into their role that the anticipated radical change hasn't materialized. Sometimes it might be due to the individual, but often it's because the necessary cultural shifts are incredibly challenging. Chief data and analytics officers don’t get the buy-in, and other leaders don't see it as part of their role.

In the end, the balancing act is all about managing change. The process is challenging, but the rewards are significant and tangible. Often, the real need isn't about moving up the left vertical axis; it's about navigating change along the horizontal axis.