The need to upskill the data and analytics capabilities of both your analytics community and the broader business and operational communities is widely recognized. It is also difficult to determine exactly who should learn what and how to best teach them. I had the pleasure of moderating a webinar recently featuring three executives who provided some amazing perspectives and ideas to help you improve your organization’s data and analytics competence and literacy. In this blog, I’ll summarize ten of the takeaways I had from the session. The panel was made up of:
- Danielle Chabot, AVP, Technology, Data, & Analytics Learning at Travelers
- Michelle Fetherston, Data Literacy and Culture Program Lead at Northwestern Mutual
- Merav Yuravlivker, CEO and Co-founder at Data Society
Takeaway #1: Upskilling May Be a Misleading Term
We started the session with a discussion about how to define and differentiate the terms upskilling and data literacy. Danielle made the point that she thinks we should just say “skill” or “skilling” instead of “upskill” or “upskilling”. The reason is that there is so much to know these days and what is most critical is always changing. What an employee needs isn’t always more of the same skills or an improved level of capability for an existing skill. Rather, employees must continually gain skills in a range of new areas as well. It isn’t about bonus “up” skills, but rather necessary base skills.
Takeaway #2: Make Content Relevant and Easy to Access
As we began discussing how to set a vision for upskilling programs, Michelle talked about the need to make the available courses and resources both relevant and easy to find. For example, her team didn’t just create various modules and then deploy them. They also put together specific tracks that tied different assets together so that someone looking to build their skills in a specific area could have a one-stop-shop. Even better, her team developed a survey that helped people find the tracks most relevant to them so that they didn’t have to dig through and try and figure it out for themselves. Making it easy to get started drove up engagement and completion rates.
Takeaway #3: It’s All About Time Allocation
As we started talking about common hurdles, Merav stated that she thought perhaps the most common barrier she has seen is time. Managers are often reluctant to support employees in spending multiple days in training, even if it is important. While everyone will agree that learning new skills is important, it rarely actually gets to the top of anyone’s priority list. Worse, even when something is scheduled, employees are often asked to cancel to deal with whatever the issue of the day is. Another important angle to consider is the need for employees to truly allocate the time to their courses and not be distracted and checking emails the whole time. It is important to be present and to pay attention. Treating training as vacation time in terms of how seriously it should be taken on the calendar was suggested. Last, time commitment doesn’t end after a session. It is important to allocate time to apply learnings on the job following attendance. Without forcing time for this follow-through, new skills won’t stick.
Takeaway #4: Have Program Champions on The Ground
One of the foundational factors Danielle felt must be in place to launch a successful program is to have strong program champions present within the parts of the organization being targeted with the upskilling program. This can of course include self-motivated champions who love the content and proactively promote it to their coworkers. However, she stressed that this can also involve intentionally and formally assigning champions to take the lead within each organization. This will ensure that the programs are promoted and that necessary visibility is achieved.
Takeaway #5: Who’s Sitting at The Table?
In order for any literacy and upskilling program to be successful, the panelists agreed that the Learning and Development team needs a seat at the table as the plans for the program are laid out. You can’t bring the L&D team in at the end to handle the implementation phase and expect fantastic results. You must bring the L&D team in early and allow them to help guide the planning and strategy from the start.
Takeaway #6: Semantics Matter!
Whether fair or not, the term “training” carries a lot of (mostly negative!) baggage with it. We’ve all had bad experiences in training at some point in our careers. The idea of calling it “ongoing learning” was suggested because “learning” doesn’t carry much baggage and any data literacy and upskilling initiatives certainly need to be ongoing. As organizations try different approaches and innovate in their delivery, changing semantics can help people grasp that something different is available. Of course, you must make sure you really are doing something different and not just rebranding your same old training!
Takeaway #7: Formally Measure Progress
Each of the panelists discussed the importance of measuring the outcomes of your upskilling programs once they are implemented. This can be done both via participant surveys and also by direct measurement of job performance. For example, tracking what data is being used within a business unit and who is using it. The distributions should change if the content from an upskilling program has been absorbed and applied. It is also possible to measure the efficiency of the organization in making decisions and creating the analytics needed for day-to-day decisions.
Takeaway #8: Dedicate A Full-Time Employee to The Program
If developing an upskilling program is but one item on many people’s lists, progress can be slow or non-existent. As with anything, assigning someone full-time to developing upskilling programs will both show commitment to the organization and ensure that someone is actually going to be focused enough to make things happen. The impact of recognizing that an FTE is warranted, getting the position budgeted, and then filling it will make everyone take the program more seriously.
Takeaway #9: Experts Are Both Partners and Customers
One topic that came up was Michelle’s regret that the analytics and data science experts weren’t brought in more as a teammate in developing her company’s programs as opposed to being treated as another customer for the programs. Your experts know best what analytics are happening across the organization. They also are aware of where their business partners struggle the most. While the experts still need help too, it is a different type of help than the non-experts. For example, data science professionals likely need help with storytelling and presenting skills more than technical skills, which they have a lot of already and know how to acquire on their own. The experts can also be helpful in the delivery of the program and in getting buy-in from their business partners.
Takeaway #10: Is There a Better Term Than “Data Literacy”?
We finished with a quick discussion about whether “data literacy” is really the best term for much of what companies are pursuing today. One option that was offered was “data fluency”. The panel wasn’t sure that was any better and it also implies some minimal level of skill is required to be fluent. “Data culture” was also discussed, but the consensus was that that term should be applied much more broadly than just data literacy. At the same time, the argument was made that nobody is really “data illiterate” today. It is all a matter of degree of literacy. In the end, we decided to leave it at “data literacy” for the day and end our very insightful discussion.
If you’re interested in hearing the entire conversation, you can find the replay here.
Originally published by the International Institute for Analytics