As the former vice president of learning and development for McDonald’s Corp. and leader of a high-performing learning team, I learned a valuable lesson, mainly in retrospect: Don’t overengineer your learning analytics strategy.
I say that because for nearly a decade, my team and I continually challenged ourselves to get better and better and better to achieve results that made us “the best of the best.” Now, I can’t help but think there might have been a better approach.
We knew we needed a robust measurement and analytics strategy not because anyone was asking but because we wanted to show the value of our enterprisewide learning investment. We used traditional Kirkpatrick and Phillips measurement models as our guide. Then we built a chain of evidence to demonstrate impact:
- Level 1 Reaction – check. We consistently used Level 1 surveys and smile sheets, which even predicated on-the-job application.
- Level 2 Learning – check. We purchased a learning management system and retooled our curriculum, building in pre- and post- knowledge assessments.
- Level 3 Behavior – check. We developed manager feedback channels so we could measure learning application.
- Level 4 Results – We put measures in place to evaluate the business results from our capstone class. We set up our dashboard, scorecard and monthly, quarterly and annual reports. All of this took a lot of resources to execute and maintain, so that we could actually document impact and measureable results — in a few years.
Without question, the feedback we received helped us improve our curriculum and “sell” revised content to our owner operators and leaders. But in hindsight, we could have taken a more efficient and effective route to demonstrate the effect of our learning investments.
In my current role of coach and adviser for learning leaders, I see the need for a different approach to create stronger buy-in, and ensure the analytics put in place are the right ones to create the desired outcomes.
I was recently discussing this topic with friend and colleague Stacey Boyle, a successful human capital analytics business owner who has been in the learning analytics space for almost 20 years. I first met Stacey 10 years ago when I was looking for guidance on how to better understand analytics. Over lunch, we discussed the areas where we both see learning leaders continuing to struggle, and we agreed that measurement and analytics is a biggie.
Stacey said her customers are still trying to build a chain of evidence with traditional models, and they’re not embracing more modern and effective analytic approaches. It was clear to us both that learning analytics needs to be easier to understand, absorb and put into action.
We also agreed that to be an effective learning leader today, it’s important to be able to assess the value of learning investments. Yet, there is so much information published on learning analytics — much of it by some wonderful experts — that can be overwhelming or overly analytical. I personally felt that way as a new learning leader, as do many of my clients.
At its core, when developing an analytics plan, it’s critical for a learning leader to first answer the question “why” — why do you want to measure? This is in contrast to the “how” that too many learning leaders typically start with.
The purpose of measurement is — or should be — to solve business problems and drive strategic decision-making; it should never be done solely as an exercise to collect a laundry list of measures needed for a training evaluation model such as Kirkpatrick’s or Phillips’ models. The “why” needs to start with the stakeholders’ needs and expectations: What is the purpose of the existing or new learning investment? What do you expect to accomplish with the learning intervention? What does success look like?
Whether you’re a new learning leader wanting to establish a new learning analytics strategy, or an experienced one looking to refine your current strategy, you must establish why you want to measure and what problems you want and need to solve. If you jump to and get lost in the details of the how right away, without clearly mapping and understanding the why as a starting point, final outcomes are unlikely to meet organizational needs.
Diana Thomas is an executive coach and the CEO and Founder of Winning Results. Comment below, or email firstname.lastname@example.org.Filed under: MeasurementTagged with: analytics, measurement, performance management, strategy