The perennial question asked of CLOs and others involved in learning and training by CEOs, CFOs and other C-suite executives is, “What return do I get from my investment in learning?” Using the representative sample provided by the more than 200 learning professionals who are members of Executive Networks, we set out to see how well we could answer the question.
The research question was a simple one:
“Does your company have measure(s) of return on investment in learning that are accepted by your senior leadership as valid, rigorous and supportive of further investment in learning? If so, please tell us briefly what they are, and we will contact you to learn more about how you measure ROI on learning.”
We received responses from more than 50 companies who said they do not have measures of ROI in learning that meet the above criteria and 11 companies who believe their ROI measures for learning do meet the above criteria. Responses from five companies are printed in detail below and include what we learned in our follow-up inquires. (The detailed responses are anonymous to protect the identities of the companies who provided this information.) The remainder of the participants will be included in a future Business Intelligence feature in Chief Learning Officer magazine.
1. Large Merchandise Company
1. No one in the C-suite buys or respects the way that training groups have defined or calculated ROI.
2. The term ROI should be retired in favor of more realistic terms like business impact, strategic contribution or relationships to business outcomes.
3. Learning teams should be looking at ways to demonstrate their value so that the ROI question quits getting asked. Find out if other soft functions (marketing, staffing, employment practices, asset management, IT infrastructure, customer care, mergers and acquisitions) are asked to prove their worth in terms of ROI — interesting to know.
4. There are better metrics to focus on — Boudreau and Ramstad, Laurie Bassi and others discuss frameworks like ROIC (beating cost of capital), return on time, value creation, predictive modeling — all of which are more constructive than what has been used thus far.
We do not, and have never, discussed training ROI with senior leadership. Their request is only that we get the most out of our investment and we prove that by ensuring a clear and indisputable link to critical tactics and strategies, mastery of content by learners, voice-of-customer metrics on knowledge and service, correlation and time-series studies of learning impact, predictive logistical regression analysis on training execution, and business impact.
2. Insurance Company
We did an ROI study on training and development a couple of years ago, shortly after attending an EDA program with Jack Phillips. I think it was more to prove that we could do it than demanded of our senior leadership. We used a sample of insurance claims adjusters. After several months of working with the data, we were able to do the analysis. We found that our adjusters who went through development were able to settle claims for 13 percent less then similar claims handled by adjusters who did not go through the same development. That has the potential to translate into millions of dollars of savings. An interesting part of our study is that we compared our ROI analysis using “hard criteria” with an estimation model for ROI developed by Dr. Lyle Spencer. We found that the estimation approach was actually more conservative and underestimated the ROI when compared to the “hard criteria.”
The issue I have with your question is that we do not calculate ROI on a regular basis. It is not asked for by our executives, and because most of our databases are transactional, it is difficult and time consuming for us to do this type of analysis.
We do some other types of metrics with our programs. For example, we have good pre- and post-tests for our manager school that can demonstrate skill acquisition. We can also show how the school improves management proficiency. We are redesigning a number of our leadership development programs to provide more information about employee performance to their managers so that this information can be incorporated into succession planning. This is viewed as very positive by our managers and senior leaders. This is also the type of metrics they want us to spend our time on developing and delivering.
3. Information Technology Company
When I ran the last Global Leadership Development Program in my previous company, we included action learning projects designed to address business critical issues. We set out an anticipated ROI to make the program cost neutral, i.e., to cover 100 percent of the program cost. For example, if the program cost $400k to run, then we were looking for the projects to deliver revenue generating activities in excess of $400k.
A company takeover happened before we had the opportunity to put the project recommendations into practice, so we never completed the evaluation. But I like the idea and would potentially use it again.
I know this isn’t exactly what you are looking for — this doesn’t address the issue of ROI on learning, but I thought it might just be interesting to you.
4. Insurance Company
Since we have accountability for the development of the leadership pipeline, we try to build in a variety of “measures” depending on the particular platform built for the pipeline segment.
For example in our “aspiring leader program” we utilize the “Leader of the Future” 360 survey at the beginning and have participants pick their area of development (one to two behaviors) and then do a “mini-resurvey” six to nine months out. The group composite data has always shown evidence of growth.
Additionally, one of the components of our development program (12-month blended learning experience for new managers) has a simple “end of the program” assessment where we go out to the managers of the new managers and ask, “Has your new manager increased their effectiveness as a result of their participation in this program?” We aim for 80 percent response to agreed or above.
5. Consumer Electronics Company
The closest we came was our metrics with Ft. Hill using Friday 5s. We had a 10-week execution period with individual learning goals for our leadership curriculum and seven sessions of a high potential leadership forum. Most leaders were very impressed, even though it was self-reporting. A few weren’t convinced that the numbers were “hard” enough but also liked seeing any application and results. Here’s what we used for the forum:
Indicate ONE Goal Category:
• Cross-Functional Collaboration
• Execution of Strategy
• Great Place to Work Issues
• Business, Financial Acumen
• Navigating Change
• Other Leadership Behaviors
Indicate ONE Business Leadership Metric:
• Growth — impact on new markets, products, services
• Profitability — impact on SGA, gross margins
• Invested Capital — impact on inventory, A/R, PPE
• Leadership Performance — any improvement that has increased effectiveness
Each participant worked on learning goals applying to their real work and then self-reported stages of completion, the estimated annual impact of their accomplishments for each metric, and what percent of the impact was directly related to application of what they learned at the forum.
We even asked them to estimate and quantify the impact of improvement in their own leadership effectiveness. Each session had a virtual wrap-up with one of our business heads and each team reported out their results.
We compiled the results of all sessions by project and metric to report back results of $149 million directly attributed to learning at the forum:
• Growth Goals — $74 million attributed/$354 million total
• Invested Capital — $437,000/$3.8 million
• Leadership Performance — $32 million/$179 million
• Profitability — $41 million/$60 million
• TOTAL — $149 million/ $598 million
Still not perfect, but much better than just evaluations, and most participants really “got” and thanked us for the focus on application.
There were a few other elements that supported these metrics even happening:
• Participants worked in peer learning teams with meetings during the whole process.
• After each progress update on Friday 5s, we sent follow-up emails with bar charts and comments of teams’ progress and copied senior leaders.
• After the 10-week execution phase, each team presented their accomplishments at a virtual wrap-up session attended by a business head.
In Part II, we’ll go over the rest of the respondents, which include an airline, a consumer electronics provider and a finance firm.Filed under: Leadership Development, Learning Delivery, Measurement, Technology