How can anyone argue against the foundational idea of measuring the impact of organizational learning and development to determine its value and improve subsequent decision-making? It’s for the good of the organization, after all.
What often gets lost when discussing analytics, which includes topics such as measurement, evaluation, assessment, needs analysis, return on investment (ROI), behavioral impact, retention and skill gain, are the answers to these questions: What is the purpose of the measurement effort? Why evaluate a training program? Why measure the ROI of an organizational initiative? Why measure the increase in skills? Why measure the retention of learning? In fact, why measure anything?
One measures, of course, to impact decision-making. The purpose of any measurement is to provide meaningful, objective and accurate information to facilitate that decision-making. The context and type of decision drives the measurement effort. This is why it’s imperative that the CLO converse in language that is not only understandable, but also important to the CEO.
In order to do this, CLOs must keep in mind the decisions the CEO needs to make. One way to look at this is to use the concept of the “business case”—the organizational reasoning that supports the application of the organization’s resources to a measurement effort.
The Importance of the Business Case
Measurement specialists have proffered training analytics as a package of breakthrough business-measurement solutions. They tout the merits of training evaluation, ROI calculations and needs assessments. But how useful are the resulting data to your business? It depends on the business case.
The organizational plan for applying measurement results to decision-making is the key to using training analytics. To work, the plan must tie into business strategy. It also must actively involve, and have support from, the organization’s stakeholders—including the CEO. The business case for the measurement effort should be clearly articulated and understood by all parties. Unless such a business case exists, how do you know what decision your measurement effort is going to impact? More importantly, the business case approach forms a foundation for successful communications with the CEO.
Here are three examples that show how a solid and clear business case of interest to the CEO can be used in the application of training analytics:
John Deere, a major North American agricultural manufacturer, experienced skyrocketing warranty costs. A CEO-sponsored investigation revealed that the cost of parts replacement for warranty-covered repairs soared significantly beyond projections. The apparent cause was inadequate problem diagnosis by service technicians. If the first part replaced didn’t fix the problem, another part would be replaced–and so on–until the repair was complete.
To address this issue for the CEO, the CLO and his team deployed a diagnostics certification course for service technicians. However, course penetration into the target population (service technicians) was poor because dealership management wasn’t convinced of the value of paying for and sending service technicians to the four-day course.
The CLO needed objective information to provide to dealership management to impact their decision-making. He commissioned an ROI training evaluation study, which found that first-year ROI for the time and cost of sending a service technician to the course was 134 percent in the United States and 155 percent in Canada in terms of reduced repair cycle time and reduced parts replaced. Only the reduced repair cycle time ROI figures (38 percent in the United States and 41 percent in Canada) were shared with dealership management officials, as their interest was in quicker repairs, leading to increased customer satisfaction.
The end result of this measurement effort was a significant increase in the percentage of service technicians completing the certification course. The business case for this measurement effort was to provide analytic information to convince dealers of the value of sending service technicians to training.
In another example, the CLO of Teledyne Brown Engineering, a major government contractor, was convinced that implementing e-learning would save costs, reduce time away from the job to attend training and increase the amount of training available to company employees who were dispersed geographically. However, the CEO and his executive team weren’t convinced that “potential costs saved” warranted an investment in an e-learning system.
The CLO proposed a pilot e-learning program that would be evaluated in terms of ROI and employee satisfaction. The study findings showed that there were significant bottom-line benefits to the program. The average net return on investment per person was 121 percent. The study also showed that the unique characteristics of the Web-based training contributed substantially to employee satisfaction with, and use of, the e-learning courses.
When these numbers were presented to the CEO, he approved company-wide implementation of the e-learning system. The business case for this measurement effort was to provide analytic information to convince the CEO of the utility of implementing an e-learning system.
In another example, APAC Customer Services, the largest call-center operation in the United States, had more business than it could handle. In the mid-1990s, APAC’s CEO and his executive team wanted to reduce the time it took to train new employees from three days to two days. After all, the company would rather not pay an employee to attend training if it had the option of paying them for production work, where an external client foots the bill. Of course, this only works if the new employee can successfully perform the job.
To address this last point, the CLO asked for a week to determine how to cut training time from three days to two days. He tested five separate two-day NEO training designs and measured first-day and fifth-day performance, following training. The “best” design resulted in a first-day performance average of 60 percent of incumbent performance. (The historical first-day performance average following three days of training was 70 percent of incumbent performance.) By the fifth day, new employees performed at incumbent levels regardless of whether they had attended two days or three days of training. Officials chose to implement the new two-day NEO program company-wide.
From the CEO’s perspective, this was a “no-brainer” decision: cut training time by one-third, which only impacts first-day performance by 10 percent and does not impact the length of time it takes to reach competency-level performance.
In each of these examples, there was a clear and convincing business case for using organizational resources to implement analytic measurement protocols. And in each example, it is evident that good communication patterns existed between the CLO and the CEO—at least on the specific issues of interest.
So, how does the CLO stay current with the CEO-level issues that analytic information could positively impact?
The “balanced scorecard” approach can help. In a 1992 Harvard Business Review article, business measurement and strategy experts Robert Kaplan and David Norton stated that “what you measure is what you get. Senior executives understand that their organization’s measurement system strongly affects the behavior of managers and employees.” Kaplan and Norton also said that top leadership rarely thinks of measurement as an essential part of their strategy, and that the balanced scorecard approach is a way of linking measurements to strategy, thus focusing employee behavior on what matters most.
What’s important here is that the balanced scorecard approach is a top-down-driven process. The metrics that end up in a balanced scorecard are those that are useful to senior-level executives. Smart CLOs will participate in the development of their organizations’ scorecard and then will align their actions to positively impact relevant scorecard metrics. The result is an ongoing and periodic communication opportunity with the CEO and the executive team, using language that is understandable and metrics that are relevant to other executives.
Many years ago, an accountant acquaintance commented that obtaining the numbers was the easiest part of his job. The harder task was convincing stakeholders of the accuracy, reliability and validity of those numbers. The numbers themselves would not be used for management decision-making until the stakeholders “believed in” the numbers, he said.
The same is true when applying analytics to learning and development. Good use of analytics assumes that the numbers produced are accurate, reliable and valid. However, it is not enough for the CLO to be confident of the accuracy, reliability and validity of the analytics. The more important issue is whether the CEO and the executive team believe in the numbers.
This issue surfaced the first time I conducted a major ROI study for an expensive engineering training course. As an evaluation expert with experience in research design and analysis, there was a high personal confidence level in the accuracy, reliability and validity of the numbers.
However, the resulting 299 percent ROI figure was met with disinterest, if not outright disbelief. The CFO stated that in his experience, 15 percent to 20 percent ROI was a good result for a capital investment (implying that 299 percent was so far out of the norm, it was unbelievable).
For the next ROI study, two process changes were made, and both of them involved bringing other people from the organization into the ROI-calculation process. First, the C-level team was provided with some background education about the measurement of learning and development. Second, the finance department was co-opted into producing the numbers and calculations. The result: When the training evaluation study revealed a first-year net ROI of 350 percent, the CFO himself informed the CEO of the organizational value of the training.
The lesson learned: Numbers alone will not necessarily motivate action. Make sure you are dealing with numbers in which the decision-makers believe.
What to Remember
Effective communication between the CLO and CEO contributes greatly to an organization’s success. This is especially true when training analytics is involved. A clearly stated business case for a measurement effort ensures that the measurement results will directly address decisions that are of interest to the CEO and the executive team. Using a balanced scorecard approach ensures that the numbers being produced are perceived by the CEO and the executive team as accurate, reliable and valid.
Remember, numbers alone are not enough. The CEO/executive team must believe in the numbers before they are willing to act on them.
Larry N. Long, Ph.D., is national sales director for ACT’s Workforce Development Division. Formerly the head of ACT’s Training Evaluation Services department, he has more than 25 years of experience leading training and development for major U.S. corporations and government organizations. He can be reached at firstname.lastname@example.org.Filed under: Measurement, Technology