The scenario is familiar: Renewed emphasis on accountability has triggered the completion of several impact studies for major learning and development programs. Reaction, learning, application, impact, ROI and intangible data are collected. One significant program has attracted the interest of the senior management team. The study is now complete, and there is a need to communicate the data to several audiences, including the senior management group.
For key stakeholders, the best way to present this first impact study is through a face-to-face meeting. So many issues must be explored and so much is riding on the outcome that a face-to-face meeting is about the only way to cover a full agenda. The good news is that the executives will usually come—some because of support for the program, others out of curiosity, skepticism and perhaps even cynicism. These executives have not seen this level of analysis connected to learning or the specific ROI data for a particular learning solution. Consequently, they will invest an hour to understand the results.
In this critical meeting, issues must be explored, impressions made and conclusions drawn. This may be the only opportunity to explain the methodology and assumptions utilized to measure the impact of major programs. This provides an opportunity to explain the conservative approach and the credibility of the overall process. If the executives do not buy into the process, they will not buy into the data.
This is an excellent time to gain support for measurement and evaluation, and convince the executives that this type of activity is critical for setting priorities, improving and managing the function properly. This is also an opportunity to get buy-in for the data in the report and support for the conclusions drawn from it. Ultimately, these meetings enable the learning staff to build more respect for human resources and the value it can and does add to the organization.
Finally, this meeting provides an opportunity to secure commitment for action to make changes based on the recommendations from the study. Almost every study will have issues and problems, and will need adjustments. This meeting can build the support necessary for needed changes.
Now for the bad news: This meeting is not going to be easy. It is perhaps one of the most difficult challenges for the learning staff. The audience will be made up of important executives, and a new process or methodology will be revealed. More importantly, a specific program will be judged. To be successful, great content and skills are needed. For the most part, the executives may resist the meeting: They are busy, and learning is not always their top priority. However, on a positive note, they are curious about what is being presented.
The Ground Rules
Several basic ground rules are essential for success:
- Do not distribute the impact study until the end of the meeting: An advance copy provides an opportunity for executives to flip through key issues. They almost always go to the ROI tab and, regardless of the numbers, have concerns, questions and issues. If the number is high, they will not believe it and will ask pointed and penetrating questions. If the ROI is negative, they will drill the presenter with questions about what went wrong. Either way, you lose control of the meeting before you get started.
- Be precise and to the point: This is not the time for small talk, unrelated stories or unnecessary anecdotes. Be organized and focused throughout the presentation, spending precious little time on unimportant issues and more time on items critical to the success of the meeting.
- Avoid jargon: Executives get turned off easily by jargon—particularly jargon in the human resources development (HRD) area. Keep to the business language at hand. Executives may be unimpressed with terms such as “value proposition,” “return on people” and other words that may be unknown to the group.
- Spend less time on lower levels of evaluation data: Executives care very little about reactive-type or perhaps even anecdotal data. Although they realize the value of qualitative data, they are more interested in the quantitative data that focus on impact and ROI.
- Present the data with a strategy in mind: The presentation must be planned with a specific strategy in mind. This leads us to the suggested format.
The presentation should involve 12 discreet data sets that must be presented effectively and efficiently. Some will require no more than one or two minutes while others will require more detail:
- Describe the program, and explain why it is being evaluated: Describe the scope, magnitude, costs and overall importance of the program. Explain why the program should deliver value, and consequently, why it is a candidate for this type of analysis.
- Present the methodology process: Executives should quickly see that a step-by-step, logical process was used to collect, process and report the data. They need to know that this systematic process is used each time. Explain the standards or assumptions—they form the basis for credibility of the data. The assumptions used in the analysis should be presented quickly and with an illustration provided for at least some of them.
- Present reaction and learning data: Only one slide is needed to show the reaction to the program, using data items such as importance, relevance and usefulness. Only a very brief presentation of learning data is needed, showing the extent to which employees have acquired or enhanced skills.
- Present the application data: One or two slides should be sufficient to show how the employees are utilizing the program as planned. It shows the progress made with implementation.
- List the barriers and enablers to success: Attention is quickly diverted to barriers—detailing the issues that kept this program from being more successful. The message: If removed or minimized, additional results could have been obtained. The opposite of the barriers, the enablers have contributed to the documented success. The message: If more attention is placed on the enablers, there will be more success.
- Address the business impact: This is part of the presentation that executives want to see: the business measures that have been influenced by the program, including how much they changed and over what time period. Describe the technique used to isolate the effects of the program on the data. Executives quickly see that this program is only one of several factors driving this success. Show the conversion to monetary value.
- Show the costs: At this point, the fully loaded costs are presented. Only one slide is necessary. It should be emphasized that both direct and indirect costs are included. Ideally, the finance and accounting person in the group should be asked to comment on the cost categories. Previous involvement of the accounting function is critical. At the very minimum, the accounting and finance staff should buy off on the cost guidelines to show what is included in the report.
- Present the ROI: This is what your audience has been waiting for! The ROI calculation is shown along with the data included in the formula. Executives are reminded that this calculation is essentially the same as the calculation for other investments—earnings divided by investment. If the numbers are extremely large, all of the previous data points and the conservative approach will help bring credibility to this particular value.
- Show the intangibles: All of the intangible benefits are presented. These measures have been excluded from the conversion to monetary value. They are listed only if they are linked to the program in some credible way. Intangibles are important. If the ROI is extremely positive, the intangibles provide a sense of value added. Also, the intangibles may help to overcome the impact of a negative ROI.
- Review the credibility of the data: List eight or 10 key points, mostly from the assumptions, and check off the reasons why the study is credible. This point-by-point checklist helps validate the process with the executives.
- Summarize the conclusions: A quick summary of the data is provided, including a one-page overview handed out immediately following the presentation. Executives are reminded that this one-page format is what they will receive for other programs in the future. It is important for them to understand the data sets and what they mean.
- Present the recommendations: Every study leads to changes or improvements. Specific suggestions should be offered, ranging from a few minor adjustments to elimination of the program.
After the presentation, the complete report is distributed. The detailed report includes all of the appendices, raw data, data collection instruments and other items necessary to understand exactly what was accomplished and how it was conducted. Some executives will examine it in more detail or hand it to one of their assistants to review—particularly if they are somewhat skeptical or concerned about the data. The report reminds executives of the resources required to produce the study. It is helpful for these executives to understand resource requirements so they will utilize this level of analysis very selectively in the future.
Call to Action
Three important conclusions must be drawn from the meeting. The first is that executives need to approve the recommendations from the study and implement the changes needed in the future.
Second, the use of impact and ROI analysis in the future should be discussed. Executive input is needed to determine which programs are appropriate for this type of analysis. Executives often have concerns about certain programs and will provide input on which programs need evaluation at this level.
Last, additional support for learning and development should be secured. Ask for support for these programs. Explain what the executives can do to make the programs successful in the future. This discussion may lead to some interesting changes in perceptions and stimulate more involvement in learning.
Jack J. Phillips, Ph.D., is chairman of the ROI Institute. He developed and pioneered the ROI process and has written more than 15 books on the subject. Jack can be reached at email@example.com.Filed under: Measurement