Leadership development firm The Soderquist Center reversed declining enrollment in one of its signature programs by quantifying its benefits, and learned more about itself in the process.
Amid the economic downturn and shrinking corporate budgets, CLOs are likely to witness a decline in resources dedicated to their learning solutions. C-suite demands for greater accountability and documented ROI make this scenario especially unsettling. Justifying your organization’s existence is never fun.
If you find yourself in this situation, you should ask yourself the following questions: First, how can you, as a learning organization, provide greater value to your customers (buyers) and demonstrate it convincingly? Second, how can you convey to the executive level that your programs are mission-critical?
Wendy Soderquist-Togami, director of emerging leaders at The Soderquist Center, a value-based, leadership development firm located in Arkansas, faced this situation and answered these questions to great effect. The Soderquist Center serves a diverse group of organizations, from local nonprofits to Fortune 500 companies, and recently experienced declining enrollment in one of its signature programs, Milestone, a retreat-style, experiential leadership development program designed to equip high-potential leaders with the tools necessary to succeed.
Before making any operational changes, and desiring a more effective way to communicate Milestone’s value to customers, Togami chose to assess the learning function using hard objective data.
“We believed [Milestone] was our flagship program and had put a great deal of effort toward that end, but found we were not able to communicate the ROI effectively,” Togami said.
With this goal in mind, she launched a comprehensive measurement and evaluation program for Milestone and other programs that comprise the firm’s core portfolio of products. This was an innovative and perhaps risky move. Like other small to midsize learning firms, The Soderquist Center lacked the necessary internal resources to launch an outcomes-based measurement initiative. Implementing such a program, however, was crucial to achieving operational excellence and increasing market share.
The measurement plan has exceeded expectations, delivering a set of data actionable to both stakeholders and customers. While it is difficult to isolate the unique impact of the initiative on the Center’s bottom line, measurement has had the unambiguous effect of driving up enrollment in Milestone. Even this year, with the recession in full swing, all of the program’s sessions are fully booked.
Togami said this is “due in part to the operational and communication changes we have been able to make because of this tool [the measurement program].”
The Academic Connection
Nearly half of learning executives polled in a recent Chief Learning Officer Business Intelligence Board survey said their organizations do not have a relationship with a university. Of those that do, 62 percent are not satisfied with the corporate-academic relationship.
The Soderquist Center was ahead of the curve in this regard, having accessed academic assets early on and created a partnership there. Togami engaged the service of a professor and consultant from John Brown University to gain access to subject matter expertise in systems design, measurement and analytics. (The Soderquist Center is housed on the university’s campus.)
According to Togami, this was an important decision because it “provided expertise not available to us in-house and gave us yet another opportunity to partner with the university, [which has] been overwhelmingly beneficial for us.”
The consultant was tasked with designing the conceptual framework of the program; creating and aligning metrics, analytics and process; and making interpretations. The consultant continues to provide recommendations to decision makers and, in collaboration with internal stakeholders, oversees the retooling of metrics and process.
Throughout the consulting relationship, The Soderquist Center has maintained management oversight of the learning function, holding the consultant accountable for timely and actionable inputs and analytics. Internal stakeholders establish benchmarks and create criteria of success. The consultant complements this as an equal partner when decisions about design, implementation and choice of analytics occur. One of the lessons learned is that for such a relationship to work, the academic consultant must be capable of operating in a corporate environment and possess some industry-specific knowledge.
Making a Commitment
Measurement involves the use of financial and human resources. Because of budget implications, it is crucial to secure high-level buy-in before implementing a measurement process. Without it, measurement programs risk being under-resourced, making success difficult.
Andy Wilson, CEO of The Soderquist Center, was initially skeptical of the organization’s measurement program. However, after he understood the need for measurement and saw results — especially in helping him communicate ROI to customers — he became an enthusiastic supporter of the initiative.
A research-based argument and compelling evidence were the prime movers here. Having Wilson, and others on the senior leadership team — including founder Don Soderquist — on-board was critical to the success of the initiative. During the design phase, senior leaders asked probing questions, especially related to measurement ROI. When it came to implementation, they created momentum and helped ensure compliance from the director level, down.
Many CLOs find it difficult to convince their CEOs to champion measurement initiatives. There is encouraging evidence, however, that the market for metrics is open. According to a recent ROI Institute poll, only 8 percent of CEOs surveyed reported measuring learning and development in their organizations; yet, 96 percent said they should.
To be persuasive, you must demonstrate how measurement creates a positive bottom-line impact and how it aligns with the organization’s strategic priorities. This is where industry publications come in handy. There are countless case studies and research articles attesting to the efficacy of measurement programs. Use them to your advantage.
During the start-up phase of Milestone’s measurement program, designers solicited input from the directors of each organizational unit — including marketing, business development, customer experience and customized solutions — to help define the goals and metrics for each learning solution. This input was critical for achieving measurement functionality, as well as director-level buy-in.
At The Soderquist Center, oversight of the measurement function is located at the director level. This has been valuable for maintaining institutional memory, ensuring alignment between organizational units and creating sustainability in the measurement function.
Measure What Matters Most
When designing a measurement and evaluation system, the timeless adage “garbage in, garbage out” is especially apt. Measurement efforts fail not only because of poor execution but also because of poor planning. Investing time, effort and financial resources in the design process increases the chances the process will deliver actionable information.
The Milestone program was evaluated on multiple dimensions, including design, delivery, satisfaction and learning. Unfortunately, measuring learning transfer is notoriously difficult, and for many learning solutions, translating findings into bottom-line results is nearly impossible. At The Soderquist Center, measurement designers devised an intuitive, yet powerful way to estimate program outcomes and learning impact.
Many of the desired outcomes of Milestone, such as ethical leadership, defy easy conversion to numerical metrics, making estimation of business impact difficult, if not impossible. To get around this problem, for each Milestone session, evaluators assessed the extent to which desired learning outcomes align with observed behavioral changes in clients. They present these findings to key stakeholders, using success metrics and anecdotes to make their cases. The design and implementation process is straightforward.
First, executive and director-level stakeholders determined the program’s objectives, ranging from overarching goals to specific learning outcomes. Second, metrics were aligned with goals and submitted to an internal review process. An inclusive instrument was then created, piloted and sent to the field. Third, to gauge learning impact, after 90 days, each Milestone participant was resurveyed using metrics that assess behavioral change due to program participation. For each learning outcome, participants were asked how their behavior or outlook had changed after attending Milestone. The analytics include a mixture of quantitative and qualitative (e.g., as client stories) measures, providing a comprehensive and textured picture of program impact.
Despite the pitfalls of using self-reported data, this approach is a cost-effective, unobtrusive way to assess the behavioral impact of learning, especially for an outsourced learning firm that cannot enforce participant compliance.
Use the Data
The Soderquist Center is committed to gaining a return on every part of the learning investment. For the Milestone program, this occurs in three areas:
Strategic: The results were shared with the advisory board, the arm of The Soderquist Center involved in strategic direction decisions. Through a combination of empirical data and compelling stories, the board engaged the data and discussed strategic implications. These were opportunities to celebrate successes and identify future needs and directions and have informed efforts to integrate Milestone into other areas of customer relations, connecting the program to the larger enterprise.
Operational: The data enabled the delivery teams to identify trends and high points in customer experiences and have been used to retool a percentage of the delivery.
Marketing: Quotes and testimonials were used in marketing initiatives, including placement on the Soderquist Center Web site. The Center also uses empirical data in assessment conversations with new and potential customers and stakeholders. Business development personnel at The Soderquist Center find that having access to solid data is necessary for convincingly selling its services and demonstrating ROI.
Keep It Simple
Simplicity breeds sustainability. Do not compromise execution by building an overly complex measurement process that cannot be easily implemented. At The Soderquist Center, measurement and evaluation are integrated into each learning solution, with data collection and analytics largely automated. Facilitators follow a measurement algorithm, with an LMS planned for the future, to ensure this process occurs in a timely and systematic fashion.
While oversight remains at the director level, implementation is delegated down the line, becoming the responsibility of each program director. At the end of each Milestone session, client data is forwarded to those responsible for analysis. This has helped build internal capacity for maintaining measurement efficacy while fostering a culture of measurement throughout the organization.
Building upon the success of measuring Milestone and Soderquist’s other signature program, Ethical Leadership Summit, the Center is in the process of extending measurement to the enterprise level. For CLOs wishing to pursue measurement, the lesson is simple: Start small, expand from there, measure, evaluate a core group of learning solutions and build internal capacity. Leverage your experiences to make the case for further funding. Once the decision is made to expand the measurement universe, draw upon what you have learned and deliver superior results.
For the innovative CLO, measurement should be viewed as more than simply an effort to prop up sagging sales. Rather, it is an important investment in the future of your organization. If properly conceived and executed, measuring learning solutions can return long-term strategic dividends.