Four Steps to Measure Impact and ROI in Executive Education

by
Jason Pileggi and Jade Vaillancourt
June 1, 2021
Perhaps what you measure is what you get. More likely, what you measure is all you’ll get. What you don’t (or can’t) measure is lost” – H. Thomas Johnson

Increasingly, professional education programs are being required, like many other business investments, to show clear and measurable results. In their 2019 study, UNICON reported that 56% of L&D executives think ROI for executive education programs is more critical today than it was in previous years.

2019 Report - ROI on Executive Education, Revisiting the Past and Looking to the Future

Corporate L&D leaders are asking for quantifiable evidence that the professional education programs they are investing in bring forth organizational improvements to justify the cost of these programs and the employees’ time away from their desks.

Like all ROI calculations, it takes a lot of thought and effort to understand what should be measured, how and by whom. A strong collaboration between program sponsors and the professional learning organization is essential to successfully match learning objectives with the organization's strategic goals and defining ways to capture reliable data points that prove a program's success.

To help develop a measurable and impactful learning journey for professionals, we recommend thinking through these four steps.

1. Defining the program's learning goals

The first step in developing a program is for the company's program sponsor to define the program’s vision, mission, and goals. Then the professional learning organization can collaborate to translate these program goals into learning objectives and specific skills that participants should acquire during the program, for example:

  • Inclusiveness can include learning how to mitigate biases that arise from heuristics and/or emotions to create an environment where all feel safe to be their authentic selves.
  • Digital transformation covers developing an understanding of the technologies driving the most substantial waves of transformations in the workplace.
  • Communication / Cognitive persuasion would include understanding the target audience, building arguments, and mastering nonverbal communication.
  • To define learning objectives, look to answer these questions:
  • What are the goals of the program?
  • What are the skill gaps the program should focus on reducing?
  • Is there an existing competency model to build the learning objectives from?

2. Getting buy-in from the organization's key stakeholders

Measuring impact requires capturing data from many sources throughout the organization. Without a commitment from important stakeholders, the information-gathering efforts can fall short, affecting the relevance of the data.

To ensure you have the right people supporting the program and have access to the information needed to measurement impact, consider these question:

  • Who is the highest-ranking person involved in developing the program? Have they voiced the importance of the program’s success?
  • Are those responsible for providing information or filling out surveys confirmed their commitment?

3. Choosing which levels of impact to target and identifying which assessment tools to use

A recognized model for analyzing learning effectiveness is Kirkpatrick’s four levels of evaluation. This model breaks down the learning impact among reaction, learning, behavior, and results.  ​

Adapted from Kirkpatrick’s four levels of evaluation model

Ideally, the impact should be measured across all levels of Kirpartricks’s evaluation model.  The levels help define what information should be captured and identify which assessment tools should be used.

To choose the levels of impact to focus on, look to answer these questions:

  • What does program success look like?
  • Are you looking to create and measure the impact that goes beyond individual progress?
  • Which metrics are critical in quantifying success?

Once you know what impact looks like, you can move to determine how it will be measured across the different evaluation levels chosen. Tools to assess progress on the selected skills can be developed. These tools can include self-assessment and 360 surveys, in-session polls, evaluated individual or group projects, and coaching feedback, among other options.

Questions to determine which tools to develop can be the following:

  • Can additional resources be allotted to capturing data from other sources than the participants, like coworkers, supervisors, etc.?
  • How much of the participants’ time can be devoted to providing impact data?

Below are some examples of tools to measure impact at each level:

Level 1 - Engagement. You may consider a participant’s activity in class as a form of engagement. For example, you can capture a participant's “Talk Time” (the number of minutes each participant participates in plenary and breakout discussions.  This is easier measured in digital environments) as one data point to understand engagement. Other tools to capture engagement could include program feedback surveys about the overall learning experience quality.

Level 2 - Learning. This level is about measuring participants’ retention of the theory covered in the program. Participants’ absorption of the learning content can be evaluated during sessions through reflection polls and exercises, and during the learning journey through coaching and group projects.

Level 3 - Behavior. At this level, we are trying to identify if participants are applying the skills they have learned to their work. You may want to include quick in-session polls to ask specifically how participants will apply the learning on the job. If you can, gather feedback from coworkers and managers on participants’ application of the learning in “real life”. Participant self-assessments and 360 assessments completed before and after the program can be used to track changes in behaviors at work.

Level 4 - Results. This can be the most difficult level to assess and may require an extended measure far beyond a learning journey. At this level, we are looking to evaluate the impact the program has had beyond individuals’ progress. Depending on how “results” are defined, it can mean looking into the program’s impact on business units, the entire organization, and even society. To capture this type of impact, KPIs need to be defined early on and monitored and updated as needed over long periods of time.

4. Crafting an impactful learning program and committing to a long-term journey

An impactful learning journey needs to go beyond just providing new ideas.  ​For learning to result in long-term desired behavioral changes, it needs to be grounded in the science of learning. Sessions should be designed to allow participants to actively engage with the defined learning objectives. Research has demonstrated the importance of collaborative learning, spaced practice, and frequent feedback in creating engaging and impactful learning journeys (Kosslyn, S. M., 2017).

Decades of cognitive and behavioral research show that participants retain what they learn interactively much better than the materials they receive passively. Participants should be engaged continuously in some form of active participation. This can include polls, breakout discussions, voting, simulation exercises, and other collaborative learning techniques. Rather than the traditional focus on lectures and information dissemination, learning journeys that ensure a collaborative environment facilitate peer learning and provide real-time formative feedback.

By incorporating deliberate, spaced practice learning fosters both depths of understanding and the ability to apply learned concepts broadly across multiple contexts. This structured progression—or scaffolding—is based on scientific evidence and means learning journeys foster mastery over time. It also allows reinforcement of learning and opportunities to apply knowledge to different professional contexts. Learners need time to digest and apply skills to their specific environment.

Additionally, adding organizational leaders and domain expert participation during the learning journey can lead to greater engagement and a more effective application of linking new learning skills to the job.

Some questions to consider for this topic include:

  • Does the learning journey incorporate fully active learning?
  • Is the content paced and cross-contextual to ensure learners can apply concepts they learned to actual workplace challenges?
  • Which existing initiatives can be integrated into this program?

With the right partner and commitment, showing impact and calculating the ROI of an executive program is possible. Following the four steps outlined above and working through a collaborative process will help ensure the program has been developed with the organization’s needs and goals firmly in mind.

References

Kosslyn, S. M. (2017). The science of learning. In S. M. Kosslyn & B. Nelson (Eds.), Building the Intentional University: Minerva and the future of higher education. Cambridge, MA: MIT Press.

Jason Pileggi is the Professional Learning Academic Program Lead at Minerva Project. Jade Vaillancourt is the Program Success Lead of Professional Learning at Minerva Project.

Subscribe here for updates on future white papers, events, and insights.

Minerva in the Media

June 2, 2022
Minerva Comes of Age, Hoping Not Too Late
March 7, 2022
Ben Nelson: “Somos Mejores que la Ivy League”