Monday, August 20, 2012
In my ROI of Coaching consulting practice, I hear this a lot and of course we can make it easier! In fact, we have worked with several organizations and individuals who are not yet ready for ROI, but still wanted to measure the success of their coaching work or program... and you can do the same!
Reaction data can be valuable
Not long ago, I conducted an evaluation of a coaching and leadership development program for a global organization and discovered just how valuable reaction data can be. Judging by the positive word-of-mouth buzz about the program prior to data collection, everyone assumed the program was a great success. Participants enthusiastically talked about the program to their peers. Managers of the participants talked about the improvements they saw in their direct reports. The learning and development manager and coaches talked about the great learning and insights they saw in participants. And, when the reaction data was finally revealed, most of the data confirmed what had been heard: 100 percent of participants agreed that the program was effective, they reported it was a good use of time, relevant to their job, a value-add in terms of time and funds invested, and they reported that they would recommend it to others. Still..., most of the participants reported that the program had not been important to their success in their jobs by the conclusion of the program.
Where do we go from here?
At first, this would appear to be disappointing news-- especially since this item is likely to be the most important reaction/satisfaction measure. But, the organization viewed evaluation as a way to improve the program for the long-term, and with that in mind, we reviewed all of the data with the participants and we were able to capture their thoughts about why the program did not contribute to their success in their jobs. What we discovered was that they believed the program was great, but that the content of the program was not linked tightly enough to the most important things they needed in their jobs to be successful. As a result of what we learned, we recommended a change in the approach to the program for the following year to create greater alignment between the content of the program and the participant's needs.
What did we learn?
This story illustrates that reaction data, though easy to capture, is still valuable and that capturing data provides more valuable and useful insight than word-of-mouth buzz. Most importantly, results that are not positive can be used to improve the program.
Want to learn more? Click Here for an Audio Recording about capturing reaction/satisfaction data and learning measures as featured in the article "How to Start Reporting ROI" by Lisa Ann Edwards in the May 2012 issue of Coaching World (pp. 5-7).