Written by

Jessica Farquhar Campbell
December 13, 2018

Keeping Yourself Honest with the Right L&D Metrics

1_Keeping Yourself Honest with the Right Learning and Development Metrics_Learning_CGSinc.com_.jpg

Anyone who wants to tap learning and development to close gaps needs a strategy to get things done—a plan that outlines a future state using measurable outcomes and proves how L&D is the right tool. Learners themselves have strategies—whether they know it or not—leaders of business teams large and small have strategies, now it’s time to ensure that chief learning officers and learning teams’ design learning strategies that impact business metrics and help you define success.

To determine whether you’ve got the right numbers, try this checklist on for size.

First question to ask yourself: is there a partnership between the learning-development team and a sponsor from the business who is a.) responsible for driving top goals and b.) invested in the value of learning?

If not, go back to square one. Ask yourself why not; then fix it. Learning cannot impact the business without that partner. And learning will not be successful without impacting the business. A sponsor is the person who cares about the numbers; they will need to be armed with hard proof that learning is working so they can continue to sponsor the strategy. That accountability alone will do a lot to keep you honest.

I have heard L&D professionals claim they know what’s best for the learner/business in order to justify the fact that they worked in isolation. Sure. You might be an expert. But you cannot drive toward outcomes that only you care about. Not for long.

That’s why the Kirkpatrick Training Evaluation Model has four levels—to cover what the learner wants, what their leader wants, what the L&D departments wants, and what the business wants. To keep yourself honest, your L&D strategy should address all four levels.

Often overlooked, the second question is, do you know what the learner wants?

The most basic evaluation, Reaction Level in the Kirkpatrick model, is a customer satisfaction survey at its core; therefore, your scores are going to be more favorable if you’ve targeted something in the training that the participant has identified as important. So, ask up front, while you are still crafting your strategy, how do you hope to develop this year? What types of training would you like to engage with? A business sponsor comes in handy here because he or she may already have data about what the people in their organization want—engagement or other types of survey responses (another way to track success if they are collecting it on a regular basis). If they do, great, carry that into question three. If not, and a survey or interview series is necessary, attaching a sponsor’s name to the ask will increase the response rate.

The findings of this effort will determine at least one measure of success. Say that leaders in your company overwhelmingly want to learn how to think more innovatively. Then that becomes a metric you aim to hit. You can measure it through an innovation index taken before and after training or through self-assessment.

Accounting for the voice of the customer is not the end-all be-all unless your learning group performs more like a an innovation or process improvement team  and only calculates a Net Promoter Score.

The third question is where it gets a little philosophical: What is learning?

The business and the learning experts need to be aligned on what the participants in the training or development programs will gain. Think of this step as identifying the noun, the main content in your strategy. Whether on a small scale or enterprise-wide, you need to identify what’s important for people to know. If the top priority at your company over the next 12 months is adding a new line of business, you might want to start to educate your people on aspects of this new reality that will affect them.

How do you measure success? Quiz them before and quiz them after on the desired knowledge. Report out the change to show progress. Aggregate the data if you are doing multiple rounds. This is your typical classroom stuff, teaching to the standards at its finest. So be sure to use best practices! No true-false questions, please. The difference from the baseline will be telling. And since you have your sponsor it will also be business relevant.

Question four is all about the verb.

What will your participants do differently after going through your training? And how do you quantify it? A trusting, ongoing relationship with your business partner will ensure you have access to folks on the job in order to observe their behavior when it makes sense to step in and measure (probably not the day after training; negotiating this timeframe is an important part of planning). This can be as simple as a tick sheet—how many times did this behavior occur before training and after?

Most important, if trainees are unable to change their behavior, you can be there to consult on why that might be. Are there environmental factors, leadership issues, or something to improve in future iterations of the training?

Finally, the ultimate success measure is, what does the business need? Question five is dependent upon that business sponsor and understanding what is important to her.

Even if financial performance is top of mind for your senior leaders, there are ways learning can impact those special, seemingly out-of-reach priorities. Balanced Scorecards are a useful framework for connecting foundational pieces like talent and technology with higher order objectives such as financial growth targets. They usually connect through goals that flow from Finance through customer then Process down to People-focused goals. If your company doesn’t have a Balanced Scorecard, you can still apply this framework, which means you might end up hooking your strategic goals onto a Process metric (such as waste reduction) since that’s the closest quadrant to learning and therefore the easiest to impact. Then as long as that Process goal affects a Customer goal, which supports a Financial target, you have indirectly influenced the bottom (or, in this case, top) line.

Whether you use a balanced scorecard approach or not, truly understanding the success metrics your business leaders and customers are obsessed with is a necessity, and having that conversation can be very grounding for learning groups who often feel like they live on the periphery.

Question six: Are you confused yet, maybe even a little scared, about what you are finding? Good. If not, you haven’t been digging deep enough (Go back a question or two or three).

There will certainly be tension among the desires of these groups who have been interrogated. The participants, their leaders, and senior leaders don’t always agree on what’s important. Throw in the opinion of the learning designer, and things might feel a little messy. That’s great! And you don’t have to pick one to please!

I think about a “Women in Leadership” program that I recently conducted,  some thought the purpose should be to advance more women in the company, some thought it should teach high-potentials how to be better leaders, and others thought it should highlight women’s unique strengths as leaders. When I surveyed the women invited to participate, I learned that many of them didn’t have ambitions to enter the C-Suite.

So, was it my job to pick different participants? To teach them ambition? Or give them projects that would hone their leadership abilities and build on their strengths? Yes, yes, and yes. Reconciling these ideas into one cohesive strategy took time, testing, and creativity. Ultimately, we invited men to participate and turned the program into a leadership session about gender (not just women). Because we realized that the lack of women at the top of the company was a problem if we wanted to better reflect our customer and have richer conversations in the C-suite. And that women alone weren’t going to affect that change.

In this case, the reaction score was a little less important than advancement numbers since the session was basically a two-day-long uncomfortable conversation (although women tended to demonstrate a lot more satisfaction right after these sessions than they did after a session in the previous model where they often expressed feeling like they were being told to change in order to succeed).

These workshops were also very action-oriented, so by keeping the participants in groups for several months after the in-person part, we were able to check on whether their behavior changed. And they held each other accountable. Case studies and storytelling helped demonstrate the program’s success.

And, by the way, my business sponsor helped me prioritize success metrics and make difficult choices about content and participants because her opinion carried a lot of weight.

Question seven: Are you having fun yet?

No strategic plan is complete without an element of play or exploration. Sure, the data can create a really nice plan for you if you’ve gathered it well and matched up the right points. They will tell you what to put in your strategy, they should point to a logical timeline, and ensure that “everyone is happy.” But there is an imperative to leave room for creativity, for the truly human element that the learner will only experience if the designer felt it.

To paraphrase Robert Frost, no fun for the designer, no fun for the learner.

Additional Resources:

Written by

Jessica Farquhar Campbell

Topics