20 Things I’ve Learned from 20 Years in Learning Measurement

smiling business woman sits in front of her laptop

by Bonnie Beresford

“There’s something about a new decade that naturally makes us look back and reflect. I started reflecting on my measurement journey, which started 20 years ago, when a chief learning officer recently asked me if his training was having an impact on the business. Today, with the benefit of 20:20 hindsight and dozens of exceptional clients who invited me to help them advance their measurement practices, I offer my reflections on 20 things I’ve learned over 20 great years.

No. 1: Be curious.

Curiosity about your business, how it’s doing, and how to improve it naturally leads to the need for measurement. Let’s face it — the reason we measure is to answer questions (Did I run faster today? Do students like my course? Did my program improve customer satisfaction?). At a recent CLO Exchange, a learning leader said his success with measurement came back to “starting with the business questions you’re trying to answer.”

No. 2: Many organizations want to measure but don’t know where to begin.

It’s tough to jump right into measurement without having a vision for where you want to go. So, start by setting goals for measurement. Next, evaluate your measurement readiness by examining your current state in terms of data, skills and tools. Finally, envision your future state, build a plan and set priorities. Best advice: Start small and get some quick wins.

No. 3: Measure to prove; measure to improve.

This has become my personal mantra and, I hope, my legacy. Whether you’re measuring basic Level 1 results or a big business impact study, it’s great to prove that students liked your training or that key metrics went up. But being open to learning from what didn’t work is even more powerful.

For example, a recent business impact study of an experiential sales training program showed that sales went up for attendees. Bravo! The team dug deeper and learned that all the gains were coming from average performers while high performers saw no improvement at all. This triggered revisions to the invitation strategy for future events, catering more to average performers than the stars. While the initial measurement “proved” the benefit, the real gain came from insights that improved the program to realize even greater impact.

No. 4: Remove the fear.

If people think measurement will be used punitively, you’re sunk. Collaboratively craft a statement of intent for measurement (see example below) that articulates how you will use measurement and the value you expect it will bring.

Sample Statement of Intent

The ABC Academy will use learning and performance analytics to
continuously improve the quality and relevance of the solutions it
provides to its learning audiences.

The ABC Academy will build a culture that incorporates measurement
thinking and evidence-based practices into all it does.

Team members will be much more open to the idea of measurement if they know it will be used to learn and improve — not to punish poor results.

No. 5: Show that you live by your statement of intent.

The first time you get unsatisfactory measurement results, use it as a moment to demonstrate your commitment to use measurement positively, not punitively. One leader showed this when a project manager dug into a poor Level 1 result to discover that an unintended audience was required to take her web-based training. They were the ones giving the low scores. The PM quickly removed the requirement for that audience. Rather than punish the mistaken audience assignment, her learning leader publicly praised her for being learner-centric, paying attention to the data, uncovering the root cause and fixing it.

No. 6: Get your house in order and run learning like a business.

This means mastering the basics — the operational and efficiency metrics. This is the first gate for measurement. It tells you the health of your business. You need to understand your LMS data (audience size, throughput, credentialing rates, number and age of courses, Level 1 scores, etc.). Build an operations dashboard and trend how you’re doing. Mastering this data sets you up to pursue more rigorous measurement.

No. 7: Share your operations dashboard with your L&D team.

Your team needs to know what’s important and how things are trending. They should participate in operational wins and collaborate on improvements when trend lines head south. We all know “what you measure is what you get.”

No. 8: Make your Level 1 results more actionable.

You’ve seen them — the Level 1 reports listing all your courses and scores. Ho hum. What are you going to do with this data? How do you know where to dig deeper?

I’m often asked what a good Level 1 score is. My response is always the same: What’s a good score for you? Budgets, platforms and such vary between organizations, and all influence the quality of training. A 4.3 on my Level 1s may not be the same as a 4.3 on yours. So, benchmark yourself! What’s “good”? Check out where you are today — it’s a starting point. Start by simply saying the top one-third of your courses are great. What’s the cut score to make it into the top third? That’s your target! You could do the same for the bottom third to identify your troubled courses.

If it feels like you’re splitting hairs between a 4.33 and a 4.34 on the common 5-point scale, get a better distribution by using a promoter score where 5s are promoters and 1s through 3s are detractors (check out Bain’s Net Promoter Score to see the math). Now you’ve got a 100-point scale. Try setting great at 70 and good at 50, and see how your courses rate. Then take action.

No. 9: Don’t collect data if you aren’t going to use it.

Let’s face it, after a course has been around six months or so, are we really still analyzing Level 1 data for that course? Numerous learning leaders have confessed that they don’t pay much attention after the first few months. So why do we keep surveying our learners and collecting data we are never going to use? We are wasting their time. By example, one learning leader did some back-of-the-napkin math and determined that in the past 12 months, they wasted 11.57 man-years of student time taking surveys no one was analyzing. They immediately adopted a policy to turn off surveys after they’ve gathered sufficient responses.

No. 10: Engage business partners in your business impact measurement projects.

Why? They’re the ones pulling employees off the job to participate in your training, so they really do care if training is improving performance. And they’ll be impressed that you care too. Plus, to measure business impact, you need business data. Guess who knows what business metrics matter and where to get that data? Your business partners. Working together to prove and improve the impact of training makes your partnership even stronger.

No. 11: You don’t need perfect data to measure something; “good enough” often is, well, good enough.

Even if your data situation isn’t perfect, it doesn’t mean you can’t measure. There will always be some errors, noise and gaps in your data. Unless you have reason to believe it’s all garbage, measurement can still give you an indication of where you stand and how your metrics are changing over time. (And if people can see how their data is getting used, they might be encouraged to clean it up!)

No. 12: Understand your organization’s business.

Nothing can derail a results presentation faster than showing your ignorance of the business. Imagine the lost credibility an analyst experienced when she presented flawed key findings during a sales training program. She excitedly shared her discovery that “E” franchisees consistently outsold “A” franchisees. She was promptly told that those categories are explicitly assigned based on prior year sales volume and her findings weren’t “findings” at all. The rest of her presentation was largely dismissed.

No. 13: Tailor your results report to the audience.

Different people are moved by different types of evidence. Some will latch on to the numbers, but others need more of a story to be swayed. Have the details in your back pocket, but don’t overwhelm your key message with those details just because you find them interesting. Chances are most audiences only want to see “the answer.” It’s best to know in advance so you can hit the mark.

No. 14: Good data visualization conveys messages.

People interpret pictures faster and better than they do words or tables of numbers. You want your audience to get the point so their mental effort is spent on discussing implications rather than trying to interpret a complicated pie chart. Great presentations (and not just those showing measurement results) are visually stimulating and include things like crisp graphics, a poignant photo with a single data point or a bar graph with little icons of cars or people (versus, well, just bars).

No. 15: Become a good storyteller.

It doesn’t matter how great, insightful or interesting your results are if they’re not communicated well. Figure out the storyline. What business question are you answering? What message do you want your audience to remember? Where appropriate, gather qualitative evidence as well — the voices of the learners, program developers and instructors can help put a human face on the metrics and findings.

No. 16: Have a point of view on what your results mean.

After all, you’ve analyzed it, stayed awake nights thinking about it and pondered the implications of what you’ve learned. You now have informed insights. Don’t stop with simply reporting results. You’ll gain a lot more respect and credibility when you also come forward with what it means to the business. Be bold and share your insights and recommendations.

No. 17: Business impact does not necessarily mean ROI.

Most business stakeholders don’t know how to interpret your ROI of 137 percent. They do know how to interpret “sales went up 3.5 percent for trained salespeople.” Save yourself the ROI math and focus on the metrics that matter to the business, like sales volume, turnover rates and customer satisfaction scores.

No. 18: A good data analyst is worth their weight in gold.

Learning measurement involves human capital data. By its nature, it can be messy. Add your training and HR data to the business data coming from various systems across the company and you’ve got a pile of files that hold secrets. Unless you have a data analyst who can clean it, integrate it and begin interpreting it, you are nowhere. A good analyst — one who is great with data and understands human capital — can elevate the way a learning team thinks about their work and its impact on the business.

No. 19: Outside help can be very beneficial.

There are a few critical times when it’s especially useful to have outside help on your measurement journey: coaching through the development of your strategy, upskilling your team and supplementing your team to execute rigorous business impact studies. You will learn a lot from those who have done this before. Don’t hesitate to seek help. It will get you there faster.

No. 20: Measurement is not a project; it’s a mindset.

Once you’ve started getting insights from measurement, there’s no going back. The level and depth of your questions become more sophisticated. Your insights become, well, more insightful. Your recommendations become more evidence based. Indeed, measurement becomes a mindset.”

Leave a Reply

Your email address will not be published.

You may use these HTML tags and attributes:

<a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>

5 × 5 =