Elearning ROI: Can we lead the way?

Leader exhorting followersWant to show that your elearning adds value? Tom Kuhlmann recently suggested these steps:

  • Meet your customer’s expectations
  • Align your projects to real measurable objectives
  • Control production costs

Measurable goals and controlled costs are vital. Of course, we also want to meet our customer’s expectations–but are those expectations always best for the business?

Sometimes we get the best results by challenging our customers’ expectations–in a helpful way, of course.

“Put this PowerPoint online and make it look slick. We need it in a week.”

Many of our customers want us to put lipstick on a pig, fast. Let’s say we’ve got one of those customers. They have a PowerPoint presentation about safety on the shop floor. Here’s a typical slide:

When loading a widget onto the widget rotator, it is important to remember the following:

  • The rotator can’t support mega-widgets.
  • Lift using your legs, not your back.
  • Securely strap the widget to the rotator. Do not use bungee cords.
  • Micro-widgets can be attached using bungee cords.
  • If power fails while a widget is on the rotator, the rotator will lock. Unlock it by pushing the widget release lever forward, then back.

The customer believes that putting the PowerPoint online will reduce injuries on the shop floor by 5%. We obediently apply some lipstick to their PowerPoint and put it online in a week.

We might think we’ve met the criteria for a return on our investment. The customer got the course they wanted, we had a measurable goal, and we kept production costs low. But actually we’ll fail, because a semi-random string of information viewed once isn’t going to change people’s behavior on the shop floor.

We’ve spent our company’s money and our time on something that can never pay us back, because we didn’t challenge the customer’s expectations.

What’s the alternative?

To prove our value, we need to take a more active role in business performance. We need to:

  • Identify a measurable business goal–a change in performance, not a score on a test
  • Design a solution that will actually move us toward that goal–probably not an information dump
  • Measure the effect of the solution–did real-world performance change?

This means we need to challenge the customers who expect us to slap an elearning band-aid on their problem. We need them to see us as people who improve performance, not just people who put information online.

How can we change customers’ perceptions?

We can try to change perceptions by educating our current and potential customers, ideally in an ongoing way (blog? series of emails? mini-presentations?). Examples:

  • Describe how a similar company creatively solved a performance problem
  • Send them links to elearning that actually improved performance
  • Encourage them to periodically ask staff what they need to make their work more effective (future job aids; process improvement ideas)
  • Periodically ask them to forecast their elearning needs (to avoid a state of constant emergency)

When a customer shows up, we can interrogate them:

  • What’s the problem you’re trying to solve?
  • How will you know you’ve solved it?
  • What have you tried so far?
  • Why do you think elearning will help?
  • Why is this long-standing problem suddenly an emergency that needs a solution in one week? (Obviously not the best wording. The point is to learn about the pressures that might be limiting the customer’s perspective.)

“But we shouldn’t put a price on learning!”

Some people argue that education is an investment in the future that can’t or shouldn’t be measured. Some corporate programs, such as leadership initiatives, probably fit in that category.

But is “How to Rotate Widgets” really a lofty investment in the future of humankind? Most of our elearning should be expected at the very least to pay for itself, and we should be able to prove that has happened. Otherwise, we’re wasting energy and money that could have been used to make real improvements.

What do you think? How can we get customers to care more about the business effectiveness of their elearning? How can we get them to see us as performance consultants? Or is it enough to just provide a “return on expectations” and forget about measurable change?


  1. Tom Gilbert made a careful distinction between behavior (what you do) and accomplishment (what gets done). One way to look at it is that behaviors are verbs (rotate widgets, interview applicants, process loans) while accomplishments are nouns (rotates widgets, completed interviews, processed loans).

    How does that help? One way to talk to your client is to start with the accomplishments. This is a lot like “how will you know when you’ve solved the problem?” — the percentage of correctly rotated widgets goes up, or the length of time goes down, or the level of rotation-related injury drops. (That last example probably deals more with the behavior — am I following good safety practice when I’m rotating? — but it emerges from the accomplishment and the standards by which the accomplishment is judged.

    In turn, this opens the possibility of true problem-solving based on accomplishments. “So, you want to reduce the loan error rate by 25%, and you’d like to shorten the time-to-approve by 20%. If we can develop a form that reduces error, that means we don’t need to train our way around the error.”

  2. sigh… that should have been “…rotated widgets…” as an example of an accomplishment.

  3. Cathy, I LOVE this line:

    “We need to challenge the customers who expect us to slap an elearning band-aid on their problem. We need them to see us as people who improve performance, not just people who put information online.”

    I think the real problem is the lack of effective branding within T&D departments/agencies. If we, as learning professionals, would do a better job of branding what we do as “improving performance and changing behaviors,” I feel confident it would cut down on some of the false expectations that our clients have about what we will do with their content.

  4. Thanks for the comments. I agree that one of the big challenges we face is the tendency to view training as a quick answer to every problem, when other solutions haven’t really been considered. I like Dave’s points in this post:


    As Dave points out, if we were to imitate performance improvement expert Tom Gilbert, we’d ask our customers these questions:

    – Do people have the information they need? (For example, do they know what standards they’re supposed to be meeting?)

    – Do they have the instruments they need?

    – Do you have incentive systems to support the performance you need?

    These questions identify improvements that could remove the need for training. If we ask these questions before agreeing to develop a “course,” we’ll reshape our brand–we’ll help customers see us as people who improve performance. We’ll also have a much deeper understanding of our company’s challenges, so when we do create training, it will more likely be effective.

  5. Raj Sekhar says:


    Cathy, well done, and great thanks for your informative and innovative writings..



  6. The problem that I have in changing the perceptions of our clients comes down to us not always having ‘evidence’ that performance has been improved. We create e-learning for clients and receive initial feedback from them but very little evaluation seems to go on and if it does we rarely hear about it. What recommendations would you make , apart from trying to sell our serivces to perform the evaluation?

  7. Smithy, I’m not sure how to convince clients to evaluate their projects. Surveys I’ve seen suggest that only a minority of companies measure the effectiveness of elearning beyond checking a smile sheet. I think this is due in part to the common habit of creating elearning for broad, vague reasons, such as “Improve customer service.” If the client starts out with a fuzzy goal, they won’t be able to measure anything.

    So I wonder if we can help them by identifying measurable outcomes early in the design. First, we might need to steer the client to identifying a measurable result for the elearning–something like “Increase customer retention by 8%” instead of “Improve customer service.”

    Once they agree to a measurable goal for the project, we can identify what the learners need to do (not know) in order to reach that goal. These actions will usually be measurable as well (like “Respond to all customer emails within 24 hours”).

    During our design discussions, we could ask the client how they plan to measure the improvement caused by the elearning. If they give a vague response, we could suggest concrete things they can do, based on the goal and actions we’ve identified in our design.

    It’s a challenging situation.