Introduction

How goals affect training design

How goals affect training design

This is the second part of a story that started here. We're going to compare the training that results from two versions of a goal:

  1. Salespeople will know all the product features
  2. Mega and mongo widget sales will increase 5% by Q4 as salespeople identify the best widget for each customer

First, we'll rewind the story and give Bob a happier beginning.

1. "Salespeople will know all the product features."

I'm out of the office and someone gives Bob all the money without question. Bob decides to spend it on elearning, because he believes it's the most efficient way to make sure that "salespeople know all the product features."

First, he creates a series of slides that present the features of each widget. He spends hours creating a clickable widget that reveals some feature information. Every five or six screens, he's careful to add a knowledge check.

While a professional narrator records the script that Bob wrote, Bob creates an impressive clickable timeline that shows when each widget was introduced and how its features have evolved since then.

When the narrator sends the audio files, Bob creates an attractive lady avatar to act as a friendly guide through the course. When the avatar speaks, her lips really move.

Then Bob buys a cool game template. He loads it up with questions like "How much does a mongo widget weigh?" and assigns points for the answers.

Finally, he creates an assessment that asks more questions about features.

Bob still has budget left, so he hires a video crew to make a video of the chief of widget sales. The chief talks about how great our products are and how important it is to know everything about them.

Bob puts it all together, starting with the video and then going into the feature presentation, timeline, knowledge checks, and game. He ends with the assessment.

He publishes the course on the LMS and assigns it to all salespeople. He notes with satisfaction that 90% of the people pass the assessment on the first try, and all pass it when they try again.

Bob has met his goal. His assessment "proves" that salespeople know all the widget features.

He moves on to his next project, happily unaware that sales haven't improved and my $40k has vaporized.

What happens if we start with the other goal?

2. "Mega and mongo widget sales will increase 5% ..."

Bob goes through the wringer in my office and emerges with only some of the money and a measurable goal:

  • Mega and mongo widget sales will increase 5% by Q4 as salespeople identify the best widget for each customer

He calls his contact in sales, Luisa. She's the person who originally told him about the need for training.

"I need to understand more about what the salespeople are doing wrong," he says. "Can we meet?"

In the meeting, Bob shows Luisa the new goal.

"That makes me nervous," Luisa says. "We can't guarantee that our training will increase sales."

"I think it's best to consider it a goal, not a guarantee," Bob says. "And I think if we target the right behaviors, we could meet it."

Bob and Luisa spend a couple of hours listing what the salespeople should be doing, what they're doing instead, and why they're doing it. They decide that the following problems have the most influence over salespeople, and they brainstorm solutions.

Problem: Salespeople compete with each other by comparing number of products sold, not income from sales. Since it's easy to sell something that's cheap, they try to quickly sell lots of micro-widgets.

"We could fix that easily," Luisa says. "We'll change what we show on our weekly reports. We'll show the profit that each salesperson brings in, not just the number of products sold. The profit numbers will increase as they sell more mega and mongo widgets."

Problem: Salespeople are paid partly by commission, and the commission is the same for all widgets.

"I'll talk to that Scrooge at the C-level," Luisa says. "Maybe we could make the commission scale with the product."

Problem: Our main competitor, Weeber Widgets, aggressively markets their own micro-widget, which is much better-looking than ours but offers lower performance. Our salespeople want to sell more of our micro-widgets just to prove that the ugly underdog can win.

"I should start focusing more on market share in my meetings with the sales staff," Luisa says. "We should be competing for all customers, not focusing just on one product. We'll beat Weeber by stealing their market share."

Problem: Salespeople will sometimes tell customers about the features of other widgets, but they don't ask enough questions to learn which features the customer actually needs.

"They need to practice this somehow," Luisa says. "They've been focused on one widget for so long that they've forgotten how to ask good questions."

It's all reduced to one training need

The discussion results in several non-training changes and one opportunity for training: "They need to practice asking good questions."

Bob mulls this over and comes up with a suggestion.

"We don't have a very big budget," he says. "So we can't fly all the salespeople to a face-to-face workshop, and I don't think one workshop would change them, anyway. But we could make online activities that help them practice asking good questions, and then challenge them with roleplays at the quarterly regional meetings. That way, they'll get some initial practice plus regular reinforcement, online and in person."

Luisa agrees, and Bob asks her to list the customer needs that are met by each widget.

Using the list, Bob creates profiles of fictional customers, each with a different set of needs, and Luisa makes sure they're accurate and realistic. Then, using scenario software like Twine or BranchTrack, Bob creates two pilot scenarios in which salespeople have to choose the best questions to ask and, based on the answers, choose the best widget.

With Luisa's help, Bob creates a two-page PDF that lists examples of the types of questions that are useful for uncovering customers' needs. He adds it as an optional link to the scenarios so people can look at it for help if necessary.

Bob takes photos of some colleagues who look like typical customers and adds them to the scenarios, which are otherwise just text. Bob doesn't have the budget for audio or video, so he worries that people will complain that the production isn't slick enough.

Luisa reviews his scenarios to make sure the conversations are realistic, and she tests them on salespeople.

"They really got into it," she tells Bob. "If we can keep score, I think they'd try even harder, because they're so competitive. And they didn't notice that it was mostly text. They were too involved in the story to care."

Bob assigns scores to the scenario paths, so the player who chooses the right widget in the fewest questions gets the highest score.

Thinking of the longer-term goal — a stronger brand and increased market share — Bob also adds scenes from the future to the scenario endings. These extra scenes show whether the customer came back for more widgets, recommended the company to friends, or complained about a product mismatch.

Meanwhile, Luisa is changing how she talks about market share and how salespeople compare their performance. She also gets approval for a small increase in commission for mega and mongo widget sales.

Once Bob has created several scenarios representing the most common widget customers and a few unusual ones, he embeds the scenarios on one intranet page. Luisa sends the page to regional managers first, so they experience for themselves what their teams are about to do, and then they encourage their teams to play.

Soon salespeople are visiting the page, trying scenarios in their spare time. Some of them disagree with how certain scenarios play out, which they point out on the company discussion forum. The discussion makes other people try the scenarios and join the debate, and Luisa uses the forum to ask what the scenario should do instead, making people think more deeply about how to match features to needs.

The "training" continues each quarter at the one-day regional meetings. There, every salesperson is given a fictional customer profile that describes what they plan to do with their widget, their past experience with widgets, how much they're willing to spend, and other characteristics. At several times during the day, people are randomly paired off and have to identify their colleague's best widget by asking good questions.

As the fourth quarter starts, Luisa calls Bob.

"You can tell that Scrooge that we met our goal," she says. "Mega and mongo sales are up 5%. And as far as we can tell, it looks like we've won some market share from Weeber, too."

What's the difference?

 
"Salespeople will know all the product features"

This typical goal inspired a forgettable information dump. The course probably refreshed salespeople's knowledge, at least in the short term, but it had no effect on sales because it didn't solve the real problem.

"Mega and mongo widget sales will increase 5% by Q4 as salespeople identify the best widget for each customer"

This goal requires people to do something on the job. It inspired Bob to uncover the real problems behind the request for training. Why aren't salespeople identifying the best widget for each customer? How can we encourage them to do it?

The solution was to make several non-training changes in addition to addressing one highly focused training need: "They need to practice asking good questions."

The training part of the solution helped people change what they did on the job. It gave them online and face-to-face practice, spaced over time.

This story was simplified to keep the focus on my main point, how the goal affects design. For example, in a more complete version, Bob would talk to other people in addition to Luisa. He could meet with a few salespeople who currently do a good job of selling mega and mongo widgets to find out what exactly they're doing differently.

Is this really our job?

It's common to think that training designers should, well, design training. If the client says they need a course, our job is to create it.

But imagine that you've gone to the doctor. "I need some antibiotics," you say. "Please write a prescription for 10 days' worth."

"Sure," the doctor says, scribbling on his prescription pad. "Here you go."

Is that a good doctor? Of course not. A good doctor would ask about your symptoms and try to solve the underlying problem.

That's what training designers should do, too. We shouldn't create a course just because someone wants one. We should ask about the symptoms and try to solve the underlying problem, because that's what justifies our paychecks.

The process starts with helping the client set a clear performance goal, like we've seen in this story. Then we identify what the future learners need to do on the job to reach that goal and why they aren't doing it.

This analysis helps us identify what part, if any, training will play in the solution. It also focuses our design on changing behaviors, not just transferring knowledge, because business performance improves when people do things differently.

Want to comment on this? Please join the discussion in part one of the story.

For a quick look at how to phrase a useful goal, see this post.

You might also be interested in this discussion on Will Thalheimer's blog about whether Kirkpatrick-style goals are useful.

This is an excerpt from my forthcoming and currently title-free book on action mapping.


Scenario design toolkit now available

Design challenging scenarios your learners love

  • Get the insight you need from the subject matter expert
  • Create mini-scenarios and branching scenarios for any format (live or elearning)

It's not just another course!

  • Self-paced toolkit, no scheduling hassles
  • Interactive decision tools you'll use on your job
  • Far more in depth than a live course -- let's really geek out on scenarios!
  • Use it to make decisions for any project, with lifetime access
CHECK IT OUT