Are you expected to "train" everyone on new software? In my previous post, I recommended that you first try everything but training. Make the software easier to use (yes, it's often possible!). Create job aids and help screens.
Did that work only partially? Are you convinced that people need formal training? This post is for you.
What you've done so far
As described in the previous post, you've already:
- Set a measurable goal that justifies the existence of the project.
- Listed the specific, observable job tasks that people use the software to complete.
- Identified why each task might be difficult and looked for ways to make it easier.
- Asked for improvements to the software to make it easier to use.
- Created easy-to-find job aids, help screens, and cheat sheets.
- Tested those changes to see if they were enough on their own.
Now you're convinced that people need formal training as well.
Expand your definition of "training"
Your organization might define training as, "Everyone goes to a room and is shown how to use the software" or "Everyone takes an online course that walks them through it." They view training as a one-time event that's delivered the same way to everyone, regardless of their pre-existing knowledge.
Let's consider two marketing employees who are expected to learn MegaMailer, which sends promotional emails to subsets of customers.
- Kate: In her previous job, Kate used a program called Mail-a-lot to send emails to a database of customers. MegaMailer takes a similar approach.
- Ben: Ben has also sent out marketing emails, but he did it by copying and pasting the recipients' addresses into the TO: field of the email. He's never used a database of customers.
Conventional MegaMailer training would force both of them to sit through a presentation about what is a database, record, and field. But Kate already knows all that. What's a different approach?
"Let Kate skip the stuff about databases," some people would say. "She can start with the presentation about MegaMailer's interface." But what if we go a step further?
We can avoid unnecessary presentations and provide spaced practice if we try this:
- Create self-contained activities that help people learn by doing.
- Make these activities available on demand, on the job. Don't lock them inside a course.
1. Create self-contained activities, not presentations
Consider plunging people into realistic simulations or scenarios in which they complete a task similar to the task on the job.
You could give them a faithful recreation of the software, some simple screenshots to click on, or the actual software, but using fake data (a "sandbox" where people can play safely).
An example activity for Ben and Kate could be: "We're going to send a mailing about the MegaChomper BigBoy toy to all big dog owners. First, you'll create a list segment of all customers who own dogs that weigh more than 15 kilos."
Ben and Kate see this activity first, not a presentation about the software. They immediately begin using the software for the same kinds of tasks they complete on the job, but with optional help.
2. Link to basic knowledge instead of forcing everyone to see a presentation.
In the "create a list segment" instructions for Ben and Kate's activity, the words "list segment" could be linked. Kate already knows what that means, so she doesn't click the link. Ben isn't sure, so he clicks the link to learn the basics about lists and segments.
3. Provide how-to information as optional help instead of walking everyone through it.
People who are already familiar with the type of software will want to plunge in and try it. Others will want a lot of guidance. Make them both happy by providing optional guidance.
For example, when Kate sees that she needs to create a segment of big dog owners, she confidently jumps into the software because she's done it before with another program and suspects it won't be very different. Ben has a lot less experience, so he clicks "Show me how to do it" and sees a short video of the steps involved.
The amount of help could be tailored more finely. For example, Kate might like just a hint showing the first menu item to use. However, Ben might want a lot more help. In addition to the how-to video, he might like a second, more in-depth presentation that explains what a database is, how fields like "DogWeight" were created, how the information about dog weight got into that field, and so forth.
For an example of different levels of how-to help, see this activity for complex medical software, designed by Allen Interactions.
4. Start easy and build skills gradually
Choose simple tasks for the first few practice activities. In our imaginary example, creating a list segment is the first step to creating a mailing, and it's also one of the easier steps. Maybe we'll have Ben and Kate practice creating a few more segments before they move on to the more complicated step of using an HTML template to create the content of the email. This is a type of scaffolding.
For an example of in-activity scaffolding, see if you can learn Zeko. The story reinforces vocabulary you've learned so far while adding new terms.
5. Provide realistic feedback, if possible
Strong scenarios and simulations don't stop you and say "Incorrect!" They just show you what happens as a result of your decision, and you conclude from that how well you did.
This can be tricky with software simulations, especially if the tasks are complex, with lots of ramifications. So this might be too much for your project, but for our imaginary marketing scenario, feedback might look like the following.
- Kate is supposed to send the mailing about the MegaChomper BigBoy toy only to customers whose dogs weigh more than 15 kilos. When she creates the list segment, she incorrectly tells the software to send the email to all customers except those whose dogs weigh more than 15 kilos.
- Instead of saying "Incorrect!" we show the natural consequence of her mistake: Owners of tiny dogs complain about annoying emails that advertise toys that their dogs can't even pick up.
When should we show the feedback? That depends.
If Kate is just practicing list segmentation, we could show it immediately. She creates the segment, and we flash-forward to show the future result.
If she's further on in the activities and is practicing the entire mailing process, we can withhold the feedback until the end. This is especially useful if our process includes a check step. Maybe the process looks like this:
- Create the list segment.
- Choose the correct HTML template.
- Enter the content of the email in the template.
- Double-check the list segment to make sure it's correct.
- Schedule the email for sending.
This gives Kate a chance to recognize and fix her earlier error, as well as having her practice the entire process.
If she doesn't catch her error, the end result will be annoyed emails from owners of small dogs, plus an optional explanation of what she did wrong. If she does catch her error, she can fix the segment before sending the email, and she sees the happy consequence of lots of MegaChomper sales.
Make the activities available on demand, on the job
In our example, we've created several standalone practice activities. Each one is self-contained because it links to supporting information. It's not an activity trapped in the middle of a presentation.
As a result, people can try the activities as they need them. Maybe all the activities are linked on an intranet page. We can (and should) show a recommended path through the activities. But people can still directly access an individual activity.
This is especially useful for reinforcement. Let's say that Ben carefully worked through all the activities about list segments and using the HTML template. He then was put on a project that involved creating lots of HTML emails while someone else created the list segments.
Two months later, Ben needs to create a list segment but has forgotten how. He goes to the bank of activities and chooses some list-segment activities to practice again. Once he's confident, he creates the segments he needs for his current project.
I'm not just making this up
This activity-driven approach might make intuitive sense, but intuition can't always be trusted. Luckily, there's also research that supports the plunge-them-into-it technique.
Again, you'll want to provide structure, such as a recommended path through the activities, and carefully increase the difficulty with scaffolding. You want people to feel competent, not frustrated.