Category Archives: Instructional design

Varied Practice

Make It Stick starts the chapter on massed practice with an interesting research anecdote about a group of eight-year-olds who practiced throwing a beanbag into a bucket three feet away. Half the group practiced with the bucket three feet away and half practiced by throwing the beanbag into buckets two and four feet away.

They did this for 12 weeks and then were tested throwing a beanbag into a bucket three feet away. The subjects who practiced on the two and four foot away buckets did much better than the ones who practiced the actual task. Interesting!

And counterintuitive. I would have anticipated the opposite, on the strength of the principle that the closer the practice is to the real world task (or the exam), the better the preparation.

If I wanted to learn a song in a piano book, clearly I’d be better off practicing that song than the song before and after it in the book. But, thinking broadly, maybe I would learn the song better if I practiced it in varied keys. And in the long run perhaps learning a variety of styles of music would help me become stronger in my preferred genre.

In terms of cognitive skills, it’s hard to know how far to take this, and the authors acknowledge that more research is needed. However, it seems reasonable to me that if you were an aspiring CPA and were learning how to audit cash at financial institutions (banks), you’d benefit from practice auditing cash at other types of businesses. The risk is that you waste time practicing learning and applying principles and facts that don’t apply to any of your actual clients, but the upside, maybe, is that the varied practice makes you a stronger auditor in financial institutions.

Actually, the bigger risk I see is that if you specialize in auditing financial institutions, taking time to practice auditing other kinds of institutions may force you to deal with concepts that are foreign to you, raising the cognitive load higher than it needs to be. Cognitive load is not an issue when you are throwing beanbags into buckets, but it is with complex cognitive problem solving.

Thus, a model for using varied practice to learn cognitive skills would have to include guidance on what cognitive variations are useful and which are harmful.

Advertisements

The Testing Effect

I’ve written previously about the Posttest Paradox. Makes it Stick in contrast speaks of a “testing effect,” which is the idea that retrieving information from memory–say, for an exam–increases your ability to retrieve that knowledge later.

I don’t like the term “testing effect,” because it implies formal exams. I’ve spent the last few years cautioning my firm that exams have hidden costs and may not be the best way to achieve their objectives, so calling this the “testing effect” undermines that when in reality, it sounds like the effect had more to do with practice and application than exams per se.

Their alternative name, the “retrieval-practice effect,” is a little better, but not exactly memorable.

There’s also a lot of emphasis in the book on retrieval. While the ability to access important knowledge is important for problem solving, in the real world, people can also look stuff up. I’d have liked to see more focus on conceptual understanding and the ability to generalize to related problems.

Reflection Is a Form of Practice

The authors of Make it Stick highlight the principle that reflection is itself a form of practice, that thinking about a problem is a form of useful rehearsal.

The recently revised standards for learning for CPAs put out by the National Association of State Boards of Accountancy (NASBA) stipulate for the first time that classroom learning must be active. For now, the rules for minimum interactivity are, indeed, minimal. To give formal credit, classes have to have one interaction per hour, of which that interaction can be nearly anything, including asking participants to reflect silently for a few seconds on a given question.

Lecturing for 45 minutes and then asking participants to reflect silently on a question asked by the lecturer is not necessarily effective design. But I applaud NASBA both for requiring interactivity and for keeping the requirement open. The more prescriptive the targets, the more proforma the execution. Keeping it open will result in many developers trying to do the minimum, for sure, but it may also help developers take ownership of active learning and try to understand NASBA’s intent.

Quiz Before Teaching

One of the core principles in Make it Stick is “trying to solve a problem before being taught the solution leads to better learning, even when errors are made in the attempt.” (p. 4)

I get it; finding out you don’t know something as well as you thought can make for a powerful learning moment.

On the other hand, I’ve tended to advise course developers to avoid asking learners right/wrong multiple choice questions before actually teaching the content, arguing that they are unfairly setting learners up for failure. (Intentional and actionable diagnostic pre-testing is an exception.) Philosophically and temperamentally, I much prefer trying to set learners up for as much success as possible. But I do understand that there are proponents of failure-based learning and am open to the possibility that there could be elements here that I should add to my playbook.

Controls

I direct learning for a CPA firm. I’m not a CPA, but I feel like I learn a lot from them.

One concept that auditors talk a lot about is controls. Controls are processes, tools, and checkpoints that businesses have in place to guard against error and fraud. For instance, if a large transaction requires the signature of the CFO, that’s a control. Password-protecting critical financial systems is a control.

In short, controls are a concept that auditors understand because auditors know that businesses with poor controls in place are going to be a lot harder to audit.

I’ve used controls as a way to explain the importance of measuring mastery of learning objectives. When an auditor–indeed, most any professional–is asked to design a course for less experienced professionals, their default is to typically treat it more like a presentation than a course and include little interactivity and no means for instructors to assess how well learners grasp the material before moving on to the next topic.

One could argue in good faith that it is the learner’s responsibility to learn. That as a professional if someone is struggling, it is on them to recognize that reality and take steps to ameliorate it. In reality, that puts the firm at risk.

So when I talk about introducing checkpoints and polling questions and case studies, I sometimes talk about them in terms of controls. Without those elements built into the course, we have no way of knowing if a course was effective (and more formatively, instructors will have no way of knowing whether what they are doing is working or whether they need to do something else).

Auditors know what separates a strong control from a weak one, so this becomes a powerful way to make the case for investing in classroom activities that provide evidence of learning.

Designing Classroom Instruction with a CBT Background

I learned the craft of instructional design designing computer-based training. Elearning has limitations not present in the classroom, and vice versa. You learn to design toward the strengths of your medium. I sometimes wonder how designing exclusively for one medium early in one’s career affects one’s ability to design for other media–a crystallized design sense, if you will.

I was observing a course I didn’t design recently, and the last section of the class was devoted to student presentations. I was unsure about this; it took up a significant portion of the class, and I always worry whether learners get much out of watching their classmates present. For those reasons, I’ve never really incorporated student presentations in my ID toolkit.

I think it worked, though. The learners were interns at the firm, and creating presentations in teams helped them get to know each other, creating potentially valuable connections, which fit with the larger goals of the program. It allowed them to go a little deeper in a topic while positively impacting the classroom dynamic.

Anyway, I’ve certainly designed experiences for classrooms that capitalized on the strengths of the medium and would not be easy to replicate in elearning, but I sometimes wonder what my blind spots are (not just me–any designer) when designing for media outside of the core of my experience.

Compression: Live Training, Elearning, and Time on Task

I learned a term recently for something I’ve thought about in the past but didn’t know was a thing: compression. Compression means it will take learners longer to complete a live class than an equivalent elearning. In other words, if a four hour class is offered in both a live classroom version and a self-paced elearning version, participants will complete the self-paced version quicker with the same level of achievement.

Apparently–and I didn’t know this either–the rule of thumb for compression is 50%. A four hour live class can reasonably be turned into a two hour self-study. The persuasive utility of this, of course, is huge. What leader wouldn’t want their people in training only half as much time?

There are some enormous caveats. One is that time spent practicing cannot be compressed, so the more focused the training is on practicing skills, the less compression is possible. It also appears that the elearning has to be text-based in order to allow significant compression. People read faster than they talk. The elearning we produce at my firm are audio-based, so one wouldn’t expect very much compression.

That brings up the question, should our elearning be audio-based? We use audio for a number of reasons. The most important is that from a cognition perspective, using the audio channel for narration and the visual channel for complementary information (charts, organizational bullet points, tables, etc.) leads to better learning.

The question then morphs to: Is the increase in the quality of the learning worth the investment? Experienced professionals can take in new information very efficiently. If training is focused on them, it’s possible that the increase in learning may be immaterial compared to a time savings of 50%.

It would be interesting to offer the same elearning two ways (audio or text), randomize which one people get, and measure time on task, satisfaction, and exam performance across the two conditions.