Category Archives: Attention

Cell Phone Lure

Our smartphones distract us, even if they are not turned on, which decreases our ability to concentrate.

This is certainly something that instructors in both school and professional training classrooms have to deal with. The mere presence of your cell phone makes you stupider by distracting you with the lure of potential social connection.

It’s tough because you can’t, at least in many professional settings, tell participants they can’t bring their phones with them. It would be better anyway, perhaps, to help people understand the effect and develop metacognitive strategies to focus. But how?

Advertisements

Wellness: Return on Investment

I was challenged recently about whether the food and breaks at our internal learning conferences are conducive to learning from an energy management standpoint.

That challenge seems reasonable. No matter whether a course is well- or poorly-designed, the physical state and alertness of learners should impact learning. So, I’m reading about it.

One incidental thing that was interesting to read: apparently wellness programs at work return on average $5.50 on every dollar. That’s impressive, and appears to be from a pretty good meta-analysis*. The building my firm is in just installed a free gym downstairs, and I get a lot of use out of it; I hope I earn my firm back that kind of return for whatever they are paying.

*Frustratingly, the meta-analysis is behind a huge paywall. I could only find a brief, so it is hard to know what all the assumptions and limitations are.

Optimal Length of Instructional Video

Someone recently pointed out this research to me. Prof Guo believes, based on the data collected from thousands of educational videos served on EdX, that six minutes is the optimal educational video length. People are willing to watch six minutes, but for every minute longer than that, the average number of minutes drops–drops not just as percentage, but in terms of the total number of minutes. If learners see that a video is 20 minutes long, they’ll quit well before six minutes (perhaps with good intentions of coming back later, but they don’t).

This is interesting for a lot of reasons, but one of them is that the governing body over learning for CPA firms recently gave the OK for firms to give formal credit for nano-learning. But to give credit, nano-learnings have to be at least 10 minutes long. EdX’s data* suggests that’s almost 70% too long!

*Of course, EdX’s data is not specific to adults learning in a professional setting. And naturally there are any number of factors in play that would make a three minute video seem interminable and a fifteen minute or even an hour long video fly by. But it’s still interesting stuff.

Supertaskers

Many of the myths debunked in Urban Myths about Learning and Education aren’t big news–I would hope that by 2017 the vast majority of teachers would be aware for instance that multitasking is bad for learning–but I was delighted to read the assertion that there is evidence suggesting that 2.5% of the population are in fact supertaskers. Supertaskers are wired to very efficiently move back and forth between tasks, minimizing the multitasking penalty.

Fascinating! I’ll need to read more. The biggest question if it looks like there is something to this: can supertasking be taught? What are the costs of supertasking?

The Future: Brain Monitoring

Michael Allen brings up the possibility (p. 146) that brain monitoring during instruction may not be that far away. There’s an interesting thought. What if we could monitor the brain directly during instruction to tell true engagement levels?

This may sound invasive or creepy, but what about as a personal learning tool? Metacognition is not easy; we don’t always realize when we aren’t learning very efficiently, so what if there was a machine that could measure our current level of learning? It could signal us that it’s time to take a break, or shut off distractions, or try a different learning approach.

From there, it’s not a long leap to elearning that can respond to real time monitoring of learning efficiency to make instructional choices for us, or gently suggest it’s time to take a break.

In terms of classrooms, it’s not hard to imagine a classroom setting where learners would want instructors to have (probably aggregated, anonymized) access to real time data about engagement if it could lead to a better classroom experience. Maybe! It’s interesting to think about (acknowledging that it is also interesting to think about the myriad privacy concerns, slippery slope possibilities, dangers of blurring the line between thought and algorithms, etc.).

Objectives as Catalysts

Classically-structured instructional objectives are invaluable for instructional designers, but dull for learners, a point Michael Allen makes in Designing Successful e-Learning: Forget What You Know About Instructional Design and Do Something Interesting.

He suggests that instructional objectives should “incite curiosity, energize the senses, and build excitement (p. 118).” Agreed!

One pedantic point of interest: Allen asserts that research shows that even traditional presentation of learning objectives have positive instructional outcomes (from his perspective, it’s good to have them, but so much better to do something interesting with them) but my memory on this point is that the preponderance of evidence is that simple presentation of learning objectives shows no correlation with achievement. I need to take a look and see if there is a consensus in the research.

Distracted Learning Index

For fun at one of our internal conferences last week I started measuring how many learners in the course were visibly displaying non-course information on a device–in other words, how many participants were multitasking. I initially called this measure the partial-tasking index, but it seems silly to invent another word for multitasking, even if that word is misleading because it implies success doing more than one thing at a time, which is very difficult to do unless the multitasker has achieved automaticity in one of the tasks. One of my colleagues pointed out a parallel to distracted driving, suggesting there should be a distracted learning index.

I took two readings per class of the percentage of multitaskers, then average the two scores. The best multitasking about I saw for a class was 3%. The worst score was 29%.

I should acknowledge here that the correlation between learning and a good distracted learning index score is probably pretty low. Just because learners are not interacting with a device doesn’t mean they are learning. On the other hand, a poor index score probably is indicative of a problem, particularly if the score gets worse during the class. It’s just a data point, an easy one to gather that is interesting to compare against other courses.

One of my fears of taking this measurement at all is that it could be misinterpreted as a call to eliminate connected technology in the classroom (if there are no distractions, learners will be forced to pay attention). It is true that courses at this conference that had electronic participant materials had, on average, more multitasking. However, some courses with electronic materials scored well on the index, so a lack of laptops doesn’t guarantee engagement, particularly since everyone at the conference has a little computer in their pockets that they can pull out whenever they are bored. Besides, technology-enhanced materials have too much upside for me to advocate a return to paper. Also, it wasn’t a fair comparison because the courses with no electronic participant guides were more often the ones with professional keynote-level speakers.

Another interesting point in the data is that large courses (>100 participants) had similar scores on average as small courses (around 30 participants). This is counterintuitive as I’d expect the larger classes to offer a kind anonymity that mighty encourage multitasking. Again, though, this might be an apples-to-oranges comparison as the larger classes tended to feature professional speakers.