Apparently the relatively affluent middle class in the United States and other industrial nations can look forward to wearable computers in the form of an Apple watch or Google glasses. Presumably these devices will be able to use Wi-Fi and/or cellphone networks to connect to the Internet, which means that it will require even less effort for us to look up any piece of information we want. We won’t even have to stick our hands in our pockets or purses to pull out that cellphone — the information superhighway will be on our wrists or in our eyes all of the time.
For educators, this means rethinking our approach to learning. In the past we insisted that children memorize things, and even now, when we encounter someone who can instantly recall an enormous variety of facts both significant and trivial, we tend to view that person as very smart, even though we’ve known for quite some time that memory is not the same as intelligence. Anyone who watched Rain Man knows the difference.
In an environment where all human knowledge is instantly accessible, what’s the point in forcing students to memorize facts? Wait, I know the answer to this one: so they can regurgitate those facts on easily-scored multiple-choice assessments. But set aside our national obsession with testing for a moment and ask yourself: if the purpose of public education is to prepare students for the future, then why should we bar the tools of the future from our classrooms? Instead of focusing on rote memorization of facts that can be easily looked up within seconds, why not focus on teaching students how to analyze, interpret, and evaluate those facts?
The previous paragraph serves as an example of my point. When writing that last sentence, I couldn’t remember the specific name of the particular classification of learning objectives that prioritized higher-order thinking skills such as analysis and interpretation. I went to Google and typed in “analyze interpret evaluate create remember,” and one of the top five results was an explanation of Bloom’s Taxonomy. This entire post is the result of analysis, synthesis, and evaluation: I read the New York Times story about the Apple watch, which got me thinking about technology and education, which in turn made me rethink educators’ backwards approach to personal technology in the classroom. Of course writing this post would have taken slightly less time if I could have just instantly remembered the name of that taxonomy, but not everybody has that type of (to borrow computer terminology) instantly searchable random access memory.
Pocket computers are already a reality, and yet it seems as if most public schools in the United States ban their usage, if not their possession. This seems exactly the opposite of what we ought to be doing. The common arguments against cellphones in classrooms is that students will use them to cheat, to send each other inappropriate messages, to play games when they should be working. But students can already do all of those things without cellphones. Students cheat, pass notes, and play games with paper and pencil. Why haven’t we banned those? Students read fiction for fun when they should be reading their math textbooks. Why haven’t we banned novels in public schools?
Obviously paper, pencils, and books have a proper educational purpose, and we teachers know that if we catch students using them inappropriately, we can provide proper consequences. We don’t ban those educational tools outright on the very real chance that someone might misuse them. But that is precisely our approach to personal computing technology in the classroom. We could be using SMS tools like PollEverywhere or ClassPager; we could challenge students to read about current events on their phones’ web browsers; we could have students collaborate with Skitch, examine Molecules, or prewrite with Popplet. Instead, we forbid the technology, mainly out of fear and ignorance.
Another potential objection to personal technology is its promotion upon dependence: those things don’t work without batteries, the Internet, and so forth. But that objection is somewhat disingenuous. The vast majority of us don’t teach our children how to preserve meat, trim wicks, or build fires from scratch any more, and no one says “But if the power goes out, what then?” Likewise, it’s no good to respond to the ubiquity of Internet access with “What if the power goes out?” or “What if the Internet is down?” because the Internet will be (or already is!) a commonplace feature of modern life, just like electricity, and outages will be (or currently are) brief and rare.
I look forward to teaching in a classroom where every student has the Internet at their fingertips (not to mention all of their textbooks in digital form) because I would much rather spend my time teaching students at the upper end of Bloom’s taxonomy. Instead of simply memorizing the formula for calculating velocity, the date of Gutenberg’s famous invention, or the correct conjugation of dormir, let’s make sure our students can use this information in meaningful ways that will enhance their daily lives in the future. And part of that daily life will be wearable computers.