I’ve seen two dominant media narratives revolving around the Apple Watch. The first is a technophiliac, utopian vision of how wearable connectivity will make life better for people. In this vision, we’ll use the watch to diagnose diseases, utilize more accurate exercise regiments, and stay connected with friends and family while being more mobile. The second vision is a world lacking privacy, full of tech addiction, where nobody is able to pay attention to the world around them.
It’s never as easy as these two binary options. The truth is that the ability to develop and wield technology is deeply human while at the same time deeply dehumanizing. It’s a paradox we’ve wrestled with from the dawn of civilization. The Apple Watch is no exception.
However, even after thousands of years of technological development, humans still tend to fail to grasp the implications of the technology we create. We often overstate certain elements while missing others entirely. In other words, we suck at making techno-prophecies.
Overstating Our Case
In a recent blog post, one of my students does a great job describing how technology hasn’t really changed what it means to be human. He points out that we have always had arguments, played games and engaged in meaningless amusement in our free time. Digital devices haven’t really changed any of these tendencies.
It has me thinking about the day we first got the Internet in our school. I was a seventh grader at the time. I remember hearing about how we would have instant access to information. The world’s collective mind would grow exponentially. We would use it to cure diseases and solve global problems. We would live in a Global Village where we would hash out our differences on the Information Superhighway.
Some of that was true. We certainly use technology to access information. However, we have also decided we’d rather crush candy and take selfies and photograph our food and watch videos of cats, because, let’s be honest, life can be exhausting and humans are kind-of lazy and the ability to add to the collective mind didn’t exactly lead to the propensity to learn 24/7. Sometimes you want to solve problems, but other times, you just want to watch “Charlie Bit Me.”
In other words, to a large extent, we pretty much stayed the same as humans. The desires and drives and actions that make us human are still there. That’s why I’m skeptical any time someone says, “This will change how we learn.”
On the other hand, we often fail to grasp how earth-shattering a world may be as a result of a particular technology. I’m not sure people realized how isolated we would become when we got air conditioning. I doubt that people saw the printing press and said, “I bet that’ll lengthen our grammatical structure, standardize spelling, strengthen the concept of the nation-state, revolutionize the concept of individual rights, and lead to a splintering of the Christian religion into tiny denominations.” Similarly, I doubt that people anticipated how the advent of birth control would forever change the grip that these denominations had on people’s lives.
It has me thinking about current digital technologies. True, we’re openly grappling with the loss of privacy (a concept that, in its current form, is a byproduct of industrialization). However, I don’t think we’ve truly made sense out of the gamification of relationships, the constant barrage of metrics, the dominance (and even embrace) of branding, and the implications of a world defined more by “relevance” (based upon an algorithm of interest) than geography.
So, my thought on wearable technology is this: I don’t know what it will mean and chances are few others do, either.
Looking for more? Check this out.
Join my email list and get the weekly tips, tools, and insights all geared toward making innovation a reality in your classroom.