The Unprepared Generation (Part I)

You will be ever hearing but never understanding;
you will be ever seeing but never perceiving.
For this people’s heart has become calloused;
they hardly hear with their ears, and they have closed their eyes.
Otherwise they might see with their eyes, hear with their ears,
understand with their hearts and turn, and I would heal them.

—Mathew 13:14-15

They will be hearers of many things
and will have learned nothing.
They will appear to be omniscient
and will generally know nothing.
Plato’s Phaedrus

When I finished writing Chapter 9 of this project nearly a year ago, I had no idea that the concerns I raised in it might prove potentially prophetic.  But a recent conversation with my mother got me revisiting the impact of digital technology on teaching and learning, and I began to see that some of the themes from that chapter might help explain some of the social phenomena we are observing right now in this country in our collective response to the pandemic.

The concern my mother raised was why so many people in our society—particularly those in the 18-35 age bracket—seem so incapable of a united response to this pandemic.  She even mused rhetorically, “Why is the highest positivity rate nationwide among those in their 20s and 30s? Why can’t they seem to take the necessary steps to protect themselves and others? Why was the Greatest Generation able to rise to the deprivations and extreme challenges of World War II for nearly four years, and today we can’t even seem to get everyone to do something as simple as wear a mask in public?”

My initial response in my head was:  A. FDR, one of the most astute, intelligent, and caring presidents our nation has ever known was in charge at that time; while Donald Trump, a publicly diagnosed malignant narcissist, has been in charge this time.  B. Back then, the age bracket in question was either actively fighting the war or working in the factories to make the fighting possible; while today, many between 18 and 35 are either underemployed or unemployed during this crisis, with all the extra isolation that can bring with it.  And C. The nature of the threat was different; you could see what you were struggling against and see directly how your actions affected that struggle.

But as we rapidly approach a death toll from the pandemic in this country that will exceed in less than 12 months the number of soldiers killed during the 4 years of WWII, I have started to think more deeply about my mother’s concern and to wonder what has changed in the last 75 years? There are obviously numerous demographic, economic, and sociological answers, but as an educator, I began to wonder:  have we somehow fundamentally rewired our brains in ways that make a collective civility more difficult?

The short answer, I think, is a probable “yes.” But the long answer is more complicated and requires some unpacking, an unpacking that begins with a better understanding of the brain and its plasticity.  I have already written extensively in Chapter 3 about the malleable features of the right hemisphere versus the conserving features of the left hemisphere, but that is only one understanding of the brain’s plasticity.  There is a larger understanding in neuroscience that recognizes that “plasticity” can also refer to the ever-evolving structural changes—and the consequent functional changes—that occur throughout an individual’s life.  The science shows that no matter a person’s age, the structure and function of said person’s brain is always at least a little different from one day to the next, simply from all the brain’s interactions with the environment and the resulting changes in neural pathways and synaptic connections.  Just as no two brains are the same, a single brain is also never the same as every datum it processes subtly alters it. 

Granted, most of these permanent changes are minute, which is why we maintain a sense of self, a sense of identity.  But over long periods of time—as well as critical developmental growth periods—the impact of all this change can add up, and therefore, people who live in fundamentally different environments will perceive and interact with the world around them in fundamentally different ways.  It is why there is some biological truth to the notion of a “generation gap.”  Growing up in historically different eras produces brains that will quite literally think somewhat differently because of how they have had to interact with different experiences.  What’s more, the manner with which the brain encounters these experiences will itself alter the brain.  As Michael Harris nicely summarizes it, “what you use to interact with the world changes the way you see the world.  Every lens is a tainted lens” (p. 35).

Which got me to asking myself:  what is the fundamental lens now at work on our brains today and how is that lens tainted? The answer to the first question is clearly the nearly ubiquitous smartphone and its attendant technologies.  But the answer to the second question, the one about the taint, is not so clear, and it caused me to go on a bit of a historical quest and return to the beginning:  Steve Jobs’ launch in 2007 of the original iPhone.  Fortunately, due to digital technology, I was able to watch the entire thing on YouTube, and by so doing, I was able to discern four deliberate design decisions that I want to suggest are the smartphone’s taint.

The first design decision was about the voicemail feature.  Anyone with experience with an old fashion telephone answering machine knows that you cannot cherry-pick which messages you pay attention to; even if simply to delete them, each recording—in the order received—requires your attention.  So Jobs and his team designed the new voicemail feature on the iPhone to allow the user control over how the user interacts with voicemails, and in fact the person in charge of this feature, Stan Sigman, bragged about how the iPhone’s voicemail “let’s you look at the voice message you want to hear and when you want to hear it.”

The second design decision, to quote Jobs, was to have “the internet in your pocket.”  The new iPhone had its own html browser with full access to e-mail and all the other on-line features and abilities associated with a tablet or desktop computer.  This feature was intended as an expansion of the iPod’s abilities, and with it, Jobs bragged that everyone could now have access to any music they wanted anywhere, anytime, anyhow.  Interestingly, the other implications of this feature were not given much attention during the launch.

The third design decision was to include a digital camera with video capacity.  The pitch for this feature was that you would no longer have to carry multiple devices, never have to accidentally miss capturing that special moment because you didn’t have your camera with you, and that you could more easily share photos with family and friends.  Indeed, the photo sharing ability plays a central role in the launch presentation.

The fourth design decision was to make the texting feature operate on a continuous feed.  There would be no equivalent of voicemail that would automatically route an incoming text to a storage area for later examination.  All incoming texts would be brought to the user’s immediate attention that one had arrived, regardless of what else the user was doing on the iPhone at the time.  Here, too, you can watch Jobs himself bragging how you will never miss a time-sensitive piece of information ever again.

In and of themselves, these four deliberate design decisions appear pretty innocuous.  But Jobs opens the launch of the iPhone in 2007 with an intentional reminder to the audience of how the iPod fundamentally altered the entire music industry (some would say laid waste to it), and it is part of the law of unintentional consequences that any new technology will bring the bad with the good.  For instance, a voicemail I can cherry-pick means I have the power to dismiss and ignore you without a second thought; the internet in my pocket means I can upload any crazy thought that pops into my mind at any given moment for permanent immortalization and consumption; the ability to film anything at any moment means I can be filmed doing everything everywhere; and a continuous feed of texts means I am always on-call, forever distracted by the rest of the world’s demands.

This last issue is worth a brief neuroscience digression because there is not the same degree of willful decision making involved as with the other three instances.  As I have reminded often, our brains did not evolve in today’s environment.  On the open savannah, a slight change in shadow and light in our visual field or a new sound in our ears could mean the presence of a hungry predator.  Therefore, our brains evolved a feature psychologists call “orienting response,” and what it means is that even the slightest change in our sensory input causes the brain to immediately switch its entire attention (“orient”) toward the new input.  Thus, every time a text alert arrives on a smartphone, the brain stops everything it is focusing on to attend to that alert.  It cannot prevent itself from doing so, and no amount of training can overcome this limitation.  “So now, just as the once useful craving for sugar and fat has turned against us in an environment of plenty, our once useful orienting responses may be doing as much damage as they do good” (Harris, p. 121).

Yet, while the tainted lens of texting isn’t something we can control, the simple truth is that we are not collectively making the necessary decisions to overcome the tainted lenses of the other three deliberate iPhone design decisions that became the standards for the industry.  As to why not, I would like to suggest that for 13 years now, our brains have lived in a world where the smartphone has become our dominate method for interaction with our environment.  Indeed, for many—and especially younger so-called digital natives—the smartphone has become their environment, and as a consequence, a device that enables dismissive, ego-driven, instantly gratified behavior has become the dominant force for altering the structure and function of brains in a society in which rabid individualism is the norm also at work on those brains.  Is it any wonder that by 2013, only 6 years after the introduction of the iPhone, a study at San Diego University found decreased levels of empathy and increased levels of narcissism in young people in this country?

Furthermore, the neuroscience has shown that “the Internet fundamentally works on our plastic minds to make them more capable of ‘shallow’ thinking and less capable of ‘deep’ thinking” (Harris, p. 38).  We are simply not as reflective and thoughtful when we are digitally engaged—even our reading patterns change as we scroll, jumping from word to word down the screen rather than steadily from left to right—and when we combine this fact with a device that makes the internet available to the brain 24/7, along with an orienting response over which we have no control, and we are left with brains structured to spend their days attempting the myth of multitasking, rapidly refocusing our attention from one action to the next, doing a subpar job on all of them (and no, the irony of my use of the internet to share this state of affairs is not lost on me).

To summarize, then, we have a world of brains alive today where the presence of the internet has had two generations and the smartphone one generation to impact their structure and function in the ways I have described.  In addition, we live in a country where the inherently self-centered character of individualism has also shaped our brains.  Enter stage left a deadly, highly contagious virus, and suddenly, I think we may have an answer to my mother’s question.  A significant chunk of our population has been raised with no need for a sense of the collective or one’s individual impact on it, able to live in their own digital bubble.  An even larger portion have been shaped by a device that invites ignorance and poor-quality decision making.  Toss in some privilege and sense of entitlement, and how, then, could so many people with brains sculpted by such an environment not struggle to do what is right for the greater good?

Especially when it entails potential isolation and self-deprivation? But more on that next time.

References

Centers for Disease Control and Prevention Morbidity and Mortality Weekly Report. https://www.cdc.gov/mmwr/.

Glenn, H. & Inskeep, St. (Nov. 18, 2020) A Nurse’s Pleas: ‘I Wish That I Could Get People To See COVID Through My Eyes.’ NPR Morning Edition.  https://www.npr.org/2020/11/18/936096303/nurses-are-under-pressure-as-hospitals-strain-to-meet-pandemic-demands

Harris, M. (2015) The End of Absence: Reclaiming What We’ve Lost in a World of Constant Connection.  New York: Current.

Jobs, S. (2007) iPhone 1 – Steve Jobs MacWorld Keynote in 2007 – Full Presentation. https://www.youtube.com/watch?v=VQKMoT-6XSg.

Johns Hopkins University of Medicine Coronavirus Resource Center. https://coronavirus.jhu.edu/.

Palmer, P. (2018) On the Brink of Everything: Grace, Gravity, & Getting Old.  Oakland, CA: Brett-Koehler Publishers, Inc.

Simon, S. (Dec. 12, 2020) Why San Mateo County Is Not Under A Stay-At-Home Order.  NPR’s Weekend Edition. https://www.npr.org/2020/12/12/945788722/why-san-mateo-county-is-not-under-a-stay-at-home-order.

Unfit (2020) #Unfit: The Psychology of Donald Trump. https://unfitfilm.com/.

One thought on “The Unprepared Generation (Part I)

  1. Once again Stemteacheremeritus has carefully and mindfully identified an issue and the attending response. But more importantly, he has again stimulated my own thought process and how I have been personally impacted. For example, I was once proud of the fact that I was a “texting holdout”. As some who know me well, knew my preferred method of engagement was a face-to-face encounter. Alas, and to make a long story short, I am now, and have been a texter. If this were considered a communicable disease, some researchers would be looking for a cure. I for one, plan to return to the face-to-face world once it is safe to do so.
    Debby Miran

    Like

Leave a comment