Muddling Through…

But Mary treasured up all these things
and pondered them in her heart.

—Luke 2:19

Pon·der: / pändər / verb / to think deeply (about); deliberate

Anyone who has known me even briefly notices that I am a voracious thinker. And if you’ve been following this project of mine, you know just how voracious.  I think about everything. If it enters my neural networks, it will get shifted for unseen (and sometimes unusual) patterns, examined, analyzing the what, where, when, how, and why of the intellectual landscape, thinking, ever thinking.  In a word, I ponder.  I ponder a lot!

What’s more, like my pal, Mike Montaigne, no topic is unworthy of attention.  That man pondered everything from farts and fear to death and the human condition, and then he wrote about it.  He essayed.  In fact, he invented the form, and every high schooler ever since who has had to master the 5-paragraph essay has—at one time or another—cursed his name, even if they didn’t know it.  Even those of us who developed a passion for authorship and the written word have cursed him because we didn’t always have control over what we had been assigned to essay about!

Well, I’ve been at it again, pondering, and what I have been pondering about are two things that at first glance may appear to have nothing to do with one another:  Artificial Intelligence (AI) and the Thwaites Glacier.  But if you bear with me, I will essay and reveal the hidden connection, including the unusual ethical conundrum it has evoked for me as an educator. 

The topic of AI, of course, is nothing new to my pondering.  Regular readers know that I have discussed everything from its potential impact on education to its danger to the very concept of truth.  In fact, my very first post for the LoC project was about catechism and AI as a tool for learning.  But what’s got me pondering AI yet again is a recent article in the September 2023 issue of Scientific American about how AI models are demonstrating knowledge about things no one has told them. They are effectively learning on their own, and the very people who invented them are—and I quote—“baffled as to why.”  In fact, “a growing number of tests suggest these AI systems develop internal models of the real world, much as our own brain does” and “that GPT and other AI systems perform tasks they were not trained to do, giving them ‘emergent abilities’ ” (such as the ability to execute code they have written for themselves independent of the original human who created them!).  Indeed, “researchers are finding that these systems seem to achieve genuine understanding of what they have learned.”

So as theologian, Nadia Bolz-Weber put it (after discovering her own books were being used to train an AI), “have we created WALL-E or HAL? Likely both. Why? Because we are both.”  The bottom line is that what it means to be human is complex, and therefore, anything we create that resembles being human will also be complex.  But that’s why MIT physicist, Max Tegmark, has authored an open letter (which I encourage everyone to read) calling for a six month moratorium on all AI instruction so that we might decide how best to regulate this exploding field before we risk Skynet (of The Terminator movies franchise fame) from becoming a reality and generating actual explosions.  However, Tegmark is extremely pessimistic, sharing with The Guardian that he believes the economic competition is too intense for tech executives to pause AI development to consider their potential risks.  As the title to the piece suggests, he thinks we are in a “race to the bottom” and that this djinn is not going back into its bottle. 

And that’s what brings me to the Thwaites Glacier: because a similar “race to the bottom” is happening there, courtesy of climate change.

For those unfamiliar with the Thwaites Glacier, it is located in Antarctica (see map), and like much of the large ice bodies on this planet, it is actively melting faster than originally anticipated due to rising atmospheric temperatures. 

However, what makes this particular large chunk of ice more significant—and has earned it the nickname, the “Doomsday Glacier”—is that its melting is increasing exponentially and threatens a potential full collapse in the very near future.  If that happens, the most immediate consequence is a 65 cm rise in sea levels (a little over two feet) taking place within a matter of years, not decades—Goodbye Miami Beach and nearly every island nation in the world—and even more terrifying is that the Thwaites Glacier presently holds back the West Antarctic Ice Sheet from entering the sea. With Thwaites gone, an ice mass roughly the size of India will melt into the water over the next century, resulting in a sea level rise of 3.3 meters (nearly 11 feet)—Goodbye Florida!

By now, how AI and the Thwaites Glacier managed to merge in my pondering may be becoming clearer.  Both situations currently represent ways in which humans seem to be actively working to crash and burn as a species, and our present mishandling of both poses a real existential threat to our well-being. But what finally linked them in my mind was when I read the review of Elizabeth Rush’s The Quickening.  My mother had brought this book to my original attention (full disclosure: I have not seen a page yet), and she had shared how it is the story of the most recent research expedition to Thwaites (the source of the sea-level data I’ve been sharing—don’t read the full report if you want to sleep in the near future). Well, when I read the review, what interested me most was learning that much of the narrative of the book involves Rush’s exploration of her desire and uncertainty about potentially becoming a parent.  Specifically (as is true of many couples these days), she spends time pondering whether it is ethical to bring a child into a world so threatened by the current “race to the bottom” documented by the expedition–and which, I would argue, is happening in both our social and ecological worlds.

Bringing me to what triggered my unexpected conundrum: given my own professional relationship with children, it occurred to me to wonder whether what I do for a living is ethical anymore.  Is it ethical to work to prepare children for a future that simply may not be? I want to say the answer is “yes,” and having had at least four of my former students that I know of die already, I am aware that all my work and that of every other educator can be cut short by tragedy.  But what if all that we are looking at is nothing but tragedy?

I’m too much the historian not to know that societies and civilizations come and go—with a variety of different size and scale dark-ages in between—and I’m too much the biologist not to know that life has managed to persevere through every mass extinction that has ever happened on this planet (and that the one we are currently causing is nowhere near the scale of the Permian event that wiped out 90% of all life at the time).  Hence, I know that some manner of complex life—and maybe even society—will make it through the evolutionary bottleneck in which we find ourselves.

But that still leaves me wondering whether what I do for a living is ethical.  Or more precisely, it leaves me wondering whether how I am preparing children to live in their futures is ethical.  Should I be preparing them to correct their elders’ errors, or should I be preparing them to be a remnant people in an apocalyptic landscape? Should I be preparing them to fix a broken world, or should I be preparing them simply to survive in the brokenness? It’s like the cartoon I used in my recent post:  should I be teaching them to program space robots or how to sharpen a stick with a stone? Which teaching will enable them not simply to survive but survive with some degree of positive meaning in their lives?

I wish I had a clear and obvious answer, but I don’t.  I suspect it is probably a “both/and” rather than an “either/or” situation. But for now, all I can do is keep muddling through and, like Luke’s Mary—who was dealing with all manner of craziness in that barn that night!—keep pondering these things in my heart.  Because it is only out of the heart, out of the love I have for my students, that I know that I can find a resolution to my conundrum that is my best for them.  As Luther said at the conclusion of his ecclesiastical trial, “Hier bleib ich; ich kann nicht anders.” Here I stand; I can do no other.

References

Bolz-Weber, N. (Oct. 22, 2023) My Robot, My Self: On AI, Religion, and “What It is to be Human.”  The Cornershttps://thecorners.substack.com/p/my-robot-my-self?utm_campaign=email-half-post&r=2y0by&utm_source=substack&utm_medium=email.

Kolbert, E. (2014) The Sixth Extinction: An Unnatural History.  New York: Picador.

Milmo, D. (Sept. 21, 2023) AI-focused Tech Firms Locked in ‘Race to the Bottom’ Warns MIT Professor.  The Guardianhttps://www.theguardian.com/technology/2023/sep/21/ai-focused-tech-firms-locked-race-bottom-warns-mit-professor-max-tegmark.

Musser, G. (Sept. 2023) An AI Mystery.  Scientific American.  Pp. 58-61.

Rush, E. (2023) The Quickening: Creation and Community at the Ends of the Earth.  Minneapolis: Milkweed.

The Search for Executive Function

Who looks outside, dreams.
Who looks inside, awakens.

—C. J. Jung

For those of you just joining this particular conversation, I have recently been exploring the general nature of adolescence and the evolution of the teenage brain, with the expressed aim of investigating how adult stakeholders might help the children in their lives successfully navigate this tumultuous and sometimes dangerous maturation period.  We have finally reached the point where we are now ready to discuss that “how,” and it turns out that the key to managing a Pliocene brain in the modern world is a set of processes the brain performs known collectively as “Executive Function.”  These processes include response inhibition, cognitive flexibility, attending (also known as working or short-term memory), and emotional regulation. They occur almost exclusively in the prefrontal cortex and its neural pathways to the amygdala and hippocampus—which, you may recall, are the parts of the brain basically in combat during the teenage years with the then more mature limbic system; indeed, elements of this combat actually mature the prefrontal cortex and its eventual regulation of the limbic centers.

But in the in-between, how can teachers, parents, and other caregivers aid the development and maturation of executive function in their adolescent charges? Interestingly enough, there is a “low-hanging fruit” for starters that is so ridiculously easy that any of us could do it tomorrow:  increase a teenager’s daily amount of exercise.  Simply requiring adolescents to get off the couch and move improves their prefrontal cortices’ wiring and function, and a huge study in the United Kingdom has shown that for every 15 minutes of additional daily exercise, student academic performance—a reliable measure of executive function—improved the equivalent of a full quarter of a grade.  In fact, regular daily exercise can increase the size of the hippocampus (the seat of working memory) by as much as 2%, and thus “the difference between a B and an A depended on little more than teens closing books in a class and opening lungs in a gym” (p. 137)[i]

Furthermore, when daily training in mindfulness is added to all this exercise, all kinds of functional connectivity in the brain starts to change, all promoting executive function.  “We know, for example, that the amygdalae, those almond-shaped structures that supervise experiences like fear, start to lose weight (they actually shrink) [which] in turn weakens many of their functional connections to other regions of the brain” (p.192).  In addition, the prefrontal cortex gets thicker (i.e. grows more synaptic connections) while simultaneously uncoupling from the part of the brain responsible for the subjective elements of feeling pain.  Thus:

Taken together, a remarkably detailed neurological picture is emerging about why mindfulness can be so powerful.  It’s changing the way the brain looks at fear and pain while at the same time strengthening regions associated with controlling them.  The neural substrates that mediate executive function are being rewired, all because you’ve decided to concentrate on the lovely contours of your earlobe (p. 193).

Therefore, what seems to be the bottom line for promoting adolescent executive function is to buy them a good pair of sneakers and a meditation app. But of course, the situation cannot be dealt with quite that blithely, and I write that oversimplification partly in jest and partly because another way we actually do—or do not—help the teenagers in our lives mature their executive functioning is much more challenging:  we must examine what kind of relationships we are having with them. 

And we must do so because as a social species, our very survival has depended on having positive relationships with other humans.  Thus, it should come as no surprise that the formation of executive function in the brain—as well as its overall strength!—directly depends on the quality of our caregiver interactions with children.  Indeed, courtesy of the pandemic, we have already seen the dark side of this fact from the negative impact on student learning of the numerous school closures, and in fact, “modern cognitive neuroscience reveals that subtracting this critical interpersonal ingredient is done at the students’ peril” (p. 84).  Furthermore, “since children’s survival is dependent for years on adult caregivers, it’s in the child’s interest to constantly monitor how the adults are doing” (p. 84), and that is why divorce can have such a negative impact on the development of executive function.

What is needed, then, is for caregivers to form the kind of healthy and functionally effective relationships with their teenagers that famed developmental psychologist Diana Baumrind calls “authoritative parenting” and which educator Zaretta Hammond calls “warm demanders.”  And note that the term is “authoritative,” not “authoritarian.”  Because what distinguishes the former from the latter—a parenting style that sadly does exist—is that authoritative parents balance a demandingness where behavioral rules are maintained (even at the risk of angering their children) with a consistent engagement in verbal consultation that explains the “why” of any particular rule, along with an openness to negotiating more autonomy as a child matures. 

Basically, authoritative parents:

Preserve the best elements of parental responsiveness, remaining accessible to their children always, and listening with warmth and acceptance (different than approval).  They seem to realize that, regardless of teen reactions, what they do as parents absolutely matters.  And what matters most is that the kids know they feel loved—and safe.  They are regularly willing to risk their relationship with their teens in the service of a higher behavioral goal.  It’s the only Darwinian thing to do, after all (p. 97).

Similarly, Hammond’s “warm demanders” are teachers with clearly articulated high expectations—both behavioral and academic—who maintain what I have described and explored elsewhere as appropriately intimate rapport and who “possess one of the hardest perspectives for adults to achieve with teens:  a growing respect for their autonomy” (p. 110).  Warm demanders ask much of their students, achieving just the right balance of stretch with support (see Chapter 3), and the decades of research data about such classrooms is clear: impulse control and attentional states (two critical features of executive function) improve and student IQs actually go up.  However, the research is equally clear that a teacher’s negative view of their students can flip that same IQ “off” like a light switch, and so whose classroom a teenager inhabits has enormous implications for that child’s brain’s capacity for executive functioning and the corresponding academic success.

Which brings me to the intersection between parents and teachers.  Part of being a warm demander as an educator is maintaining a safe learning environment.  Yet safety starts at home, and therefore the “who” who arrives in my classroom depends entirely on the parent delivering their child to me.  Wounded children are all too common in our society, and the consequent impact on their executive functioning places an enormous impact on their learning.  The reason why is because without social-emotional well-being, the adolescent brain struggles to learn empathy, and empathy, it turns out:

[Forces] students to think critically about “other perspectives.”  This “otherness” translates not just to friendships outside one’s own experience, but to concepts outside one’s own experience.  It promotes cognitive flexibility, which in turn leads to more elaborately conceived—and often quite unique—problem-solving abilities.  Reasoning, it seems, involves taking other perspectives (pp. 161-162).

Thus, without a capacity for empathy, teenagers risk failing to develop and mature one of executive functioning’s most critical processes: the ability to be cognitively flexible.  Yet, empathy is something the research of Sherry Turkle and others suggests our younger generations are losing, and therefore, the final way I will share that we could all be helping our adolescents grow their executive functioning to thrive as adults is to develop social-emotional learning programs in our schools.  Doing so is a way that the educator side of the equation could be addressing this empathy issue, and school stakeholders providing the necessary funding for such programs is the other.  Together, we can help every teen become the best adult version of themselves.

We just have to remember that we must be the best adult version of ourselves in order to do it.

References

Baumrind, D. (1995) Child Maltreatment and Optimal Caregiving in Social Contexts.  Oxford: Routledge.

Baumrind, D., et al. (2008) Parenting for Character: Five Experts, Five Practices.  Arlington, VA: ASCD Books.

Hammond, Z. (2014) Culturally Responsive Teaching and the Brain: Promoting Authentic Engagement and Rigor Among Culturally and Linguistically Diverse Students.  Thousand Oaks, CA: Corwin Press.

Medina, J. (2018) Attack of the Teenage Brain. Arlington, VA: ASCD Books.

Rosenthal, R. & Jacobson, L. (2003) Pygmalion in the Classroom: Teacher Expectation and Pupils’ Intellectual Development.  Bethel, CT: Crown House.

Turkle, S. (2017) Alone Together: Why We Expect More from Technology and Less from Each Other, 3rd Edition.  New York:  Basic Books.


[i] All quoted material for this essay is from Medina’s Attack of the Teenage Brain.

Those Crazy Kids

The brain is not interested in learning.
Period.
What the brain is interested in is surviving.

—John Medina

Anyone familiar with my concept of “authentic engagement” knows that I believe one of the pillars of effective teaching (and parenting) is a thorough working knowledge of the neuroscience about the brain, and that absolutely foundational to this knowledge is the reality that the structures and functions of the human brain did NOT evolve in the environment we now currently inhabit as a species.  Indeed, it is to the peril of educator and parent alike to forget that our brains evolved under very different circumstances—the Pliocene of East Africa—and that we, correspondingly, have a brain totally fine-tuned for survival in that world, not this one.

Moreover, nowhere is this peril greater than when working with the teenage brain. As I mentioned in my previous essay, adolescents today are fundamentally no different than adolescents at the start of my career (or millennia ago for that matter), and that is why one of my critical tasks as their teacher has been to help my students navigate the massive upheaval going on inside their heads and why I suggested that there are ways to address this need successfully in our always changing world.  However, I want to argue that first we must understand what’s actually going on in the adolescent brain and why it evolved to generate the kinds of behaviors we typically associate with teenagers. Only then, I think, can we figure out how to help a teenage Pliocene brain traverse the modern challenges of our digital age.

So let’s get started.

To begin with, we require a short primer on brain development in general.  First, from conception to roughly age five, you grow as many possible neurons in as many regions of your brain as your environment demands (which is why Olympic level athletes nearly always start at this age—the brain grows lots of extra motor cortex neurons in response to the demand—and it is also why Head Start programs are so important to future academic success).  Then, between roughly age 5 and the onset of puberty, all those new neurons reach out and grow billions of extra synaptic connections, again as your environment demands (which is why learning to read by 3rd grade is so critical).

Next comes puberty and adolescence when some time (usually!) between the ages of 10 and 13, your hypothalamus starts cranking out a hormone called kisspeptin that, to borrow a phrase, “make all hell done break loose!” It initiates the massive upheaval of changes that transform a child into a reproducible adult, and no where does this upheaval have greater impact than in the brain.  Massive increases in myelination speed up neural communication rates 3,000-fold while all those earlier extra synaptic links are careful and meticulously pruned—all, as always, dependent on what the environment is telling the brain is important (and which is the neurological justification for a liberal arts education). Indeed, at no other time in our lives does the brain undergo such enormous physical change, continuing well into our early 20s.

Adding fuel to all this fire is the fact that:

Although these changes happen in all teenage brains, there’s no wholesale neurological goose-stepping.  Teens go through this at their own individual paces, as diverse as political points of view.  To make matters worse, even specific regions inside a single brain go through maturation process on differing schedules. (Medina, p. 63 original emphasis)

The limbic system, for example, reaches its adult form around age 15, with a 7% larger amygdala with all its pesky primary emotions (fear, want, anger, & lust); while meanwhile, the prefrontal cortex with its complex judgment and decision making centers doesn’t’ reach its adult form until as late as age 25.  Hence, “teen-brain reward centers are more active than children’s or adult’s, making them experience rewards (and other feelings) more intensely than any other time.  When they yell out, ‘I’ve never been so stoked in my life!’ they may be literally telling the truth” (Medina, p. 69).

What, then, are the implications for our understanding of teenage behavior? First, on the mundane level, it explains their incessant appetites for food and why they are always sleepy when they don’t get the 9 or more hours of snoozing they actually need each night.  The mature brain is already one of the great energy drains on the body’s resources, accounting for only 2% of our total mass but consuming at least 20% of our caloric intake, and here we have brains under construction demands that makes building the Burj Khalifa tower in Dubai look like an assembly of matchsticks.

Second, on the less mundane, the disconnect in maturation rates between limbic and prefrontal cortex accounts for the general swings in mood and frequent risk-taking teenagers experience.  A less-than-fully regulated amygdala and other limbic structures are going to have the potential to generate behaviors resembling those of the “lizard brain” for which this region is often analogized, and while it is actually the struggle to control this part of the brain that directly causes part of the prefrontal cortex maturation, some limbic decisions have resulted in their owner paying a heavy cost…sometimes, regretably, even death.

Third, and finally—and most definitely NOT mundane—the price of all this colossal change taking place in the teenage brain is that adolescence is the peak onset of mental health disorders, with 50% of diagnosable illness occurring by age 14 and 75% by age 24.  The simple reality is that with so many “moving parts,” there are so many more potentail ways to have things break, just as they would with an actual mechanical machine, and that leaves the teenage brain at greatest risk for malfunctioning before it even has the chance to transform itself into a healthy fully adult brain.

Which brings me back to the evolution question:  why on earth would the human brain evolve to go through what looks like from today’s vantage such a potentially dangerous developmental stage? The answer is two-fold:  gene shuffling & energy.  Early humans lived in small, hunter-gatherer groups where the potential threat for incest and all its accompanying genetic dangers—especially from a species perspective—was a very real problem.  What better way to drive you to separate from your family right about the time you are biologically ready to reproduce and to make the dangerous trek to another tribe than to make you engage in obnoxious and annoying behaviors toward your elders and to employ a brain “cognitively anesthetized” toward taking risks? Furthermore, in a world of small, hunter-gatherer groups, energy would be very hard to come by—especially the excess energy needed to reproduce and rear young—and so what better way to conserve the need for energy than speeding up and pruning huge sections of the most energy intensive organ in the body in as short a time span as possible? Even if that meant that some individuals would not survive that cognitive rearrangement mentally intact.

Hence, what adolescence evolved to do was to ensure the necessary genetic diversity within the human species needed to adapt to an every-changing environment as well as the energy conservation in adulthood needed for individual members of the species to reproduce in a world where daily hunger was the norm.  As John Medina puts it, “all of a sudden, you have a seriously powerful label for puberty: species savior” (p. 78); we literally wouldn’t be here today without the teenage brain.

Of course, today, we do not live in the Pliocene.  There is more potential gene shuffling in a single American high school than existed on the entire Serengeti plain, and we actually have an obesity crisis in this country because of how much excess energy is available to us.  Hence, the challenge for us as educators (and parents) today is how to manage a Pliocene brain in an industrialized digital world.  Knowing how that Pliocene brain works, though, can give us clues.

More on that next time.

References

Gazzaley, A. & Rosen, L. D.  (2016) The Distracted Mind: Ancient Brains in a High-Tech World.  Cambridge:  The MIT Press.

McKie, R. (1996) African Exodus: The Origins of Modern Humanity.  New York:  Henry Holt and Company.

Medina, J. (2018) Attack of the Teenage Brain. Arlington, Va:  ASCD Books.

Ever Changing, Always the Same

When I was a boy of 14, my father was so ignorant
I could hardly stand to have the old man around.
But when I got to be 21, I was astonished
at how much the old man had learned in seven years.

—Mark Twain

Orion is once more in the early morning sky (though a little harder to see with all the light-pollution).  A few of the leaves have started to change color on my morning commute (despite a hundred-degree heat wave in September).  And two weeks ago, I started my 35th year in the classroom (surrounded by colleagues, the majority of whom I could have taught).

In all that time, I have witnessed amazing and significant change.  For example, when I started that fall of 1989, class handouts had to be produced using a hand-cranked mimeograph machine that left paper slightly damp from the evaporating acetone as the ink dried; today, I send a PDF file via a wi-fi signal to the many-times descendent of the Xerox-machine that prints, staples, and 3-hole-punches two-sided handouts that I pick up at my convenience.  Back then, I wrote those handouts on a computer the size of a toaster oven (with a separately attached monitor) that required me to remember to save my work regularly on a 3.5-inch floppy disk that could store up to 1.44 MB of data (i.e. millions); today, I compose on a computer actually named after its size (tablet) that immediately autosaves any work I do to “the cloud” (computer servers that could be on the other side of the planet) and I currently have 4 terabytes of storage (i.e. trillions) but could pay for even more.

In addition, my students in 1989 had to do any research I assigned them in a library with limited hours and a finite number of books and periodicals; today, they carry pretty much all human knowledge ever generated in the palm of their hand with 24/7 access.  Back then, when I gave them a group project to complete, they had to organize a gathering time and space to work on it, and they had to physically assemble the final materials for submission; now, they use group-chats to communicate anytime they wish and share on-line documents which they can work on simultaneously from anywhere in the world.

So, change.  Dramatic change.  And lots of it.  In fact, I could go on ad nauseum with the change I have seen, and I obviously haven’t even touched on the social, cultural, and environmental differences of the past 34 years that impacts teaching and learning from outside the classroom.

Yet, I have passed out paper handouts every single one of those years, regardless of how they got manufactured.  And my students still research topics about the natural world, and they still complete group projects for me where they have to cooperate successfully to create their final product.  And that 3.5-inch floppy disk? A visual image of it is the “save file” icon on the screen of every software program on every computer on this planet.

I share all this because what I have been noticing most this particular fall is not the changes in my students over the years but rather their continuity.  Watching the 14- and 15-year-olds in my 9th grade biology class, I see the same struggles with executive function and materials management that I did in 1989.  I see the same awkward anxiousness when answering a question, the same tentative hand in the air when asked to respond to a query, the same timidity when assigned groups for an activity.  Likewise with my seniors, I see how the four years have matured them, how the anxiety of learning has been tamed, the executive functioning refined, and I see a different tentativeness as they prepare for college admissions and the next chapter of their lives. I see 18-year-olds on the cusp of early adulthood.

But most of all, what I have realized I have been seeing at the start of this new school year is the adolescent brain doing what eons of evolution have hard-wired it to do:  feel deep emotions; prune unused synapses; take risks; myelinate the prefrontal cortex; lay down new neural pathways; prepare for reproduction…in a word: be adolescent.  And being adolescent hasn’t changed in my 35 years of working with them.  It won’t in any future years I remain in this career—evolutionary change in something as complex as the brain doesn’t work on that brief a time scale.  Hence, the children entering my classes today are fundamentally no different than the ones I taught over three decades ago; they are adolescents being adolescents.

However, the students in my classes this fall do live in a radically different world than the ones I taught in 1989, and because any good geneticist knows that we are all products of nature AND nurture, the changes of the past 34 years do matter.  They have a direct impact on how adolescent brains experience their adolescence (as has been documented by the significant changes in teenage mental health over the course of my career). 

The blunt truth is that the world my students live in today is radically different and radically changing—both all the time and at a pace undreamed of in 1989—and I have to account for that as the person responsible for creating the classroom environment they experience. But I also need to remember that I’m fundamentally working with the same adolescent brain I was working with 34 years ago and that “if adolescents seem imperfect, it’s only because most of the obstacles their brains were wired to traverse no longer exist, and we haven’t sojourned long enough in organized society for adolescent brains to get the memo” (Medina, p. 4).  Therefore, I’m still responsible as their teacher for helping them to navigate the deep emotions, risk taking, synapse pruning, and myelinating their brains are experiencing.

How I and others might do so in our ever changing world is what I will address next.

References

Medina, J. (2018) Attack of the Teenage Brain. Arlington, Va:  ASCD Books.

U.S. Surgeon General Advisory (2023) Social Media and Youth Mental Health.  The Department of Health and Human Services. https://www.hhs.gov/sites/default/files/sg-youth-mental-health-social-media-advisory.pdf.

How Far Can We Sink? Part 2

Liberal science has relegated violent creed wars to the history books.
—Jonathon Rauch

We may have just handed a four-year-old a loaded weapon.
That’s what I think we actually did.

—Chris Wetherell, Inventor of Twitter’s “Retweet” Function

What do you do when a truth takes six times as long as its corresponding falsehood to reach the intended audience and the falsehood is 70% more likely to be reshared?

Those were the findings of a 2018 study by researchers at the MIT Media lab, and they are at the heart of why—as I shared in Part 1 of this essay—Jonathan Rauch contends that the Constitution of Knowledge, with its reality-based community and epistemic principles of fallibilism and empiricism is under attack.  The rise of social media and its consequent ability to spread misinformation, calculated disinformation, and “shock & awe” levels of psychological assault nearly instantaneously has enabled those who are antagonistic to the notion of objective truth-seeking—for whatever reasons—to undermine the work of those who engage in it, and even worse, to directly attack those individuals and institutions involved in this work. 

For example, Rauch points out, fear of being publicly bullied, stalked, and ostracized on-line for simply stating an opinion or offering a critique—no matter how well-intended—has reached such epic proportions on today’s college and university campuses that 70% of professors report feeling at least some concern (and 40% extremely so) “that having an open class conversation on [controversial] topics could result in their being reported to the authorities, receiving bad course evaluations, suffering damage to their reputations and careers, and being shunned by their colleagues” (p. 221).  Meanwhile two-thirds of students claim that “their campus climate precluded students from expressing their true opinions because their classmates might find them offensive” (p. 222), and thus, Rauch argues, one of the pillars of the reality-based community risks finding itself silenced and censored—cancelled—by the ability of social media to make life a living hell for anyone who dares to submit a proposition for critique and potential falsification that is preordained as harmful or oppressive by some group or another.

Yet, if the history of the Constitution of Knowledge teaches us nothing, Rauch continues, it is that a diversity of ideas—of propositions to test for fallibility—is essential to the pursuit and expansion of objective empirical knowledge, and therefore, a diversity of voices is equally so.  Without them, our own individual biases keep us from even entertaining certain hypotheses, and so potential truths about the world don’t ever become knowledge because the questions leading to that knowledge never get asked.  As Rauch puts it:

The problem arises when other groups and other ideas are absent or silent or fail to organize.  To say it the [John Stuart] Millian Way: the problem arises where lack of contestation hardens even true opinions into dogmas.  To say it the [James] Madisonian way: the problem arises where the republic of science becomes too small, intellectually speaking—when its sphere contracts instead of expanding (p. 230, original emphasis).

And if the reality-based community doesn’t have enough problems already with social media impairing one of its pillars from within, this same digital technology has also made attacks on it from outside the community significantly easier as well.  And while I have written before about the origins of what Rauch calls “troll epistemology” and why internet trolls and trolling have become so successful at challenging our notions of truth, Rauch brings some fresh ideas to the table that I want to share. 

He first points out that “we forgot that information technology is very different from knowledge technology.  Information can simply be emitted [without any of the checks and balances required for true knowledge]” (p. 125, original emphasis).  Consequently, any and every thought, opinion, fiction, or other “brain fart”—however false, wrong, or delusional—can make its way immediately into our lives via our computers and phones.  The result, Rauch states, is that “what troll epistemology can do is degrade the information environment around the reality-based community” (p. 164, original emphasis)—which is highly problematic for the Constitution of Knowledge and its reality-based community because while they do not need everyone to be fellow truth-seekers, they do need most people to be truth-friendly and “to behave in ways which support rather than undermine the Constitution’s ability to do its job” (p. 115; original emphasis). 

Thus, what makes trolls so dangerous to the work of those pursuing real knowledge about the real world is that they can overwhelm people’s attention with fake news, conspiracy theories, and other dreck and drivel to the point where they are totally diverted from anything resembling objective empirical reality, utterly exhausted from the bombardment of information.  Essentially, trolling blockades the attention of those whom the Constitution of Knowledge needs to be truth-friendly, and as Rauch puts it:

if you cannot be sure at any given time whether you are being manipulated or scammed, then the natural way to protect yourself is to assume you are always being scammed, or to hunker down with online friends in your own private version of reality, or to take a demagogic politician’s word for it (p. 169).

Being truth-friendly then becomes either an after-thought or something one is even antagonistic toward, and the reality-based community finds itself fighting (and in the case of the pandemic, literally) for its life.  It is probably one of the reasons Rauch calls trolls “epistemic sociopaths.”

So what are those who value truth to do in a world of anonymous epistemic sociopaths and cancellers? How do we in the reality-based community defend a basic understanding of reality that is as objective as possible within our epistemic limits? Rauch is clear in his response.  First, “when we encounter an unwelcome and even repugnant new idea, the right question to ask is ‘What can I learn from this?’ rather than ‘How can I get rid of this?’ ” (p. 198).  And second:

Every time I hear a minority-rights advocate say that she should not have to debate haters who question her very right to exist, I say: on the contrary, that is exactly who you need to debate.  The hearts, minds, and votes we need to win are those of people who do not already agree with us—a point which might seem obvious but is surprisingly easy to overlook.  Recent research supports what activists like me learned firsthand in the gay-marriage struggle:  deploring and denouncing people rarely changed their minds, but respectfully listening and talking to them often did (p. 257).

Thus, Rauch’s response is essentially that if we do not treat even the most deplorable as propositions for potential falsification—if we do not abide by the fundamental principles of fallibilism and empiricism at all times—then we have failed to engage in the very work the reality-based community employees to generate the objective communal knowledge we believe should guide both us and our attackers in the first place.

Or as he summarizes it:

In exchange for knowledge, freedom, and peace, [the Constitution of Knowledge] asks us to mistrust our senses and our tribes, question our sacred beliefs, and relinquish the comforts of certitude.  It insists that we embrace our fallibility, subject ourselves to criticism, tolerate the reprehensible, and outsource reality to a global network of strangers.  [Its daily, never ending, defense can be exhausting, upsetting, and deeply stressful.] But we cannot afford to be snowflakes.  Epistemic liberalism, like political liberalism, is a fighting faith (p. 263).

Moreover, when it comes to this fight, Rauch is clear that “I am not an alarmist.  To the contrary, I write this book in the spirit of hope and guarded optimism” (p. 18) and that “the reality-based community has withstood much worse.  It beat back the inquisitors who imprisoned Galileo, the dictators whose gulags spanned continents, and the racists and homophobes who sought to silence voices of freedom” (p. 264).  It is clear that he believes the defense of truth is winnable.

However, here is where I stand concerned.  As I write this, a former President of the United States is facing the near certitude of a third criminal indictment, and he is all but bragging about it on his social media platform to stir up his supporters for his run again for the Presidency.  As I write this, an extremist anti-abortion group is arguing that a mother who has an abortion should be sentenced to death; it’s leader publicly arguing “an eye for an eye.”  And as I write this, the Alabama legislature has redrawn its political map in response to the Supreme Court striking down its former map with an even more partisan one to dilute the Black vote simply because the Court did not explicitly say in its ruling that there must now be two predominately Black districts due to changes in population percentages.  Simply put, we live in a world where truth-friendly is rapidly becoming an endangered species, and the anti-intellectuals of our society already greatly outnumber the members of the reality-based community.

What’s more, when Rauch wrote his book, ChatGP3 did not exist; it wasn’t even on the foreseeable horizon. Yet, as I wrote in The End of Truth?, anyone wanting to disinform or even attack can now produce such realistic material that only the most sophisticated digital analysis tools can determine whether it is a so-called “deep fake” or not (and, in fact, I learned shortly after that post that a colleague of mine had already had it happen to him).  Worse yet, if Caryn Marjorie’s use of AI—she of the CarynAI “girl-friend” experience—is any example of how people will employ this technology, then we are in deep trouble.  News services report that “one fan, at the bot’s encouragement, built a shrine-like photo wall of her;” to which her response was “this is why we have limited CarynAI to only accepting 500 new users per day.”

It is estimated that she will make $5-10 million dollars this year. Simply for consuming oxygen and exhaling carbon dioxide.

AI, therefore, has already demonstrated its potential to provide the trolls and cancellers with the social media equivalent of nuclear weapons against the reality-based community and its Constitution of Knowledge, and even its relatively benign users are not exactly demonstrating restrained wisdom.  It is not a lot to give one optimism about Rauch’s defense of the truth.  Even though I am in complete agreement with him about the value of such a defense—indeed, will never stop doing so myself—I am left wondering if we are not head more toward the equivalent of the situation in the Koreas, rather than the Allies of WWII, with the epistemic equivalent of a DMZ.

A potential creed war with at best a fragile truce.

References

Associated Press (July 21, 2023) Alabama Lawmakers Refuse to Create a 2nd Majority-Black Congressional District.  NPRhttps://www.npr.org/2023/07/21/1189494854/alabama-redistricting-map-black-districts.

Contreras, B. (June 27, 2023) Social Media Star’s AI Clone Charges a Dollar Per Minute for Chats.  The Los Angeles Times. https://www.latimes.com/entertainment-arts/business/story/2023-06-27/influencers-ai-chat-caryn-marjorie.

Johnson, C. & Inskeep, S. (July 24, 2023) Trump Could Face Federal Indictment Soon over Effort to Overturn 2020 Election. NPR Morning Editionhttps://www.npr.org/2023/07/24/1189719400/trump-could-face-federal-indictment-soon-over-effort-to-overturn-2020-election-d.

Rauch, J. (2021) The Constitution of Knowledge: A Defense of Truth. Washington, D.C.: Brookings Institution Press.

Sarat, A. & Aftergut, D. (March 17, 2023) “Pro-Lifers” Choose Death.  Verdicthttps://verdict.justia.com/2023/03/17/pro-lifers-choose-death.

How Far Can We Sink? Part 1

In any human society, a foundational problem is
how to resolve conflicts of belief.

—Jonathon Rauch

It will appear that individualism and falsity are one and the same.
—Charles Sanders Peirce

This past Christmas, my mother gave me a T-shirt with a somewhat unique message on it, and when I wore it to school on one of our “dress down” days this past spring, it evoked the following response from one of my students: “Oh, Mr. Brock! That is SO you!!”

The message?

That’s what I do:  I read & I know things.

Well, I’ve been reading again, and from what I’ve been learning recently, I have started to wonder just how far into irrationality our society can sink and still survive.  In just this past month, a meteorologist has had to resign his broadcaster position due to death threats for educating about climate change; the governor of Texas has signed a law prohibiting mandating water breaks for people who work outdoors (during the most catastrophic heat wave in state history!); and a young woman in Arizona has created an AI “clone” of herself to sell as an on-line “girlfriend” for anyone interested.

It is enough, frankly, to leave this member of what Jonathan Rauch calls the “reality-based community” shaking my head.  What’s more, it leaves me shaking my head in a mixture of consternation and despondency because at the same time I have been reading these headlines, I have simultaneously been reading Rauch’s latest book from the Brookings Institution about the need for us to be defending the rational pursuit of objective truth, wondering if that’s even possible anymore.

Part of the challenge for me, personally, is that I know too much brain science not to know that while reasoning is one of the brain’s capacities, it is not the chief one in charge. The limbic system with its emotions and intuitions is.  Or as Rauch so eloquently phrases it, “evolution selects not for the ability to reason in a way which leads to truth, necessarily, but for the ability to reason in a way which persuades” (p. 23; original emphasis).  The bottom line is that as a social species, our brains care more about looking like we know what we are talking about than determining the veracity of what we are talking about—just recall what happened to Socrates for insisting on the truth!

Yet when the rational and the irrational do butt heads, how do we resolve the conflict successfully in favor of the rational? How do we get to at least some semblance of agreement about the objective truth of a given situation? That’s essentially the focus of the first half of Rauch’s book, and he argues that for most of human existence, we simply have not.  When conflicts of belief have arisen, humans have traditionally either created isolated enclaves and ejected the heterodox, employed authoritarianism to maintain order, or resorted to violence—something Rauch calls “creed wars”—to destroy those who harbor “wrong” ideas. 

But with the rise of liberal science, he argues, based on skepticism and fallibilism, knowledge about the world acquired a communal nature that transcended individual beliefs.  When all propositions had the potential to be wrong and potentially disproved, no one person could claim certainty about the truth, and it was these checks and balances of the scholarly community, Rauch claims, that enabled humans to start resolving conflicting beliefs about the world in favor of a rational understanding of reality—even though (as any good philosopher of science will tell you) truly 100% objective knowledge is never possible.  Thus, humanity could now grow ever closer to truth through this “evolutionary epistemology” (as Karl Popper called it) without having to resort to physical force:  you could kill the hypothesis without killing the hypothesizer.

Rauch calls those who pursue knowledge in this manner, the “reality-based community,” and he calls the epistemic structures that support it, the “Constitution of Knowledge.”  He uses this notion of a constitution as a very deliberate analogy to how the U.S. Constitution provides for the governance of our society with its separation of powers, arguing that the principles of fallibilism and empiricism prevent no one individual or institution from dictating what is or is not true.  Likewise, he suggests, just as the U.S. Constitution guides a network of people to generate laws through persuasion and compromise in a process that no one person can control, “the reality-based network [of communities] behaves like an ecosystem, producing a body of validated propositions whose composition humans can influence but not control” (p. 87).

Rauch also uses this analogy because he wants to assert strongly that just as declaring something unconstitutional in our society means it is illegal, period, then so too, “anyone can believe anything, but liberal science—open-ended, depersonalized checking by an error-seeking social network—is the only legitimate validator of knowledge, at least in the reality-based community” (p. 87).  Only propositions about the world which have been repeatedly put to the test of falsification using empirical means and continue to pass such testing can claim any status as objective truth.  Everything else is subjective mysticism, falsehoods, or outright lies.

Which Rauch acknowledges:

goes down very badly with lots of people and communities who feel ignored or oppressed by the Constitution of Knowledge:  creationists, Christian Scientists, homeopaths, astrologists, flat-earthers, anti-vaxxers, birthers, 9/11 truthers, postmodern professors, political partisans, QAnon followers, and adherents of any number of other belief systems and religions (p. 87). 

But Rauch also grasps that what can go down badly even with members of the reality-based community is that just like the U.S. Constitution, the Constitution of Knowledge bestows not just rights, but responsibilities as well, and:

as between the two, the rights are the easier to grasp and defend…of all epistemic orders, only liberal science is premised on free inquiry and intellectual pluralism.  Yet the responsibilities are heavy and tempting to shirk.  We can believe as we like privately, but in making public policy, we must privilege the reality-based community’s judgments about facts…we can question and criticize to our heart’s content, but we must also endure being questioned and criticized ourselves…accepting that we might always be wrong…[Furthermore,] if we wish to be part of the reality-based community, we behave as if truth exists and evidence matters [and we recognize that] failing to persuade means losing the argument, period.  You cannot prevail by majority vote, by being divinely inspired, by being historically oppressed, by having justice on your side, by silencing the other side, or any other way.   Those are the republican virtues of the republic of science—and they require just as much discipline and commitment as do the republican virtues of politics (pp. 112-113).

That is why Rauch recognizes that as with any constitution, “if the people or their factions seek to win by lying, breaking the law, fostering extremism and demagoguery, or wiping out the other side, then no constitution will endure, however strong it might look on paper” (p. 112).  Today, he asserts, the Constitution of Knowledge is under just such attack from both without and within, and he posits that social media, cancel culture, and what he calls “troll epistemology” are collectively chipping away at the reality-based community.  His is a call to defend against these three forces (as the book’s subtitle implies), and that is where I’m no longer sure we have the necessary social ecosystem to succeed—especially given the preponderance of the types of news items with which I started this essay.

Something I will explore in my next post.

References

Chow, D. (July 7, 2023) Backlash Brews Against Texas Law that Eliminates Mandatory Water Breaks.  NBC Newshttps://www.nbcnews.com/science/science-news/backlash-brews-texas-law-eliminates-mandatory-water-breaks-rcna92961.

Contreras, B. (June 27, 2023) Social Media Star’s AI Clone Charges a Dollar Per Minute for Chats.  The Los Angeles Times. https://www.latimes.com/entertainment-arts/business/story/2023-06-27/influencers-ai-chat-caryn-marjorie.

Rauch, J. (2021) The Constitution of Knowledge: A Defense of Truth. Washington, D.C.: Brookings Institution Press.

Trisman, R. (June 27, 2023) A Meteorologist Got Threats for His Climate Coverage. His New Job is About Solutions.  NPR Morning Edition.  https://www.npr.org/2023/06/27/1184461263/iowa-meteorologist-harassment-climate-change-quits#:~:text=Iowa%20meteorologist%20quits%20after%20death%20threat%20over%20climate%20coverage%20Chris,tackle%20climate%20change%20head%2Don.

Revisiting “Authentic Engagement”

I expanded my educational horizons this summer and am presently teaching a population of children that are most definitely not my normal bailiwick.  A colleague and friend of mine from our Middle School division convinced me to teach a STEM course to a group of rising 6th, 7th, and 8th graders in a regional program that pairs Baltimore private schools with Baltimore City public schools to prepare students for high school.  Known as the Middle Grades Partnership (or MGP), it has been serving as a bridge program since 2005 to prevent summer learning loss and help students find greater academic success in what are critical developmental years in early adolescence.  Thus, for the past three weeks (and two more to come), I have been teaching eleven, twelve, and thirteen-year-olds how to build bridges out of paper, study yeast metabolism, and write a computer program to make an animated dinosaur dance.

And it has been quite the education for me! Any inklings I ever harbored about being a middle school teacher have been banished, and my respect for those who do it full time for a living has reached the level of reverence.  People who work effectively with eleven-year-olds should be worshiped and paid seven-figure salaries.

The experience has gotten me thinking, though, about the concept at the core of this whole project of mine, “authentic engagement.”  For those not familiar with my introduction, the fundamental premise I posit is that we should understand education through an ecological rather than mechanistic lens, recognizing that classrooms are effectively ecosystems and that teachers inhabit that ecosystem as its keystone species—the one which determines the well-being of the entire thing.  Authentically engaged teachers, then, are those that generate healthy ecosystems for learning in their classrooms and to achieve authentic engagement involves three critical things: embracing the role of co-learner in all educational situations; generating appropriately intimate rapport with students; and employing a full understanding of the tension between the brain’s plasticity and its hard-wiring.

I have already explored each of these properties in detail in Part I of this project. But these past few weeks have made me realize that I have explored them almost exclusively through my lens as a high school teacher, and while my experience at MGP has made me even more convinced that we need an ecological paradigm for education and that co-learner, rapport, and neuroscience are foundational to effective teaching, I have come to realize that the three properties of authentic engagement will look different depending on the age-group with whom you are working.

Take embracing the role of co-learner.  In high school, that can mean sharing the latest in research you’ve been reading or collaborating on choreography in a dance class.  But as I have recently discovered, to an eleven-year-old, it can mean being willing to play the cryptography game you’ve just taught them along with them.  As for generating appropriately intimate rapport, I am awestruck by one of my colleagues in the program who employs what I can only describe as this gentle goofiness when teaching them math that just would never be in my capacity. And don’t even get me started on a full understanding of the brain…I thought I knew my stuff.  But watching the behavior of rising 6th and 7th graders has me scurrying for the research journals!

The long and the short of it, then, is that while I remain even more confirmed in my belief that authentic engagement in the classroom is essential for genuine teaching and learning, what the qualities of its three properties look like varies more than my original analysis of them might suggest.  I think I knew this intellectually—after all, elementary teachers legitimately employ well-timed hugs that would get me arrested—but now, I have a deeper existential appreciation for authentic engagement’s variability.

And an even deeper awareness of why I teach the age group that I do.  I have no regrets that I chose to stretch myself to grow as an educator this summer and am even considering staying with MGP for another year to build on what I’ve learned.  But my five weeks as a middle school teacher has most decidedly taught me what I am most definitely not, and I can’t wait to see my seniors in the Fall.

Looks Like the Jury is Finally in….

Because adolescence is a vulnerable period of brain development,
social media exposure during this period warrants additional scrutiny.

—the U.S. Surgeon General

We’re rearing a nation of passive, sedentary, and constantly distracted people.
—Shane Trotter

Anyone who knows me or has read my work knows my thoughts on digital technology—and especially social media! Unlike my students, I do not watch three TikTok videos simultaneously during every free moment I have, and unlike many in the generations in between, I do not have an Instagram or Snapchat account or a Facebook page that notifies me with constant insistent alerts.  In fact, I only possess a smartphone reluctantly because during the renovation of my kitchen, I discovered every business model in America assumed everyone was carrying their own personal computer around with them at all times, and it was no longer possible to function in our society without a ridiculous degree of aggravation and inconvenience without such a phone.  But I have no apps on it that didn’t come preloaded, and I still use actual physical maps and atlases in my travels.

I will confess to using LinkedIn occasionally to track down someone from my past with whom I wish to get in touch (as I did just the other day with a former student to see if she will be attending her hall of fame induction this fall), and I have created and managed numerous websites over the years (including this one) as a means of disseminating various types of information. However, I delete every attempt by WordPress to get me to “re-tweet” or “share” even this website, leaving it to others to decide for themselves if they want to engage with its content.  I will and do employ technology, but like maintaining a healthy food diet, I do so thoughtfully, with deliberation, and only actually as needed.

What’s more, I did not arrive at my relationship with technology willy-nilly.  Like fellow educator, Shane Trotter, I remember well the initial rollout of my school’s one-to-one laptop initiative back in the early 2000s and how it:

was driven by a vague, yet unassailable belief that technology was “the future” and, thus, that any measure that brought more technology into classroom instruction was inherently good.  According to the prevailing wisdom, new technologies were sure to unleash a cascade of new teaching superpowers.  And teachers needed to utilize these superpowers as soon as possible because “students were different now.”  They would no longer learn well from the old teaching methods like lecturing, reading, and writing.  The only way to reach the 21st-century student was to integrate a steady stream of new tools into every lesson.

And, like Trotter, I remember well the message from the administration (one of whom, in full disclosure, is a follower of this project) that we were to find ways to employ these new tools proactively in our teaching, to rethink our lessons with student access to their laptops front and center in our planning.  We received significant professional development to aid us in this process, and most of my colleagues gamely did their best (or as Trotter puts it, “that year, teachers tried everything”).

However, I was not among them.  Having been coding since high school and with two websites already designed, created, and managed by then, I was well versed in the technology now available to both my students and myself on a daily basis.  Including what it meant to have full wi-fi internet access!  Therefore, I took a more skeptical and cautionary approach—doing so precisely because I understood that while the worst I’d had to confront prior to the laptops were students who “would have had nothing better to distract them than doodling on notebook paper or passing notes between each other” (something I already knew how to combat with highly engrossing learning), I would now be competing for their attention “with all the world’s entertainment and the attention-hacking efforts of Silicon Valley’s most brilliant minds” (Trotter). 

That is, I would be competing for said attention unless I firmly delimited the usage of their devices in the teaching process.  Hence, while excited to have the nearly infinite “library” of the internet available when researching about a topic, I knew that “screens down” needed to be the norm rather than the exception when working in the classroom.

Sadly—even frighteningly—the intervening decades have demonstrated the value of my caution as the digital realm has enmeshed itself deeper and deeper into all our schools at a greater and greater cost to both individuals and society.  Again, I like Trotter’s words:

In its desire to embrace technology, our school district failed to recognize the social devolution that was taking hold of society.  The iPad Initiative came right as smartphones became virtually ubiquitous among American teens and adults…and at this crucial juncture, we decided to begin allowing students to use smartphones throughout the school day.  These students would not know how to set boundaries for how they used their phones.  They’d have no understanding of the psychological vulnerabilities that tech companies exploited—no training in how to use their phone without it using them.  Most of all, they’d have no environment where they could be free from the incessant psychic drain that had come to define their world.  Oblivious to any responsibility to help students or their families adapt better, our schools helped facilitate the community’s descent into becoming screen-addicted, constantly distracted people whose cognitive skills and attention spans were being chipped away rather than cultivated.[1]

What’s more, in the latest news from our tech-dysfunction, people are apparently losing the ability to remain engaged in anything challenging or discomforting, even when doing so is critical to their well-being. Researchers Tracy Maylett and Tim Vandehey have demonstrated that “we’re [now] accustomed to programs that do what we call ‘reality switching’ in a very short time instantaneously and unconsciously, [and so] rather than stick things out, it’s much easier to impulsively say, ‘I don’t like this in the moment, and I’m done.’ ”  We then simply swipe on our devices to the next potential distraction, and that, they argue, is causing our brains to become “masters of swiping” rather than masters of achievement. 

And that’s because achieving goals of any kind requires practice and patience, as well as what David Rock and Emma Sarro describe as “creating space for the quiet to reach the surface.” And since significant numbers of us are no longer developing the skills of practice, patience, and quiet as we bounce from one interuption to the next, we find ourselves in a society swiping its way through our days in almost nihilistic narcissism.

Yet as Trotter writes so eloquently:

I fear we’ve been engulfed in this world for so long now that the obvious solutions will sound extreme. You cannot allow an infinite device like the smartphone in the classroom and expect students to cultivate academic capacities that require an attention span. You cannot allow smartphone use throughout the hallways and cafeterias and expect students to develop social skills. And you cannot expect students, parents, or even teachers to navigate the incessant temptations of modern technology—to be anything but distracted pawns—without clearly defined limits and an educational campaign to teach them the boundaries required for healthy use. Every school district in America has some Band-Aid initiative to shrink learning gaps and respond to the surge in mental health disorders, yet few, if any, will address this most obvious saboteur.

And when an institution as conservative as the U.S. Surgeon General’s Office is issuing a warning about the impact of the digital age on the health of our children, then you know the jury is in on what needs to be done to reign in our technological lives.  The question is:  will we have the will to carry out the necessary sentencing?

[1]And have done so at a cost to learning I have explored elsewhere and extensively documented in many of my previous postings

References

Maylett, T. & Vandehey, T. (2023) Swipe: The Science Behind Why We Don’t Finish What We Start.  Herdon, VA: Amplify Publishing Group.

Rock, D. & Sarro, E. (April 16, 2023) Forget Information. Focus on This Instead.  The Baltimore Sun. https://baltimoresun.newspapers.com/search/?query=%22forget%20information%22.

Trotter, S. (April 22, 2022) Hidden in Plain Sight: Putting Tech Before Teaching. Quillette. https://quillette.com/2022/04/22/hidden-in-plain-sight-why-we-should-stop-putting-tech-before-teaching/.

U.S. Surgeon General Advisory (2023) Social Media and Youth Mental Health.  The Department of Health and Human Services. https://www.hhs.gov/sites/default/files/sg-youth-mental-health-social-media-advisory.pdf.

Sometimes, It’s Hard Not to “Curse”

Early on in the pandemic, I shared some observations at the start of the 2020-2021 school year that opened with one word: “Pray.”  It was a depressing and dispiriting time as my zoom zombies and I tried to hold class, and Notes from the Trenches remains probably the darkest thing I have ever allowed myself to publish.

Yet for nearly a month now, I have had to fight the urge to open all of my writing with that exact same word.

Because the news pouring out of education just keeps getting grimmer. 

It started with a report from Atlanta where the elementary teachers there have documented that—due to the interruptions of the pandemic—74% of their third graders entered the year reading (if at all) at only a first-grade level. This leaves those teachers only the year to try and get their students caught up for the higher reading expectations of fourth grade—where the demands for independent learning start to be significantly greater—and the stakes couldn’t be higher. Children who aren’t reading fluently by the end of third grade are four times more likely to fail to finish high school, and those who don’t earn their diploma are more likely to end up in prison (with a 70% greater chance if you are a person of color).  It is a race against time, and it is a race in schools across the country, where unlike the Atlanta district, few are adding an extra 30 minutes to the school day and not all of them are using research supported phonics-based curricula to tackle the problem (Go Atlanta!).

Yet, even such intensive efforts may be for naught given what researchers at Harvard and Stanford reported discovering in the data from nearly 8,000 school districts earlier this month.  When examining previous situations similar to the interruptions of the pandemic—e.g. a single district hit heavily by an outbreak of the flu during a given year—the researchers found not only the anticipated declines in reading levels; they found that the declines stayed put.  As Stanford’s Sean Reardon put it in an NPR interview:  “what was, I think, striking and surprising and a little sobering was that when there’s a big decline in one year, those cohorts don’t seem to catch up for those three or four years that we can follow them into the future.” Therefore, he warns, “parents and public officials shouldn’t just assume that schools can make up for all that lost ground because history shows in those test scores, without a concerted effort, much of it will just stay lost.”

And one of the potential impacts of it staying lost was revealed when the National Assessment for Educational Progress (NAEP)—sometimes referred to as “the nation’s report card”—released its latest results for 8th graders on the history and civics tests.  The scores were the lowest since the test’s inception, with civics dropping for the first time ever, and though it makes sense given the plummet in NAEP reading scores that came out last fall—since learning both history and civics obviously depend on reading comprehension—that doesn’t make it any less fraught with alarming implications for our future as a society.  As Brown University professor of political science, public policy, and education, Jonathan Collins remarked when asked how worried he was about these results:

Well, deeply concerned – and not just because I think the understanding of the world is important, but I think there’s an additional step there, which is the understanding of the world in order to address the major problems that the world faces. So when we think about whether it’s climate change, whether it’s growing economic or racial inequality – these big, major, huge social problems that seem to be growing more and more by the day – if students aren’t getting a handle on how our political system works, then this impacts their ability to be a part of a structure that’s supposed to bring people together to solve these big problems.

Our civic institutions are already dysfunctional enough as it is right now without pouring this additional fuel of ignorance onto the flames.  Hence, I find it hard to learn of the NAEP results and not feel just a little disheartened.

Which brings me to the “salt in the wounds” provided by all this recent news.  As I shared in a previous post about the state of education today, the statistics on the teacher shortage in the U.S. show that we are risking a crisis, and apparently, governors around the country have finally awakened to this fact and are starting to call for raises in districts nation-wide.  However, for many in education, it is a situation of “too little, too late” since what is sometimes referred to as the “pay penalty” for choosing to teach when compared to other college-educated professions has only grown over the past decade, reaching a record 23.5% in 2021. 

Basically, what that means is that teachers now earn 76.5 cents for every dollar earned by other college-educated professionals, and while no one goes into teaching for the money, others in the “caring professions” (nursing, social work, etc.) don’t begin to have that big a gap.  My sister, the social worker, makes 18K more than I do and has a similar amount of leave time.  Hence, as if the current situation in education wasn’t wounding enough, now this new news. Like I said, “salt.”

I know; I know. Anyone who has read pretty much any of my other writing knows that I can’t leave it there.  I am simply constitutionally incapable of not responding to all of the pandemic’s impact on children’s learning with caring and support. Tomorrow morning, I’ll be right back in the classroom, authentically engaged, trying as always to “light candles.”

But oh, sometimes, it is damn hard not just to “curse the darkness.”

Coda

Hope.  And my kind of hope.  Hope as verb. 

While I was finishing the final edits before posting this essay, I took a break to read the paper, and there it was, a headline reading: “Kids’ reading scores soar across the South amid reforms.” 

It turns out that the perennial punching bag of educational failure and ineptitude, the state of Mississippi, has gone from having the second worst reading test scores among 4th graders in the country to only the 21st worse scores.  It, along with Louisiana and Alabama, have invested in training thousands of teachers in the science of reading (i.e. phonics!), and they are requiring every K-3 teacher and elementary principal and assistant principal to take 55-hours of training in how to teach children to read. The consequences have been so dramatic that some in education are referring to it as “the Mississippi Miracle,” and one can argue that that is not hyperbole given Mississippi was among the only three states to see modest gains in their reading test scores during the pandemic, rather than the learning loss seen nearly everywhere else.

It is amazing what happens when we actually invest in our children. And pay attention to the science!

References

Carrillo, S. (May 3, 2023) National Student Assessment Has Educators and Legislators Worried.  NPR All Things Consideredhttps://www.npr.org/2023/05/03/1173776611/national-student-assessment-has-educators-and-legislators-worried.

Levy, M.  (May 10, 2023) Raises for Teachers Make Governors’ Agendas.  The Baltimore Sunhttps://digitaledition.baltimoresun.com/html5/desktop/production/default.aspx?&edid=6629bd6c-1045-4d60-ab0d-dfc40b85b777.

Lurye, S. (May 20, 2023) Kids’ Reading Scores Soar Across the South Amid Reforms.  The Baltimore Sun.  https://digitaledition.baltimoresun.com/html5/desktop/production/default.aspx?&edid=c6a27048-5d52-413e-93c2-ffb4ecf32ba8.

Martinez, A. & Turner, C. (May 12, 2023) How Much Learning Did Students Miss During the Pandemic? Researchers Have an Answer.  NPR Morning Editionhttps://www.npr.org/2023/05/12/1175711862/how-much-learning-did-students-miss-during-the-pandemic-researchers-have-an-answ.

Pfeiffer, S.; Jarenwattananon, P.; & Lim, M.  (May 3, 2023) 8th-Graders’ History and Civics Scores Drop on a National Test.  NPR All Things Consideredhttps://www.npr.org/2023/05/03/1173776447/8th-graders-history-and-civics-scores-drop-on-a-national-test.

Toness, B. V. (April 23, 2023) “Too Much to Learn” The Baltimore Sunhttps://digitaledition.baltimoresun.com/html5/desktop/production/default.aspx?&edid=27971d2d-1a90-49e9-8cb5-c12bfdc0fc25.

Trotter, S. (April 22, 2022) Hidden in Plain Sight: Putting Tech Before Teaching. Quillette. https://quillette.com/2022/04/22/hidden-in-plain-sight-why-we-should-stop-putting-tech-before-teaching/.

A Letter to the Class of 2023

Struggles are the fertilizer for spiritual growth.
—Joyce Rupp

The work is often hidden and unglamorous, but it’s holy.
—Andy Stanton-Henry

Dear Members of the Class of 2023,

It has been quite the pilgrimage you have taken these past four years.  In spring of 2020, you were isolating at home, probably anxious about whether life would ever be normal again, and in the days and weeks since, you have dealt with zoom and hybrid classes; you have worn masks and socially distanced; you have tested weekly for COVID; and you have even lost some of life’s normal rites of passage—all of it with the ever-present reality that school could close at any given moment’s notice, disrupting life yet again.  Even now, as many routines, traditions, and habits have returned, the shadow of the pandemic lingers as each test positive forces a 5-day quarantine at home.  You are the class of the pandemic, and it has not been an easy four years.

Nor have the events of the larger world been any less challenging.  Since those first shutdowns you endured, you have witnessed an armed insurrection in our nation’s capital, the overturn of constitutional protections, Ukraine’s invasion by a despot, and far too many mass shootings of school children and police killings of people of color.  China is actually busy building new coal-fired power plants—even in the face of its own destructive climate events—and the economic turmoil coming out of COVID has generated inflation and other financial hardships not seen since the late 1970s.  It has, indeed, been one of the more tumultuous times to live through as you have matured into young adults.

Yet in that same time period, you have also observed the nearly miraculous creation of a successful vaccine in less than a year that saved billions of lives; you have seen the successful criminal prosecution of law enforcement for the murder of George Floyd; and you have witnessed the first nuclear fusion reaction to generate more energy than it consumed, laying the foundation for a truly carbon-free energy future.  A multi-national effort launched and installed a telescope a million miles from Earth for peering at the origins of the universe—an engineering marvel so complex that it had more than 300 potential points of failure threatening its final activation—and four countries (Columbia, Ecuador, Costa Rica, and Panama) created the Transboundary Marine Biosphere Reserve of the Tropical Eastern Pacific, the largest conservation area on the planet.

Hence, along with the bad has come the good; along with the good has come the bad, and in between, there have been thousands of sunrises and sunsets, moments of great joy and times of deep sadness.  There has been laughter and tears, days that were interminable and those you wished would never end.  There has been, as the wise author of Ecclesiastes once wrote, “a time for every purpose.”

And it is about that sense of balance and its agency that I want to write to all of you today.  We, your elders, have not always done a good job of achieving equilibrium with the world—or even within our own lives—and we now share with you (and will eventually leave with you) significant environmental and social problems that threaten to unbalance things even further.  Not least of which is the very technology I am using to write this letter.  As I have demonstrated to some of you firsthand, the device you are reading this on is simultaneously inhibiting your brain’s ability to function at full capacity—not ideal for tackling the kinds of problems we are leaving you—and if one adds in the fact that AI is now threatening our very capacity to discern what is true from what is false, it can be quite challenging to maintain any optimism about rebalancing both our lives and the larger world.

However, you will neither be the first…nor will you be the last!…to inherit the failures of those who came before you—a truth the Jewish scriptures sometimes refer to as “the sins of the fathers”—and in your own generation’s struggles with these “sins,” you will discover for yourselves what is perhaps the fundamental truth about what it means to be fully human:  that there is not and never can be enough time; so choose wisely the use of the time you have.

Granted, embracing this finitude is never easy, and it has frankly been my generation’s failure to do so collectively that has gotten our species into the current “hot mess” we are confronting.  But once you recognize and own that you cannot do it all—that each time you choose one course of action, you are by definition ruling out its alternatives…that you can never be everything you potentially could be—then you approach all of life’s choices with a better sense of balancing that of the Light and that of the Dark within you.

Because there will be both.  No one walks through life without times of alienation, and no one walks through life without times of grace.  Indeed, what my many students have taught me over the decades is that the dance of estrangement and redemption, of sin and salvation, of harm and healing…this dance is the moral and spiritual equivalent of breathing.  And just as biological breathing provides the body’s agency for action, so too does the soul’s “breathing” give agency for purpose—the power to effect change.

Which brings me to a question I have now asked my graduating seniors at the end of every year for 34 years: what will you do with yours? What will you do with your power? Will you consume simply to consume the way so much of our society continues to do? Will you invest your resources without thought to their potential footprint on the world? Will you support legislating to maintain the systemic structures of white, hetero-normative privilege? Or will you actively care for the Other? Will you deliberately fight for social and economic justice? Will you thoughtfully choose your impact on the environment?

Or, to paraphrase my favorite proverb, will you merely curse the darkness or will you light candles against it?  

Will you willingly tackle the “sins” you’ll inherit?

I will never get to see your answer.  But I have joined the caring adults in your lives who have done our best to give you the tools to walk a path that is both meaningful for you and a positive one for those around you, and it is with faith, hope, and love that we part ways, launching you into the next stage of your journey.  It won’t be an easy one; nothing that truly enriches your life ever is.  However, if the pandemic has given you nothing else, it has made you resilient beyond your years, and thus, I have the fullest confidence in your capacity to confront successfully life’s future demands.

In the meantime, to quote Dr. Seuss—a mandatory requirement of every graduation event—“Oh, the places you will go!” Congratulations and best of luck!

References

Burkeman, O. (2021) Four Thousand Weeks: Time Management for Mortals. New York: Farrar, Straus and Giroux.

Gazzaley, A. & Rosen, L. (2016) The Distracted Mind: Ancient Brains in a High-Tech World. Cambridge:  The MIT Press.

Maylett, T. & Vandehey, T. (2023) Swipe: The Science Behind Why We Don’t Finish What We Start.  Herndon, VA: Amplify Publishing.

Rupp, J. (2005) Walk in a Relaxed Manner: Life Lessons from the Camino.  New York: Orbis Books.

Stanton-Henry, A. (April 1, 2023) Ten Miles Around.  Friends Journalhttps://www.friendsjournal.org/ten-miles-around/.