Rage, rage against the dying of the light… Do not go gentle into that good night. —John Donne
It started with an assignment. My students were learning to use the standard APA-style citation method employed in the sciences, and one of my students who is a faithful and almost fanatical rule-follower kept calling me over to ask how to cite his next item of research. After multiple attempts at re-explaining the process, I finally simply asked this student to show me his screen. This is what I saw:
Now, my student hadn’t done anything atypical of today’s learner. He had typed his query directly off my instruction sheet into Google and awaited the response. It is, of course, not a good research habit (and one I keep trying to fight), but when I saw what it had produced, I was unnerved; I had not realized how much AI had invaded internet search engines. Here I had spent all this time teaching my students how to vet websites for academic and scientific reliability—an essential critical thinking skill, especially in today’s flood of misinformation and disinformation—and yet, here, confronting me on my student’s screen was an AI summary of only potentially relevant sources with no distinct authors or web addresses for my student to cite. No wonder he was confused!
So I showed my student how he could click on the little link symbol you can see there on the image right after the word “change” in order to bring up the list of web sites the AI had used for its summary, and I demonstrated how to find the source he needed among those sites so that he could formally cite it in his project. But if not for my own critical thinking skills enabling me to know what the AI was doing, both my student and myself would have been left in the dark, making unsubstantiated claims, reporting the thoughts of others as our own without any attribution to the original thinkers. The literal definition of plagiarism.
To say that I, as an educator, was appalled and alarmed by this development is like stating that hydrogen bombs make a noise when they go off (hyperbole intended!). However, I shortly thereafter read an editorial piece on Bloomberg that reminded me that my collegiate level colleagues have it even worse right now. At the preK-12 level, good schools are still doing a lot with pencil and paper in their classrooms, including formal assessments that require actual knowledge and the ability to think through a problem unaided by technology. But presently in academia—at institutions whose very raison d’être is the production and refinement of critical thinking!—“outsourcing one’s homework to AI has become routine” and “assignments that once demanded days of diligent research can be accomplished in minutes…no need to trudge through Dickens or Demosthenes; all the relevant material can be instantly summarized after a single chatbot prompt.”
Even more incredible (confirming a rumor I’d heard) is the fact that apparently more and more professors are starting to employ AI themselves to evaluate student work, leading to the mind-boggling and ultimately untenable reality of “computers grading papers written by computers, students and professors idly observing, and parents paying tens of thousands of dollars a year for the privilege.” The Editorial Board of Bloomberg News is indeed spot on when they declare that “at a time when academia is under assault from many angles, this looks like a crisis in the making.”
The coffin’s nail for me, though…the camel’s straw, the road’s end, the coup de grace…pick your cliché for finality and mine from this past month was the screenshot below:
I had read this remarkable article in Scientific American on the genetic fluidity of sex and gender in sparrows, and I wanted to share it with my fellow biology teachers for use in our inheritance unit next year (as well as some separate electives we each teach). So I scanned the article as a PDF to make it more permanently accessible for all of us, and that’s when I saw the message from ADOBE up there in the lefthand corner: “This appears to be a long document. Save time by reading a summary.”
I spluttered; I fumed; I cursed:
“Of course it’s a long document you [expletive deleted] piece of software! That’s the whole point! To provide the reader with rich, nuanced knowledge and understanding of one of the most complex ideas in all of biology!!! If I had wanted my colleagues and I to have a [further expletive deleted] ‘summary,’ I first would have written it myself before giving it to them and then I still would have provided them the formal citation!”
In case you cannot tell, gentle reader, I was pissed. Pissed at the seeming systemic and systematic attack on the human capacity to think (let alone actually valuing that capacity). Pissed that there is clearly a market for this disparagement of thinking, and pissed that so few in our world seem to be upset by this dying of the light. I have known that scientific reasoning has been under assault for some time now, but the death of basic thinking itself?!
I know, I know. One more thing to add to the agenda for my often Sisyphean-feeling profession. But I’m not just pissed. I am also deeply concerned, and something neuroscientist, Hanna Poikonen, wrote earlier this year is a good way to end this brief ragging on my part:
Each time we off-load a problem to a calculator or ask ChatGPT to summarize an essay, we are losing an opportunity to improve our own skills and practice deep concentration for ourselves…when I consider how frenetically people switch between tasks and how eagerly we outsource creativity and problem-solving to AI in our high-speed society, I personally am left with a question: What happens to our human ability to solve complex problems in the future if we teach ourselves not to use deep concentration? After all, we may need that mode of thought more than ever to tackle increasingly convoluted technological, environmental, and political challenges.
“May need” indeed. My money’s on “will,” not “may.”
References
Maney, D. (March 2025) The Bird that Broke the Binary. Scientific American. Pp. 48-55.
Poikonen, H. (Feb. 2025) How Expertise Improves Concentration. Scientific American. Pp. 81-82.
Change your thoughts and you change your world. —Norman Vincent Peale
Smile, breathe, go slowly. —Thich Nhat Hanh
In my most recent essay, Flailing to Thrive, I left off suggesting that I think there might be one more thing we could be doing as a society to address the struggles that males in our culture have been documented dealing with lately. I can now share that my motive for my pause is that this “one other thing” doesn’t just involve the sorts of focused interventions I discussed in that essay. Instead, what I think we could be addressing as a society to benefit our boys and young men as they grow up would also benefit our girls and young women as well. Specifically, I think we need to change how we socialize all our children as they mature.
For example, despite all humans being equally capable of experiencing the full range of possible emotions, we regularly teach our children otherwise “through the gendered use of language.” From an early age, our children learn “that certain emotions are more acceptable for girls than for boys and that women talk more about their feelings,” and studies have shown that significant numbers of mothers are “more likely to use emotional language when speaking with four-year-old daughters than with sons that age.” (Agarwal, p. 75). Consequently, a number of adult males in our society struggle with the healthy expression and processing of certain emotions, and this, in fact, is one of the reasons why men have the higher rates of suicide discussed last time and why dedicated intervention programs targeted just for men have needed development.
However, the “genderfication” of emotions is only a tiny subset of the role the affective domain has played in our socialization process. For millennia in Western culture, there has been a bifurcation between the so-called “rational” and the so-called “emotional,” and ever since Heraclitus stepped into his river and Zeno found his paradox, the latter has been severely denigrated (along with the gender that has historically been most associated with it). Oh, there have been intellectual moments of rebellion—the Epicureans, the Medieval mystics, the German & English Romantics of the 19th Century—but for over 2,500 years in our society, reason has been affirmed the supreme ruler of the cognitive domain and men declared its primary purveyor.
Or at least this was the case until recent neuroscience—with its fMRI scans—came along and dismantled this whole paradigm entirely. For instance, we’ve known now for almost two decades that the brain does not engage in any kind of bifurcation of the “rational” versus the “emotional.” Something as strictly analytical as the equation 2+2=4 has an emotive component to it, and even the darkest of grief has its ratiocinative side. As I like to phrase it for my students, “every thought has a feeling; every feeling has a thought.”
Today, though, we are actually able to observe the neural networks involved in all this brain processing, and what that is revealing is revealing for this discussion. To understand how, let us take a brief detour and familiarize ourselves with three of the most important of these networks. One (and the one you are employing the most right this very moment) is the Executive Control Network or ECN. This network enables each of us to pay attention to a specific task at hand (e.g.. reading this essay), to identify and employ the necessary rules (e.g. the syntax and grammar of reading), and to manage the behaviors needed for successful completion of this task (e.g. control of eyeball movements and body posture).
The ECN then alternates with the Default Mode Network or DMN, which is the part of your brain most active when you are simply staring off into space. The DMN is what you employ when you are reflecting without any deliberate intent, and it is responsible for the creative problem-solving process (the so-called “Ah, ha!” or “Eureka!” moment). Indeed, as the person writing this essay, I am regularly drifting off to await my DMN to generate my next sentence or paragraph.
Which brings me to the Salience Network or SN. This portion of our brain literally keeps us alive (heart pumping, lungs breathing, etc.) and generates the necessary emotional states—both simple and complex—required for survival as a member of a social species. Yet the SN is also fully integrated into both the ECN and DMN, serving as the active switching mechanism between the two. What that means is that what we frequently think of as the “real” work of the brain—generating ideas, solving problems, learning, etc.—actually involves the very system of the brain that keeps us alive…including our emotional states. Hence, as neuroscientist Mary Helen Immordino-Yang puts it, “emotions, rather than interfering with clear-headed thinking, drive clear-headed thinking—thinking that is rational, responsive to circumstances and morally aware” (p. 51; original emphasis).
What that means for how we socialize our children is profound. Whenever we “genderfy” emotions and/or perpetuate the “rational vs. emotional” bifurcation myth, we interfere with how robustly the brain connects its SN circuits to both the ECN and DMN, and the link between this interference and an increased vulnerability to mental illness—especially in teens—is starting to be well documented. Individuals who get “stuck” in their ECN due to weak SN connections are more prone to the different types of anxiety disorders; while individuals who get “stuck” in the DMN are more likely to experience clinical depression. Either way, how we socialize our children around their emotional experiences directly impacts their brain development and how effectively their brains function; so being a bit more deliberative about it as caregivers and avoiding all manner of emotional “genderfication” would benefit all involved.
Especially in today’s digital wasteland of a cognitive environment. There, according to MIT theoretical physicist, Alan Lightman, we have trashed the ecology of our inner lives as badly as we have the ecology of the natural world, and we have done so for quite some time now. He, like Oliver Burkeman, attributes this to how we have blended our frenzied obsession with managing time with the ever-present technologies we allow to hold our attentions 24/7, and he insists that unlike the actual planet—where we have begun to acknowledge our harm and are even starting some interventions to repair things—the damage to our inner lives remains hidden from our view, unrecognized and unaddressed.
Now, in full disclosure, I have not read Lightman’s In Praise of Wasting Time, where he presents his arguments and offers suggestions for remediating the problem. I am relying instead on remarks he said in his interview with Rick Steves. But this notion that we have polluted our inner lives as badly as we have polluted our outer ones resonated so deeply with me from my work with today’s adolescents that I felt compelled to share. Particularly because that is what the process of socialization does: it informs the construction of the inner life we each employ to generate our public life. Thus, if we are dumping social media’s toxic waste there and poisoning the atmosphere with “genderfication” and AI generated contaminants, we are risking socializing our children to build inner lives—in both our boys and our girls—that are fundamentally dysfunctional.
Moreover, for over a dozen years now, we have seen what that does to people’s public lives in our society. Just this past month, I had the misfortune of witnessing a man and a woman on a public street in a relatively posh part of town scream invectives at each other over a harmless traffic error, a situation that rapidly escalated to language shouted aloud which I would be ashamed to say in the privacy of my own head. What’s more, I felt actual shame when—rather than risk intervening to help de-escalate what was happening—I sped up my pace to walk away from the scene as rapidly as possible because in the back of my head was the thought: “what if one of these idiots pulls out a gun?” Such is the world our collectively polluted inner lives has produced.
So what are we to do about all this? If you’re a parent or guardian, get your child off of screens. More importantly, get yourself off your screens. Stare off into space and clean up some of the litter in your own inner life. Think about your word choices when it comes to emotions and model what healthy emoting and emotional processing looks like. Be your best self as much as possible (and generous when you inevitably are not). If you are an educational institution, ban smart phones of any kind from your classrooms if not your entire campus and deliberately teach emotional intelligence in your curriculum. More and more schools have started to realize they need to do both but we are still far short of a critical mass. Finally, if you are a fellow educator—committed to authentic engagement with your students—remember that hope is a verb: if we do not work determinedly to keep illuminating the darkness, then (to paraphrase John Donne) the not-so-good night wins.
Coda
I have written variants of the preceding paragraph so often now that I feel like one of those old scratched LPs where the needle keeps going over the same groove again and again—i.e. the proverbial broken record. However, I also know that if I remain silent, if I do not repeat myself however many times it takes, then I am not actively hoping the way I fundamentally believe we are all called to do. Which leads me to close this essay with a Haitian proverb that recently crossed my path: “Beyond mountains, there are mountains.” Or as Miley Cyrus once sang, “it’s the climb.”
References
Agarwal, P. (Feb. 2025) Emotions Are Not Gendered. Scientific American. Pp. 74-75.
Immordino-Yang, M.H. (Feb. 2025) Growing the Adolescent Mind. Scientific American. Pp. 48-55.
If you are here unfaithfully with us, you are causing terrible damage. — Jalal Al-Din Rumi
This topic is a challenging one for me. Those closest to me know that I am not the biggest fan of my half of the species and that I can tally on one hand the number of fellow males I would count among my close friends. In fact, I usually simply tolerate most of the other males in my life. I loathe the banal culture of the “locker room,” and I am so actively antagonistic to the patriarchy that I like to claim that my mother jokes that she raised two children and one feminist and that it wasn’t her daughter. Bottom line, I much prefer the company of women—to the degree that in classic couples situations where the men and women usually pair off with their respective genders, you will find me in the kitchen with the women. There is a reason I spent the majority of my teaching career at a single-sex all-girls school.
However, today I find myself once again in a fully co-ed environment where I have a professional duty to authentically engage all my students for purposes of nurturing them to become their best authentic selves, and so I read Clair Cain Miller’s article in the New York Times with a profound sense of downheartedness. I already knew that suicide rates have always been generally higher for men than for women and that those rates have increased for all young people in the past decade—much of it directly attributable to the impact of social media ([expletive deleted] Snapchat!). But to learn that the suicide rate in the population of males I work with has effectively doubled from 11 per 100,000 to 21 per 100,000 since 1968 was disturbing to say the least. That’s over 4,600 teenage boys and young men dead by their own hand in 2023 alone—a rate that only goes up as they age.
Why? What could be causing an increasing number of males—in a fundamentally patriarchal society!—to fail to thrive? Part of the answer seems to be economic. As the types of positions traditionally identified with masculinity—so-called “blue collar” jobs—have been increasingly eliminated by robots and other forms of automation, the remaining employment opportunities and those where there has been steady job growth rely more and more on the so-called “soft skills” traditionally associated in our culture with women. Which in a patriarchy can be viewed as problematic. As Tracy Dawson, a 53-year old unemployed welder from St. Clair, Missouri, made abundantly clear in a 2017 interview: “I ain’t gonna be a nurse; I don’t have the tolerance for people. I don’t want it to sound bad, but I’ve always seen a woman in the position of a nurse or some kind of health care worker. I see it as more of a woman’s touch.”
Of course, attitudes such as these have been around for a long time (pop-culture was recognizing this fact as early as the late 1970s, and Bruce Springsteen made a career out of examining them). However, Robb Willer, professor of sociology at Stanford, is blunt when he states that, today, “the contemporary American economy is not rewarding a lot of the characteristics associated with men and masculinity, and the sense is those trends will continue.” So where does that leave the Tracy Dawson’s in this world? It leaves them under- or unemployed in an increasingly shrinking part of the work-force (see chart below)—with all the consequent potential for undermining and harm to an individual’s sense of self and well-being.
Yet, underlying this employment issue and any subsequent potential changes in how men in America perceive themselves today is an even deeper root cause and one that directly impacts me as an educator. Since learning is the gateway to everything about a person’s life, any changes in educational status will impact a person’s entire existence, and the reality is that today, starting as early as kindergarten, boys are arriving in our schools less prepared than girls, both in academic readiness and their behavior. The likely reason for this is the increased focus on college-readiness that has taken over schooling in the past two decades, forcing educational institutions of all kinds to emphasize academics at earlier and earlier ages. That is something which boys, who usually mature later than girls, are less prepared to handle, and as a result, boys are not getting the same academic head start that girls now are. Furthermore, this gender gap in academic performance continues to persist in today’s schools as both sexes move up through the grade levels, resulting in women being more likely to graduate, earn higher G.P.A.s, and even go on to college. Indeed, women now outnumber men at the college and university level with 66% of female high school graduates compared to 57% of the male ones.
Again, where does this leave the young Dawson’s in this world? Well, since the link between matriculation from college and broader career prospects and higher earnings is well documented, it leaves a lot of them increasingly left behind economically, frequently still living with their parents, and ever more susceptible to the reckless ravings of an autocrat. As Jonathan Rauch articulates in his Constitution of Knowledge, these are the men who hear the perfectly authentic and valid challenge to their male privilege, look at their employment prospects and long-term financial outlook, and reply “Privilege?! What privilege?!” It is precisely because the implied social contract of the American patriarchy told them that simply being male guaranteed them a degree of status in our society that the perceived failure to deliver on that “promise” has resulted in men who will storm our capital, vote for a self-declared “dictator for one day,” and sometimes literally kill themselves out of their despondency.
So what are we, as a society, to do? The feminist in me may be tremendously excited by the data showing how far the status of women in our country has improved since my childhood (still can’t believe my own mother once could not have her own credit card!). What’s more, the educator in me still knows how far there still is to go for women to achieve true equity with men in this country (especially in the face of the patriarchy’s current pushback under the Trump administration). However, just because I personally am not a cheerleader for men does not mean I believe that they somehow do not deserve to have lives of meaning and purpose. ALL humans deserve that. Indeed, the foundational flaw of both the patriarchy and systemic racism is their refusal to believe this very thing!
However, the automation of the workplace continues unabated, and with AI, this is even going to start being true of some of the so-called “white collar” jobs. Thus, it will not just be the unemployed welders and longshoreman dealing with the ennui in their lives; it will also be the unemployed estate lawyers and radiologists confronting their lack of purpose. Which brings me full circle after my brief (but important) digression to my original question: what do we do about this?
There are at least two things in education we could do right away. The first is to consider restructuring the configuration of our early elementary classrooms when it comes to males. Just as there is data showing that single-sex classroom environments benefit middle-school aged girls in the math and science disciplines (and there are co-ed schools both public and private that segregate their populations accordingly for these classes during those years), there is data suggesting that a single-sex environment may benefit K-3 boys in terms of behavioral discipline problems, enabling them to focus better on their learning at this critical age.
Which leads to the second thing schools could be doing to address why some boys and young men are falling behind: teach and employ restorative justice practices in our schools instead of the more traditional punitive approach. The data is clear: boys are far more likely to receive punishments (and frequently harsher ones) for poor decision making than girls do—especially among children of color—and the data is equally clear that by using restorative justice techniques, teachers and administrators alike can help students better manage their emotions and behaviors and find constructive resolutions in situations of conflict. Schools that employ these practices have all shown improved academic performance, and they are safer communities for their inhabitants—again, particularly among children of color.
One additional thing I think we could be doing to address the segment of boys and young men in our population who are struggling to thrive is to reconsider what intelligences we choose to value. Historically, we have always tended to undervalue the kind of critical thinking and problem solving associated with certain jobs such as waiting tables or wiring a house—or welding. But in the recent hyper-focus on “college readiness,” practical, less traditionally academic intelligences have received progressively fewer and fewer formal supports. The vocational tech programs of my youth—we had an entire high school in my district devoted to them—have been steadily dismantled and their government funding withheld or withdrawn, to the point where we actually have a critical shortage of such labor in this country. Resurrecting the vocational tech schools of the past, as educator Mike Rose points out, would go a long way toward addressing a whole host of issues confronting our society—one of which I would like to suggest could be providing the young Dawson’s of our society with both a sustainable income (no one’s automating plumbing for the foreseeable future) AND a sense of meaning and purpose.
As for the one other thing I think might be helpful when addressing this essay’s topic, I will save that for next time.
Treat people as if they were what they ought to be and you help them become what they are capable of being. —Goethe
I have never been more afraid for America’s future in my life. —Thomas Friedman
In the original TV series, Dragnet, the character Sgt. Joe Friday is alleged to have said “Just the facts, ma’am.” But like Bill Clinton’s association with “it’s the economy, stupid,” it is a total fabrication. The famed comedian, Stan Freberg, said something similar in his parody of the show, and what would now be called a meme was born, with “just the facts, ma’am” forever associated—incorrectly—with Joe Friday. However, just as the meme connected with former President Clinton served as a useful lens for an earlier essay about education in this country, “just the facts” is an ideal one with which to start this posting; so here are just a few of the most relevant ones:
40% of fourth graders today read below the basic level on the National Assessment of Educational Progress (NAEP), meaning that they “cannot grasp the sequence of events in a story.” It is the worst performance for this grade-level in 20 years.
33% of eight graders today also read below the basic level on the NAEP, meaning that they “can’t grasp the main idea of an essay or identify the different sides of a debate.” It is the worst performance for this grade-level in the five decades since the inception of the exam.
In terms of reading engagement outside of school, 34% of fourth graders now report that they read only 30 minutes or less each day, and though a mere 34% of eighth graders reported reading for fun in 1984, that number has now dropped to 14% in 2023.
As for the United States’ adult population, 30% of them can only read at the level of a 10-year-old, and both numeracy and literacy levels as measured by the Program for the International Assessment of Adult Competencies have dropped consistently among those ages 16-65 (see graphic).
Now since literacy of any kind is the foundation for the ability to reason and the basis for all background knowledge needed to make good decisions in a complex world, then these facts are extremely problematic—and that is avery generous understatement. As New York Times columnist David Brooks puts it—quoting retired generals Jim Mattis and Bing West—“if you haven’t read hundreds of books, you are functionally illiterate, and you will be incompetent, because your personal experiences alone aren’t broad enough to sustain you.” Reading—and lots of it—is the keystone to our capacity for critical reasoning, and just as the absence of a keystone species in an ecosystem will lead to its collapse, the absence of reading in a country’s population is a recipe for the breakdown of our entire social order.
And before I am accused of hyperbole, I am already witnessing the potential for this breakdown in my own classes and have been now for over a decade. Like Anya Galli Robertson, who teaches sociology at the University of Dayton, I too have continued to “give similar lectures, assign the same books and give the same tests that [I] always have,” and like Professor Robertson, I too have seen firsthand how “years ago, students could handle it; now they are floundering.” Moreover, while the mental coddling I’ve written about before is definitely playing a role in this situation, the even bigger causal source for this general decline in my students’ collective IQ, CQ, and EQ is their poor reading habits. Habits due in no small degree to the amount of screen time spent on their phones.
Also (to quote Brooks again):
Not just any screen time. Actively initiating a search for information on the web may not weaken your reasoning skills. But passively scrolling TikTok or X weakens everything from your ability to process verbal information to your working memory to your ability to focus. You might as well take a sledgehammer to your skull.
Or more accurately, a broom. To see why, a little brain science from my own classroom is in order. Each year around this time, I have my senior anatomy class perform a series of experiments. I give them a standard short-term memory (STM) test in the absence of their cellphones; we do a few other learning activities; then they take the exact same test a second time while grasping their phones in their hands after playing with their devices for two minutes. Data is scored, loaded into the spreadsheets, and then we wait until the next class where we do the exact same sequence of events with a different but equivalent STM test—only this time, no phones are present at all. Again, data is scored, and I “innocently” ask how many of them scored better the second time—to which every hand in the room rises, and I use this fact to introduce the concept of working memory.
Put simply, working memory is like a temporary storage shelf that your hippocampus uses to place items from your immediate STM that you might want to add eventually to long-term memory (LTM). It’s a parking lot for thoughts and experiences needing evaluation as to whether they are important enough to dedicate to your LTM, and it’s why you can recall what you had for dinner last night—something that is no longer in your current STM awareness—but cannot say what you had for dinner a month ago (unless you have one of those extremely rare autobiographical memories). Basically, your working memory still has last night’s dinner on its shelf waiting for processing while nearly every previous meal you’ve ever eaten has been swept from the shelf as not having enough significance for LTM (again, those special ones you do remember got the required import tag).
Having taught all this to my students, what I do next is bring up the graph below, and this is when their eyes all widen and why I do not, like David Brooks, have to say “so the main cause is probably screen time” (my emphasis). The blue line represents the impact on STM of asking it to store and recall increasingly longer sequences of random letters. It is the averaged student data from the very first STM test, and it is exactly the trend neuroscience would expect. The yellow line represents what neuroscience says should have happened after my students took the exact same test a second time that first day (and which did happen with the second STM test). The red line, though, is what happened when my students were holding their phones after playing with them while taking the exact same test a second time: the mere physical presence of the devices wiping their working memories clean. Groundhog Day for the brain, every day, 365 per year.
Anyone not unnerved at least a little by this data about our devices is probably not reading this essay in the first place, but if not convinced, then, like David Brooks:
My biggest worry is that behavioral change is leading to cultural change. As we spend time on our screens, we’re abandoning a value that used to be pretty central to our culture — the idea that you should work hard to improve your capacity for wisdom and judgment all the days of your life. That education, including lifelong out-of-school learning, is really valuable.
However, as I reminded my seniors this year, let’s be generous and assume anyone reading this essay gets that our society’s changing habits about reading and learning may be endangering our very future. Then the logical question to ask next is: how our society is handling this potential crisis? Again, “just the facts” can be useful:
The Baltimore City Public Schools have had to close their tutoring program for reading remediation for 1,100 students because of the withdrawal of $418 million dollars in promised pandemic recovery funds (as a district, they will not be alone).
The former CEO of World Wrestling Entertainment—the “apotheosis” of demanding intellectual engagement!—has been confirmed as the next United States Secretary of Education, with the explicit charge to dismantle and destroy the entire department (the executive order was signed a month ago).
Harvard University has lost more than $2 billion in federal research funds for having the temerity to basically say that critical thinking matters (with additional threats to their tax-exempt status on the line).
And, finally, as a country, we have ceded to China the global leadership in research output in the fields of chemistry, physics, and earth & environmental science (with biology and the health sciences soon to follow due to the recent defunding of the NIH and the firing of many of their scientists).
That last fact may be the most telling one, and it is why I was sorely tempted to title this essay “The Stupidifying of America.” Our collective education system in this country no longer produces enough “home grown” PhD scientists and engineers, as well as other levels of expertise, to meet our most basic economic needs, and the “cruel farce” that is the Trump administration is simply going to make things worse. As Thomas Friedman points out:
Do you know what our democratic allies do with rogue states? Let’s connect some dots. First, they don’t buy Treasury bills as much as they used to. So America has to offer them higher rates of interest to do so — which will ripple through our entire economy, from car payments to home mortgages to the cost of servicing our national debt at the expense of everything else…[Thus] bond yields keep spiking and the dollar keeps weakening — classic signs of a loss of confidence that does not have to be large to have a large impact on our whole economy…[Furthermore,] you shrink all those things — our ability to attract the world’s most energetic and entrepreneurial immigrants, which allowed us to be the world’s center for innovation; our power to draw in a disproportionate share of the world’s savings, which allowed us to live beyond our means for decades; and our reputation for upholding the rule of law — and over time you end up with an America that will be less prosperous, less respected and increasingly isolated.
Like Friedman, I am truly frightened for our country, but like Goethe, I know what I need to do in my small corner of influence to combat the rising tide of ignorance, anti-intellectualism, and antipathy. As the sign at one of the Hands Off protests suggests, I’ll keep teaching critical thinking to my students—in the hope that future elections might turn out for the better.
Lukianoff, G. & Haidt, J. (2018) The Coddling of the American Mind: How Good Intentions and Bad Ideas are Setting Up a Generation for Failure. New York: Penguin Books.
I was raised, if your heart’s beating, you play. —Gary Woodland
Dear Members of the Class of 2025,
Several years ago, when I first started my project to help improve education in this country, I wrote a letter to my graduating seniors in the midst of the worst of the pandemic’s lockdowns and posted it for them to read from the isolation of their homes. I spoke a lot about the generative power of truth and the corrosive power of lies, challenging them to build a better world than the one they were inheriting, reminding them that “hope” is a verb, and today, variations on those themes have now featured prominently in every letter I have written to each graduating class ever since. This one will not be an exception.
Part of the reason for that, of course, is because moments of closure in our lives, milestones that mark the end of one journey and the start of another…they just naturally lend themselves to recalling the needs and demands, the ideals that inform every journey. It’s why all commencement addresses fundamentally sound the same: use your potential wisely; pack appropriately for the trip; stop and reflect from time to time; remember to love and be loved; and…here are three life lessons to aid you on your way!
Put simply, these moments of closure remind those of us older than you of similar times in our own lives, and because we care, we just want to provision you with some final wisdom for the road—to prepare you for the occassions of darkness we know inevitably await you. I know. Pretty heavy stuff for such a celebratory occasion. But like my letter to the class of 2020, I find myself writing once more during a time of tremendous turmoil, with a petulant child trying to tear it all down because he never learned how to work and play well with others, and thus, I find myself needing to be a little more overt about those “three life lessons.”
One of which is (and extremely appropriate to our current situation): avoid “magical thinking.” This is the term anthropologists use to refer to ritualistic behaviors done with the intent of somehow modifying something over which one has no actual control (e.g. if we sacrifice this goat, the rumbling volcano will not erupt). But, in today’s broader parlance, it can also refer to thoughts or deeds that simply ignore this lack of control. They can be as harmless as the superstitions behind game-day rituals before a sporting event or as devastating as the delusion that tariffs will cause corporations to abandon their investments to rebuild in the United States. However, the consequence of any magical thinking is always the same: engaging in actions that cannot have any actual bearing on reality.
Not that the actions themselves do not have consequences. The goat is dead; the “lucky” jersey must be washed; markets tumble. But the intent behind the actions remains no less disconnected from their ultimate impact, and it is this intent that can be truly hazardous.
Which leads me to perhaps the most dangerous magical thinking of all (and “life lesson” dos!): the notion of “the Perfect Life.” This is the misbelief that “if I just go to the right school…if I just marry the right person…if I just find the right career…if, if, if…if I just do the right things, then my life will be exactly the way I want it to be.” It is the fantasy that you can achieve a life completely free of frustration, boredom, discomfort, and disappointment, and quite cynically, it is a fantasy that quite a few people make a LOT of money off of—especially today’s social media influencers who try to convince you that if you just follow their lead, buy their product, do as they do, etc. that all will suddenly become bliss. Indeed, an entire medical field exists because of the magical thinking behind the notion of a Perfect Life, earning its practitioners $11.8 billion dollars in 2022 alone—and that’s not including the cosmetic industry itself. All of them, people and companies alike, with the expressed intent of making you feel inadequate about yourself so that they can sell you something.
However, there is no such thing as the Perfect life, never has been and never will be. Moreover, while all of us will engage in the occasional wishful thinking to cope emotionally with life’s finitude—the “if only I can get through this week, then everything will be okay” moments—it is when this wishful thinking turns into magical thinking that we run into trouble. When “if only I…” becomes the sole, primary motivating force in your life, then you condemn yourself to a Sisyphean existence of dismay and defeat. And that’s because there will always be a next “if only I…”—some obstacle to your “final” success, some obstruction to your “ultimate” happiness—and in the meantime, you just wasted who knows how much of your finite time on this planet feeling disappointed, disillusioned, and dyspeptic.
Therefore, do not wait until you are a middle-aged, career-obsessed individual with ulcers to learn not to engage in the magical thinking of the Perfect Life. And along the way, try to avoid Perfect Life’s cousins: “You Can Have It All” and “You Can Be Anything You Want to Be.” No. You cannot. Period. I want each of you to know (as I have written before) that you will have numerous opportunities to do a wide variety of things in this world and that, as an educator, I hope I have helped you begin to decide which of those choices you might finally find yourself investing in one day. But you are a finite organism on a finite planet with a finite lifespan (read The Price of “Pie” if you want to see just how finite), and thus, you will have no choice but to make lasting decisions about how to spend your finitude (remembering that failure to choose is itself a choice). You cannot have it all; you cannot do it all. And you cannot even do everything you wish for; hence, I encourage you to make decisions along your journey that are as thoughtful and informed as they can be (knowing that you will never have all the data) and then invest yourself as best you can, remembering that life is not a “to do” list.
Which brings me to that mandated third “life lesson” required of all commencement moments everywhere: you always have a choice; you just have to be willing to pay the cost. Want to become a neurosurgeon? Then you will give up nearly two decades of training time that won’t be available for family and friends. Want to have children? Then you accept the dozens of years’ worth of financial and emotional burdens required to raise them to adulthood (and often beyond). Want a life partner? Then you need to embrace all the daily compromises that that will demand to make it happen. The simple reality is that free will does not mean freedom from consequences; it simply means that part of any decision is determining whether it is worth the price or not. It can be as simple as choosing to do A rather than B on a weekend afternoon, knowing that B will not get done. Or, it can be as dramatic as quitting a job in protest, knowing that financial insecurity just became your new reality. Regardless, as renowned psychotherapist Sheldon B. Kopp once put it, “you are free to do whatever you like. You need only face the consequences.”
That last line, though, has to be one of the scariest ideas ever because if we join it together with the imperfect nature of our finite lives, we can find ourselves frightened that we are somehow not making the “right kind” of choices—the kinds of choices that are somehow worthy of their consequences. Then we risk trapping ourselves in a vicious cycle of indecision where we put absolute value on each choice as if the very worth of our lives was on the line every time. We risk becoming immobilized in the quest for so-called “best” decisions, and then life really does become “what happens while you’re busy making other plans”—in this case, about your future “best” possible choices.
Of course, this notion of ideal choices is simply another variant of the Perfect Life form of magical thinking, and yet what makes it particularly challenging to avoid is the reality that every choice does actually have a consequence. However, there are consequences and there are CONSEQUENCES, and unless you wish to waste a great deal of that finite life of yours “making plans” instead of living at least a modestly meaningful existence, then learning how to tell the difference is crucial. Because one of the great fallacies (and failures) of our culture is the fact that so many of us seem to believe that we must somehow justify the simple fact that we are alive. Too often, the message we hear is that we have “failed our potential” if we have not fundamentally transformed the world in one fashion or another. Well, reality check: you did not choose to be born; you simply are. In addition, the gut-punch truth is that everyone’s final destination is the same; so the time that you are here is a gift you didn’t—and in fact couldn’t—earn and one that has no claim on you whatsoever.
Which doesn’t mean, as the golfer Gary Woodland suggests, that you don’t play. Yes, from a certain perspective, your entire existence consists only of the consumption of oxygen, the production of carbon dioxide, and the transformation of various organic compounds; you are essentially nothing more than a chemical machine that runs, on average, for 80 years before breaking down and getting recycled into yet another chemical machine. However, from the more nuanced perspective acknowledging both human cognition and agency, you also have the power to have a significant impact on the qualitative experience of all that chemical machinery—both your own and what surrounds you—and you have that power for the better or for the worse. You can, to paraphrase Milton, “make of life a heaven or a hell,” and therefore how you use your finite time does matter; it just doesn’t need to have cosmic importance.
Not that you cannot (nor should not) aspire to have a lasting impact. The New York Times columnist, David Brooks, is correct when he writes that “every society on earth has a leadership class of one sort or another [who need] sensible views about authority so that they don’t childishly rule imperiously from above—[individuals who] embrace the obligations that fall on them as leaders, to serve the country and not their own kind.” Moreover, he is equally correct that if we want a society where everybody flourishes, we are going to need such leadership on steroids to establish better future institutions of governance (assuming we manage to survive the current imbecilic sociopath residing in the White House). Because only when we have leaders who listen to all their fellow citizens, anticipate everyone’s needs, and guide the social change to meet them will we finally find ourselves living in a truly just and equitable society. Maybe some of you are up to the challenge.
I know, that’s a big ask. Right up there with fixing climate change and all the other damage that my fellow elders and I are leaving you to try to repair. What’s more, anyone who has ever constructed anything—a Lego model, a theater set, a curriculum, even a meal—knows firsthand how much harder it is to build than to tear down. But that’s why—again!—it is SO important not to engage in magical thinking. When there is so much that needs fixing (and some of the repairs are truly global!), you can easily find yourself at times feeling cognitively overwhelmed and fatigued to the point of paralysis. This is especially true in today’s 24/7 digital—for which modern psychology even has a term. It’s called “compassion fatigue,” and it can make taking any sort of action seem pointless.
However, as author and journalist, Oliver Burkeman, points out, the solution to compassion fatigue is both ridiculously simple and yet incredibly challenging (for fear of the judgement of others): embrace your finitude and pick your battles; choose which change you will seek to be and let the rest go, trusting that others will choose different battles than yours. Indeed, one could argue that “in [our] age of attention scarcity, the greatest act of good citizenship may be learning to withdraw your attention from everything except the battles you’ve chosen to fight” (p. 36) and then giving those battles what you can.
And before you think giving what you can cannot possibly be enough to have an actual impact, I will share a small piece of my own journey. Most reading this will know that I commute to work by walking and have done so now for nearly 30 years. Well, there is an exercise I have one of my senior classes do where they calculate the amount of carbon dioxide released into the air from burning fossil fuels, and on a whim, I did the calculations with them a little over a year ago on how much CO2 my decision to walk rather than drive has kept out of the atmosphere. Turns out the answer is a little over 30 metric tons, which is the equivalent of 9 football fields worth of forest. Or to make that a visual many people reading this can understand, it is the equivalent of growing or preserving a forest occupying the entire campus of Friends School of Baltimore. Thus, never doubt your individual power to effect positive change. Even the smallest of decisions, enacted consistently can have profound impacts.
But that brings me to a point I try to make each year, and that is to be graceful with one another. Each of you will make mistakes along your journey, mistakes that will impact others, including people you love. You will bruise and be bruised because sin is real. Yet, you have the power for compassion—to forgive yourself as well as others—and with it, you therefore have the power to restore wholeness in a broken world—the employment of which is the ultimate form of hope.
So let me leave you here with a small bit of wisdom I have passed on before, an idea in Zen Buddhism known as “Mu.” “Mu” is the understanding that sometimes when we find ourselves with an intractable problem, that perhaps we are not asking the right question(s). Thus, a Zen master will regularly tell a struggling disciple, “Mu”—you need a different perspective. Therefore, I share this concept of “Mu” with you because as you make your finite choices about your finite life, deciding which consequences to pay and which limited battles to fight, you will regularly find yourself very humanly second-guessing yourself. And in those moments, my permanent advice to you will always be, “Mu.”
Equal Rights for Others Does Not Mean Fewer Rights for You. It’s Not Pie. —Popular Bumper Sticker
The entropy of any closed system increases over time with each energy transformation within that system. —The Second Law of Thermodynamics
What I’m about to say is not likely to be news to anyone who isn’t actively living under a rock: simply staying alive has become more expensive. The Waffle House franchise now has a surcharge on its egg dishes (understandable given that the price of eggs has risen 15.2% in just the past four weeks and a whopping 53% since this same time last year). A middle-aged couple in Baltimore must share a row house with five other people merely to meet the $1500 a month rent—that is until they recently received notification of the non-renewal of their lease and are now facing homelessness. The state of Maryland has a $3 billion dollar budget gap it must close by the end of this legislative session, and with more than 50% of households in this country already “cost burdened” (meaning that they must spend more than 30% of their income on housing), Elon Musk and Donald Trump have decided to create additional economic insecurity for tens of thousands of federal employees simply to “save” what is less than 1% of the overall federal budget.
Hmm. That rock is starting to look awfully inviting.
Which is why as I surveyed all the news during the first month of the second Trump presidency, I realized that it might be time to revisit some themes I first explored in what was only my second posting back at the start of the pandemic. Titled “Maybe It’s Pie After All…,” it examined some scientific realities about the natural world that would be worth bringing to folks collective attention again because while this information might not immediately help in the current situation, it can provide what Diana Butler Bass calls “a framework for understanding that helps make sense of where we’ve been” (something she does a marvelous job of for the current situation from a historical perspective). Therefore, let’s turn to what I sometimes refer to in my environmental science units as “the law of homeostasis.”
In the original essay, I introduced readers to the field of population dynamics and the reality that no environment has limitless resources, that even the earth is a finite system, and that therefore there are always only finite ways to distribute those resources as well. The example I gave was how:
in a room of 3 people and 9 balls, the distribution might range from a single person having all 9 while the others have none to each person getting 3. But the number of ways to divide the balls up between them is finite, and the same is true for the resources in any given ecosystem.
I then explained that the consequence of this for a population of organisms is that the size of that population must always fluctuate around a set maximum value because while some specific members of the total population might overuse resources to reproduce, their overuse of those same resources deprives other specific members of the total population to do likewise, resulting in their death. Hence, while some members of a population are always adding to it, others are always subtracting from it because there is only a maximum population size a given ecosystem can support.
What I did not talk about at that time, though, is that this same concept of a set maximum applies to the resources themselves in any given ecosystem as well. The second law of thermodynamics ensures that in a closed system, any order or level of energy in that system can never increase beyond a set value, which means—to use my earlier example—in a room of 3 people and 9 balls, there can never be more than 9 balls. Moreover, with time, the distribution of those 3 balls is guaranteed to be randomly distributed between the 3 people since that is the maximum level of order the room can maintain without an input of outside energy.
But what if that room could somehow get that input so that one person could again snatch up all the balls (i.e. add order)?
Ah! That’s where biology’s “law of homeostasis” comes in. An accepted working definition of “life” in science is any system capable of transforming energy to resist entropy. Or in other words, any closed system capable of taking in energy from the outside to seek to maintain its order. It’s why we as animals eat and why plants photosynthesize (the sun being the ultimate source of energy outside our collective biological systems). However, it is also why all life ages: we are resisting entropy, never stopping it, and that is why all life at whatever level of complexity one wants to describe it—from cells to biomes—is constantly fluctuating around a set point of maximum energy and order.
A reality that is as unchangeable, absolute, and tyrannical as physics’ law of gravity and chemistry’s law of the periodicity of matter: the law of homeostasis.
What, though, does any of this have to do with the price of eggs? Or housing? Or state budgets? The short answer is that it debunks the very foundations of the economic capitalism on which those things currently depend; the long answer is that that claim will take some unpacking.
Let’s start, then, with one of capitalism’s central premises: the continual growth of production. Capitalist economies are built on the concept of always growing one’s production of goods and services. We even measure a country’s worth by its Gross Domestic Production and how much that GDP increases from one year to the next. Yet, in a finite closed system such as the planet Earth, perpetual growth is no more possible than a perpetual motion machine—and for the same basic reason, that pesky second law of thermodynamics! It is why ever since capitalism became the dominant economic system on our planet, we have had regular economic recessions and depressions, crashing things back to the fluctuation point of available resources at that particular moment in time.
However, a strong counterargument has always been made that while these periodic crashes do occur, the economic periods following them show an increase in production that has steadily grown the world’s collective wealth and quality of life over the past two centuries—the foundation of the worn cliché that a rising tide lifts all boats. Moreover, I say worn because as discussed in my earlier essay, the mathematicians who study capitalist, free-market economies have discovered the exact opposite, and now we are in a better position to understand why.
Since our planet—while genuinely finite—is SO enormous, capitalism as practiced around the world is able to create the illusion of perpetual growth in small subsets of our species by denuding whole sections of the planet where those same small subsets do not live. As marvelously presented in The Story of Stuff (which if you have never watched, you should!), our productive wealth in the industrial world completely depends on turning huge swaths of our planet into ecological dead zones and toxic deserts. And because those wastelands are almost never directly in front of our attention, this disconnect effectively makes it seem like there is no homeostatic fluctuation point when in reality, we must deficit spend the world’s resources to achieve this self-deception.
Which is why now, when we have deficit spent for so long, some of the proverbial chickens are starting to come home to roost—or more accurately not roosting at all in the case of actual chickens; hence, today’s price of eggs! It is why people can’t afford housing (the supply is too small to meet the need), and state governments are having to make cuts in programs (finite resources can only meet finite budgetary responsibilities). Even the shell game that Trump and Musk are now playing with their massive layoffs in the federal workforce (before realizing that they might need people to track the avian flu outbreak; curse those egg prices!) is being done to try to convince the general public that the federal government is now somehow saving all this money—that all these “savings” from furloughed federal employees will somehow counteract the deficit spending from the earlier Trump tax cuts that he now wants congress to make permanent.
The simple truth is that finite resources mean finite choices, and all the dismissal of truth in the world cannot make this or any other of reality’s inconvenient truths go away. Furthermore, while a more equitable distribution of this finitude could currently enable 100% of the humans presently on this planet to live lives that meet more than just Maslow’s foundational needs, that still doesn’t make it any less finite. 6% of the world’s population simply cannot consume 38% of the world’s resources indefinitely, nor can that human population continue its current rate of growth for the same reason. Like it or not, it is “pie.”
Of course, as just suggested, that does not mean that the “pie” can’t be more equitably distributed or that decisions about how we allocate our finite resources can’t be more just. That’s what makes the budget shortfall here in Maryland, for example, so unnerving: our so-called progressive Governor wants to balance the books in ways that will negatively impact people with disabilities, short-change our 988 mental health services, and defund portions of our state universities—along with underfunding the massive public education reforms known as the Blueprint for Maryland’s Future that only just got underway this current school year. Worse, the proposed decreases in funding for this Blueprint for next year impacts and harms our most socio-economically vulnerable populations of children more than any other group, meaning that those who were about to finally get their fair share of the “pie” are suddenly facing having it taken back.
Again, it’s about choices, and it is about finite choices. Perhaps most important of all, though, it’s about the values that inform those finite choices. As I quoted Oliver Burkeman in an earlier essay, “every decision to use a portion of time on anything represents the sacrifice of all the other ways in which you could have spent time, but didn’t—and to willingly make that sacrifice is to take a stand, without reservation, on what matters most to you” (p. 33). Simply put, each of us must decide how we will resist the entropy, knowing full well that the finality of that entropy is itself inevitable.
But even more significantly, each of us must make this choice knowing that how we choose to resist directly impacts how every other living thing resists as well, and right now, I would argue that too many of us are not making very good choices—which (as I remarked last time), if the morality of the situation doesn’t convince, then perhaps pragmatism will: the ghosts of Louis XIV, Marie Antionette, and Czar Nicolaus can all too readily inform what really happens when the “have nots” get desperate enough. Both the French and Russian Revolutions started out as riots over the cost of bread…eggs anyone?
References
Boghosian, B. (2019) The Inescapable Casino. Scientific American, November. Pp. 70-77.
Every era casts illness in its own image. —Siddhartha Mukherjee, The Emperor of All Maladies
During his 1992 presidential campaign, then candidate Bill Clinton is alleged to have claimed, “it’s the economy, stupid,” when addressing the perceived economic failures of the Bush, Sr. administration. He did not, in fact, actually say it (it was a campaign talking point of his advisor, James Carville), but that has not stopped this phrase from entering our cultural lexicon and becoming a meme used ever since by both pundits and politicians alike to explain the voting patterns of the American people. It has even been suggested as the primary reason Trump won re-election: because of how so-called “average” or “ordinary” citizens were feeling about their pocketbooks.
The reason, though, that this phrase has lately re-entered my working memory is because of the recent release of the results of the 2024 NAEP assessment, popularly known as “The Nation’s Report Card.” For those not familiar with the NAEP, it is the one standardized test administered nearly universally to all 4th and 8th graders in this country since 1969 to benchmark how successfully we are teaching our children how to read and to do math. It is our one and only truly longitudinal look at how well America’s schools have succeeded at educating our children, and the 2024 report is pretty grim. While math scores have shown some recovery from the pandemic loss, they are still lower than before the pandemic (part of a long term decline puzzling many educators), and children’s reading scores simply continued the steady decline they have been in since 2013.
Hmm. 2013. Know what got released in late fall of 2011 and gained rapid popularity during 2012? Snapchat. Then Vine in 2013, followed by TikTok in 2017. In addition, during this time, the average age for a child receiving their first smartphone dropped steadily to 11.6 years-old, with children as young as 4 now having one.
Notice a pattern here? Like the pattern in these graphs for both the math and reading scores before and after 2013?
Or notice a pattern in the change in rates of teenage depression in the past decade (especially among 13 year-old girls)?
Now I am too much the scientist not to understand that correlation does not automatically mean causation. Spurious associations are so common and readily found that there are entire websites devoted them (one of my favorites is the amount of GMO corn grown in Minnesota and the frequency of global piracy in a given year). However, I still remember intimately the shocked dismay I felt in the fall of 2013 when the average score on an assignment I had given to my most advanced students for more than a decade abruptly dropped from the steady “C” it had been from years prior to the nearly universal “F” it was that September. I, of course, made the necessary adjustments and interventions and have continued to do so with all my students ever since. But the number and depth of those adaptations have steadily increased every single year to date, and I’m not anticipating this demand letting up any time soon.
Again, Hmm. “If it looks like a duck, walks like a duck, and quacks like a duck….” “Where there’s smoke, there’s….” “It’s the economy….” Cliches (and their modern equivalent, the meme) exist for a reason, and those that exist about the link between correlation and causation do so in part to remind us that sometimes we do not have the luxury of untangling the full extent of the causality in a given situation. We need to act like it is a duck; like it is fire; like it is the economy. Or in this case, like it is the Snapchat, etc. because the alternative risks the kind of long-term harm we are seeing in those graphs above. Better to remove social media’s influence from our children’s lives on the likelihood that it could be disruptive to their proper mental and physical development than to wait to fully confirm (as the mounting research of Sherry Turkle, Jonathan Haidt, and others is doing) that it is.
Because if we want to witness a microcosm of a world in which daily use of social media has risen to an average of 95 minutes per person and more than 54% of people get their primary news from it, look no further than the past two weeks. As the Trump administration has deliberately sown chaos through a metaphorical fire-hose of executive actions, the consequent eruption of misinformation, disinformation, and conspiracy theories on social media among immigrants, federal employees, and foreign aid workers has all but paralyzed whole segments of our society and even our economy. We are in a societal freefall at present, and the only “parachute” is going to be calm, persistent, rational, and critical thought to separate what is truly happening from the fiction and lies so that people can persevere in their resistance to tyranny.
And remember. There is nothing more useful to a budding autocrat than an illiterate and ill-numerate population. Hence, we had better take the necessary actions to help improve our nation’s math and reading scores and do it soon because the alternative has already arrived.
Coda
And speaking of that arrival, I got to experience an element of it firsthand while preparing this latest essay. As my regular readers are aware, I work very hard to provide supporting reference for any statistical or factual claim I make in my writing and to cite properly all thoughts I cannot claim as uniquely my own. However, a major source of some of that information is the federal government’s CDC and other scientific databases—all of which, as you can see from the screenshot below, are now under attack from the new administration (note the fine-print at the top about executive orders).
Moving forward, I will continue to do my best to provide full references for anything I write, but since I often link to previous postings where the original sources of some of the citations have effectively disappeared, I ask my reader’s trust when visiting any of my earlier work that if I claimed it or quoted it, I promise the now gone website did affirm it.
One of the advantages of this time of year for me as an educator is the ease in workload as the academic calendar makes the turn from first semester to second semester. Exams are done; final grades calculated; coursework caught up; and for a brief window of time, there is nothing needing any kind of assessing (i.e. I’m done grading for a while!). It means I can get caught up on news and research in the world of education that are not immediately critical to my specific everyday needs and to reflect on what insights this information might have for the larger mission of teaching and learning.
Two such items to catch my attention this time around involve math and AI. The first, a report issued this past September, chronicles the severity of the academic decline in math skills of our youngest learners. The latest research is indicating that the children who were pre-K or kindergarten during the most severe restrictions of the pandemic not only lost a critical learning window when it comes to math; they are, in fact, not catching up to pre-pandemic levels the way their older elementary age peers are. Worse, many of them are actually falling further behind, and what makes this fact so highly problematic is that there is a limited window during brain development for mastering such skills effectively. Hence, the long term impact of a failure to do so can have devastating economic consequences—for both the individual and our society—and that means that this “math gap” that a portion of an entire generation is facing is not inconsequential.
Moreover, that may be even more true for those of us entering the later stages of our lives. As Jenny Anderson and Rebecca Winthrop report out, “in a survey by Gallup and the Walton Family Foundation of more than 4,000 members of Gen Z, 49 percent of respondents said they did not feel prepared for the future. [In addition,] employers complain that young hires lack initiative, communication skills, problem-solving abilities and resilience.” Hence, we already have individuals entering the workforce self-identifying as ill-prepared; just imagine what today’s second graders are going to be like as the long-term caretakers of late Boomers (such as myself) and every single Gen Xer and early Millennial! It is difficult not to shudder.
Nor is AI going to be the answer to this “math gap” problem. The other report to catch my attention came from researchers at the University of Pennsylvania who studied the impact of using AI as an aid to learning math. 1,000 high school students were divided into three groups: a third had full access to ChatGPT while completing practice problems; a third had limited access to a tutorial version that would give hints but not divulge any answers; and a third did their work the traditional way. The results were very clear: while the first group solved 48% more practice problems correctly and the second group solved an incredible 127% more problems correctly, the first group earned 17% lower grades than the control on the final test and the tutorial group scored the same as the control. In other words, good old-fashioned “grind it out” for the win.
Of course, when analyzing the data more deeply, the researchers found that part of what they were observing were flaws in the bot itself. Its computations were sometimes wrong (8%) and “its step-by-step approach for how to solve a problem was wrong 42% of the time.” However:
the researchers believe the [biggest] problem is that students are using the chatbot as a “crutch.” When they analyzed the questions that students typed into ChatGPT, students often simply asked for the answer. Students were not building the skills that come from solving the problems themselves.
Again, score one for basic grit; something we’re going to need to aid our current 2nd graders in learning if they are to bridge their “math gap” successfully.
What is more, this general capacity for doggedness is something we are all going to need to reacquire if we’re going to meet the massive challenges facing our world today. While ruminating about these math and AI stories from September, the more recent world was also impinging on my awareness, and as often happens in those circumstances, a kind of Gestalt emerged with an insight I had glimpsed before but never fully fleshed out. I was listening to Brittany Luse interview former Missouri congresswoman, Cori Bush, on the NPR show, “It’s Been a Minute,” and Ms. Luse kept trying to get Ms. Bush to address how a progressive political agenda could survive in the face of the recent election to which Ms. Bush kept responding that change takes time—an answer Ms. Luse just didn’t seem to want to hear—and as I listened to this repetitious back and forth, the proverbial “light bulb” went on: change does take time, but that’s an answer nearly no one in today’s world can psychologically hear anymore.
It was like a syllogistic moment out of one of those scenes in The Queen’s Gambit where the main character manipulates the chess pieces in her mind while staring at the ceiling. Premise 1: Digital technologies have all but destroyed any capacity for delayed gratification in enormous swathes of the human population; the creation of AI has simply been the pinnacle of these efforts, offering instant essays, instant math solutions…instant chimeras of any manner of complex thought. Yet (premise 2) ALL real, authentic, lasting change is NEVER instantaneous, and so (conclusion) we find ourselves today living in a society in desperate need of change with almost no capacity for the patience to achieve it. Instead, when the needed change doesn’t happen right away, too many of us now either give up and disengage in fatalistic disgust or succumb to the Siren’s song of fallible simplistic-ness (if I may coin a word).
And the outcome of the recent election is a classic example. While a six-year study of the prices of 96 items at a Walmart in Georgia revealed that the overall price increase between 2024 and 2023 was a mere 0.7% inflation, that same study points out that the price increase between 2024 and 2019 was 25%. Given, I’ll suggest, that five years to an adult memory is probably the equivalent of the 15 minutes in the famed Stanford Marshmallow Experiment and you had most of society declare in early November that they wanted their “one marshmallow” now! and be damned the “two marshmallows” they could receive from patience with a (documented!) growing economy. Hence, the guy touting instant access to the single “marshmallow” won because it was the simplistic solution to a perceived “immediate” need.
Moreover, the fact that the solution offered was a materialistic one was key to the public’s response. In a society obsessed with stuff, entire populations of this country were prepared to ignore all the other bellicose threats Trump promoted—no matter how potentially detrimental to their own immediate lives and communities—and such is the power of this collective obsession that today, we are willingly standing by as a nation as corporate leaders such as Mark Zuckerberg openly prepare to sacrifice truth itself to maintain their profits against any legal onslaughts from the incoming administration about fact-checking (and God help you if you have the temerity to try to call out this greed in a nation-wide publication!)
So where does all this leave me as an educator? First, I’ve got children who can’t do basic math. Second, I’ve got AI that can’t solve the problem and actually threatens to exacerbate its difficulties. Third, I’ve got a society too incapable of delayed gratification to deal with either of these first two problems (let alone the enormous ones such as climate change and environmental degradation), and I’ve got a simpleton to lead them getting sworn into the Oval Office. Kind of a grim outlook for a grim winter.
However, there was one other story during this down time that came to my attention that reminded me that there is a solution to these problems (or any other), and that is: patient, steady, determined resolve. Granted, the story itself is really kind of trivial, namely that my alma mater officially rebranded itself as “WashU.” But you need to have the insider view of the story that underlies this story to know why it uplifted my spirits, and so please bear with me as I fill in some of the “behind-the-scenes.”
It starts in 1982 in the Public Relations Department of a university recognized regionally for its excellence, who has recently hired a new director who has made a small name for himself at some other midwestern schools for raising the profiles of those institutions. Washington University in St. Louis wants to stop being known as “the Harvard of the Midwest” and start being said in the same breath as Harvard instead. It wants to be “Washington.” The only problem is that there are at least 20 other institutions in this country that have “Washington” in their name, and all the locals and students know this school by its folksy title, “WashU.”
Enter the new director, Fred Volkmann—who has as one of his employees, a sophomore work-study student, hired to run the mimeograph machine and mail out press releases to the local and regional media outlets. Fred recognizes that there is authentic marketing power in the folksy, “WashU,” and he has a plan, a plan he generously shares the broad strokes of with his young work-study student (helping with the grunt work of the first rebranding campaign) who impudently wonders aloud why we can’t just make the switch immediately to “WashU?” Said student is given a quick but firm lesson in the intricacies of PR, and he goes back to mailing press releases.
By now, of course, any reader has filled in the blanks, and as an alum (and that former employee), I have watched Fred’s plan unfold from afar for over 40 years. I have watched my alma mater achieve the national recognition it aspired to all those years ago. I have watched its brand change from “Washington University in St. Louis” to “Washington University” to “Washington”—all stages in Fred’s original plan. But when he retired about eight years ago, there was as yet no “WashU,” and I wondered if “Washington” might be the end of things. That is until this past fall, when I learned through my alumni magazine that Fred’s grand dream from the early 1980s had finally come to fruition and that, henceforth, the official branding of my alma mater would be and is “WashU.”
Like I said, a rather trivial story—especially in a world where Palestinians are enduring threatened genocide and Los Angeles, California is basically burning to the ground. Yet, I think it is also a story full of potency and import because it is the story of the fundamental power of patient, steady, determined resolve to change the world. Like Fred, all any of us can do is plant seeds and quietly tend them, keeping faith that the crop will eventually bear fruit, and like Fred, there is a lot of anonymity to that task (most people reading this will never have heard of him). Therefore, when I think back to my earlier question in this essay, “where does all this leave me an educator?” it leaves me as it always does (and always will!): planting the seeds of knowledge, critical thinking, and wisdom in my students, doing so one day, one lesson, one moment at a time—something no AI or material “stuff” is ever going to be able to do.
It is not an easy task. Nor is the patient, steady, determined resolve needed to accomplish it a comforting reality. But it is the task at hand, and as I have oft quoted Luther, those of us committed to this profession “kann nicht anders.” Our world and its future are literally depending on it.
Barshay, J. (Sept. 2, 2024) Kids Who Use ChatGPT as a Study Assistant Do Worse on Tests: Researchers Compare Math Progress of Almost 1,000 High School Students. The Hechinger Report. https://hechingerreport.org/kids-chatgpt-worse-on-tests/.
By oneself is wrong done, By oneself is one defiled. By oneself wrong is not done, By oneself, surely, is one cleansed. One cannot purify another; Purity and impurity are in oneself. —The Dhammapada
On July 15, 1979, then President Jimmy Carter gave a televised address to the nation that history would come to call the “Crisis of Confidence” speech. In it, President Carter laid out the case that our society was suffering from a malaise of self-indulgence where “too many of us now worship consumption” and “human identity is no longer defined by what one does, but by what one owns.” He argued that as a country, we had adopted the mistaken understanding of freedom as “the right to grasp for ourselves some advantage over others,” and he astutely observed that “that path would be one of constant conflict between narrow interests ending in chaos and immobility. It is a certain route to failure.”
Well, here we are, nearly 50 years later, and as Ron Lieber of the New York Times recently pointed out, we are well on our way to that failure. He is worth quoting extensively here:
Consider how our children feel after we’re mostly done raising and educating them. The Cooperative Institutional Research Program at the University of California, Los Angeles, surveys first-year college students every year. The percentage who named being “very well off financially” as an important goal doubled from 1967 to 2019. Those who wanted to develop a “meaningful philosophy of life” decreased by nearly half…
Research by Tim Kasser and Jean Twenge showed that materialism among 12th graders increased over time, peaking in the late 1980s and early 1990s with Generation X, and then stayed at those historically high levels among millennials. “There was a trend underway at the time Carter was making this speech, and it basically just amplifies in the next 10 years rather than being suppressed,” said Mr. Kasser, an emeritus professor of psychology at Knox College, [who] watched these developments with a sense of foreboding, because his research has shown that higher levels of materialism are associated with societal instability…
And finally:
We will be tested again. Next time it may be a climate-related catastrophe, driven in part by the very patterns of consumption that Mr. Carter warned against in his speech. He called for turning down the thermostat in the winter and for 20 percent of the nation’s energy to come from solar power by 2000 — all these years later, we’ve done neither.
Which turns out to be truer than even I would have thought when I recently learned from a story in the local press of a couple paying nearly $900 for their heating bill this past month. This for a row house in the urban heat island that is the city even in winter. This for a place less exposed than my own three walls (I’m a duplex) and with fewer square feet. $900. What temperature, I thought, do you keep your house at??? For perspective, my largest heating bill ever was a little over $200.
However, putting my (self-righteous?) indignation aside, as we prepare to eulogize and bury President Carter this month, what strikes me most about his words all those years ago and the world we’ve created since is that the “village” has clearly been falling down on the job of “raising its children.” I may agree with the words attributed to the Buddha at the start of this essay that each of us is solely accountable for our individual moral character. Yet as I read these same words again, they remind me, too, that our moral nature is also a social construct. There truly is no such thing as a “oneself” in utter and absolute isolation; it takes indeed a “village” to make a self. What is more, it takes that same “village” to hold that same self individually morally accountable, and the paradox of this great truth is what our culture stumbles so badly over.
Take my discipline, for example. Everyone is rightly concerned about the declines in language and math skills seen since (and attributable to) the pandemic. But the interventions have focused almost exclusively on tutoring and other individualist efforts when the larger cause—absenteeism—has received proportionally little attention. “Chronic absenteeism [however] is not just bad for kids; it is bad for society. Learning is first and foremost a social endeavor, and kids learn to be part of a cohesive community by going to one every day” (Anderson & Winthrop). In other words, unless one is an integral part of the “village,” neither “village” nor “child” can thrive.
Which is the power of ex-President Carter’s example to us following his loss to Ronald Reagn in 1980. He chose to remain part of the “village” to the day of his death, holding both himself and others accountable for their choices and their actions and the impact of these on the larger world. With his hands, heart, and mind, he built literal villages as well as metaphorical ones, and those in turn helped raise tens of thousands out of poverty and into more participatory lives in their communities. He fundamentally embraced the paradox of the moral character of the relationship between “village” and “child,” and the lives he touched both directly and indirectly remain the better for that. His was very much a life worthy of modelling.
Would that the same could be said of all the political leaders in our lives.
There was a beginning to it. There are lots of societies that don’t have it. It takes very special conditions to support it. Those social conditions are now getting harder to find. Of course, it could end. —Thomas Kuhn
They who are aware do not die; They who are unaware are as dead. —The Dhammapada
If you are a member of Jonathan Rauch’s “reality-based community,” this past month has been a rough one. First, the re-election of the premier anti-intellectual in this country to the office of the President of the United States (and most depressing of all by an actual majority of voters this time around). Second, said President-elect’s announcements of his nominees for his Cabinet—including an anti-vaxxer for the Department of Health and Human Services! Third, the CEO of ExxonMobil all but imploring our President-elect to keep the U. S. in the Paris climate accords—and this coming from a company that currently depends for nearly 100% its profits on climate change’s very cause. And fourth, but so subtle that I suspect it flew beneath every radar except NPR’s, the threat of a second Trump presidency to the H-1B visas program.
“The H-1B what?” a reader might ask. Why on earth should a threat to H-1B visas generate despondency in the reality-based community? Simple answer: because the loss of this specific visa program will actually endanger the reality-based community in this country. H-1B visas are how universities, corporations, and engineering firms hire all the highly skilled workers they need (think PhD) to fill all the research positions needed to remain economically viable and competitive. “Foreign-born workers account for about half of the doctoral-level scientists and engineers working in the U.S.” reports NPR, and the reason for this fact is simple: there are simply not enough American-born individuals entering the educational pipeline for these kinds of degrees and scientific fields.
Which means our society’s anti-intellectual streak risks undermining not only our health and physical well-being; it risks damaging the very source of our economic power and standing in the world. If a pissed-off electorate that voted for Donal Trump thinks the price of eggs and rent are too high now, I can only imagine their reaction when major companies close because they don’t have the intellectual capital to compete in the world’s marketplace anymore. As Raymundo Báez-Mendoza of the Leibniz Institute for Primate Research in Göttingen, Germany points out, “a lot of countries in Europe benefited from Brexit, in the sense of capturing really amazing scientists that were working in Britain [because in the world of science] top talent is very mobile.”
Of course, it should not come as that much of a surprise that our country cannot adequately supply its own need for highly skilled workers. Not when we idolize celebrity over the painstaking work of solving an equation. Not when we would rather doomscroll on our phones than read a book that might challenge an assumption. And perhaps most telling of all, not when the brain science clearly shows that the first five years of development are the most critical for wiring a brain that can produce such a worker and yet we pay those responsible for teaching this age SO poorly that 12.3% of them live below the poverty line here in a state with the second highest household income in the country. And where more than a third of Maryland households in this state with an early childhood teacher in them must use at least one (and frequently more!) of the social safety net programs such as Medicaid and SNAP. It is cliché that you get what you pay for, and we as a society simply do not pay to produce the kinds of brains needed to produce highly skilled workers.
Therefore, here I sit, then, a trained biologist, thinking: the CDC is reporting that only around a third of all adults in the U.S. have taken this year’s flu shot and less than 18% have received the latest COVID booster; the childhood disease, measles—one of the most deadly and declared eliminated here in the U.S. more than two decades ago—has already had 16 outbreaks so far this year; and human life expectancy—at least in this country—has actually declined for the first time in centuries. All because, as Dr. Gregory Poland of Atria Academy of Science and Medicine puts it, “as a society right now, we’re in a phase of rejecting expertise, of mistrust of any expert, whether it’s science, meteorology, medicine, government – whatever it is.”
And that causes me to contemplate what I once thought impossible: that Thomas Kuhn may have been right when he suggested that science as a method of studying and understanding the world could actually disappear—perhaps forever.
Science, of course, is the “it” in my epigram from Kuhn as the start of this essay, and the famed historian of science is reported to have said these words in an interview with Scientific American towards the end of his life in the winter of 1991. As the author of one of the single most influential books of the 20th Century, he comes across in the interview as weary of what he perceives as all the misunderstandings people have had about his ideas, and when pushed, he basically states flatly that science as an intellectual endeavor is just as much a social construct as any other such endeavor and, therefore, like any social construct, it can die.
Now I have recognized for some time that any shared sense of truth in this country was—at best—on life support. The firehose of dis- and misinformation modern digital technologies have made possible have all but ensured truth’s demise. But the idea that the one remaining arbiter of truth could be in trouble, that the one arrow left in our collective epistemological quiver could disappear…naively, that thought had never occurred to me before encountering Kuhn’s words amidst the events of the past month. Suddenly, I had gained a small, existential insight into the voices of the many African American women interviewed following the election: “Damn! Please don’t tell me I have to keep fighting yet again a battle that I should not keep losing.”
But for those of us in the sciences, fight we must. We must become the resistance to every effort of the in-coming administration to dismantle the scientific infrastructure in this country. Furthermore, we must do so anywhere and everywhere we can. In labs and research centers. In classrooms and homes. In legislatures and city halls. In movies and museums. Even in the kitchen!1 Put bluntly, all of us in the “reality-based community” must join like-minded individuals such as Hank Green of the Vlogbrothers and Scishow and do everything in our power “to make the truth go viral.”2 It won’t be an easy fight, and I openly confess that I, too, am growing weary of the constant need to battle ignorance and stupidity. But I could never look the generations of children who have come through my classroom in the eye if I didn’t say I tried. How they will judge me only time will see.
Coda
During my morning run today, I was reminded yet again of how spectacularly and especially beautiful this fall has been here in the State of Maryland. Seldom have I seen such rich colors that have lasted for as long, and there is even this one oak on my walk to school where the rays of the rising sun hit it in such a way that I can only shake my head in awe at the metaphor for God chosen by the authors of Exodus—burning bush indeed!
However, this same beauty has made me recall the opening lines from James Stokesbury’s history of World War I which I reread this past August:
The summer of 1914 was the fairest in living memory. Grass had never been greener, nor skies bluer. Europe lay rich and ripening under the warming sun, and from the Ural Mountains to the wave-beaten west coast of Ireland the cows fattened, the newborn animals played in rich fields, and lovers strolled in the country lanes….So beautiful was that summer that those who survived it invested it with a golden haze; it assumed a retrospective poignancy, as if before it, all had been beautiful, and after it, nothing ever was again. It became the summer that the world ended, and it was somehow fitting that it should therefore be the most glorious summer ever (p. 11).
For a whole lot of people—many of whom don’t yet realize it just as many didn’t in 1914—the world as they knew it ended on Nov. 5. Even science itself in this country may have ended, and what keeps me up at night about the looming battle is that while I am not yet truly elderly, I am also clearly no longer young, leaving me with a fraught and fretful question:
Who’s going to take up the mantle when I’m gone?
1For more on science in the kitchen, check out J. Kenji López-Alt’s The Food Lab: Better Home Cooking Through Science (New York: W.W. Norton & Company, 2015).
2To learn more about Hank Green and his on-line efforts to debunk falsehoods of all kinds, listen to the Nov. 22 episode of NPR’s On the Media.