“For now we see through a glass darkly,” said Paul (I Corinthianas 13:12), “but then face to face.” By “glass,” the King James Version means “mirror.” But what if the “glass” is a computer screen instead? Last week, I taught an essay called “Look Up from Your Screen,” by Nicholas Tampio. The occasion of the article was a proposal by Bill Gates and former Secretary of Education Betsy DeVos to install computers in schools so that students may engage in “personal learning,” which means learning by themselves. Class time will be exchanged for computer time. The term “personal learning” is an interesting subterfuge, since it refers to what is actually impersonal learning, interaction with a screen rather than with human beings. Why would anyone propose this? Proponents tout educational advantages, especially that students would be able to learn at their own pace. But the real reason is money. Beware of proposals coming from those who stand to make a profit by them, whether by selling millions of computers to school systems or, in the case of a right-wing Secretary of Education, saving money by “efficiency,” cutting down on education’s biggest expense, teachers’ salaries. And maybe other things as well: it would provide more excuse to cut things like field trips and arts programs—students can visit a farm or an art gallery online instead, if such things are even necessary and not distractions from the real task of teaching students “useful” information and job skills. This would appeal to voters who resent their taxpayer dollars going to such frills.
Advocates of the proposal might have thought they could count on a level of support from students themselves. After all, students like computers and don’t like going to class, right? But the article dates from 2018, before the pandemic. The pandemic deeply changed students’ attitudes about screen learning. In the earlier years of my teaching career, a letter or editorial periodically appeared in the student newspaper complaining about teachers’ attendance requirements. The argument was, “If I can read the textbook and pass the exams on my own, why should I have to go to class? Class time is mostly wasted.” But that kind of complaint has largely disappeared—to the extent, anyway, that it has not been replaced by students whose reason not to attend class is “social anxiety.” But most students agree with their teachers: class on Zoom sucks. The experience of having to conduct a class session with two dozen talking postage stamps on a laptop screen, some of whom are visibly asleep, some of whom, in defiance of the rules, have not turned on their cameras, is deeply alienating. My strong impression is that students now value what are significantly called “contact hours,” and I strongly agree with them.
The model of education implied by “personal learning” is practice in quarrying information out of some data base, whether that data base is a textbook or a search engine or special educational programming. Such learning is a valuable and necessary foundation of all higher education, and the argument is not either-or. Computers can be helpful for the kind of education that consists of factual acquisition—for expository writing, or, in common lingo, reports. Students often come into freshman composition class thinking that college writing means reports. A number of students like reports because they are easy. They demand no thinking, only a passive collection of cut-and-pasted information. No wonder it is hard for them to see what is wrong with AI-generated reports. Why bother to do that drudgery of collecting and pasting if a machine can do it for you? When they find that they are not to write reports but engage in “critical thinking” about some topic, and one which rarely has a simple “right answer,” they may be bewildered.
I tell them that writing informational reports is a noble task. If you have a knack for it, you may do the rest of us a great service by writing intelligent popularizations of difficult subjects for lay readers. I glean what little I know, what little I have the capacity to understand, from popularizations of the sciences, mathematics, economics, and other subjects for which I have less than no talent. I am grateful to Stephen Jay Gould and Loren Eiseley on evolutionary biology, Martin Gardner on mathematics, Carl Sagan on astronomy, John McWhorter on linguistics, Paul Krugman on economics, Simon Winchester and Heather Cox Richardson on history, Alberto Manguel on books and the history of reading, and many more. All of them help me to be a little less ignorant—and all of them are stimulating, a pleasure to read. Most of these people have also published formal academic work in their specialization, but they are able to switch languages and speak to all of us through popular nonfiction in various forms, not just books but newspaper columns and Substack newsletters. It saddens me that, outside of Northrop Frye’s classic little book The Educated Imagination, there is relatively little in my own area, because literary studies got caught up in the culture wars and has often regarded the task of writing for an intelligent general public with great suspicion. Not only do I value such writing, but I am heartened by the fact that there is an audience for it.
To these I might add a different type of popularization. For years, I have been teaching myself to play guitar through instructional DVD’s, and now online programs. Those of us of a certain age can tell you that back in the 60’s, when suddenly an entire generation wanted to learn to play guitar, you largely did it by ear—personal learning in a bad sense. There was tablature, but, to me, playing from tablature is like trying to read a novel by translating it from computer code language. It was good for developing your ear, but inefficient and laborious. Video instruction is a very good way to learn—personal learning at its best, perhaps. Nowadays there is a lot of bitty-and-piecey but useful visual advice on YouTube videos. So personal learning is not a bad thing. It only becomes bad when it becomes a total model of the educational process, whereby it becomes reductive and ultimately a vehicle for the suppression of thought.
The knowledge of popularizations is what could be called consensus knowledge, that which is generally accepted, mostly not contested. But as students mature intellectually they come to learn that the disciplinary consensus, even in the hard sciences, is to some degree a polite fiction. Behind the scenes, there is relentless argument and a conflict of interpretations that can become outright nasty. No sense urging scholars to stick to the facts when what counts as a fact is the first bone of contention. We introduce students to the idea that all knowledge is interpretation through what is called “critical thinking.” The type of essay that students learn to write in freshman writing classes is one that incorporates information from sources, not for its own sake but as a means to the end of thinking about some question, problem, issue, or conflict of views. They have to learn the technique of close reading in order to grasp the argument or conclusion of each source, evaluate for bias or logical weaknesses, and come up with their own conclusion. While the long-term goal is that, yes, eventually students will acquire the skill of doing such active reading, thinking, judging, and forming of conclusions on their own, in the beginning students cannot just be set up in front of a computer and told to perform such higher-level thinking.
The student sitting and learning alone on a computer is a model of the little individualist. But after a point that kind of individualism is an illusion. We are a social species, and we learn through social interaction. We learn through conversation, even when part of that conversation may be reading what others have written about a subject. When you do the reading for your doctoral research, you are told you have to acquaint yourself with the “critical conversation” surrounding your subject. The nature of higher learning is dialogical. We need to be face to face, literally or at least figuratively.
While the conversation is not necessarily adversarial, it is always exploratory, always challenging, always doing what science fiction writer Theodore Sturgeon called asking the next question—the one after the obvious question, always reflexively imagining what the opposite view would be to that which seems obviously true, asking whether that view has not been ignored or suppressed. The natural vehicle for this kind of learning is the small discussion group. “Socratic method” meant Socrates and a small group of engaged students choosing a subject and delving into it, turning it inside out and upside down, trying to arrive at a deeper level of thinking than the kneejerk conventional. Paradoxically, while it tries to arrive at a kind of insight that is not just subjective opinion, it is not an impersonal inquiry at all. The students’ individual personalities, including all their quirks and neuroses but also all their gifts and intuitions, not to mention their humor, are involved in the inquiry because we think with our whole selves. Making ourselves over into some kind of impersonal reasoning machine is not the ideal of liberal education. And the teacher’s personality, quirks and all, is also part of the process. Students know this: they find out those professors with whom they have an affinity and therefore learn more, and they may take every course that professor offers. Each of us has a group that my former wife used to call the fan club, but there is something serious in that joke-phrase. This is not a matter of egos and cultism—it can become that, all too easily, but when it does, it immediately becomes toxic. The real dedication is to the process of exploration itself and to the expanded vision that is the goal of that process, the Grail we all seek. Unlike the Grail quest in the stories, which was one more zero-sum game with only one winner, we may all achieve the Grail on our own terms, our own vision, even if it is our personal version of a universal one.
Socrates was an oral teacher. We only know his thought because his student Plato recorded it, but Plato’s writings are dialogues, records of conversation face to face. It was a strange feature of the literary, philosophical, and cultural theories of the so-called theory wars of the 1970’s through the 1990’s that they were to a marked degree hostile to orality. Deconstruction in particular championed writing, which it saw as subverting what it called a coercive “metaphysics of presence” idenitified with speech. One of the things that “presence” means is face to face. Writing marks an absence: it exists in place of a speaker. It is therefore detached, literally and figuratively, from any speaker, and is to some degree “other,” not identical to its author. This makes writing more analytical and impersonal, and therefore congenial to disciplines like philosophy that developed away from the Socratic group towards the labyrinthine abstractions of modern philosophy. From my own point of view, what is valid in this skeptical and ironic stance is its opposition to a certain type of orality that is indeed dangerous because it works to collectivize its audience. Socrates’ rivals were the Sophists, who made good money training the young men of Athens in the art of oratory that was invaluable to a career in politics. As Socrates complained, the Sophists did not care about truth: they cared about what moved the crowd. They taught the art of oral manipulation, and the type of manipulative speechmaking they advocated is a direct ancestor of today’s mass media. In my lifetime, the primary electronic vehicle of mass manipulation has moved from radio, used by Hitler on one side and Churchill on the other of the worst collective uprising in history; then to television, which Orwell portrayed as an instrument of dictatorship in 1984; and now to social media. This electronic collectivizing is still oral in its basis—Walter Ong called it “secondary orality”—even though most of it is at a distance and not literally face to face, and it does indeed represent an enormous danger, because we don’t know how to control it. It is supplemented by a more traditional type of in-person rabble-rousing, represented by Trump’s MAGA rallies, in which he whips people into a lynch-mob frenzy, an adrenalin and dopamine high that gives his followers a jolt equal to any fentanyl or heroin, and to which they become addicted. It is the pure power of negative orality, and you do not have to be a post-structuralist to oppose it.
But note that such frenzies are the opposite of Socrates’ small-group face-to-face inquiries. The mass audience is faceless—the whole idea is to dissolve one’s face, one’s individuality, into something bigger and more powerful. Its “presence” is anything but metaphysical, though what “metaphysics of presence” usually signifies is the type of ideological writing that rationalizes collectivism. But that is the point. So far as I can see, the whole speech versus writing controversy is simply confused. There is a bad, collectivizing orality—but also a bad use of writing as an instrument of that collectivism. There is also a good, individualizing orality, and a use of writing that recreates and extends it.
What is it about being face to face that is so important? The classroom, or other form of the Socratic small group, exemplifies the fact that we have evolved to function this way, interacting directly in a manner that is not just social but, strange to say, also physical. Tampio cites Maurice Merleau-Ponty, whose philosophy was founded on the premise that the mind-body split is simply false: “In other words, human thinking emerges out of lived experience, and what we can do with our bodies profoundly shapes what philosophers think or scientists discover. ‘The entire universe of science is constructed upon the lived world,’ he wrote” (565). Tampio spells out the implication of this: “Humans are thinking animals whose thinking is always infused with our animality” (565).
Recorded dialogue captures the movement of ideas, but usually removes it from its context, in which so much more is going on that is a necessary part of the conversation. When I am trying to facilitate a discussion, the front of my mind is focused upon the ideas going back and forth among the speakers. But the back of my mind is monitoring half a dozen other things that are physical and emotional. I am reading the mood of the group, based on body language—how many people are slumped in their chairs, especially those trying to hide in the back; how many are on their phones; how many yawning or restlessly shifting in their seats. But I am also noticing what is going on with individuals. When someone has an idea that has excited them, I see it in their face, and sometimes call on them, saying it’s obvious they are on to something. When someone suddenly walks out of class, was it a panic attack, bad news on their phone, illness, or something emotionally triggered by the discussion? I interpret these things based on what I know of the person’s personality and life. I am additionally, myself, reacting through body language and other physical responses. I move within the classroom, partly to hear people, partly by instinct, partly driven by restless energy. I laugh; I focus intently on what someone is trying to articulate, deciding whether to try to help out of keep silent and let them say what they have to say in their own time. Most of all, I keep track of what each student says—or of a student’s silence—recording it mentally as part of my picture of who they are, what their vision is, what their needs are. All of this changes, minute by minute. And I also keep track of the time because I have to draw it all to some kind of end that is not just abrupt or awkward.
All of this stuff is not mere distraction: the movement of ideas emerges out of it. It is a product of individual personalities with individual issues, and those are in turn grounded in bodily responses and unconscious motivations that influence our thinking at every moment. Some people prefer nice, clean abstractions, uprooted from all this mess of personalities and bodies. Abstractions are their escape from all that confusing and encumbering mess. But the kind of detachment that produces abstract thought is bought at a price, a price that could be described in the words of a character in one of Blake’s Prophecies: “Trying to become more than human, we have become less.” Ideas are expressions of personalities, and personalities are emanations of the body and the unconscious. The beginning of wisdom is to recognize and accept this. The end of wisdom might be to celebrate it. The educational enterprise does not have to be entirely limited to dialogue, merely grounded in it. In some classes, due to the nature of the material, lecture rather than discussion is the appropriate format. That does not, or should not, eliminate the back-and-forth with students, merely transfer the student response to written interpretive essays once the lecture has made the material accessible by providing necessary context, often historical.
But the ideal of the lecture is still oral, still in-person. As a way to deliver information, a lecture is inefficient, but that is not its only purpose. A good lecture is a dramatic performance, and there is an interaction between the lecturer and the audience that is just as important as that between a stage actor or musical performer and their listeners. I have read many testimonies of folk musicians about how emotionally damaging the pandemic lockdown was for them. They depend on the energy that circulates between performer and audience. That is what makes life worthwhile, so much so that many of them are willing to live an itinerant life, traveling long distances to the next live audience, for the next experience of the magic that can happen face to face. Lecturers may have their own version of this in the form of book promotion tours, which can be at once exhausting and energizing. The chance to interact in person with one’s readers after writing a book or a blog alone and sending it out like a message in a bottle can be satisfying. Of course, lectures can be recorded, either digitally or in print, for those who cannot be present for the live performance, the latter being an example of the way in which writing can be a valuable extension of the enterprise of what begins face to face. Northrop Frye was a brilliant lecturer, and most of his many shorter books are lecture series turned into print. The Educated Imagination was six radio talks on the CBC. These books retain much of the warm, conversational voice of a good lecture, as well as things regarded as irrelevant to formal articles, such as pacing and a sense of dramatic structure. These latter are musical elements, and it is often remarked how Frye’s training as a pianist informs his writing.
Formal academic style typically rejects this kind of accessbility. Both traditional and radical critics may see it as pandering, evidence of a lack of intellectual rigor, and, worse, of a kind of dishonesty that simplifies difficult and ironic argument into something easy and comforting for a lazy and complacent bourgeois audience eager to be comforted. The result is a formal prose style that lacks all oral animation, that attempts so far as possible to be “writerly,” as it is called, which means, instead of a voice, a labyrinth of “signifiers” that lead to other signifiers, signifying nothing except to those who are initiated into the academic code. Both traditional and radical academic discourse aspire to the impersonal. Traditional academic style rejects the personal as bias and attempts to be the voice of rational empiricism, just the facts, ma’am. Radical academic style rejects the very idea of the individual as a fiction of the privileged. The “individual” or the “human” are instead “demystified” into a play of signifiers. What both scientific and radical formalism are reluctant to admit is that the ideal of the impersonal has a Jungian shadow. The impersonal style all too easily becomes a language of authority used by regimes that wish to deny personal responsibility for their actions, and to lie with impressive-sounding authority. The classic essay about this is Orwell’s “Politics and the English Language,” a critique of the pseudo-scientific euphemisms used by politicians as a kind of smokescreening, in which bombs that kill babies and shred flesh become “anti-personnel devices.” There is no voice of a real human being taking responsibility for what is said, deliberately giving rise to the impression that, well, these things happen and there’s nothing to be done about them. Thus, writing is not superior to speech. Both are pattern-creating products of the imagination, forms of power, good or evil depending on how they are used.
Socratic dialogue is a Classical ideal. Christianity, like most religions, may seem at first to present an alternative, in the form of an absolute truth delivered to a passive audience. Jesus’ most famous public teaching is the Sermon on the Mount. Except that in the Gospel of Luke it is delivered on a “plain.” It is Matthew who put Jesus on a height above his listeners, and it was not just because the acoustics were better. Matthew is obsessed throughout with showing that Jesus is the fulfillment of Old Testament prophecies, and Biblical scholars generally agree that his placement of Jesus on a mount is a piece of interpretation, designed to parallel Moses’ receiving of the Law on top of Mt. Sinai. This is just a small example of how attempts to isolate the “historical Jesus” and “what really happened” from all the later mythologizing are futile. Any “literal” reading of the Gospels is a construct. This is true even of the central event of the Resurrection: Mark, the earliest Gospel, breaks off in mid-sentence and leaves us with an empty tomb, but no risen Christ. So, Jesus, like Socrates, taught orally, face to face, but whatever he said is hypothetical. What we have are interpretations, some of them ideologically motivated, some mere accidents of the game of telephone that is historical transmission. Monty Python satirize this brilliantly when, in Life of Brian, they showed that maybe it was a matter of the acoustics after all. At the Sermon on the Mount they are in the back row, and when Jesus gets to “Blessed are the peacemakers,” what they hear is “Blessed are the cheesemakers.”
Christianity began as a religion of small groups of kindred spirits, two or three gathered in Christ’s name. But it opted to go big time, becoming an institution with world-conquering aspirations, and it did so, first by allying itself with the Roman Empire, and, second, by establishing top-down theological authority and declaring all attempts at “dialogue” or difference to be heretical. Jesus did his share of playing to rock-stadium-sized crowds, but when he taught his disciples, it was in small groups that resembled those of Socrates. And the teaching was more interactive than we have often been led to believe. Among the group of kindred spirits, Jesus taught less through precepts than through parables (in the Synoptic Gospels) and metaphors (in John). Parables and metaphors do not give the “right answer.” They demand that the listener think and wrestle with what seems riddling and paradoxical. Jesus is very much not demanding that the disciples obediently parrot certain messages and commands. He is, like a good teacher, demanding that they think, and think hard. The disciples, like Socrates’ students, routinely blow it, and both Jesus and Socrates then have to take over the conversation and guide it, which is not necessarily the same as squelching it. He did the same thing with his hecklers, the scribes and Pharisees, turning their challenges against them and outdebating them.
It is this kind of active, engaged response that Milton saw as essential to Christianity. His great defense of it is his prose essay Areopagitica, which is far more than a mere defense against censorship. Someone, he says, who passively accepts doctrines without questioning and actively understanding is what he calls a “heretic in the truth.” At the center of every one of Milton’s major poems is a face-to-face debate, an agon or contest of greater import than any fight with a dragon by a traditional warrior. There will always be voices tempting, smooth talkers trying to con us into pledging our allegiance to the wrong cause, the cause whose essence is always, whatever the disguise, the will to power. Just being nice, obedient subjects is not sufficient. God respects us enough to demand that we be able to think critically and debate articulately for the good, to resist being conned by the attractive illusions offered by life’s various tempters. Eve learned this the hard way. In Paradise Lost, she is face to face with the serpent, and the serpent wins. She has insisted on leaving behind her conversational network that might have helped her see through the serpent’s lies and verbal manipulations, and stands, and falls, tragically alone. Milton is capable of showing a woman strong enough to resist being “seduced”: the protagonist of his masque Comus is only 15 years old (or at least that was the age of the girl who played her in its production), but she verbally trounces the evil Comus. Milton’s greatest debate poem, though, is Paradise Regained, based on Christ’s temptation in the wilderness by Satan. Once again the debate is face to face, a verbal agon or contest, and Jesus wins it as a man, not a god.
This is what is demanded of us: the Christian battle is inward and mental, not outward and physical. Winning consists of clarification, of the revealing of truth by its separation from error. Milton was influenced by the Classical tradition of Socratic debate, which had become part of the humanistic system of education in his time. Some of his school exercises are debates on conventional subjects, such as whether day is superior to night. His matched pair of pastoral poems, L’Allegro and Il Penseroso, implicitly force the reader to take part in the debate between Mirth and Melancholy. Each personified figure makes the case for their mood, and then it is left up to the reader—there is no “right answer.” However, Milton saw the demand for an active, questioning faith as specifically Protestant. Catholicism thinks it already has the truth, and sees the task of the Church as preserving it unchanged. If doctrine needs to be interpreted, it must be by properly sanctioned authorities. But Milton preaches tolerance of dissent and the inevitability of disagreement in Areopagitica because we do not have the truth, but are only seeking it, hopefully approximating it more and more but never in full possession of understanding. Northrop Frye’s two books on the Bible are based on a Protestant attitude that the Word has the last word, so to speak. It is infinite, and our understandings are finite, and it may always break through our attempts to cage and control it with our interpretations.
Frye uses the traditional term kerygma, usually translated “proclamation,” to denote this power of the Word to shatter and then reveal, to decreate but then recreate our understanding, and it is for that reason that both The Great Code and Words with Power end with a discussion of the Book of Job, in which Job, through debating his three friends and Elihu, is really debating God himself. God’s answer is so kerygmatic that we don’t really know what to do with it, but his speech must mean more than “Shut up and obey, puny human,” even if that is what it sounds like. If nothing else, God’s tremendous poetry makes us hesitate to dismiss him as as an Old Testament Galactus, the tyrant Blake called Nobodaddy, nobody’s daddy. Job himself seems to have broken through to a vision beyond that of the authoritarian bully. He does not explain, perhaps because, like Dante in the Paradiso, he has been momentarily “transhumanized,” and what he has known cannot be explained in ordinary subject-object language. Yet we have one clue: “Though worms destroy this body, yet in my flesh shall I see God” (19:26). Right now, God is a talking whirlwind. But there is another state in which Job and God meet face to face, and, in their meeting, God ceases to be an alienated other and becomes instead what Frye called a “spiritual other.” Blake represents this in his illuminations to Job by giving Job and God the same face.
The face-to-face encounter in a classroom generates what teachers call “chemistry”—except for the times when it doesn’t. My colleagues and I all know the experience of teaching a class with multiple, identical sections, like freshman comp, the same material taught the same way by the same teacher, and having one class effervescent with engaged energy and another one a leaden ordeal to get through. The difference lies in the students. This is true of all human groups.
What I am calling the face-to-face encounter is the heart of liberal education, and it has its center in the humanities. However, universities are presently being dismantled by forces that would like to raze them to the ground and rebuild them as centers of “practical education.” Those forces consist of big-money capitalism in league with right-wing political and religious factions—the usual suspects. Thus far, they have been so appallingly successful that it is uncertain whether liberal education will survive another generation. The cover story is “declining enrollment”—sorry, the number of students is diminishing and there’s nothing to be done. Higher education needs to be ruthlessly downsized, too bad. This may sound plausible, but it is an example of what Milton called “necessity, the tyrant’s plea.” Declining demographics are only one cause of the crisis in higher education. The deeper cause is the withdrawal of support, monetary and otherwise, that has been going on now for decades. The fact that most students now graduate with crippling debt is not some fact of nature—like climate change in nature, it has been produced. Universities are in debt partly because they now have to provide enormous amounts of financial aid to enable students to attend their institution at all. Enrollment decline could be largely counterbalanced, at least avoiding the present catastrophe, by a return to the levels of public and governmental support that used to be considered normal. In Europe, students have traditionally had the right to a free higher education, not out of airy fairy idealism but because democracy depends on people who are accurately informed and can think critically. But there are many forces in the United States who very much do not want people to be able to think.
And so the radical surgery of downsizing. What is cut is community. In my academic building, there are now no secretaries at all. Department chairs must do everything for themselves. No one talks any longer of the “college experience” as a reason students would pay for four years of schooling. The lack of a secretary does not just mean that the workload falls on someone else. Our wonderful secretary, Lee Ann Jindra, who wisely got out several years ago before she would have been fired, was not only good at her job but beloved of students and faculty alike. Her office was the symbolic hearth of the English Department. People would meet up randomly in her office and end up having conversations—students, staff, and faculty, all face to face. This kind of thing is remembered by students for years, in fact decades, as a part of the warm feelings people have about their college years. Extracurricular activities have also been axed. There is no student newspaper this semester, and no yearbook for a long time. Events in which faculty, staff, and students bond, such as awards ceremonies, have been cut to a minimum: when they occur, they are austerely low-budget. Faculty used to end up talking in the hallways, but that doesn’t happen now. People go in, teach their courses, and leave, as much from a lack of social atmosphere as because there is no time, though faculty courseload has in fact been increased. The conservative strategy of “starve the beast” is being applied to higher education, and there is no resistance to the onslaught.
But liberal education is not entirely social and extraverted. Interchange in the classroom finds its necessary counterbalancing opposite when the student has to go back to the dorm and write a paper. Writing is a solitary act, and writing introverts. I urge students not to write the superficial kind of essay that says the obvious. They have a choice as to what to write on, and I urge them to pick something that engages them, something they feel the need to explore, to wrestle with, to figure out. Their conclusion will be based on “close reading” of the text—I am not asking for an opinion paper. But they should, ideally, arrive at something that is their insight, that belongs to them, perhaps something only they could have said because, again, our thinking is based on our individuality, which includes our feelings, our past history, even our bodily condition (more so than ever in an age in which students suffer from all sorts of mental health issues). If students are liberated from the expectation of following instructions and giving the teacher what the teacher is presumed to want, it can result in what we call a “breakthrough paper,” written on a level that leaps beyond what the student was previously capable of. Students may remember their breakthrough moments all their lives, and they should. It is the moment that they found their true identity, that they came into their own, that they discovered their authentic voice.
If this goes down, we go down. What Christianity calls the “natural man,” expanded to include all genders, does not want liberal education. The natural self hungers for authoritarianism and tribalism. This election campaign has revealed starkly how powerful that urge is, sucking in not just the deplorables but people who once had decent impulses but were too weak to resist temptation and have given in to collectivism, to the mob. In the end, we battle ourselves. Whether individually, when reading, writing, and thinking alone, or in the public arena where we must choose—“Reason is but choosing,” Milton says—we are always face to face. Whether in the solitude of loneliness or the solitude of the crowd, we are always face to face with ourselves. We must look at ourselves in that dark mirror to see ourselves reflected there, and talk face to face with our own otherness, with the self we do not want to know, but must. This is what education really means. The “liberal” in liberal education means liberating, and what it liberates us from, ideally, is ourselves.
Reference
Tampio, Nicholas. “Look Up from Your Screen.” In The Norton Reader, 16th Edition. Melissa A. Goldthwaite, Joseph Bizup, and Anne E. Fernald. Norton, 2024.
Richard, thank you. I'm always honored that you not only read the newsletter but engage with it--yes, face to face. My experience with LLM's is confined to sporadic student attempts to cheat. The typical product is blah-blah in sophisticated prose. It sounds as if it's really saying something, but it's really empty mimicry. Thing is, such mimicry can take place between human beings: all too many academic articles might a well have been written by AI, because all they do is mimic what's fashionable while being pretty much empty. Hence the importance of real liberal education, so that a student--or a citizen--can say--"But you're not really saying anything." Then maybe Russian bots wouldn't have such sway over elections. So all I can say is that what I've experienced of LLM's is that they couldn't actually provide a rebuttal of a real argument, at most fling back some fashionable memes they've found that mimic a rebuttal. If they actually could do so, we would be in the world of Philip K. Dick's novels, but I don't see that at present.
Excellent, memorable essay as always. I'll be curious about your speculations about how ChatGPT and its future iterations affect this. It's easy, for example, to copy/paste your entire post into an LLM and say "This is what I think. Prove me wrong" or "Pretend you're Northrop Frye and quiz me socratically about the strengths and weaknesses of my arguments". You could even superimpose a video and feel you're on a Zoom call. In a few years, it'll be a lifesize robot in the room with you reacting to your facial expressions.
Incidentally my own humble opinion is that, while these LLM-based technologies are much better than "a glass darkly", we humans will always prefer other, flesh-and-blood humans. As Merleau-Ponty emphasizes, to understand something requires a shared culture (and biology) .