I have become aware that I do not remember some things that people tell me. Yes, that happens to everyone, but with me it seems more frequent. I have on a few occasions forgotten serious, personal information that people I am close to have told me. Sometimes I remember bits and pieces of it. Occasionally, it is simply gone, and I have no memory even of being told. No, despite my age, I do not think it is dementia, because, having seen that it is a pattern and not random, I realize that I have always been this way. That is in fact one way of stating it: this is not a decline but a sudden insight into what I have always been. And it seems built in: a feature, not a bug. It resembles dementia, however, in the way it isolates. We all see how people with dementia are marooned on a kind of island, to some degree cut off from the full experience of life. My forgetfulness produces the same result but has a different origin, or so I think. Hearing loss, which is from old age, does not cause this obliviousness but does join with it. Hearing aids have been transformative for me, and yet they do not correct hearing perfectly the way that corrective lenses or cataract surgery may correct vision back to near perfection. Some degree of hearing loss is permanent. Those of us who suffer from it tend to describe it in the same way: “I can hear you, but I can’t understand what you’re saying.” It isn’t a problem of volume but of clarity. At least some of the words are lost, not sharp enough to make out. You wish for an adjustment that can bring them into focus in the way that you can focus binoculars, but there isn’t one. But what I am talking about is different, although the loss of some degree of full experience is similar. I am forced to recognize that I walk around missing part of the life around me, like a goldfish who sees only dim shapes beyond the bowl.
I am not at all bemoaning my fate. If this is my worst affliction in life, I am very lucky—and I am very lucky, or at least have been so far. I am not very interested in my personal eccentricities for their own sake but rather for what they may reveal of psychological patterns common to other people, maybe to everyone. The diagnostic terms of academic psychology have a limited usefulness in my case. However, what I experience, or fail to experience, fits a carefully delimited portion of what is now called “inattentive ADHD,” which replaces the old term ADD. But that definition is too broad. What has the ring of truth for me, rather, is the description of the particular aspect of inattentive ADHD that involves what is called “prospective memory,” for which the rather evocative description is “remembering to remember.” The problem is that things are not retained in short-term memory. It is especially acute in one-on-one conversation. Someone tells me something, and then the conversation moves on to many other topics. Those topics overwrite the short-term memory of what the person told me, as a computer file may be overwritten and lost. I am quite uneasy about the danger of trying to use fancy psych lingo as a way of absolving myself for what may simply be a kind of narcissistic self-absorption. Maybe I just don’t pay enough attention. And yet, descriptions by others of their own problems with “prospective memory” have startled me. If others are like this too, maybe it is more than just rationalized self-centeredness. Here is a description by Tim Beshara from a “guest blog” called "Inattentive ADHD and Me":
You also have a rubbish working memory. Your long-term memory can be excellent, but your ability to temporarily hold two or three pieces of information in your mind at any one time is limited. If you are typing on your computer and someone asks you to remember to call someone, you will nod and say yes, you will actively try to remember but the information never lodges.
Aligned with this is a deficiency in your prospective memory. Prospective memory is all about being good at remembering to remember. The thing about tasks is that they are set to be done at a specific time. “I need to pay this bill when I get home.” “I need to pack my lunch when I leave for work.” “I need to go to the post office at lunchtime.” With inattentive ADHD you store these pieces of information as you would an answer to a trivial pursuit question, not as a note in a diary. So even if I’ve reminded myself several times I need to put my lunch in my bag before I walk out the door for work, the thought will simply not enter my mind at all.
For me, that nails it. That is exactly how I am. It is genuinely startling to realize for the first time that others may be that way too. Whether that excuses my steady stream of small failures is another matter. It is part of adult responsibility to remember things, especially things that may not be as trivial as packing a lunch. Whether we can help them or not, we must take responsibility for our failures because they affect other people. We must take responsibility when we forget things that other people tell us, things that are important to them.
Memory seems at first a commonplace topic, and yet, the more we think about it, the more complex and mysterious it seems. We may start by distinguishing between objective and subjective memory. By objective memory, I simply mean information, memory as a data base containing various facts. Before the advent of writing and other ways to record information permanently, there was a great deal of anxiety about forgetting. Poets were repositories of their cultures’ vital information—their myths, their histories and legends, their genealogies, their cultural values. They developed prodigious memories, but also called for backup in the form of the inspiring Muses, whose mother was Mnemosyne, a name that means “memory.” Continuity of memory in an oral culture is a fragile and precarious thing. The development of writing must have been a great reassurance. A written text is like a safety deposit box, a guarantee that the information will be preserved despite death and forgetfulness. But every technological invention comes with a price. When asked about the impact of the newfangled invention, Socrates in the Phaedrus predicted that that literacy would make everyone lose their memories. And he was right, looking forward shrewdly to students who see no need to memorize because if they want to know something, they’ll look it up on their phones. But orality in fact survives alongside literacy. In both Classical Greece and the Renaissance, oratory was a vehicle of political power. Renaissance rhetoric developed a technique called “memory theatre” whereby a speaker could memorize the elements of a speech by associating them with the various rooms of a building. There are oral traditions of various kinds even in modern times. African American culture had the figure of the songster, who knew hundreds of songs in all the popular genres—blues, folk songs, pop tunes—and drew upon them to fit the occasion. I have always been in awe of people like actors, who memorize parts for show after show, and conductors, who direct from a score but must know all the parts of all the instruments.
Part of the answer lies in the difference between short-term and long-term memory. I lament that I have no memory of what someone told me two days ago, but at the same time the lyrics of folk songs that I memorized 50 years ago return to me readily: they have become permanent acquisitions. Details of much of the literature that I teach, especially works I have been teaching for 40 years, are likewise embedded securely. The same is true for the works of my chief mentors, Northrop Frye and C.G. Jung. I have read their work so often and so intensively that it has become part of me. A lifetime of reading and thinking has built up the reservoir of informational memory that makes writing this newsletter possible. Once I light upon a topic that carries a certain energy charge for me, relevant information floats up out of memory and builds up a webwork of associations, which I then try to analyze and interpret. When I tell friends that I have a bad memory, they may look at me askance because they know I am capable of expounding on certain subjects at length, although I very much try not to unless I am in a classroom or writing.
Since the brothers Grimm began collecting the folktales of German oral culture, writing down stories that had been passed down orally from generation to generation, there has been an increasing effort to collect and preserve the stories, songs, and folklore of oral tradition before they die away. In the early 20th century, Cecil Sharp in England and John and Alan Lomax in the United States collected the folk music that had been hitherto dismissed as the crude entertainment of an ignorant working class, eventually moving from printed notation to recordings, creating invaluable archives at the Cecil Sharp House and Smithsonian Folkways. Whatever social problems it has created, the advent of digital culture within my lifetime has meant the ability to document and preserve seemingly almost everything. Similarly, historic newspaper comic strips, formerly available only in expensive reprint editions, are now frequently available online. YouTube videos document both cultural and political experience. Anyone can pull up a Neil Young concert from 1972 or Martin Luther King’s “I have a dream” speech. Surely nothing can be lost ever again: we can be confident that the sum total of human experience is preserved on the cloud somewhere.
Yet there is still loss of a different kind. If you preserve it, will they come? Material preserved archivally can be lost just as much as the old photos and letters in grandma’s attic, and for the same reason: subsequent generations simply do not know they are there, and are not minded to look because they have never heard of certain things. It is tempting to coin a term such as “cultural inattentive ADHD” to describe the loss of material in plain sight. Over the last several decades, acoustic guitarists who when young sat at the feet of the old blues and folk masters and learned their playing styles have recorded instructional DVD’s, now digital downloads, so that anyone can learn the techniques of Mississippi John Hurt, Gary Davis, or Robert Johnson. But when I taught blues and folk music, my students had never heard of such people. Most of them know of Bob Dylan, Joni Mitchell, and a few other icons, especially if their parents were fans. But how many of them will ever even listen to the old music, let alone be inspired to learn to play acoustic fingerstyle? In the same way, newspaper comic strip creators are glum. The newspapers are disappearing, and the young audiences know graphic novels but not the comic strips that preceded them. The young are not insensitive: they do not value certain elements of the past simply because they have never been exposed to them, let alone inspired to carry the tradition forward to new developments in the future.
Authors have always been aware that their immortality depends upon the attention of their audience. When he was younger, Milton wrote of his aspiration to write a work that future generations “will not willingly let die.” But in the work that he finally wrote, Paradise Lost, he wistfully recognized that he was likely to have “fit audience, though few.” The fantasist Harlan Ellison was openly anxious about whether his work would survive because it was his immortality: as an atheist, he believed in no other kind. I think his work will indeed survive, but there is no guarantee. Creators may disappear from the public eye for decades, even centuries, then unexpectedly be resurrected by changes in critical or cultural fashion. Sometimes a critic of genius “remembers” a neglected writer of genius and sparks a revival. T.S. Eliot did this for John Donne after Donne had been relegated to the dustbin of “minor eccentrics” for two centuries. Northrop Frye’s Fearful Symmetry almost singlehandedly changed William Blake from a disregarded minor lyricist into one of the major Romantic poets. Alice Walker championed a forgotten book by a forgotten author, Their Eyes Are Watching God by Zora Neale Hurston, which has since become part of the canon.
If the culture of their parents’ and grandparents’ generation often remains invisible to them, the same is even more true of their forebears’ personal lives. When I teach the Odyssey, I focus students’ attention on Penelope’s statement to her on Telemachus that, if Odysseus does return, even after 20 years, she will know it is truly him because “there are secret signs we know, we two.” Telemachus is understandably baffled by this, but that is exactly what happens. Penelope knows it is truly her husband and not an impostor when Odysseus knows the secret that their marital bed is carved from an olive tree still rooted into the ground, symbol of their marriage. I pair this episode with an account of James Dickey’s wonderful autobiographical poem “The Celebration,” in which he tells of how, as an adult, he secretly followed his parents whom he saw unaccountable at a midway, realizing with shock that these middle-aged fogies are on a date. It had never occurred to him before that his parents had lives. They were his parents, and parents don’t have lives. Except that they do, and they have not told you nearly everything about their past. Old photographs can produce something of the same effect, and old letters even more so, back when people wrote letters. Who are these young people, younger than you are now, so full of passionate feeling?
However, the ability to document and preserve the past has made us more aware of the fact that both personal and cultural memories have a tendency to be revisionist. Sometimes we forget, not because of some neurological disability, but because we don’t want to remember. And what we do not forget, we may revise, and are unaware that we are doing so. Friends have reminded me of episodes from the past that I have conveniently forgotten. The unwelcome information may also come from things you have written in the past. One aspect of my inattentive ADHD, or so I believe, is that I can’t remember things I have written. That certainly includes past newsletters. I live in some fear of writing on a topic, only to realize that I have written about it three years ago. Past newsletters in my files have a label identifying their topic, but even so I usually have only a dim memory of what is in them. That is one type of memory loss. But I become aware of the other, revisionist type when I go back and read undergraduate essays I wrote 50 years ago, plus a few surviving personal letters. Who is this person? I recognize the voice, and the fairly fluent prose, but who is this restless, rebellious, tactless young punk? I had a lucky upbringing, all told—what cause had I to be so prickly and difficult? I genuinely don’t remember and don’t understand this stranger, nor much like him. Recently, I attended a memorial service for my mentor and friend Ted Harakas, who knew me when I was 18 and he was 33. Stories were told about Ted, but to my amazement a half dozen complete strangers came up to me and asked me about stories that Ted had told about me when we were young. He apparently found the episodes amusing. I was mildly appalled. But it led me to think that perhaps there should be memorial services where all the stories are told, not just the ones that make us look good—like a public version of Catholic confession. I do not want or expect a memorial service, but any such service should speak of the real me, complete with my Jungian shadow.
The same is true of institutional memory. There are several histories of my school, Baldwin Wallace University, formerly Baldwin-Wallace College. They are all good, but they are all official—not dishonest, but selective, and maybe not even deliberately selective. I know some things that have been left out because I was there in past years, all the way back to 1969, before at least one of the authors had been born. I remember things that perhaps no one else now does, things that were part of the reality of the time. I remember curfew for women (not men) in my freshman year; I remember the drug bust in which the police went into the dorms; I remember angry meetings about the faculty firings of 1973; I remember buildings that do not exist anymore, and still-existing buildings as they were before they were remodeled and modernized; I remember the one and only BW folk festival, with legendary names like John Lee Hooker and Mississippi Fred McDowell and Mike Seeger playing in the college union; I remember brilliant faculty like Ruby Redinger; and of course I remember who slept with whom. I wrote a newsletter about “invisible buildings” that no longer exist. But is there another potential newsletter about what the buildings that still do exist have witnessed in their time? I think of this in a rather uneasy way when staying in hotels, but more positively about the apartments and houses in which I have lived. What families lived in this space before me, what dramas unfolded, both tragic and comic? Not long ago, I happened to encounter the present occupants of the house in which I grew up, a house now close to a century old. When I told those young people, “I grew up here starting about 68 years ago,” I must have sounded like the Ancient Mariner to the wedding guest. Yet I would gladly listen to some crazy old loon who could tell me of that house before we lived in it, or about the house I live in now. Fantasy and horror stories believe in the lingering presence of the past in certain dwellings. Sometimes the ghosts are benign, though sometimes they are demonic, as in Stephen King’s The Shining.
On a larger scale, we are always faced with the task of coping with cultural inattentive ADHD, the forgetting of the past. I did not understand this when young. As a long-haired hippie, I had no interest in or patience with the past, which seemed—the Ancient Mariner again—to be only an albatross around the neck of the present. We wanted to abolish the past, break the hold of its dead hand upon our lives. We were right, in part. The times they were achanging because America was in certain ways overdue for change. Segregation still existed; homosexuality was in some places still illegal, and everywhere classed as a pathology; nice girls didn’t, and middle-class housewives often lived lives of quiet desperation; we grew up amidst Cold War paranoia, under the dark shadow of a possible World War III. Historical revisionism was just beginning to be in the air, and we began to realize that the version of American history we had been taught in school, not to mention in popular films and television shows, was doctored, cleaned up, censored, an idealized version of the American story from the point of view of a privileged white middle class. Such skepticism about historical memory soon led to skepticism about those monuments of the past that we call the classics, or the literary canon. To what extent are the Great Books either tombstones over the graves of dead ideas or repositories of destructive ideology?
When I arrived on the Baldwin-Wallace campus in 1969, you could say that I was unwittingly cutting edge, because I brought with me an impatience with literature I didn’t approve as “relevant.” One of the stories that Ted told a fair number of people was of how my final essay in his British Literature Survey I course was titled “The Care and Feeding of Sacred Cows,” and was an attack on much of what we had read was obsolete, so what was the use of studying it? Ah, be careful, for there really is such a thing as karma, and you may live long enough to find that the joke is on you. I have spent 40 years teaching mostly those old classics, and in a few weeks will begin teaching Survey of British Literature I. Ted is somewhere laughing his ass off, and I am laughing with him. I intend to tell my students this story, and hope they laugh too. I hope they understand why I am telling it. I now know the response to my youthful impatience. I teach the old works precisely in order to recover the memory of what it was like to be the people of earlier centuries, to understand why they felt as they did, saw life as they did, and recorded their sense of its meaning in these ancient works. I will never share their values: the endless warfare and feuding in the Iliad and Beowulf will always repel me. But the people do not. Both of those works are pervaded by a darkly tragic sense of being trapped in their own ideology, unable to break out of it, as if it were a curse. That sense of ideological imprisonment is the other side of my guiding theme of the imagination as the power that can break people out of their ideological prisons.
Nor are the works themselves outmoded because they are to some extent in the grip of outmoded ideologies. They are not simply propaganda, but show how people struggled to live, to love, to excel and be heroes, to be human, although doing so meant somehow coming to terms with a value system that defined a hero as a killing machine and women as “war prizes.” Nor are we so removed from their situation, for we too are struggling to free ourselves from the curse of the dead values of the past, which turn out not to be so dead after all. In the past decade we have discovered that the Civil War is still being fought. Southerners resist the attempt to get rid of statues of Confederate generals and rename buildings; here in the north, in rural Ohio, there are Confederate flags flying at a few houses. Bobbie Ann Mason’s brilliant story “Shiloh” is about a young Kentucky couple in the 80’s whose marriage is failing. They married young, never had an education, and “missed the 60’s.” Now the wife is taking college classes and preparing to fly—imagery of birds runs all the way through the story. The husband knows it, but his disability from a trucking accident symbolizes a deeper mental disability that keeps him passive and trapped in a way of life that clearly needs to be over. For far too long they have been deliberately forgetting their tragic past, which included the death of their baby—another thematic image is of a “dust ruffle” that hides the dust under the bed. But that only works for so long. The final breakup occurs when the couple visits the battlefield of Shiloh. Neither had thought much about history before, but are now beginning to realize that people can become so divided that the result amounts to a civil war. The story is almost a half century old, yet it seems newly topical in this election year.
There are not one but two genealogies in the Gospels attempting to prove that Jesus as the Messiah fulfills the prophecies by coming from the line of David. The fact that the two do not agree is irrelevant, because in fact Joseph is not really the father of Jesus. The genealogies exist because people expected the Messiah to be the descendent of kings—not a carpenter’s son born in a stable. Yet Jesus’ story follows the outlines of the myth of the hero, who is regularly born in humble or even disreputable circumstances. Realistic versions of the myth abound in the entertainment world: we are not surprised, and indeed rather expect, that a famous actor, comedian, or singer was in the beginning a nondescript person with a commonplace name. In the past, the humble origins were quite deliberately “forgotten,” though the subterfuge was usually easy to guess. After all, no one is really named Rock Hudson—including Roy Harold Scherer, Jr. No one is really named John Wayne, much less “the Duke”—including Marion Robert Morrison. No one for sure is named Lady Gaga—although Stephanie Germanotta has been clever in making the transformation of identity part of her myth instead of obscuring it.
“To thine own self be true” is not wisdom if you can only achieve success and respect by becoming someone different and “forgetting” your origins. I have just finished a fascinating new book, Jelly Roll Blues: Censored Songs and Hidden Histories, by Elijah Wald, an acoustic guitarist who is also a Ph.D. in ethnomusicology and sociolinguistics and author of a dozen books, all of which in various ways try to “remember” parts of America’s musical past that have been forgotten, sometimes deliberately. This book uses as its starting point an extraordinary long interview, plus musical examples, with Jelly Roll Morton, conducted by Alan Lomax for the Library of Congress in 1938. By that point, Morton wanted to be known as a serious jazz musician—so desperately that he claimed to have invented jazz, a claim not accepted by scholars. But Lomax, who was always pursuing “authenticity” in popular music, kept pushing Morton to talk about and play examples of the kind of music that he made his living by when he was starting out early in the century.
Morton rather reluctantly talked about and played examples of such music, which was called “blues” although it was not classic 12-bar blues of the Delta. It was rather semi-improvisational dance and entertainment music, often with lyrics regarded as unprintable until recently, and its audience was the prostitutes and their clients in New Orleans houses during an era when a lot of women supplemented their income by turning a few tricks and were not particularly stigmatized for it. Louis Armstrong emerged from the same milieu. It was a lowdown environment, often violent, and not for the faint of heart, or nose. A song from that era called “Buddy Bolden’s Blues” calls for someone to open the window to let the stink out. These were people who did not have the means to take baths very often, and report was that the human stench could be literally overpowering. Without Lomax’s urging, Morton would have been happy enough to skip over his scandalous beginnings—although he retained the nickname “Jelly Roll” as a reminder. “Jelly roll” is blues slang either for female genitals or sex. But then, the word “jazz” emerged from the same funky environment and originally had pretty much the same meaning, namely, sex.
Such deliberate forgetting differs from what we could call subjective inattentive ADHD of the sort with which this discussion began. Perception and its limitations and ambiguities did not really become a major theme of literature until the Romantic period, although Hamlet looks forward to it, which is why it so obsessed the Romantics. In traditional romance there are plenty of illusions, but they are usually externally generated, by evil magicians and the like. In Romanticism, the illusions may be generated from within, may even be an aspect of what we are. Romanticism was powerfully influenced by the philosophy of Kant, which showed that the mind did not just passively perceive external reality but to some degree actually constructed it according to its own a priori categories of understanding. In addition, the concept of a deeper unconscious level of the mind actually emerged long before Freud, and it was well understood that the combination of a reality-constructing mind and an unconscious that preserved or repressed memories for emotional reasons led to the conclusion that we all perceive different realities and remember—or fail to remember—experiences differently.
This led to such works as Robert Browning’s The Ring and the Book (1868), in which ten dramatic monologues by ten different characters recount contradictory versions of the same multiple murder. Akira Kurosawa’s famous film Rashomon (1950) is similar: one after another, the characters recount the events of a murder in mutually exclusive ways. It is really the technique of many mystery stories—all three of Kenneth Branagh’s film versions of Agatha Christie novels follow the pattern. Sometimes the people who testify are lying, but sometimes their understanding is limited by their point of view. The relativity of perception is explored in “literary” fiction as well, for example in Faulkner’s The Sound and the Fury. A repeated technique is to adopt a child’s view of adult behavior, the child being a detached outsider who can therefore see things that adults are oblivious of.
A tour de force using this technique is Henry James’s What Maisie Knew (1897). The entire story is told from Maisie’s point of view as she is caught in a world of egocentric, manipulative adults but succeeds in coming to understand their deficiencies and thereby survive her dysfunctional environment. We all remember eavesdropping, deliberately or accidentally, on our parents and other adults, trying to figure out their strange ways. In a “Preface” to the novel, James says that “Small children have many more perceptions than they have terms to translate them; their vision is at any moment much richer, their apprehension even constantly stronger, than their prompt, their at all producible, vocabulary” (27). What Maisie “knows” is thus intuitive and largely prelinguistic. Northrop Frye says that “The story What Maisie Knew, being about a child, is, as the preface explains, the story of what Maisie knew but didn’t altogether know she knew. That, incidentally, is the technical reason for an omniscient narrator, to tell us what his characters know but don’t know they know, or feel but don’t feel that they feel” (353). The point of the story, however, is that Maisie still knows more than the adults around her, sleepwalking in their selfishness. Moreover, she knows more than the narrator, for all his pedantic prose.
Our discussion has moved from the feeling of remorse at having forgotten something it was vital to remember to the feeling of shock when someone tells you what you have not known about yourself. This is a kind of anagnorisis, or recognition, Aristotle’s term for what happens at the climax of a tragedy. The feeling that “I don’t really know myself” evokes terror, the terror of madness, and such a recognition inverts the traditional hero’s quest. A traditional hero faces antagonists that are external, monsters and villains. The sudden knowledge that I may be the monster or villain can be utterly demoralizing. The potential horror here is deeper than that of even a remorseful villain like Milton’s Satan. In Paradise Lost, Satan has chosen evil willingly and knows he is guilty. But it is a deeper form of damnation to try to be good, seem good, think yourself good, and yet find in the end you are not good at all but have only been deluding yourself. The pattern is not completely unknown in traditional tragedy. In Sophocles’ Ajax, the hero goes mad and slaughters a bunch of sheep, thinking they are the Achaeans who have shamed him. Seneca’s Hercules, in Hercules Furens, or Hercules the mad, does worse and slaughters his family. Both heroes awake to the horror of what they have done. But Christianity universalized the pattern, using it to define the fallen human condition. This is what Catholicism calls original sin and Calvinism innate depravity. In Judaism it is, well, “Jewish guilt,” typified by Kafka’s famous saying that “Guilt is never to be doubted”—even though Kafka’s hapless characters do not know exactly what they are guilty of. In this they resemble Job, tormented for unknown reasons as if he had sinned, though he had not.
The effect of such a view is to universalize a pattern that resembles inattentive ADHD. You can never have an easy conscience. If you do, you are guilty of pride. You must always doubt yourself, always feel that, if you are not conscious of having done wrong, you have forgotten something, overlooked something, not examined your conscience deeply enough. Modern culture has given attention to the theme of external paranoia, the idea that, as in The Matrix, for instance, the outer world may be an illusion. But there is an internal paranoia of doubting one’s motives and ultimately one’s very identity. Examination of conscience is terrifying, because it has no end. Even good actions or feelings are probably lies masking selfish intentions. You never dare think you are innocent. This is not the same as guilt: guilt is for something you’ve actually done. It is, rather, a pervasive self-doubt. In Catholicism, if you even think about something, it is a sin—whoever looks upon a woman with lust is guilty of adultery, Jesus said. Such an attitude may intensify to the point of despair, as it did with Luther, who had to invent the doctrine of “justification by faith alone” to rescue himself from the feeling that he could never fulfill the demands of God’s law and had to throw himself on God’s mercy. Calvinism stresses Paul’s notion of predestination: some of us are simply damned by being denied God’s grace. We cannot even repent, because the grace that enables repentance has been denied. Much post-structuralist and post-modernist literary and cultural theory seems to me a secularization of the feeling that guilt is never to be doubted.
This crippling feeling of self-loathing is what self-help is designed to remedy, stressing affirmations and self-esteem: “I am good enough.” The problem with self-help formulas is that they are weak and ineffectual compared to what they are up against. In his book on romance, The Secular Scripture, Frye speaks of “an involuntarily acquired self-knowledge that is more terrible than death itself. The symbol of this hostile knowledge is generally some form of scales or balance, the emblem of the law” (82). He goes on to say that “The only companion that accompanies us to the end of the descent is the demonic accuser, who takes the form of the accusing memory. The memory is demonic here because it has forgotten only one thing, the original identity of what it accompanies. It conveys to us the darkest knowledge at the bottom of the world, the vision of the absurd” (83). Memory as this kind of accusing paranoia is a definition of hell, in which people are locked into their sins, endlessly reliving them in a kind of spiritual PTSD, as Dante shows in the Inferno.
Shakespeare is adamant that our humanity depends on rejecting the whole complex of guilt and accusation, whether it is accusation of ourselves or of others. When King Lear tells Cordelia that she has just cause to hate him, she cries, “No cause, no cause.” Not because he is not right but because blaming solves nothing: it only twists the knife deeper. The happy ending of the comedies and romances often depends on forgiveness, although “forgiveness” is a misleading word. “I forgive you” may mean that you are playing the blame game and nobly allowing someone to get away with something. It can be a power play, putting yourself in a superior position. But that is part of the same toxic mindset. My own alternative catchphrase is “Solve the problem.” If I have hurt you or failed you, or you have hurt me or failed me, it only makes it worse to try to decide, like Olympic judges using a point system, who is guilty and to what exact degree. If we are hurting each other, we can only remedy it by figuring out the problem and solving it. Accepting responsibility is not the same as accepting guilt.
In The Secular Scripture, Frye summarizes what he calls “a great trumpet call from a very different imaginative world, the Hymn of the Soul in the Acts of Thomas,” a Gnostic text (103). In that wonderful parable, the Soul is sent by his father to Egypt to find a pearl guarded by a serpent. But he is tempted to eat the food of the lower world, which causes him to forget not just his mission but his very identity. However, his parents send him a “letter” that restores his memory. What we have most deeply forgotten is not the unconsciously guilty self of which the Accuser is always reminding us. That is not the real self. Yet there is a real self, and we may know it, although, like Maisie, we may not know that we know it. The letter stands for the creative power that reminds us, the imagination as a wake-up call. Frye says that “It seems that one becomes the ultimate hero of the great quest of man, not so much by virtue of what one does, as by virtue of what and how one reads” (104).
T.S. Eliot’s beautiful poem “Marina” dramatizes the moment in which we recover from our amnesia, remember who we really are, and look upon all that we thought we had lost. It is spoken by the title character of Shakespeare’s late romance Pericles, who recovers his lost daughter Marina. The poem has an epigraph chosen as a complete contrast, from Seneca’s Hercules Furens: “What place is this? What region, what quarter of the world?” which is what Hercules says recovering from his madness to discover the horror of the murders he has committed but does not yet remember. Eliot’s Pericles is also remembering, but the remembrance moves in the opposite direction:
Bowsprit cracked with ice and paint cracked with heat. I made this, I have forgotten And remember. The rigging weak and the canvas rotten Between one June and another September. Made this unknowing, half conscious, unknown, my own. The garboard strake leaks, the seams need caulking. This form, this face, this life Living to live in a world of time beyond me; let me Resign my life for this life, my speech for that unspoken, The awakened, lips parted, the hope, the new ships.
It seems to me that I have quoted this, from one of my favorite poems, in another newsletter, but, typically, I can’t remember. No matter. Perhaps we all could use the reminder that there is an amnesia deeper than any inattentive ADHD, deeper even than any cultural inattentive ADHD. In Plato’s Meno, Socrates claims there is an innate knowledge, never learned, that can be recovered through recollection, surfacing from the depths. Every creative act, of intellect, art, or love, reminds us of who we most deeply are, a self—or, in Jung’s terms, a Self—that is at once individual and collective, transcendent and commonplace, so completely lost that it cannot even be called an illusion and yet here and now. We have always known this Self. It’s just a matter of coming to know that we know it.
References
Frye, Northrop. “Henry James and the Comedy of the Occult.” In Northrop Frye on Twentieth-Century Literature. Edited by Glen Robert Gill. Volume 29 of The Collected Works of Northrop Frye. University of Toronto Press, 2010. 350-70. Also in The Eternal Act of Creation: Essays, 1979-1990. Edited by Robert D. Denham. Indiana University Press, 1993. 109-29.
Frye, Northrop. The Secular Scripture: A Study of the Structure of Romance. In ‘The Secular Scripture’ and Other Writings on Critical Theory, 1976-1991. Edited by Joseph Adamson and Jean Wilson. Volume 18 of The Collected Works of Northrop Frye. University of Toronto Press, 2006. 3-124. Also published by Harvard University Press, 1976.
James, Henry. What Maisie Knew. Edited by Paul Theroux. Penguin, 1985. Originally published 1897.
Wald, Elijah. Jelly Roll Blues: Censored Songs and Hidden Histories. Hachette, 2024.
Even Northrop Frye, who was said to have a photographic memory, once realized that he had confused the Montreal poets A.M. Klein and Eli Mandel in a closing reference as he returned to the office. He was already 70 and past the official age of retirement, but he shook his head as he told his secretary, "That never used to happen to me."