Jack and Mr. Rogers vs. Widening Gyre of World History

b96042f1554578d1e164a018be25761e

When asked about an event a long ago in her life, my grandma sometimes shakes her head. “I can’t even remember what I had for breakfast this morning,” she’ll say (proverbially, of course, because it is a well-documented fact that the lady loves cinnamon toast). Many of her stories are lost to us now. For her, history is difficult to conceptualize. Along with most of my older relatives, she assures me that my own memory will only grow worse. And it’s true: my own sense of time actively decays, as evidenced by all of my lost pencils, my surprised delight at family photo albums, and my inability to exactly retrace the routes to my old haunts in the cities where I’ve lived.

And those are just my own experiences. Learned history is so much more difficult. I loved my 12-credit-hour, 2-semester-long Western Civilization course in my undergrad, which emphasized history and literature from the Romans to the present day. (Nerd alert.) Of course, it’s impossible to fit so grandiose a subject as “Western Civilization” into a single course, regardless of copious credit hours; it’s impossible to fit it into a single lifetime. Instead, we focused on overarching themes, influential philosophies, and traceable patterns based on events and literature considered important. The pattern that emerged to me, and many others, wasn’t exactly optimistic. The sometimes romantic, often nationalistic, and always mystical William Butler Yeats helps me in expressing some kind of philosophy of time in “The Second Coming” where he famously writes,

Turning and turning in the widening gyre

The falcon cannot hear the falconer;

Things fall apart; the centre cannot hold;

Mere anarchy is loosed upon the world

. . . .

And what rough beast, its hour come round at last,

Slouches towards Bethlehem to be born?

History, to post-war Yeats, is a recurring spiral. But that spiral widens like the falcon, furthering itself from order, from the center, from the thing it is. History (not to mention the cosmos), moves toward disintegration rather than order. It does repeat itself, but in ways more mysteriously horrifying than what has come before: a rough beast instead of a holy infant in Bethlehem.

This vast image out of Spiritus Mundi was handed down to me from a very certain time period after a very certain war. This, I learned, was the mood of the twentieth century, creating the Lost Generation (and spiraling into endless other lost generations), strange artistic movements, and a general loss of faith. And who was I to blame my predecessors for the ennui, the alcoholism, all of the broken coping after World War I? Wilfred Owen assures me that if I could see for myself gas in the trenches, “My friend, you would not tell with such high zest, / To children ardent for some desperate glory, / The old Lie: Dulce et decorum est / Pro patria mori.” Cause and effect. WWI and modernism. Inevitable. It all fits nicely in the 16-leaf blue book I wrote my exam in.

This simple, entropic vision–my helplessness in Yeats’ gyre–is what I believed as I walked out of my Western Civ final. It was hard not to, there in the wake of other rocking classes, in the midst of important people leaving my life, and with a long summer of serving coffee ahead of me. I read books, when I could, with new knowledge of my place in a world I had just discovered. I quietly observed where writers wrote as expected, staying true to their place in the gyre and perpetuating the falling apart of all things. History grew darker, and predictably so. I had no right to feel happy—especially not when insignificantly good things happened—because happiness had no place. It doesn’t fit in the gyre.

Of course, that isn’t everyone’s experience with the very same material: another reason why history is difficult to conceptualize. Just opening Facebook shows me how different people interpret the same stories, arriving at opposing conclusions. Political discourse, academic essays, and even just conversing with others reinforces the same. And I like to think that courses like Western Civilization don’t simply exist to crush overly-sensitive wanna-be intellectual types. There’s something to be said for the knowing of things, and not all of us walk away with quite the same feeling.

One day in the noisy heat of that summer, I picked up C.S. Lewis’ Surprised by Joy from where I’d left off after one distracted Christmas break. In the spiritual autobiography, Lewis traces the story of his conversion. For him–and probably for all of us, if we can see it–that narrative begins in childhood (probably before) and bleeds over into his adult life. So, begrudgingly, he dedicates a chapter to his own experiences in the Great War: the very same one that I had written about whirling like the lost falcon. And while Lewis’ is not the grittily realistic account I anticipated (it actually rings of repression), it stunned me. He, too, had holed up in trenches and lost a generation of peers. He had also seen the unimaginable. Participated in it, even.

But somehow there was goodness. Something good had come out of the Great War and out of the gyre. True, his works are hardly the zeitgeist. But even if it was just Jack Lewis with his myth of a Lion that had carried me through childhood, there was something. And so, I wondered: if one good thing could come out of an event so systemically and deeply horrific, could two good things, or even three good things, come out of the gyre too?

The summer before preschool began, my grandma babysat me in a hedge-enfolded blue cottage at the end of a cul-de-sac. Neither of us recall that time very well anymore, but I do remember sitting at a tiny table with a glass of chocolate milk. As Grandma cut the crust off my peanut butter sandwiches in the kitchen, I was transported to the Land of Make Believe. Admittedly, the puppet Lady Elaine Fairchilde did scare me, but despite my fears, Mister Rogers’ Neighborhood was there in that cottage, too. Like it had been for my mom as she pretended that the quiet, cardiganed host with his kind attention was her father. Like, I imagine, many of us did.

Fred Rogers, emerging from the context that my Western Civ course told me was full of mistrust of authority, fear of the Other, wartime anxiety, mass media frenzies, and depleting morals, is another anomaly in the gyre. Instead of angry protests, he kindly requested funding from Congress for public television. Instead of inciting fear, he showed children how to learn from the stranger. Instead of telling us how not to feel, he taught us the power of self-control. Instead of buying into the commodification entertainment–the way television began to strip individuality–he saw an opportunity to humanize, to educate, to express.

How Mr. Rogers could exist in the widening trajectory of civilization is, I think, more than mystical.

Jack Lewis and Fred Rogers, like I said before, are hardly the norm. They are marginally known by the masses, celebrated passionately by tiny pockets of society, and not studied nearly as much as they should be. And why should they be, when they represent the opposite of our great historical narratives?

But it’s anomalies like these that convince me that my historical constructs aren’t necessarily real. Not that these figures have no faults (I’m well aware of Lewis’ misogyny–what some, incidentally, dismiss as a product of his times), but they are without some of the common faults of those we hold up as representations of an entire era. Or at least those vices are less evident. Or at least there was some act of grace along the way.

These ought to be remembered as we attempt to contextualize, to educate, to encapsulate. Because, I pray, if the gyre is true, there will at least always be anomalies that point toward the forgotten center. Because whatever my own construct of history, whatever way I make meaning out of the chaos, the Logos is truer and realer and creates more goodness than my tendencies toward oversimplification care to notice.

Said Fred Rogers himself,

A high school student wrote to ask, “What was the greatest event in American history?” I can’t say. However, I suspect that like so many “great” events, it was something very simple and very quiet with little or no fanfare (such as someone forgiving someone else for a deep hurt that eventually changed the course of history). The really important “great” things are never center stage of life’s dramas; they’re always “in the wings.” That’s why it’s so essential for us to be mindful of the humble and the deep rather than the flashy and the superficial.

Humble. Deep. This is the way anomalies appear: a veteran who envisions a faun in the woods, a seminary student fascinated by television, a grandma who doesn’t remember her kindnesses, a child born in Bethlehem; I pray, too, a kid who read Narnia, watched Mister Rogers, and cried in her Western Civ class.

Advertisements

But really, what does it mean to speculate?

It is funny to me how our culture feels about the noun, “speculation,” and the verb, “to speculate.” It has the same roots as words like, “spectacles, spectacle, spectators, spectacular,” etc., which means that it derived from the concept of sight, of seeing something. But if it derived from sight, and we as a culture like to talk about and uphold sight, then how did it come to mean something ungrounded, essentially something you couldn’t see? I am pretty sure that this is from modern responses to about two thousand years of philosophical and theological history, in which much of what was practiced was called “speculation” in a positive sense – basically “sight.” Sure, it referred to sight of the mind’s eye, in many ways. But at the same time, often enough we use and refer to the same kind of sight regularly at least in popular America. Much of American culture is made up of speculations – sights justified because they are the sights of the mind’s eye (we make mistakes about these sights, but such sight is still how we justify many beliefs and actions – and even scientific thought is a history of mistakes, hopefully being corrected, but still a history of mistakes – and so is philosophical history).

The elite and some people who pride themselves on their scientific education will probably object to this precisely on the grounds that only the scientific is reliable. But then the grounds of scientific method are non-scientific. In fact, they are speculation. How do we know that nature obeys laws and will continue faithfully to do so? Well, “sight” of the speculative kind, is what justifies such belief – the specific speculative kind which also happens to be extremely unique since most of the 6 billion people on the planet do not share that exact speculation. This is important because people who talk about science in America commonly assume that the bases of science are self-evident – but if they are self-evident why did we not come to them sooner? And why don’t most people hold to them regularly today? In fact, they are speculation. They may be good speculation – a lot of medieval speculation was good speculation. But they’re still speculations, to be held up or knocked down on those grounds, not as self-evident authorities. It is also the case with the belief that we can consistently and coherently observe, theorize, and experiment in the natural world – this essential belief of modern scientific method is speculation which is not held universally or by the majority in history or the present, and it is not a self-evident authority. It falls or stands on speculative grounds – on the sight of the mind’s eye, and our reasoning about such sight.

Trying (Not) to be Jesus for Others: A Remarkable Thing to Consider

jaroslav_spillar_altenteil_1904_domazlice2c_galerie_spillar

Like many of the Christian NF-types I know around my age, I (finally) picked up Marilynne Robinson’s Gilead. I’d heard a lot of high praise for this epistolary novel of a dying father writing to his young son. And the more I read (and reread) the final pages, the more I realized I was not disappointed by the hype. We could devote an entire series of blog posts to Robinson’s glowing coal of a novel—shining forth, Hopkinesque, with the grandeur of God—and an undergrad course I took actually intended to spend three or four class periods over it. Sadly, that class ran out of time by the end of the year; I suppose I don’t have the time to blog all of my thoughts about it, either.

In those last pages, the ones I relish rereading, John Ames imagines his son as an old man. He cherishes the thought of his son filled with wisdom and experiences and evidence of having loved creation. He blesses the body parts that will trouble him in forty or fifty years—the very same parts that trouble Ames as he imparts his novel-length goodbye. “I wish I could help you carry the weight of many years,” he writes. “But the Lord will have that fatherly satisfaction.” The gap after this sentence in the book, marking the end of the section of text, literally underscores Ames’ impending absence.

Even though this is a passing thought (most of Gilead is, actually), I can’t help but find it a “remarkable thing to consider,” to use Ames’ oft-repeated, almost liturgical, phrase.

I’ve lived most of my life in a way that encouraged dependence. Maybe it stemmed from sibling interactions or loneliness or grade school expectations or pride. Maybe it’s just human depravity. I remember realizing in middle school that while being friends with popular kids was a lost cause, I could easily get other (obviously) lonely people to like me. It became a game in high school. A mode of survival in college. It bled into friendships where I could tell myself I was being loving: listening (a lot), offering advice, giving gifts. And it seemed to start with love.

Soon, the object was not so much loving another person—which involves, of course, another person. The object was myself. Seemingly loving behaviors became vehicles for my feeling affirmed, feeling wanted, feeling needed. None of those feelings are inherently bad, of course. But if, like I did, I preyed on other peoples’ loneliness for the sake of my own gratification, then I really couldn’t call that love anymore.

And the worst thing for those who squandered their dependence on me: I, like every other human, am imperfect (in case the previous paragraphs of my predatory predilection didn’t already make the point). Admitting that sets my teeth on edge almost as much as admitting my mercenary desire to be affirmed. I, even the wizened and bright and compassionate and obviously-humble individual that I am, will let down every single person I happen to come in contact with. I will disappoint. I will say the wrong thing. I will hurt.

I will not be able to provide for their every need. Even on my best day, with the best intentions and all of my resources available, I will be insufficient for even one single human soul.

If only this realization about my own sick self was just one of those cool metaphorical dream sequences in a movie. If only I woke up from it, whistling, and hastily changed my ways. But I think the realest lessons have to come from the realest part of our experience: the part where our hands are lacerated by even the dirt we fall down in, and we can’t look away from the fact that it’s our own damned fault. Over the course of a couple of months, I saw that I had let people down not just by my absences and mistakes—the ways that were immediately apparent—but because I simply and ravenously encouraged dependence on me.

The dying John Ames rightly admits that he can’t bear the burden of years for his son. I think he would have realized that even if he had lived to see his son become an old man himself: he writes only of helping carry, not carrying for him. And so he wisely does not express a desire to somehow absorb his son’s struggles, even if it would make Ames feel the essentiality that I crave. And how can he, when he will be gone?

He writes instead of God having “that fatherly satisfaction.” And in two sentences, he does what has been so difficult for me: he entrusts someone he cannot help to the care of someone who can. Someone transcendent and sufficient. Someone who provides the only rest for hearts that fundamentally seek an object upon which to heap their dependence. Someone who gives meaning to the term “fatherly” and, in fact, “satisfaction.” Someone perfect.

Along with Ames, I must also confess that I can’t bear the weight of many years. I can’t do it for myself, much less another human. I can’t fill those deepest wounds because I’m not meant to be Jesus. (And, sweet freedom, I truly thank God for that.) But because I am meant to be like him, I am not excused from caring for others. The way I imitate him is not by pointing dependent hearts in the direction of myself (the listening, sage, meek, and deeply predatory self). It is by turning my attention, and others’, toward the only Person who deserves the fatherly satisfaction of holding all of the aches in our aging bones and aging souls.

A remarkable thing to consider, indeed.

On Lemuel Haynes and ‘Late-Night Talk-Show Liturgies’

‘The Lonely Ones’ by Edvard Munch (1899)

​I’m not sure America’s that much less religious than it used to be. Television is more common than it was half a century ago and we shouldn’t underestimate the degree to which religion can function as entertainment. ‘Nominal’ religion, of course, is certainly evaporating. There are less ‘religious in name only’ folks these days. Our ‘liturgy fix’ is easily attainable in our media saturated culture, and many who might have sought to quiet the pangs of utter hopelessness in a Church building a few times a week a century ago can self-medicate more economically by tuning in to The Late Show.

In the absence of other accessible forms of entertainment, previous generations were more apt to attend seasonal revivals, involve themselves in religious activities, etc. I’m talking about the way in which things like television and other now  accessible forms of cheap entertainment have come to fill a gap that previously religion filled for large subsets of the population: the desire to be entertained.

In centuries past, when Christianity – usually, some form of Protestant Christianity – was simply assumed to be true by most of the public, and there were fewer forms of widely accessible entertainment available as competition, involving oneself in Christian activities (such as bible studies, revivals, etc.) was one of the most common ways to pass the time.

This could give the sense, then, that America was more devout than it was. By circumstance, we were ‘more’ religious, but the roots weren’t particularly deep. Now that there are numerous alternatives by which we can entertain ourselves, religion’s hegemony in American life has predictably waned.

My question, then, is to what extent we can truly say that America is substantively less religious than it was, when previously the truth of the Christian story was merely assumed, rather than embraced, and the function of religion for a sizable portion of the population was chiefly its ability to fill a certain universal human need – that is, to pass the time in the absence of accessible competition, like television, etc. If my suspicions are true, America is as ‘religious’ as ever, we’ve simply reallocated our devotions.

I’m not being cynical. I certainly don’t think that hope is lost. None of this means that the gospel will no longer take root on American soil, or that the Great Commission is somehow out of reach. It simply means that we no longer have some of the crutches that our great-grandparents’ generation had.

But, as Lemuel Haynes was always quick to remind his cohorts, everything – everything – that happens, happens in order to further along the Triune God’s eternal plan to reconcile the world to Himself. If we face unprecedented challenges because there are infinitely more opiates from which the average person can choose, then so be it. As Lemuel would also have pointed out, true religion – at least, the religion of Jesus, is a balm for the wounded but is hardly an opiate for the bourgeoisie. If it is anything, it is a merciful buzzkill, jostling us awake and demanding our submission to the strange moral vision of the gospel of grace.

So it may be a good thing that television has taken away our hegemony. Bad for numbers – at least for now – but good for the world. To paraphrase David Bentley Hart, the Church has only ever been half-Christian (at best) when we’ve run the world. And to paraphrase Rod Dreher, we’re well on our way to becoming a minority religion in the West. To paraphrase Anthony Bradley, though, the majority of the Christians in the United States have never, actually known what it is like to be the ruling class – the historic Black church has always been a marginalized Christian group, from the colonial days to the Civil Rights movement to the modern era, where culturally dominant Evangelical institutions send missionaries into black neighborhoods to plant their own churches without involvement from the long-standing, theologically conservative Black Churches that have been there for decades (or centuries). And that’s only one example. We’ll survive like they’ve survived, and in-season we’ll multiply like they’ve multiplied – in season. The predominately white churches that make up what was the most culturally influential player in America’s religious landscape will soon enough occupy a similar position to the ethnic churches that past generations explicitly marginalized and current generations largely ignore. And we can learn from their historic witness. And, perhaps, in joining them in the society’s lower wrungs, we can learn to identify, explicitly and implicitly, with them as one Body, transcending our cultural divides without erasing them, and in doing so, become a common Church in America that begins again to turn the world upside dowm – even in the television age.

Religious Liberty, the Johnson Amendment, and the Imperial Cult

20170504_141258

A statue of Emperor Domitian (AD 51 – 96)

The Johnson Amendment, which prohibits non-profits (especially religious groups) from officially endorsing political candidates, is apparently terminal. I’m not sure who, exactly, was asking for it – save for a few folks who miss the golden age of God and country, when the broad majority of Americans at least paid lip-service to the God of the Bible.

Religious liberties are important. So much so, in fact, that Hillary Clinton lost nearly the entire Evangelical vote by (at least) implying that certain convictions, – regarding sexuality, and more – most of which are fairly mainstream among religious conservatives, should be declared anathema by the federal government. “Religious beliefs have to be changed,” she said. I don’t know what she meant, but everyone knows what it sounded like.

Which makes it interesting that anyone, anywhere is applauding the striking of the Johnson Amendment. To permit religious institutions to officially endorse political candidates is to alleviate the distance between Church and State – that much is hard to dispute. As a Baptist, that troubles me – because chipping away at the separation of Church and State helps the State domesticate the Church far more than it helps the Church influence the State.

If you look, for example, at what’s happening in Russia: Russian Orthodox churches, now sanctioned by the government, are increasingly becoming part of the the State’s propaganda arm.

Looking further back: the Protestant Reformation was always at its worst when one branch set about establishing Protestant muncipalities – the political aims of the State would largely govern the gospel proclamation within its borders. Calvin’s bizarro obfuscations regarding the connection between citizenship in Geneva and citizenship in the Kingdom, for example.

Likewise: what good came from Byzantine is largely overshadowed by the bad it created in the long run – the extremities that eventually necessitated the break from Catholicism began in good intent with Constantine and mutated, gradually, as the political necessities of the ‘Empire’ (among other things) helped to shape and distort the gospel witness of the Church through the ages.

The story never changes: you can’t Christianize a country, but you can co-opt a Church. Well nigh absolute religious liberty is the only defense any religion, Christianity included, has against being co-opted by Caesar.

Haggai’s one of my favorite books, and Haggai’s first sermon essentially runs: “You have become enculturated by the comforts afforded to you under a government that supports you. I’ll be taking them away now. It’s for your own good.” So we could lose religious liberties at the drop of a hat, and it’d be for our good.

But, as a policy, religious liberty for all is ideal. Jefferson understood what many today don’t – there are only two options: indiscriminate religious liberty, and theocracy. To establish religious liberty, there has to be an insurmountable wall between the Church (and Mosque, and synagogue, etc.) and the State.

In that scenario, First Baptist Church Shawnee, Emmanuel Synagogue, Grand Mosque of Oklahoma City, etc. can all work among themselves for the good of their cities (i.e. Jer. 29:7, “seek prosperity for Babylon”), but cannot be regulated by Nebuchednezzar.

State sanctioned religion, after all, is never the religion it purports to be. State sanctioned Christianity, history has shown, is never the religion of Jesus, and so on: it always amounts, instead, to a sacrilizing of whatever the State already values; State mandated religion is Jereboam’s golden calf.

The obvious exception, of course, the Nation of Israel throughout the OT, which, at least in theory, was a monarchy with Yahweh as its king. But this is America, today. And the slow erosion of Church-State separation is the way religious liberty dies. I’m no alarmist, of course, but this is nothing to be excited about.

Surprisingly Enough, Jesus Is What God Is Like (Happy Easter)

The above image has the high and holy King washing the feet of His rowdy disciples. It ought be surprising that this is what God’s like. 

You’ve probably noticed that Jesus as we meet Him in the four gospels not just an amplification of whatever we already thought was right. ‘Good‘ as exemplified by Christ Himself was not, it turns out, just common sense baptized in Godness. Counterintuitively, He was something else entirely – something which often grates against what we call conventional wisdom. As a man, Jesus began the divine project of turning the world upside down. As the Creator of the universe, He began the project of bringing His creation into conformity with Himself. As both God and man, He bore the weight of humanity’s sin, and His own wrath against it. 2000 years later, we’re familiar enough with the story to miss the point. 

There was no reason to expect Good Friday. No reason to anticipate Holy Saturday, Christ’s descent into Hell. And no reason to count on Resurrection Sunday. And yet they happened.

These things are beyond the parameters of human creativity. Folks who imply a parallel between Jesus and the old myths of ‘dying-and-rising-Gods’ miss the point by a few degrees, like folks who can’t see much more than a ‘tribal deity’ in the Old Testament’s image of the God who sprung Israel out of Egyptian slavery. Whatever peripheral similarities exist are eclipsed by the sheer insanity of this God character who defies expectation.

There was no reason to expect the Creator of the universe to wash the feet of His disciples. There was no reason to expect the God who commanded the slaughter of the Canaanites to preach the Sermon on the Mount. There was no reason to assume that the God who vanquished Pharaoh would submit to a cross to redeem the crowds of people screaming for His blood. 

But on this side of His resurrection, it makes perfect sense. All this time we’d read our own motivations into the Yahweh character. We’d given Him our own psychology, assumed our own values in His every move. But God is different than we thought we knew. Everything He’d ever done, somehow, is part of the project of redeeming the world. 

The hasty meet Jesus in the gospels and assume that the Old Testament got God wrong. But the truth is that the Jesus we encounter there shows that we’d read the Old Testament wrong. God has always been like Jesus. The Father, Son, and Spirit has always been, as John writes, love. And now we know, on this side of the Resurrection. 

There’s a Place At the Table for the “Faithless”

Originally published to Armchair Theologian.

​I am not God. That’s an ultimate reality. But only because it contains so much in so few words. “I am not God,” means that God exists, almost certainly. I could reason my way into an explanation for the universe that does not require a creator, sure, but that isn’t the point. If something like the Classical Theism we half-read about in our Western Civ classes is true, then God is present, in a way, in the statements, “God does not exist,” and “There is no God.” I happen to think that’s true, and so you can commune with God, live with Him and love Him with the best of yourself if you struggle and even fail to believe with your head and your gut that He is real.

For all of its strange incarnations, Christianity recognizes the contradictions in man. And God, if He’s this God, promises rest to people who need it, so He invites one and all to His table to join Him in His rest. And he knows the contradictions in man, and it is right for the one who struggles and fails to accept even His existence – who cannot with her conscious mental faculties sign off on the proposition, “there is a God,” to join in worship, in the ordinances, and in the whole life of the Church community, because despite her limitations God has accepted her.

Her place at the table is not a special place amongst other specially marked places  for  those whose faith is, like hers, intentional rather than intellectual. The place marked out for the one who cannot believe in God but will not let go of Christ is among all of her brothers and sisters in the faith. It is wrong to say that she has no faith. They have the same faith, and sit at the same table in fellowship with the same God whose real presence makes them one people.

That the God incarnate on earth in Jesus of Nazareth actually exists, not only in the faith of His people but in Himself, means that the sort of belief that saves is not actually an intellectual assent to the right propositions. If anything, that would constitute a “good work” that would put certain people in God’s favor by way of an arbitrary advantage. Instead, saving belief is to entrust yourself to God through Jesus Christ, even in the midst of serious limitations in your capacity to believe.

“I am not God,” means God is distinct from me. That is “good news of great joy.” He has His own being apart from me. He is not ultimately a projection of my unconscious emotional need to believe a higher power. It is true that “in Him we live and move and have our being,” as St. Paul quoted Epimenides to the crowd of Greek philosophers in Athens, but that is not the end of the story. We exist in Him but we are not a part of Him. The quest for God is not a journey inward.

The Christian religion teaches that history is a roundabout retelling of God’s journey toward us. He has left His place to meet us, not within ourselves, but as Himself and on His own terms. It is no kind of grace if we seek God and the road brings us back round to ourselves. For the woman who needs desperately the rest of God, it is a nightmare to find that she has nowhere to run. If God is an extension of herself, then there is no help coming. She has only “the power of positive thinking” to hope in. The rest of God is an illusion.

If, however, God is His own person, then also the rest of God in which He invites us to join Him is real. He has the authority to offer it to whomever He  wants. The struggle, the anxiety of existence can be overcome and the hope to do so is real. The world is inhabited by tired people, and if God is real, and I am not Him, then He can offer rest to my tired eyes, and yours, even amidst our utter lack of faith.

The Monsters That Remind Us: Empathy, Civil Society, and the English Major

kojima_of_horiecho_-_tokaido_gojusan_tsui_-_walters_95585

As the sister to two older brothers, I grew up watching a lot of monster movies. My brothers especially loved the old Godzilla films. And when Godzilla 2000 came out, of course my family packed up and went to see one of the first showings. Never mind that I was four years old. Godzilla 2000 has to be one of my earliest childhood memories, which consists largely of closing my eyes in the embrace of my mom and dad. I still have weird dreams about jellyfish aliens; sometimes I swear I can hear the echoing of Godzilla’s roar in the distance. It wasn’t until I watched monster movies later that I came to realize that they have valuable lessons to teach us, even aside from how to hide from three-headed dragons and radioactive moths.

Weird as it sounds, I think that empathy is one of the lessons here. And I don’t just mean for all of the parents who had to comfort frightened children in movie theaters across America.

There are, of course, a myriad of definitions, as well as subcategories, as to what empathy means. It originally came into English from the Greek empatheia, meaning “to feel into.” Aesthetic theorists first used the term to describe “the ability to perceive the subjective experience of another person. The term was later used in psychology by E.B. Titchener, who said that empathy stems from one’s own physical imitation of the pain that another person feels. Empathy, then, is deeper than sympathy in that allows us to feel—or at least attempt to feel—the pain of another person. In a helpful and concise definition, autism researcher Simon Baron-Cohen defines empathy as “the drive to identify another person’s emotions and thoughts, and to respond to them with an appropriate emotion.” So this practice is two-fold: identifying the processes of the Other, and then responding appropriately.

Stephen G. Post and Ann Jurecic add extra layers to this concept of empathy. Post, first of all, calls empathy a “force” that is on a sort of spectrum. The feeling of empathy is well and good, but this is a weaker manifestation of the force. For empathy to be “strong,” it must motivate us to do something. Inherent to this strong empathy is the “reliable affirmation of the other [that] requires a conceptual act of valuation—that all human lives have equal worth.” Empathy requires the acknowledgment that my life has the same worth as yours. And when I affirm this, I am even more motivated to “identify another person’s emotions and thoughts, and to respond to them with an appropriate emotion,” as Baron-Cohen says.

Empathy is not always welcome or even helpful. Jurecic writes, “Empathy is not salvation; it’s not certainty or knowledge; it blurs the boundaries in ways that can be both generative and destructive. In the end, empathy is a practice, a process that extends in time. To make it work takes both effort and humility.” Jurecic’s definition of empathy as a practice aligns so far with what the other writers have said. And because empathy depends so much on imagination, it is indeed far from certainty. However, her talk of it as destructive comes from a misapplication of the practice. Sometimes the exercise of empathy causes us to misimagine the Other, to project our own thoughts and emotions onto him instead of accepting him on his own terms. We imagine that we fully understand people when we empathize, when really the process of empathy must be far more open-ended than that: a practice that must be ongoing.

If empathy is an ongoing process that seeks to imagine the thoughts and emotions of the Other and to respond well, then the applications of empathy to civil discourse are hopefully apparent. On a personal level, some have written that degree of empathy corresponds to moral action. Goleman writes on the research of Martin Hoffman, “who argues that the roots of morality are to be found in empathy, since it is empathizing with the potential victims—someone in pain, dangers, or deprivation, say—and so sharing their distress that moves people to act to help them. Beyond this immediate link between empathy and altruism in personal encounter, Hoffman proposes that the same capacity for empathy . . . for putting oneself in another’s place, leads people to follow certain moral principles.”

Hoffman’s research demonstrated a correlation between a person’s capacity for empathy and her support for moral principles like aid for the poor. Studies have not concluded that empathy always motivates people to act morally, but they do highlight empathy’s importance in living in a civil society. Acting morally, yet empathetically, better helps us to extend that hospitality in our thoughts, speaking, and listening. As former president Barack Obama once said, “I view that quality of empathy, of understanding and identifying with people’s hopes and struggles, as an essential ingredient for arriving at just decision and outcomes.”

On a more societal level, Jeremy Rifkin writes on the “empathic civilization.” (And you can watch a really cool video explaining his thoughts here.) In his work, he speaks on the primary human drive to belong, which he calls an “empathic drive.” He says that empathy is grounded in our shared morality and our flaws. For Rifkin, “When we talk about building an empathic civilization, we’re not just talking about utopia. We’re talking about the ability of human beings to show solidarity not only with each other, but with our fellow creatures who have a one and only life on this little planet.” And when we are able to show that solidary, Rifkin argues, then we are able to truly have civilization.

To illustrate his point, he uses perspectives on human evolution: we first saw ourselves as part of one tribe, then as part of one religion, then as part of one nation state. Humans typically show empathy in order to relate to each other on these levels, showing solidarity as they associate with one another in terms of blood ties, religious affiliation, and national identity. So, Rifkin asks, why can’t we extend our empathy beyond this? Why can’t we see ourselves as part of one race sharing one biosphere? Surely if such empathy extended this far, we would stop seeing ourselves in terms of our differences. If we began discussions with the acknowledgement of our common humanity, then civil discourse could flourish. Disagreements would undoubtedly arise, as surely as they do between family members, but such strong empathy would still allow us to work together civilly for the common good.

Of course, despite some of our perhaps natural tendencies, empathy doesn’t always come easily. In fact, sometimes it is downright discouraged by the culture we find ourselves in. We are tempted to continue to define ourselves by our merely religious, ethnic, or national ties. During World War II, for example, both Japanese and American cultures encouraged the dehumanization of their respective enemies, largely to make it less difficult to kill one another during wartime. Donald Shriver writes, “In its systematic erosion of tendencies to empathize, racism is a peculiarly vicious enemy of forgiveness in politics or justice in any human relation.” Through this negative example, Shriver directly links empathy with forgiveness and justice. Racism, which refuses to extend empathy past a limited idea of “our own kind,” undermines the ability to practice forgiveness and justice well. And interestingly, the U.S. has never apologized for dropping atomic bombs on Japan, and Japan has never apologized for bombing Pearl Harbor.

Now that we have a working, nuanced definition of empathy—that is, an ongoing process by which we identify and respond actively to another’s thoughts and feelings, a practice fundamentally associated with hospitality, forgiveness, justice, and civil discourse—well, what’s a lowly undergraduate English major to do? How can my discipline help cultivate empathy, as well as civil discourse?

Critics, academics, and writers have often associated reading, especially that of fiction, with empathic readers. Margaret Nussbaum, in a defense of the liberal arts education, writes on a particular type of empathy that she calls “narrative imagination,” or “the ability to think what it might be like to be in the shoes of someone different from oneself.” Reading narratives causes us to see from someone else’s point of view, even feeling what they feel. If we read well, we empathize with the protagonist and other characters. As in Jeremy Rifkin’s idea of extending our empathy to the world, some attribute “to storytelling the extension of the ‘moral circle’ to include ‘other clans, other tribes, and other races.’ . . . [B]y allowing our projection ‘into the lives of people of different times and places and races, in a way that wouldn’t spontaneously occur,’ fiction can change our perspectives,” writes Suzanne Keen.

While there are no significant studies to show that reading does actually increase empathy, the possibility exists for us to cultivate an at least “weak” empathy. Even if this practice does not result in action, by reading, we have at least begun the ongoing process of empathy.

Aside from the potential to cultivate empathy, English as a discipline requires writing. A lot of writing. Looking back at my own trajectory in my time as an undergrad, I know that writing has helped me to create orderly thoughts, connections, and arguments. Not only do I see it in myself, but I am also able to better analyze the arguments and connections of others. Admittedly, I don’t always do that well—hopefully this presentation doesn’t represent that fact—but the rhetorical skills that an English major has fostered an extra concern for words and the arguments that they can represent.

Empathy, the English major, and civil discourse all seem to work well on paper. Empathy oozes from the cracks of what it means to study the language and literature of the Other. And empathy binds itself up with the other virtues in forming a foundation for civil discourse. But instead of simply focusing on the higher philosophies and musings, Finding Civil Discourse has taught me the value of particulars and exemplars. So for this project, I sat down with Dr. Alan Noble, professor of English, longtime editor-in-chief of the magazine Christ and Pop Culture, and recent founding member of Public Faith with Michael Wear. He describes the latter as an organization that’s attempting to reenvision evangelical participation in politics. Instead of the angry, “culture war” rhetoric often employed by the right, Public Faith seeks to promote pluralism and bipartisanism while remaining steadfast on key issues. Dr. Noble wishes to offer an evangelical perspective on political issues, but winsomely. The website publishes key stances in language that invites agreement, with the goal to remain neighborly toward people who do not always agree. We can think more creatively about living together with the Other by engaging in empathic language rather than the sensationalized, enflamed accounts we often stumble across in the media.

Public Faith has released statements on topics like criminal justice, perspectives helped by empathy. However, Dr. Noble says that the real work of empathy is two-fold: we must advocate for the oppressed without abstracting them, and we must seek to understand those with whom we argue. Advocating while abstracting can hurt communities when we do not stop to consider how they might be hurt by certain policies. A civil society pursues the common good, which can only be achieved when we consider the Other. For those who do not see things in the same way, we must empathize in order to communicate well. Dr. Noble has stressed that we don’t necessarily need to win people over to our side through arguments, but we ought to help them understand that it is possible to live as neighbors, and work together in pursuit of the common good. By maintaining a stalwart online presence, Public Faith serves as a practical example of civil discourse and empathy.

Empathy is an ongoing process by which we identify and respond actively to another’s thoughts and feelings. It is a practice fundamentally associated with hospitality, forgiveness, justice, and civil discourse. And I say, “practice,” because in order to contribute fully to civil discourse, it must continually be put into action, extending to the whole human race. Studying English may be the starting point for cultivating empathy, but organizations like Public Faith show empathy in practice among a culture of uncivil language and thoughts.

When I look back to that fateful viewing of Godzilla 2000 as a kid, I see a terrifying monster, formidable foes, and unthinkable destruction. But I also see the potential for humanity to recognize each other as humans, to work together despite religious or ethnic ties, to put differences aside for a greater good. While I don’t think that a monster will rise up out of the sea, I do think that the monsters of the present day can unite, not divide, us, if we are properly empathetic.

Music to Make Us Believe the Gospel

Third Eye Blind is the reason I didn’t kill myself in 2010, so how’s that for the power of music? I can still remember sitting in my car, listening to Motorcycle Drive By, and deciding I wanted to live. I wasn’t teetering on the edge, ready to take the plunge, but the thought was there, and it lingered, and it looked good.

It’s a good song. Didn’t change anything about my life, but it took all the reasons – however intangible and impossible to articulate – that I wanted to live, and drug them up out of the depths and onto the beaches of my heart.

I still don’t really know what they were. I don’t know what I thought was going to justify going on another day. But that song – an angsty recollection about a doomed relationship with a girl from New York – sounded like hope. And it still does.

It sounded like resurrection, but that wasn’t even in my vocabulary when I was 16.

I think I can identify that as a turning point in my life, maybe the really big one. If I wrote an autobiography it’d be it’s own chapter. I sat in a car one evening and started to believe the gospel while a pagan sang about a girl.

That’s how music works. Because that’s how people work. I met Jesus months later, but it started there. I was done with something, even though I wasn’t sure what it was. Eventually my head-strings caught up with my heartstrings and I trusted Jesus in a tangible way. Because I heard a song that sounded like resurrection.

That’s why we believe the gospel more deeply when we encounter God in musical worship. Especially corporate musical worship. Music knits people together around the Conductor.

That’s why our faith doesn’t wither away amidst the onslaught of failure, depression, and hopelessness. God is a disease, and once we’ve caught Him too many songs remind us of Him. God is a disease and music is a dirty syringe.

Thank God.

I’m at a worship gathering as I write this. The crowd around me is singing Be Thou My Vision. High King of Heaven, my Treasure Thou art, they sing. They’re bloody well right, and it’s never more tangible than these moments.

Telling Better Stories


I recently spent a semester studying at the University of Oxford through the SCIO program, where I delivered a TED-esque talk very similar to this. Although previously published via the illustrious medium of Facebook, it seemed worth another share here. Thanks to Will and Ryan for allowing me to (hopefully more than occasionally) hijack the blog. 
I don’t know if you’ve seen it yet, but King Lear at Blackwell’s was an amazing experience. I know a lot of us went last Tuesday. Personally, I couldn’t recommend it more. It was well worth the £17. Not only was it interesting because it’s Shakespeare (duh), but also because there were only five actors. I don’t know how familiar you might be with King Lear, but there are far more than five characters.

Needless to say, there was a lot of actor-sharing. A single actor would use props like scarves, glasses, or jackets to let the audience know which character he was at the time. So for example, two of King Lear’s daughters are played by a single actress, but we can tell the difference by the different scarves she wears. These two characters would often appear in the same scene, so the actress would wear one scarf and hold up the other as if the scarf were the character. She and the others on stage would look at it, talk to it, touch it. We as the audience come to understand that the object is a person—at least for these few hours on this stage in the middle of a bookstore.

But then I got to thinking: how often do we take this experience out of the bookstore? We understand objects as people, people as objects. Look at me. You probably see dark hair, glasses, and a flower skirt. You’ve already come to understand me as a thing, an object. And I’m not blaming anyone—we all do it. You make assumptions about me based on the way I look, the way I talk, the way I present myself. You see my body and determine I’m a woman. You see my glasses and think I’m smart/hipster/edgy/cool/trying too hard. You see my hair and decide I probably should have just put it up today. And let’s not even talk about all the ways you’ve probably already guessed that I’m not a very practiced public speaker.

You’ve done all of this within seconds. I’m a thing to you.

Maybe, if we’ve ever had a conversation, it’s different for you. Maybe you know I’m from Hawaii, or that C.S. Lewis in Context is my primary tutorial, or that sleeping is my favorite pastime. Maybe you’ve seen me eating too many cakes at teatime. You know facts, statistics, snapshots, if you care to look or remember. But these are all still things.

You might never know my story. You’ll never know that my older brother has special needs. And you’ll never know what I went through growing up as I came to realize what this meant. You’ll never know about the cancer my mom had my sophomore year of high school and the depression that came with it. You won’t know the addictions, the late nights, the angsty journal entries that pervade almost every memory of my growing-up years.

We’ll never talk about the ways I’ve overcome these, that I still fight to overcome these. You won’t feel the hugs from my mom, meet the friends who encouraged me, or hear the words they said. You won’t read the same books in the same way I did. You won’t discover the same epiphanies, or participate in the processes that brought me here.

Because to you, I am an object, a thing easily definable and dismissed, a scarf on a stage in a bookshop.

And to me, you are that.

I don’t know your story—maybe I never will. I may never even know what your primary tutorial is or what state you call home.

And as much as I want to know your story, sometimes I have to remind myself that I can’t. Intimately knowing every person here during a single semester is just not possible, as much as we might converse or interact. And as important as our stories are in the past, what about us in the present? The way we process information, the connections we make, the inevitable influence of our memories?

I guess, maybe, we can’t really know each other. Maybe.

But that doesn’t mean we have to be objects to each other. You are no simple scarf in a play that signifies a name, a voice, a shadow of a personality. You are so much more—and so am I, and so are all of us. That’s just true.

It’s going to take a lot of conversation, and a lot of constant imagination to realize that truth; open, empty rooms in our minds where we allow other people to exist in possibility. Maybe I am more than these glasses or this hair or even this talk. Maybe you are more than a haircut or a smile as we pass by. Maybe you’re on your way to a lecture where you’ll sit with a new British friend, awkwardly attempting to talk about the weather before you break out your phone to text your mom—man, you miss her a lot—and tell her about what you cooked for your food group last night. Maybe your life is just as complex as mine.

None of these things probably happened—and that’s okay, because I humbly realize that I’m no psychic. I don’t actually know. But maybe these stories are an exercise for me in imagination, helping me to see you as more than an object. Maybe stories—even just the ones I make up—will make you and I more human.

In the final scene of King Lear, one character cries over the “body” of another. It moved me to tears. But the “body” was a just a dress that the dead character had worn, draped over the stage. And this was the moment of greatest humanity: that in the telling of the story, the object had become a person.

Let’s tell better stories about each other.