Surviving Our Humanity

Bird Box, just recently released on Netflix, bears an obvious resemblance to John Krasinski’s A Quiet Place. The latter is a superior movie in almost every way, but that’s not my point. My point is that Bird Box and A Quiet Place are strikingly similar in how they ask the audience to consider how much less human we’re willing to become in order to survive. Each film is a horror-parable about our own humanity’s being weaponized against us.

“A Quiet Place”

In A Quiet Place, apocalyptic monsters have taken over and almost invariably kill whoever and whatever speaks above a whisper. In Bird Box, the same idea is turned to a different sense: Sight. Unseen monsters put whoever glimpses them, even for a second, into a lethal trance that ends in suicide. Thus, the heroes of both tales have to live without a part of their normal human functions: Sandra Bullock and her two children are blindfolded even while boating in rapids, and the family in A Quiet Place verbalizes nothing above ground. Human beings are threatened by the very things that make them human. The monsters are of course the problem, but they are quasi-omnipotent; they’re not going away. The real enemies are sight and speech.

I can’t help but wonder if these stories are connecting with audiences at a spiritual level. Might we think of many of the problems of contemporary life as a felt conflict between human flourishing and human nature? Take consumerism. Consuming is a natural human impulse, yet isn’t there a palpable sense right now that our consuming nature is at odds with our desire for meaning and transcendence? Or consider the setting of A Quiet Place, a world in which it is dangerous to speak. Ours is the age of near endless speech, amplified by mobile technologies that allow us to live intellectual and emotional lives out of our phones. Amazingly, this technology has been most efficiently leveraged to make us depressed, insecure, outraged, distracted, and lonely. Perhaps A Quiet Place resonates as a horror film because its premise is actually true for us right now—our sounds invite the monsters.

A similar idea emerges in Bird Box. I was disappointed the movie’s screenplay didn’t explore a bit more the monsters and their power. For example, most of the people who see the monsters immediately commit (or try to commit) suicide. But there a few who instead of killing themselves become quasi-evangelists for the monsters. They violently try to force blindfolded survivors into looking, chanting stuff like “It’s beautiful” and “You must see.” What’s the reason for the difference between the suicidal and the possessed? Regrettably the movie never comes close to saying. It’s fascinating though to consider Bird Box‘s theme of becoming what we are beholding through the lens of the monsters’ creating both victims and victimizers. Those who look at the monsters and live only do so because they are actually dead on the inside. They survive the monsters by becoming the monsters. That’s a pretty potent metaphor for the era of “call out culture” and strong man politics, not to mention the modern shipwrecking of the sexual revolution that is #MeToo.

In both movies, death comes through the body itself, through the senses. This is a provocative way to think about what Lewis famously dubbed the “abolition of man.” Lewis’s essay warned that the death of binding moral transcendence and the subjugation of nature would not liberate mankind, but merely re-enslave it to itself. “Man’s conquest of Nature turns out,” Lewis wrote, “in the moment of its consummation, to be Nature’s conquest of Man.” This is the world depicted by both A Quiet Place and Bird Box, a world in which nature, especially human nature, has been weaponized against us. In both films people must find ways to live below their own full humanity, because it is the expression of their full humanity that brings violence.

To me, this is a stirring poetic summarizing how divided we feel from ourselves in a secular age. The indulgence of our nature in the affluent postwar glow of the latter 20th century failed to slake our thirst for righteousness. Now, slowly awakening from nihilism, we find our own humanity turned against us, especially through technology’s power to shape the mind. To look at modern life, in its pornographic despair, kills the soul, and to speak above a whisper invites the demons of doubt and shame.

It’s interesting to me how both films center on kids. Each story’s drama mostly concerns whether the adults will be able to save their children. Why is this? Perhaps it’s because children are a common literary stand-in for renewal of innocence. But also, perhaps it’s because one of the few motivations left in a world of living beneath one’s humanity is to protect those whom we hope may not have to do so. Perhaps it’s also because such a world inevitably slouches toward new life, one of the final touchstones of grace in a disenchanted world. I sometimes wonder whether protecting children is the closest an unrepentant mind can come to true faith, as if to say, “I cannot become like a child, but I will preserve those who still can.”

 

Advertisements

Have I Sinned Against Unbelief?

Why Christians should take suffering that inflames unbelief far more seriously

While reading a remarkable book titled Christianity: The True Humanism, I was bowled over by this passage by J.I. Packer and Thomas Howard:

It is clear that many humanists in the West are stirred by a sense of outrage at what professed Christians, past and present, have done; and this makes them see their humanism as a kind of crusade, with the killing of Christianity as its prime goal. We cannot endorse their attitude, but we can understand it and respect it…

We, too, have experienced in our own persons damage done by bad Christianity—Christianity that lacks honesty, or intelligence, or regard for truth, or biblical depth, or courtesy, or all of these together. No doubt we have sometimes inflicted this kind of damage, as well as suffered it. (Lord, have mercy!) We cannot, however, think it wrong for anyone to expect much of Christians and then to feel hurt when they treat others in a way that discredits their Christian commitment. Since Christianity is about God transforming us through Jesus Christ, high expectations really are in order, and the credibility of the faith really is undermined by every uncaring and uncompassionate stand that Christians take. Loss of faith caused by bad experiences with Christians is thus often more a case of being sinned against than of sinning and merits compassion more than it does censure.

I instantly realized this was close to the opposite attitude I have had for many years. Instead, I’ve often been so occupied with undermining unbelief, with critiquing the spirit of the age and tearing down the intellectual and existential reasons people give for not following the Christ of the Bible, that I have utterly failed to take seriously the connection between being sinned against and unbelief. If Packer and Howard are right—and I believe they are—this is a major failure.

Why have I been failing here? I can think of two reasons.

First, there is a palpable cultural mood that reduces everything about life to the sum total of one’s experiences. This is the “my story” epistemology that I’ve written about before. Because there are no agreed upon central, transcendent truth claims in a secularized public square, the most truth that anyone can arrive at is their truth, and their truth is often deeply subjective interpretations of relational and social events. This mentality is powerful, and it is destructive; it blinds people to the absolute nature of our most important questions. It empowers confirmation bias. It can make people unteachable and difficult to reason with. It’s bad news.

So I think I’ve been caught up in refuting this mood so much that I’ve lost sight of the legitimate relationship between experience and objective belief. I’ve tried to swing from the one extreme of “experiences are all that matter” to the other extreme of “You should be able to think and live wholly independent of what people do to you.” Both extremes are logically impossible, though one feels more Christian than the other at this cultural moment. But Packer and Howard get to the heart of the matter when they say that unbelievers are right to have high expectations of people who claim to be actually reborn by the Spirit of Jesus. They have those expectations not because of Christians but because of Jesus! Thus, to ignore the failures of people who say they are born again to image the One in whose name they are supposedly reborn is to ignore the moral glory of Christ himself.

The second reason I think I’ve failed here is that I have consistently underestimated the power of suffering. It’s an underestimation that comes straight from my not having suffered very much. But it also, I suspect, comes from my not having listened very closely to the testimonies of people who have suffered much. This is inexcusable, and I’m sure it’s damaged in some way my connection with others.

I’ve said before that virtues like modesty and chastity have attending practices that can help us grow in them. This how I feel about stuff like the Billy Graham Rule, for example. But I think I’ve neglected the fact that empathy is also a virtue, and that like other virtues, it too has practices that must be picked up if the virtue is going to flourish in my life. What if one of those practices is not arguing all the time? What if another one is listening carefully to people who may not validate my assumptions?

Now here’s an important point. I don’t think the main reason to cultivate empathy is to become less decisive or more “open-minded.” The problem with open-mindedness is that it’s not a virtue. Its desirability depends entirely on what is trying to get into the mind. But empathy is a virtue that cuts across whether people are right or wrong, whether people believe or disbelieve. Rejecting the claims of Christ is wrong. Yet it is possible to compound a wrong by sinning in response to it. It is possible to drive a thorn deeper. Neglecting or minimizing the power of suffering, or lowering bar of expectations for believers, are both sins against unbelief. To the degree that I have done so, I’m sorry, and by God’s grace, I will grow in this.

One final thought. All of this applies very much to the way we Christians talk to people about the suffering of others. If we minimize trauma or excuse a lackadaisical response to it, for the sake of making some tribal theological or political point about someone not in the room, we are broadcasting a false view of God to the world. We are propping up a graven image in people’s minds. We are, in other words, acting in the same unbelief as those we are trying to convert.

Poverty, Dreher, and Story

Rod Dreher, a writer for whom I have a lot of admiration and respect, nevertheless has a tendency to overstate things, especially when those things pertain to his lived experience. Likewise, he has an unfortunate tendency to assume the worst intentions of people who push back against the conclusions he draws from his experiences. Those two flaws—which I shamefacedly confess to sharing—were on full display in the minor kerfuffle over this post. I won’t recap the mud-throwing, but suffice to say that I think Rod’s critics are right in their substantive critiques (Jemar Tisby’s, in particular), and that this whole episode might have been avoided by pondering for a few minutes longer the wisdom of defending transparently bigoted remarks by a transparently bigoted politician.

But there’s another contour to this thing that’s worth a very brief reflection. Part of what Rod was getting at in his original piece was that political correctness often runs counter to what people actually experience. This is a familiar beat to Rod, and it would be a mistake for people to assume that Rod has a vested emotional interest in punching down on poor people. If I’m reading him correctly, I think what Rod resents is the deliberate turning away from reality in favor of sentiments that play well with people who have no (literal and figurative) skin in the game. I think there’s something to say about that, and in an era of actual “reeducation” by our culture makers, the effects of Rightspeak are worth contemplation.

But I think what I’ve come away sensing is that Rod, and plenty of others, have not given enough contemplation. Instead, they’ve intuitively normalized their own experience of poor communities and downtrodden cultures into an argument. Rod’s desire to look for truth through experience is further confirmed when one considers the letters that he’s publishing as responses, as well as some responses to the responses. I think the best course of action is not only to reconsider tropes and stereotypes about the poor, but also to ask sharp questions of our tendency to equate experiences with an argument.

A lot of people have had a negative encounter with poor people or communities. And many of them choose to reason from their negative encounters to much bigger ideas about the moral quality of those in poverty. The problem with this is that one’s experiences are not worthy of such intellectual power. Yes, our experiences matter, and they can powerfully shape us, body and soul. But it doesn’t take much imagination how reasoning from experience is an awfully selective and unfair enterprise. If your only experience of poor Americans is being accosted by panhandlers, you’re likely going to reason from that experience that poor people are poor because they’d rather stand on the side of a highway off-ramp than find a job. Is the problem then that you haven’t had more experiences with poor people? Perhaps! But even if additional, more positive experiences broaden your horizon, continually over-relying on your experiences to inform you about the world will simply manifest itself in some other wrong, prejudiced, or naive way.

We see this everywhere right now. People who experienced judgement in a church might start a blog in which their experience of a relatively small number of people is extrapolated into huge, sweeping ideas about Christianity or the church. People who experience unexpected illness or health might intuit such experiences to big, specious notions about what is healthy and unhealthy (how do you think the essential oils business runs?). The point is not to discount our experiences entirely. We couldn’t do that even if we tried! The point is that piecing together our experiences and coming to a true knowledge of anything requires more than just gathering as many experiential narratives as we can.

The truth about American poverty lies far beyond the possibility of my experience, because it is indelibly rooted in history and ideas. I cannot visit the south side of Chicago for a weekend and come away with authoritative knowledge about poverty or urban policy. Nor can I justly conclude that a friend, associate, or Uber driver’s testimony is warrant enough for me to be dogmatic about an issue. My experiences and the truth are not coterminous.

I’m convinced that if Christians are going to coherently carry their witness to Christ and him crucified into future generations, we have to insist on this fact. I know Rod agrees with me, because I’ve read him long enough to know he does. I hope that he’ll apply this principle as liberally to issues of poverty and race as he does to modernism and confession.

My Year in Books

Let’s get this out of the way: Year-end reading lists are usually more helpful for making us feel guilty about what we didn’t read than making us thankful for what we did.

My own year of reading was certainly no exception; the pile of books that I read this year seems so small compared to that of others. Yet, I think it’s important to actively fight against this feeling. There is probably a place for reading to have read, but it’s a place that is often far more prominent in my ego than it needs to be. Reading at whim and for pleasure is, all variables being equal, vastly superior to reading to keep up. The former can, and often has, turned something in my soul. The latter usually just confirms my preexisting insecurities and arrogances.

With that prologue finished, here are the books I spent the most pleasurable time with this year. This isn’t an exhaustive list of my reading (though I won’t pretend that the exhaustive list would be much bigger), nor is it a definitive breakdown of everything I liked this year. Rather, these are the books that stayed with me the longest after I read them, the books I thought about the most, the books I marinated in the deepest. Most are from 2017, though not all.

 

-Brian Jay Jones, George Lucas: A LifeA compulsively readable biography. While it doesn’t offer quite the psychological insights I hoped, Lucas’s eclectic, unlikely career is vividly told with lots of fascinating new anecdotes.

-Rod Dreher, The Benedict Option. If you haven’t read the book, you don’t quite know the argument.

-Tom Nichols, The Death of Expertise. An accessible and unpretentious assessment of a major cultural development. An essential read for anyone trying to understand the impact of the internet on how we think. Speaking of which…

-Alan Jacobs, How To Think. One of my most underlined books of the year. I like to think of it as a long essay about the epistemological consequences of social media. I can hardly think of a more timely work.

-John Stott, The Cross of Christ. This was my first foray in a Christian classic. Stott’s defense of penal substitutionary atonement is beautiful—so much so that it’s odd to even call it a “defense.” Of all the nonfiction I read this year, this one drove me to prayer and worship the most.

-Graham Greene, The Heart of the Matter. Greene’s psychological novels dig deep in my soul. This story about a duty-bound English police officer and his crisis of faith and marriage kept me up late hours of the evening. The ending is one of the most spiritually moving pieces of fiction I’ve read.

-Chinua Achebe, Things Fall Apart. An exquisitely written novel about some of the most fundamental human experiences. Aspiring storytellers should know this book.

-Sarah Shin, Beyond Colorblind. This excellent work is a rare thing: An evangelical treatise on race, white privilege, and community that is both thoroughly Christian and unflaggingly level headed.

-James K.A. Smith, You Are What You Love. Probably the second-best book I read this year. On that note,

-Joe Rigney, The Things of Earth. My #1 read of 2017. I will be re-reading this book regularly. It has given me something for which I’ve longed for a while: A theological perspective on enjoying what God gives, and why doing so doesn’t conflict with enjoying who God is.

The Death of Expertise

Remember that famous scene in Peter Weir’s “Dead Poets Society,” in which professor Keating (played by Robin Williams) orders his English literature students to tear out the introductory pages of their poetry textbook? Those pages, Keating explains, are the soulless pontifications of a scholar trying to explain verse. Nonsense, says Keating. Poetry isn’t what some expert says it is. It’s about “sucking the marrow out of life,” about spontaneous utterances of the subconscious and chasing your dreams and sticking it to your parents and headmaster. Forget the experts, boys; carpe diem!

As a misguided defense of the humanities, “Dead Poets Society” is easy enough to dismiss. The bigger problem is that Keating’s heedless disregard for truth and knowledge is a pretty accurate picture of how many Americans think and live. That’s the contention of Tom Nichols in his new book “The Death of Expertise,” a brief yet persuasive work that laments our generation’s factual free-for-all.

Americans, Nichols believes, are not just subsisting on a low amount of general knowledge. That wouldn’t exactly be a new development. Rather, Nichols is disturbed by the “emergence of a positive hostility” to established, credentialed, and professional knowledge, one that “represents the aggressive replacement of expert views or established knowledge with the insistence that every opinion on any matter is as good as every other.”

According to Nichols, what White House press secretaries might call “alternative facts” have become common cultural currency. If love means never having to say you’re sorry, the Internet means never having to say you’re wrong.

For many people, a first-person blog post is (at least) as authoritative as a peer-reviewed study, and a Facebook link offers truth too hot for professional journalists and fact checkers to handle. This ethos doesn’t just promulgate wrong information, which would be bad enough. Nichols argues that, even worse, it fosters a deep distrust and cynicism toward established knowledge and credentialed communities.

Nichols’s book puts the symptoms of the death of expertise on a spectrum. Some effects are clearly more harmful than others. It’s no revelation that “low-information voters” feel as vehement as ever about a plethora of fictitious things. More worrisome, however, is the growing public comfort with dangerous conspiracy theories. Both of these trends owe much to the “University of Google” (to borrow one celebrity’s self-proclaimed credentials for rejecting vaccinations). With so much access to so much information available to so many people, the web has seriously undermined the responsible dissemination of verified facts and blurred the distinction between truth and talking point. Nichols writes:

The internet lets a billion flowers bloom and most of them stink, including everything from the idle thoughts of random bloggers and the conspiracy theories of cranks all the way to the sophisticated campaigns of disinformation conducted by groups and governments. Some of the information on the Internet is wrong because of sloppiness, some of it is wrong because well-meaning people just don’t know any better, and some of it is wrong because it was put there out of greed or even sheer malice. The medium itself, without comment or editorial intervention, displays it all with equal speed. The internet is a vessel, not a referee.

Nichols doesn’t lay all the blame on the internet. Higher education has contributed to the death of expertise, Nichols writes, both by churning out poor thinkers from its ranks and by defining education itself down to mean little more than payment in exchange for a degree. “When everyone has attended a university,” Nichols observes, “it gets that much more difficult to sort out actual achievement and expertise among all those ‘university graduates.’” Similarly, public trust in professional journalism has been harmed by incompetence on one end and clickbait on the other. All of this, Nichols argues, combines to foster an instinctive cynicism toward expertise and established knowledge. When experts get it wrong, well, of course they did; when they get it right, there’s probably more to the story.

One issue that seems relevant here, and one that Nichols lamentably doesn’t really parse, is the triumph of subjective narrative over objective arguments. Americans have always loved a good story, but what seems unique about our time is the way that story and first person narrative have assumed an authoritative role in culture, often to the contradiction and exclusion of factual debate. Instead of trading in truth claims, many simply trade in anecdotes, and shape their worldview strictly in line with experiences and felt needs.

The privileging of story over knowledge is a glaring feature, for example, of much contemporary religion. While real theological literacy is alarmingly rare, what are far more common are self-actualizing narratives of experience. These authoritative narratives take all kinds of forms—they’re the diaries of the “spiritual but not religious” Oprah denizens, and they’re also the cottage industry of “ex-[insert denomination/church name]” watchdog bloggers. In both cases, when jolting stories about the problems of the religious expert class collide with more established doctrine or practices, the tales triumph.

What’s more, young evangelicals in particular seem more likely to get their theological and spiritual formation outside the purview of pastors, churches, and seminaries (a triad that could be considered representative of a religious “expert” class). Blogs, podcasts, and TED Talks seem to offer many American Christians a spiritual life more attractive than the one lived in institutions like the local church and seminary. Indeed, a casual disregard for formal theological education seems to be a common marker amongst many young, progressive evangelicals, several of whom enjoy publishing platforms and high website traffic despite their near total lack of supervised training. An Master of Divinity may be nice, but a punchy blog and a healthy Twitter following is even better (you don’t have to think long or hard before you see this dynamic’s potential for heterodoxy).

Perhaps we ought to consider this the “Yelp” effect on American culture. In an economy of endless choices, “user reviews” are often the first and most important resource that many of us consult in making decisions. Putting trust in the aggregated consensus of the crowd is likely more endemic in our daily experiences than we think. It’s how we decide where to have dinner, which car to buy, what insurance company to rely on–and, increasingly, whether or not to inoculate our children, and which interpretation of the New Testament to accept. When the self-reported experiences of our peers are just a couple clicks away, and our first instinct toward expertise and credentialed wisdom is suspicion of bias and elitism, it’s not hard to see how we got here.

So what’s the solution? Unfortunately, Nichols’s book doesn’t offer many answers for the death of expertise. This is somewhat understandable; there are only so many different ways to say “epistemic humility,” after all. There is obvious need for self-awareness, both among laypeople and among the expert class. As Nichols notes, credentialed experts should “stay in their lane,” not risking credibility in order to appear omni-competent. Likewise, readers should acknowledge the inherent value in professional training and the processes of verification and review. While these things do not make expertise infallible, they do make expertise more reliable than sheer intuition.

But in order for this epistemic humility to take place, something else needs to happen first, and that is the re-cultivation of trust. Trust has fallen on hard times. Mutual trust in the public square is increasingly rare. In many cases, good faith dialogue and hearty debate have been exchanged for competing “righteous minds” that suspect the worst of ideological opponents. The “death of expertise” is, in an important sense, the death of trust—the death of trust in our public square, the death of trust in our institutions and cultural touchstones, and even the death of trust in each other.

Believing in the inherent value of experts requires us to accept fallen human nature in its limitations. It requires us to to admit that individuals with a laptop and a curious mind are limited, and that “limited” does not here mean “stupid.” The value of experts—whether professors, doctors, theologians, architects, or, gasp, even government officials–is value that we see when we accept that time and training and accountability and certification are helpful precisely because they impose a control on individual passions and abilities. The fact that not everyone is an expert is a good thing, because human flourishing is not when, as the joke goes, “everybody is above average,” but when people learn from each other in striving for the common good.

Expertise is not an infallible panacea. Nor is it a politically motivated trap. It is the natural consequence of being made in the image of a knowing God, who gives gifts and graces to each, for the good of all. Humility to sit under this kingdom economy is the key to resurrecting a culture of trust—and with it, a flourishing, mutually beneficial age of experts.

There Are No Secrets Anymore

Modern technology has made immorality easy to do, and impossible to keep secret.

Disgraced politician Anthony Weiner has been disgraced yet again…and again, it’s all about some raunchy texts. I can’t really laugh at him, because it’s obvious that he’s dealing with some life-deforming demons that I know too well. My prayer is that he would reach to the heavens for the rescue he desperately needs.

In a brief piece at National Review, Charles C.W. Cooke makes an interesting point about technology and immorality. Years ago, this kind of infidelity was hard to keep secret, because it required physical presence. Then, with technology, it got really easy to keep secret. But now, with the way that modern smartphone technology tracks and archives everything, secrecy is impossible yet again:

By the 1950s, everybody had a car, which they could use to get to the next town — or farther. Motels popped everywhere, as did their discreet proprietors. And the analog telephone provided a means by which those who were up to no good could communicate instantly, and without leaving a substantial record. So fundamentally did this transform American life that traditionalists complained openly about the deleterious effect that modernity was having on conventional mores…

[I]s this still true? I think not, no. Now, there are cameras everywhere. Now, most people carry cell phones and drive cars that track their movement by satellite. Now, most communication is conducted via intermediate servers, and spread across multiple devices. In 1960, the average American could make a sordid phone call without there being any chance that it would be taped. Today, with a $3 app, anybody can record any conversation and send it anywhere in the world in a few seconds…Put plainly, it is now nigh on impossible for anybody to get away with infidelity, especially if one is a public figure.

Maybe we could put it like this: In the age of the iPhone, doing something lascivious while no one is watching is the easiest it’s ever been–but doing it without anyone ever knowing is virtually (pun not intended) impossible. At the very least, those naked pictures and crass text messages are being stored somewhere, on technology that someone with a name and two eyes built and maintains.

Surely, as Cooke writes of Weiner, we know this to be the case. So why is there so much explicitness on cloud servers? I can think of two answers.

First, sexual temptation is stronger, always has been stronger, and always will be stronger than logic. This is why Solomon urges his son to not even walk down the street where the adulterous woman lives.

Second, though: Is it possible that many in Western culture are actually OK with the idea of people they’ll never meet having access to their naked bodies and lewd messages? Could it be that our pornified consciousness has actually numbed us to the point where, even if we know that our texts and pictures stop belonging to us the moment we press “Send,” we don’t really care? Have we, as the prophets warned, actually become the very smut we love?

Be Better Than the Bengals

In sports, character and self-control matter every bit as much as talent. You won’t see many demonstrations of that truth more glaring than Saturday night’s NFL playoff game between the Cincinnati Bengals and Pittsburgh Steelers.

In sports, character and self-control matter every bit as much as talent. You won’t see many demonstrations of that truth more glaring than Saturday night’s NFL playoff game between the Cincinnati Bengals and Pittsburgh Steelers. Continue reading “Be Better Than the Bengals”

“The Force Awakens” and Getting Trapped By Nostalgia

In conversations with friends about the new Star Wars movie, I’ve noticed two trends. The first is that most of the people I’ve talked to report enjoying the movie quite a bit (and that makes sense, seeing as how the film is scoring very well on the critic aggregation site Rotten Tomatoes). The second trend is that virtually no one has criticized The Force Awakens for being too much like the original Star Wars trilogy. Indeed, the opposite seems to be true: Most people who have told me how much they like Episode VII have mentioned its similarity, both in feel and in plot, to George Lucas’s first three Star Wars films as a reason why they like it so much.

For the record, I enjoyed The Force Awakens quite a bit, and J.J. Abrams’ homage to the golden moments of the original films was, I thought, well done. But many of my conversations about it have confirmed to me what I suspected when Episode VII was announced: We’re trapped in a cultural moment of nostalgia, and we can’t get out of it.

Of course, the nostalgia-entrapment begins with the existence of movies like The Force Awakens. As I’ve said before, as much as I love Star Wars, the fact that a 40 year old franchise is still dominating the box office, news cycle, and cultural attention is not something to be excited about. There comes a point when tradition becomes stagnation, and at least in American mainstream film culture, it seems like that line was crossed some time ago. Case in point: Included in my screening of Star Wars were trailers for a Harry Potter spinoff, another Captain America film, an inexplicable sequel to Independence Day, and yet *another* X-Men movie.  In other words, had an audience member in my theater just awoken from a 12 year coma, they would have seen virtually nothing that they hadn’t seen before.

Nostalgia, if unchecked, runs opposed to creativity, freshness, and imagination. Even worse, the dominance of nostalgia in American pop culture has a powerful influence in marketing, making it less likely every year that new storytellers with visions of new worlds, new characters and new adventures will get the financing they need to materialize their talents. That is a particularly disheartening fact when you consider that the storytellers whose work has spawned a generation’s worth of reboots and sequels were themselves at one point the “unknowns:” George Lucas couldn’t find a studio to finance Star Wars until an executive at 2oth Century Fox took a risk on a hunch; Steven Spielberg finished “Jaws” with much of Universal’s leadership wanting to dump both movie and director; and for much of the filming of “The Godfather,” executives of Paramount openly campaigned to fire writer/director Francis Ford Coppola. If formula and nostalgia had been such powerful cultural forces back then, there’s a good chance there’d be no Star Wars to make sequels for at all.

The trap of nostalgia is deceitful. It exaggerates the happiness of the past, then preys on our natural fear that the future will not be like that. But this illusion is easily dismantled, as anyone who has discovered the joys of a new story can attest.

There’s a freedom and a pleasure in letting stories end, in closing the book or rolling the final credits on our beloved tales. The need to resurrect our favorite characters and places through the sequel or the reboot isn’t a need based in the deepest imaginative joys. It is good that stories end rather than live on indefinitely so that we treasure them as we ought and lose ourselves in a finite universe rather than blur the lines in our mind between the truth in our stories and the truth in our lives. If we cannot allow myths to have definite beginnings and endings, it could be that we are idolatrously looking to them not for truth or grace but for a perpetual youthfulness.

Of course, there are dangers on the other side too. An insatiable craving for the new can be a sign of the weightless of our own souls. A disregard for tradition can indicate a ruthless self-centeredness. And, as C.S. Lewis reminded us, novelty is not a virtue and oldness is not a vice.

But we should be careful to distinguish between a healthy regard for those that come before us, and a nostalgia that (unwittingly) devalues tradition by ignoring how and why it exists. In the grand scheme of things, how many Star Wars films get made is probably not of paramount importance. But being trapped by nostalgia has its price. An irrational love of the past can signal a crippling fear of the future. Christians are called to lay aside the weight of fear and follow the gospel onward. If we’re not even willing to learn what life is like without a new Star Wars or Harry Potter, how can we do that?

How “God’s Not Dead” fails Christian students

MV5BMjEwNDQ3MzYyOV5BMl5BanBnXkFtZTgwNDE0ODM3MDE@._V1_SY960_CR8,0,630,960_AL_

I took the plunge that I had been studiously avoiding and turned on God’s Not Dead, the evangelical blockbuster movie from last year that has thus far raked in cash, awards, and even designation as the “best Christian movie of the year.” I had seen beforehand its 17% rating on Rotten Tomatoes and read thoughtfully critical takes on the movie. I was more or less prepared to watch a bad film, and indeed that’s what I got.

The failures of “God’s Not Dead” are particularly frustrating when you consider how easily they could have been avoided. There’s nothing wrong with God’s Not Dead that couldn’t be fixed by handing the script to a writer who isn’t eager to portray non-Christians in the worst light possible. The film feels less like a dramatic narrative and more like a propaganda reel, highlighting The Enemy in all their inglorious abominations.

It would be one thing for the movie to caricature non-evangelicals if it had no aspirations to realism in the beginning. I actually would be curious to watch a well-done diatribe against the secularist monopoly on higher education; the potential to learn something in that context seems high. But the medium of dramatic narrative is a higher medium than a lecture. It engages the imagination and moves the spirit in a more significant way. That’s why God’s Not Dead’s animosity towards its non-Christian characters is dangerous; if Christians come away thinking unbelievers in real life are like the unbelievers of God’s Not Dead (and that is clearly the message of the script), they will be carrying a spiteful fantasy into their relationships and evangelism that will be fatal to Gospel conversations.

Fairly representing those who disagree is not something that Christians should be bad at doing. Telling the truth about what people believe and engaging them like honest people isn’t a spiritual gift or an acquired skill. It’s basic honesty. How can I criticize the anathematizing of people like Brendan Eich and Ryan Anderson if after hours I myself enjoy caricatures of those who disagree with me?

I understand why people enjoy “God’s Not Dead.” It’s a brief moment of cinematic glory for Christians who, for good reason, often feel lampooned and marginalized in pop culture. But it’s a moment that comes at the expense of a helpful or even realistic perspective on the dialogues between faith and unbelief. The vast majority of atheists that Christian students will meet in college are nothing like the professor from God’s Not Dead. If these students go into school expecting the contrary, the cognitive dissonance that will result from seeing a reality that contradicts their assumptions will have a worse effect on their faith than a few hours of talking with a unbeliever could ever have.