A Prayer For Wonder

Heavenly Father,

Thank you for being God. Thank you for being such a wise, creative, powerful, tenderhearted, patient, and sovereign King of the world. May your name forever be hallowed, in my heart, in my home, in my church, and in my world.

You are indeed a good King, and we, made in your image, are made to see and savor you. We are made to bask in the light of how beautiful you are. It is not enough to agree that you are good. You demand that we enjoy you. You demand that we wonder at you. In your wisdom, you have commanded our wonder.

You have not hidden your wonder. You have stamped it throughout creation. The bristle of the tree leaves in a summer wind says, “Wonder!” The orange lake poured over the sky in sunrise says, “Wonder!” The impenetrable depths of the ocean floor, where live creatures our best technology and brightest minds cannot fathom, say, “Wonder!” The Milky Way, the exoplanets, the distressing vastness of space, they all say “Wonder!” You have commanded our wonder, and you have given us much to wonder at.

But wonder doesn’t come easy to us. Like the man in Bunyan’s slough of despond, our eyes are weighed downwards, away from the majesty and toward the muck. Your night skies go unheeded in favor of the dull blue glow of our iPhones. We ignore the wonder of words (how amazing language is!) and focus instead on how we can use them to gain platforms from people we don’t pray for. We greet the world you have made not with wonder, delight, and worship, but with cynicism and defensiveness, so occupied with trying to show others that we belong here that we forget why we belong here.

Wonderful things are close to us, yet wonder feels far away. Yet we often confuse this for nostalgia. If we could just go back to childhood, if we could unlearn what we’ve learned, if we could lose ourselves once again in pleasure and play, we think we would wonder again. “Things were better before,” we say every year, meaning “Before this year.” We want the good old days and we want to know we’re in them. We want to wonder and yet feel ourselves wondering. We want to wonder at our wondering.

We want, so often, to wonder at ourselves.

But we can’t. No matter how hard we try, self-wonder crumbles under the distractions of life. It is exhausting to see mirrors everywhere. We are tired. We want to see you.

Father, help us to do this. Help us to see you. Help us to see you in the beauty of the summer skies and the winter frost. Help us to see you in the great stories. Help us to see you in each other. Help us to see you in the simplest of things, the things we don’t even think about because we are distracted. Help us to love where You have put us, with whom You have put us, when You have put us. Help us to wonder, not wander.

Help us to wake up every morning eager to wonder at You, who You are, and what You’ve done. Help us not to wake up already imagining ways to make strangers respect us. Help us not to see the world through social media, nor through all-consuming careerism. Help us to be productive but also to rest, and help us remember that work and wonder are not always the same. Help us to be calm in an outraged time, and help us to be quiet in a culture that demands we fill all silence with words.

Help us to wonder now, like we will be doing for eternity.

In Jesus’ name,

Amen.

Advertisements

The Problem With Social Media Righteousness

If there’s anybody who is writing more incisively about social media culture right now than Freddie deBoer, I don’t know who it is. There are quite a few valuable points in his latest post, but I want to focus on one in particular.

Freddie writes:

The modern internet, particularly social media, is essentially a vast positive reinforcement machine. Note that positive in this context doesn’t mean “leading to positive outcomes,” just “an active system of reward.” We’ve built these systems into every major online platform there is, the likes and favs and retweets and reblogs and shares. And the thing is that they work. They are powerfully influential on people’s behavior. But people’s rational minds rebel at that and insist that they don’t care about such things. The problem is that you might not care, in terms of your conscious mind, but your brain cares. Check the literature on behaviorism. In the video game you jump up to get the coin even though you know it does nothing to help your life, even in the context of the game. You do it because you’re rewarded for it, in the simplest and least consequential way, and so the pleasure centers of your brain light up and you are conditioned to do it again.

Every time someone who is extremely online and yells about politics all day and all night says to me “I know social media doesn’t do anything,” I check and they’ve tweeted like 250,000 times. That’s behaviorism at work.

I’m not an expert on digital technology or neurology, and I don’t think Freddie is either, but almost everything I’ve read about the internet and the psychological dynamics underpinning social media affirms what he says here. The reason Facebook and Twitter and Instagram are some of the most financially valuable companies in the world right now is not mainly that they give us something we can’t get anywhere else, it’s that they give us the same thing over, and over, and over, and over again, in a way that embeds itself into our consciousness. Think of the last shot of the movie “The Social Network,” in which Jesse Eisenberg’s character mindlessly refreshes his Facebook page multiple times to see if his ex has accepted his friend request. That’s an accurate picture of how most of us use social media–not really to discover anything new, but to discover how others have discovered us in some way.

Why should we remind ourselves of this? Here’s one reason: Because social media has such a powerful neurological imprint, we should be extremely skeptical of our motivations while using it. We should assume, all variables being equal, that we have mixed motives at best for how we utilize the medium, how we present ourselves, what we say, and how we respond to others. We should not, in other words, assume that our social media “community” is merely a digital version of flesh and blood company, or that our posts and Tweets and “Likes” are representative of how we would think or behave in that moment if we didn’t have the technology.

Now, there are going to be some who really–and I mean really–resist what I’m saying here. Why? Because what I’m prescribing is that we consciously undermine the mentality and emotions that go into the vast majority of social media trends, attitudes, and politics. Social media righteousness, the kind of social media righteousness that chastises others for not Tweeting about something fast enough or that builds walls of moral superiority around hashtags and threads, is a righteousness that has been polluted with a uniquely strong toxin. The reward mechanism that Freddie mentions here is pervasive, and it builds platforms and people who appear thoughtful but in reality calculate who they are and what they say to climb up the social media ladder. This is just reality. It’s reality we don’t like hearing, but it’s reality nonetheless.

This is exactly why there are a handful of topics that I will almost never talk about on social media. Some of these topics are incredibly important, and I’m sure some have noticed my silence and have chalked it up to cowardice or siding with “the powerful” or just apathy. I don’t care. Silence is not apathy, and no matter how many times your tweet saying otherwise gets Retweeted and Liked, that’s a fact. Refusing to jump into a particular conversation looks like the wrong decision to people who are trapped into this mental reward mechanism.

So here’s the takeaway. I think the best way to use social media is to open up communication between people who might otherwise not engage. I think the best thing you can do is find people like you and unlike you, people whom you respect (don’t follow people you don’t respect–you’ll discover the magnitude of the time waste only after the fact), and share ideas and stories and perspectives. I also think social media works best when people realize that, yes, it is absolutely an inferior mode of communication and relationship, and it’s not just millennial bashing to say so. Save the majority of your righteous mind for longform writing, for conversations with friends in person or on the phone, for letters and emails and Skype chats. And when you need to engage in something more serious on social media, do so with self-awareness, and tell yourself that no matter how serious the topic being talked about, no matter how passionate the emotions get–it really is just the Internet after all.

The Death of Expertise

Remember that famous scene in Peter Weir’s “Dead Poets Society,” in which professor Keating (played by Robin Williams) orders his English literature students to tear out the introductory pages of their poetry textbook? Those pages, Keating explains, are the soulless pontifications of a scholar trying to explain verse. Nonsense, says Keating. Poetry isn’t what some expert says it is. It’s about “sucking the marrow out of life,” about spontaneous utterances of the subconscious and chasing your dreams and sticking it to your parents and headmaster. Forget the experts, boys; carpe diem!

As a misguided defense of the humanities, “Dead Poets Society” is easy enough to dismiss. The bigger problem is that Keating’s heedless disregard for truth and knowledge is a pretty accurate picture of how many Americans think and live. That’s the contention of Tom Nichols in his new book “The Death of Expertise,” a brief yet persuasive work that laments our generation’s factual free-for-all.

Americans, Nichols believes, are not just subsisting on a low amount of general knowledge. That wouldn’t exactly be a new development. Rather, Nichols is disturbed by the “emergence of a positive hostility” to established, credentialed, and professional knowledge, one that “represents the aggressive replacement of expert views or established knowledge with the insistence that every opinion on any matter is as good as every other.”

According to Nichols, what White House press secretaries might call “alternative facts” have become common cultural currency. If love means never having to say you’re sorry, the Internet means never having to say you’re wrong.

For many people, a first-person blog post is (at least) as authoritative as a peer-reviewed study, and a Facebook link offers truth too hot for professional journalists and fact checkers to handle. This ethos doesn’t just promulgate wrong information, which would be bad enough. Nichols argues that, even worse, it fosters a deep distrust and cynicism toward established knowledge and credentialed communities.

Nichols’s book puts the symptoms of the death of expertise on a spectrum. Some effects are clearly more harmful than others. It’s no revelation that “low-information voters” feel as vehement as ever about a plethora of fictitious things. More worrisome, however, is the growing public comfort with dangerous conspiracy theories. Both of these trends owe much to the “University of Google” (to borrow one celebrity’s self-proclaimed credentials for rejecting vaccinations). With so much access to so much information available to so many people, the web has seriously undermined the responsible dissemination of verified facts and blurred the distinction between truth and talking point. Nichols writes:

The internet lets a billion flowers bloom and most of them stink, including everything from the idle thoughts of random bloggers and the conspiracy theories of cranks all the way to the sophisticated campaigns of disinformation conducted by groups and governments. Some of the information on the Internet is wrong because of sloppiness, some of it is wrong because well-meaning people just don’t know any better, and some of it is wrong because it was put there out of greed or even sheer malice. The medium itself, without comment or editorial intervention, displays it all with equal speed. The internet is a vessel, not a referee.

Nichols doesn’t lay all the blame on the internet. Higher education has contributed to the death of expertise, Nichols writes, both by churning out poor thinkers from its ranks and by defining education itself down to mean little more than payment in exchange for a degree. “When everyone has attended a university,” Nichols observes, “it gets that much more difficult to sort out actual achievement and expertise among all those ‘university graduates.’” Similarly, public trust in professional journalism has been harmed by incompetence on one end and clickbait on the other. All of this, Nichols argues, combines to foster an instinctive cynicism toward expertise and established knowledge. When experts get it wrong, well, of course they did; when they get it right, there’s probably more to the story.

One issue that seems relevant here, and one that Nichols lamentably doesn’t really parse, is the triumph of subjective narrative over objective arguments. Americans have always loved a good story, but what seems unique about our time is the way that story and first person narrative have assumed an authoritative role in culture, often to the contradiction and exclusion of factual debate. Instead of trading in truth claims, many simply trade in anecdotes, and shape their worldview strictly in line with experiences and felt needs.

The privileging of story over knowledge is a glaring feature, for example, of much contemporary religion. While real theological literacy is alarmingly rare, what are far more common are self-actualizing narratives of experience. These authoritative narratives take all kinds of forms—they’re the diaries of the “spiritual but not religious” Oprah denizens, and they’re also the cottage industry of “ex-[insert denomination/church name]” watchdog bloggers. In both cases, when jolting stories about the problems of the religious expert class collide with more established doctrine or practices, the tales triumph.

What’s more, young evangelicals in particular seem more likely to get their theological and spiritual formation outside the purview of pastors, churches, and seminaries (a triad that could be considered representative of a religious “expert” class). Blogs, podcasts, and TED Talks seem to offer many American Christians a spiritual life more attractive than the one lived in institutions like the local church and seminary. Indeed, a casual disregard for formal theological education seems to be a common marker amongst many young, progressive evangelicals, several of whom enjoy publishing platforms and high website traffic despite their near total lack of supervised training. An Master of Divinity may be nice, but a punchy blog and a healthy Twitter following is even better (you don’t have to think long or hard before you see this dynamic’s potential for heterodoxy).

Perhaps we ought to consider this the “Yelp” effect on American culture. In an economy of endless choices, “user reviews” are often the first and most important resource that many of us consult in making decisions. Putting trust in the aggregated consensus of the crowd is likely more endemic in our daily experiences than we think. It’s how we decide where to have dinner, which car to buy, what insurance company to rely on–and, increasingly, whether or not to inoculate our children, and which interpretation of the New Testament to accept. When the self-reported experiences of our peers are just a couple clicks away, and our first instinct toward expertise and credentialed wisdom is suspicion of bias and elitism, it’s not hard to see how we got here.

So what’s the solution? Unfortunately, Nichols’s book doesn’t offer many answers for the death of expertise. This is somewhat understandable; there are only so many different ways to say “epistemic humility,” after all. There is obvious need for self-awareness, both among laypeople and among the expert class. As Nichols notes, credentialed experts should “stay in their lane,” not risking credibility in order to appear omni-competent. Likewise, readers should acknowledge the inherent value in professional training and the processes of verification and review. While these things do not make expertise infallible, they do make expertise more reliable than sheer intuition.

But in order for this epistemic humility to take place, something else needs to happen first, and that is the re-cultivation of trust. Trust has fallen on hard times. Mutual trust in the public square is increasingly rare. In many cases, good faith dialogue and hearty debate have been exchanged for competing “righteous minds” that suspect the worst of ideological opponents. The “death of expertise” is, in an important sense, the death of trust—the death of trust in our public square, the death of trust in our institutions and cultural touchstones, and even the death of trust in each other.

Believing in the inherent value of experts requires us to accept fallen human nature in its limitations. It requires us to to admit that individuals with a laptop and a curious mind are limited, and that “limited” does not here mean “stupid.” The value of experts—whether professors, doctors, theologians, architects, or, gasp, even government officials–is value that we see when we accept that time and training and accountability and certification are helpful precisely because they impose a control on individual passions and abilities. The fact that not everyone is an expert is a good thing, because human flourishing is not when, as the joke goes, “everybody is above average,” but when people learn from each other in striving for the common good.

Expertise is not an infallible panacea. Nor is it a politically motivated trap. It is the natural consequence of being made in the image of a knowing God, who gives gifts and graces to each, for the good of all. Humility to sit under this kingdom economy is the key to resurrecting a culture of trust—and with it, a flourishing, mutually beneficial age of experts.

Putting Down My Inner Polemicist

I’m not a pastor, but this post by Kevin DeYoung hit me where it hurts. For the sake of clarity, “polemics” in the sense that DeYoung is using here refer to a particular mode of engaging ideas critically, with a goal of correcting bad ideas. While polemics qua polemics are inherently valuable, the word is often associated with a genre of writing that is attitudinally aggressive, critical, and negative. If you find a “polemics blog” you won’t likely find much “dialogue” or even nuance; you’ll see writers naming names and naming blasphemies.

DeYoung’s exhortations convicted me because, even though I don’t think my writing is really polemical in the above sense, it’s become clear to me that my disposition and my instincts are more polemical than I want them to be. And the biggest evidence for that is in my social media use.

Not long ago I perused my own Twitter timeline, and a frightening thought occurred to me: I probably would not follow myself. Way, way too much of what I tweeted was cynical, snarky, pedantic, and more than a bit self-important. I don’t remember, but my guess is that I probably only noticed this because I was having similar impressions of someone else’s feed (remember C.S. Lewis’s point about how prideful people are extra-sensitive to the pride of others?). I was astonished, in a very bad way, at how much time I spent thinking and saying reactive, defensive things. If my wife or a friend ever told me this is how I talked to others in non-digital life, I would be embarrassed.

The allure of polemics is the thrill. There’s an actual adrenaline kick when you’re breezily dismantling (at least in your own head) other people’s wrongness. There’s a feeling of control, of power, and, especially if this is a kind of Christianized sort, of doing God’s work. Being given a chance to feel smarter than someone else in the name of Jesus is an offer many of us can’t refuse. That’s why self-awareness is so difficult, at least for me. To stop and think, “Is this really the best use of my time and brain” is to interrupt the thrill and the superiority. And when nothing stands between your thoughts and your public words except a button smaller than your thumb that says “Tweet,” the incentives for delayed gratification are few and far between.

Now of course, engaging ideas is what I do. It’s why I write. I love thinking and writing and talking about important things, and you’re not going to think and write and talk about important things long before you’re doing to advocate for X instead of Y. That’s part of being made in the image of a truthful God and believing in a narrative of human history that says truth is knowable and real and matters.

The problem with cynical polemics, the kind that comes so easy to me, is not that they’re unnecessarily obsessed with “truth.” It’s that they actually aren’t really about truth at all. It’s about “my truth.” At the end of the day, the cranky polemicist makes everyone around him certain of only one truth: himself. A defensive posture toward everyone and everything is a posture of self-actualization. I need to ruthlessly tear down this Wrong Idea because its existence is a challenge to my existence. My opinions become my identity. And when that happens, your opinion is not just incorrect, it is incompatible with me as a person.

This is why rational discourse has become so difficult in the internet age. It’s why commenting sections degenerate so quickly into acid-throwing endurance events. And it’s why confirmation bias and declining attention spans have combined to give us a culture that equates nuance with compromise and carefulness with cowardice.

So I want to put down my inner polemicist. I want to think more and Tweet less. And I don’t want to look at the conversations of the public square as little more than a ripe opportunity to assert my own cleverness. Dear reader, for those times, either in this space or another, that I have failed this ideal, I apologize, and hope you will forgive me, and bear with me as I try to keep truth, love, and beauty in harmony.

The Cross vs Nostalgia

We call it “Good Friday” now. But no Christian should want to relive that day.

On the day we call Good Friday, Jesus’ disciples were foolish. They volunteered to die with him, demonstrating that they neither understood his mission or their own hearts. Hours after pledging their lives, they fled at the sight of Roman soldiers. The best friends Jesus had, the men who had spent three years by his side, stood back, denying they even knew him, as he was given over to an illegal trial for his life.

Witness after witness told lie after lie, so many lies that the Scriptures tell us that the testimonies didn’t even agree. Jesus was beaten, mocked, and pronounced guilty on the authority of actors, and nobody intervened. Nothing happened to stop the farce and return Jesus his dignity. No one stood up to corruption and injustice. He was alone.

He was alone as Pilate cowered before the crowd and ordered him flogged until his flesh hung off his bones like wet parchment. He was alone as Pilate cowered again and ordered his crucifixion, declining to announce what crime was being punished. He was alone as the weight of a wooden cross smote him into the earth. He was alone as the nails were driven with ruthless efficiency. A man who had raised little girls up from the dead was stripped naked so that federal agents could play dice with his clothing. Nothing happened, no one stopped it.

Why do we call this day “good”? This is the kind of day that we learn about in history textbooks, with black and white photos of burned bodies stacked on top of each other. This is the kind of day where we watch Planned Parenthood surgeons sift through a petri dish of humanity, looking for the most valuable of the remains. This is the kind of day where armed guards open fire on peaceful protesters, or sic dogs on children. There’s nothing remotely sentimental about the cruel injustices of “Good Friday.” So why do we call it good?

We call it good because tradition and nostalgia aren’t synonyms. The past—the realities of the faith once for all delivered to the saints—is our life. Without Calvary there is no church, there is no heaven, and there is no hope. Christians don’t believe in the idea of the atonement—they believe in the history of it. Jesus really did die on a cross, in a real part of the Middle East, surrounded by real people who really did shout for a healer and a teacher to be murdered by a government they proclaimed to hate. This isn’t just theology. It’s history. It’s our history, our tradition, and our hope.

It’s not, however, our nostalgia. Tradition is about receiving from the past; nostalgia is about disfiguring it. Nostalgia is our cultural mood right now because it affords the comfort of the past while letting us Nostalgia is superficial in essence but can be tyrannically earnest; we can try to reinvent our entire lives in the image of that which reminds us that we were once young. But for the age of nostalgia, hope is to be found in the here and now. We must be nostalgic so that we can be comforted by the past without being taught by it.

We dare not be nostalgic about Easter. Only the foolish would sentimentalize the flogging, the walk to Golgotha, and the naked, shredded flesh. To make the Passion an object of our nostalgia—to see in it only the value of our grandfather’s generation, the benefit of a “Christian nation”—is to spit upon the cross itself. It is said that in the United States are millions of “Easter and Christmas” churchgoers, those who make time in their secular existence for two hours of hymnody a year. Oh, if only these Americans could see in their holidays the blood and the gore and the evil! If only they could see the gospel in its visceral reality, and not in its Thomas Kinkadian counterfeit.

If they could–if we could–we would not look at Good Friday with nostalgia. But we would look at it, and, if God is merciful, we might never look away.

Sympathy and Suicide

Netflix’s newest original miniseries, 13 Reasons Why, is compelling TV. It’s well-acted and hauntingly written. But while watching it, something bothered me, and it took me a while to figure out what it was. I fear that 13 Reasons Why might be the latest example of how Hollywood hitmakers tend, even unwittingly, to romanticize suicide.

The show is focused on the suicide of high-schooler Hannah. Shaken by her death, her friends and classmates discover that Hannah had recorded 13 audio tapes, discussing various people and incidents that drove her to kill herself. Each episode of the series centers on a the revelations of a particular tape; as the series progresses, secrets of Hannah’s classmates are exposed, and the series ends on a sober note of justice as many classmates and even some school administrators are implicated in Hannah’s death.

Critics have almost universally praised 13 Reasons Why for its intelligent script and mature narrative. It is indeed a well-produced series, and the writers and actors deserve credit for handling such a brutal story with a measure of dignity and hope. But therein lies my concern. While it’s true that teenage bullying, depression and suicide are stories we need to be telling, I fear that 13 Reasons Why may tell a story that, even unwittingly, valorizes a teen’s death.

Hannah is clearly a smart girl. Her quick wit and observational skills clearly outpace most of her peers, especially the boys. The recordings she leaves behind are likewise clever and incisive. Without giving away too much of the plot, let me briefly explain that Hannah’s tapes serve as a crucial instrument of her revenge, a revenge that exposes criminal activity at her school and culminates in a lawsuit. By the end of the series, Hannah, though driven, as we see in flashbacks, to despair by the cruelty of her world, has achieved something very much like a vindication. Her tapes win; her bullies lose.

Poetic justice? Yes. But at what cost? Does Hannah’s posthumous vindication make her decision to kill herself more tragic, or less so?

We need to remind ourselves that art moves us first at a level beyond rational thought, ultimately because art is not about information but about desire. I have no doubt that 13 Reasons Why will be an emotionally compelling experience for many, especially teens. And that is troubles me. It unnerves me to think of a teen, caught in a cycle of abuse and neglect like Hannah, watching this story unfold and desiring the self-sacrificing heroism that they see. While 13 Reasons is engrossing story, it’s also not real life. Suicide is not heroic. Killing oneself is not a strategy for revenge. It is a monumental act of selfishness. Hannah’s friends process her death and her tapes exactly how she expected them to. The empathy that she didn’t feel in life she receives in death.

That’s a fairy tale. And it’s a fairy tale that might have lethal consequences for people struggling to value their life.

Death is not a friend. It’s not a vehicle for your self-actualization. Suicide will not give you a front row seat to watch as your friends and family understand you and love you for the first time. But that’s what happens for Hannah. The filmmakers behind 13 Reasons Why know that Hannah’s death is tragic. They do not rejoice in it. But unfortunately, by giving Hannah a godlike intelligence and an ephemeral sort of control over the unfolding events after her death, the makers of 13 Reasons have told a profoundly wrong and morally confused story. It’s a story that poses a unique threat to audiences who may be considering Hannah’s path, simply to feel the love shown to her.

Because life is immeasurably precious, displaying its worth is immeasurably precarious. The power of stories is their ability to shape our intuitions, our loves, our expectations of the world. Trying to help despairing friends see the value in their life requires more than telling a story of someone like them. It requires sifting through beguiling myths and being honest about our enemy Death. I’m afraid 13 Reasons Why makes this harder, not easier.

Brett McCracken (and Me) on Movies, Nostalgia, and Criticism

For the last few weeks I’ve been chatting via email with pastor, writer, and Christian film critic Brett McCracken. Brett is one of the most articulate, and consistently helpful evangelical culture writers that I know. I was eager to get some of his perspective on a variety of movie-related topics–such as the state of the industry, Christian approaches to film, the importance of critics, etc.

The conversation will continue beyond this post, but I asked Brett if I could share some of our thoughts already. He graciously agreed.

____________

Samuel: Before we talk about issues related to Christians and movies, I’d love to get your perspective on just the film industry as a whole right now. I think a lot of people thought this year’s crop of Oscar nominees was a strong one, so in one sense there’s good reason to be excited about what Hollywood is doing. But in another sense, 2016 was, a lot like previous years, a huge year for reboots, remakes and sequels. I’m not sure what you make of that trend?

Personally, I’ve not been shy about criticizing what I feel like is a dearth of creative thinking and originalism from the studios. It seems to me that this drought of fresh ideas may not be unprecedented but it does feel quite invulnerable right now. As long as superhero franchises top the box office, we’re going to get more of the same (how many different Spider-Man actors can we cram into one millennium?)

Is that the impression you have, or am I missing something?

Brett: I think your reading of the industry is correct. It seems like studios are indeed mostly interested in reboots, remakes and sequels, which is to say: proven formulas and guaranteed global moneymakers. One of the key words in that last sentence is global. As the theatrical, old-school movie experience in North America declines in popularity, in other parts of the world it has been growing. Thus, Hollywood has in the last decade or so really concentrated on overseas markets. The thing about movies that play well all over the world is that they have to be able to click with audiences across various cultural divides. This means more subtle, character-driven stories (as well as comedy, which doesn’t translate well across cultures) have less global appeal, while (you guessed it) familiar franchise films and big budget, CGI-heavy action films draw audiences everywhere.

I also think there is an interesting dynamic going on in the larger culture right now, which is a sort of obsession with nostalgia and an inability to truly innovate new things. You see this in politics, with both parties holding on to old goals and defined more by nostalgia for the past (e.g. “Make America Great Again”) than vision for the future (the inability of the GOP to unite around a healthcare vision is case-in-point). Yuval Levin’s Fractured Republic is a great book for understanding the “politics of nostalgia.”

The same thing seems to be happening in film and television. Everything is sequel/spinoff/reboot/franchise/familiar. A new Twin Peaks. Fuller House. Girl Meets World. Another Fast and the Furious movie. Another Spiderman. Another Indiana Jones. New Star Wars and Avengers and Justice League films planned until 2025 and beyond (or so it seems). Live action versions of beloved Disney animated films. Even the “original” things are driven by nostalgia. Stranger Things is soaked in 80s/Spielbergian nostalgia. La La Land is an exercise in nostalgia for a classic Hollywood genre. When news breaks that The Matrix is being rebooted, you know things are bad.

I think in one sense, nostalgia is a proven source of commercial success at a time when the industry is jittery because the whole model is changing (streaming, etc). On the other hand, there is a cultural anxiety at play wherein the present is simply too fragmented, too overwhelming, too unknowable (this is the “post-truth” / #alternativefacts era after all) to inspire contemporary narratives that resonate with mass audiences. And so Hollywood goes to the future (sci-fi, apocalypse, dystopia) or to the past, but doesn’t know what to do with the present. I recommend Douglas Rushkoff’s book Present Shock for understanding this phenomenon.

None of this is to say there are no places where innovation is happening. There are. Especially in new formats on streaming sites like Netflix. It will be interesting if these new formats inject new life and originality into Hollywood’s storytelling rut.
____________

Samuel: Your point about nostalgia and current cultural anxiety over the present is very interesting. I suppose if you were OK with the “post-9/11” cliche you could try to understand the box office since 2001 as representing a cultural thirst for morality and heroism. 2001-2002 seems to be a watershed time frame, too; Lord of the Rings and the first Harry Potter film both debuted in 2001 and immediately made fantasy the go-to genre, and then in 2002 you had Sam Raimi’s Spider-Man really re-energize the market for superhero films. But I think it’s just as plausible to see it, as you said, as a response to an increasingly fragmented, metanarrative-less public square.

Every time I talk about this I remember A.O. Scott’s essay about the death of adulthood in pop culture. His argument has been very compelling for me and, in my opinion, helps make sense of a lot of what we see from Hollywood right now. Especially helpful was his description of this movie generation as the “unassailable ascendancy of the fan,” meaning that audiences are essentially immune to film criticism because they have franchises rather than stories, and with those franchises comes a sense of belonging, the belonging of fandom. Do you think that as movies become more openly nostalgic, formulaic, and franchise-driven, the task of the movie critic becomes harder or even more irrelevant? Should critics just embrace the reboot era and judge movies on how well they resuscitate the original material, or should there be a point where critics say, “Even if this is a well-produced retread, it’s a retread, and as such its value as art is inherently handicapped”?

Brett: I think the task of the critic today is different than it was in previous eras, but no less crucial. It’s true that some franchise and tentpole films are “critic-proof,” but the rising popularity of sites like Rotten Tomatoes indicates that audiences are at least somewhat interested in critics’ opinions, even if a correlation between critical consensus and a film’s box office success is debatable.

From my perspective, the importance of the critic today is not about a “thumbs up or down” endorsement as much as it is about adding value and depth to an experience of film/TV, at a time when the overwhelming speed and glut of media leaves us with little time and few tools for processing what we’ve seen. Whether or not audiences are interested in taking time to process and understand something, rather than quickly moving on to the next thing, is an open question. I know for myself after I see a complex film, I love reading my favorite critics as a way of extending the filmgoing experience by processing it communally. The communal aspect of film/TV is being lost, as society becomes further atomized and isolated, with no two media feeds looking the same. Fan communities fill some of this communal void, but reading critics is another way we can make an otherwise solitary experience something that connects us with others.

As to the question of how critics should approach films in the reboot/franchise era, I think the task should be less about fidelity to franchise and the “did they get it right?” question, as much as simply evaluating it as a film on its own two feet. A film’s faithfulness to the “world” of the franchise is a concern for fandom. A film’s insights into the world we live in today is the concern for critics. What does a film or TV show, however nostalgic it may be for the past, say about who we are as a society today? This is a question I believe critics should be asking.

There is plenty of room for innovation and beauty within the constraints of franchise (see Nolan’s The Dark Knight, LOTR, some of the Harry Potter films, and so on), and some might argue that the limits of genre/franchise/adaptation actually can sometimes spark the most innovation. The fact that “there is nothing new under the sun” need not be a downer for critics and audiences. Great narratives, great characters and themes endure because they are great. The best artists can mine these sources in ways that are fresh and timely for today’s world.
____________
Samuel: I agree with you about the importance of good criticism. As I’m sure you have, I’ve known a lot of Christians who sort of thumbed their nose at professional criticism. I’ve been the “negative” guy in the group leaving the theater plenty of times. I think the perception many people have (and I would say this definitely more true in conservative culture) is that talking honestly about a movie’s flaws and strengths is a kind of elitism that exaggerates the importance of movies. “It’s just a movie,” “Just enjoy it,” etc etc.

Right now I’m reading Tom Nichols’ book “The Death of Expertise,” and the main argument of that book is that we are in a cultural moment in America where there is not only widespread ignorance (which is not really unique to our time) but active hostility toward knowledge and credentials (which, Nichols argues, IS unique). As someone who has watched more movies than many, probably most, other Christians, and has studied and practiced the art of criticism and analysis, how do you exercise a kind of authority in your criticism, without pretense or arrogance? If someone were to approach you and say that Terrence Malick, for example, was an untalented director who made gibberish, what’s your approach to correcting this idea–or do you? Can you argue taste after all?

(This is an important question to me because it gets at the heart of something I believe strongly about Christian culture–that we’ve failed to produce good art largely because our idea of good has been defined down to mean “family-friendly” and “inoffensive”)

Brett: I think taste is, in large part, learned. It’s why people in different cultures and contexts have taste for certain types of foods and have different conceptions of beauty. They’ve been nurtured in a certain environment where they’ve developed those tastes. So when someone doesn’t share our taste exactly, we shouldn’t begrudge them for it. But I do think it’s natural and good for us to not simply shrug and say “to each their own!” but to dialogue and try to help others see what we see in something. Lewis talks about how our desire to share something we enjoy with others is not superfluous but rather integral to our enjoyment of it: “We delight to praise what we enjoy because the praise not merely expresses but completes the enjoyment; it is its appointed consummation.”

And so if someone were to say to me that Terrence Malick is untalented and makes gibberish, I could not just say “well we see differently.” Part of my enjoyment of Malick (an integral part) is being able to share with others what I’ve discovered in his work. And so I’d do my best to not be angry and get worked up about the other person’s comments, but to share with them why I think Malick is brilliant and his films are important. This is the nature of criticism. A good critic writes not out of a place of spite but a place of love. My enjoyment of Malick’s films doesn’t stop if others struggle with them. But if I can help others through their struggles and help them appreciate Malick more, that only adds to my enjoyment.

Another thing I would say is that for film critics or any expert on anything, it’s important that you show and not just tell that something is important. What I mean is, rather than simply pronouncing “x is good,” an expert’s criticism or description of “x” should prove its goodness by being so beautiful and interesting and enlightening in its own right that readers can’t help but see x as something of value. The way Paul Schrader wrote about Ozu films made me learn to love Ozu. The way my Wheaton College professor Roger Lundin so passionately talked about Emily Dickinson made me learn to love Dickinson. The way the food critics on Chef’s Table talk about the importance of certain chefs makes me desire to try their restaurants. The way my college roommate passionately loved coffee helped me develop a more refined taste for it.

It’s not just what critics or experts say but how they say it that matters. So much of what we learn as humans comes from observation of others, from models and mimesis. Good critics and experts (and teachers of any sort) model ways of thinking about and enjoying things well. And we need to value those models now more than ever, in a society where it is easier than ever to consume things poorly, cheaply, excessively. The Nichols book sounds great, and very timely!

I would hope that when people observe how much I love Malick, how seriously I take his films and how deeply I engage them, I hope they not only gain an appreciation for Malick but also a desire to love and think deeply about other films, engaging them deeply even when it is challenging. This is what I hope my writing about film for Christian publications has done over the years: modeled a more complex, nuanced and ultimately more enriching engagement with film beyond the reductive “good = inoffensive and family friendly” approach that you rightly note has left us ill-equipped to be good creators.

The Other Woman

Thinking about this passage from Brian Jay Jones’s lovely biography of George Lucas:

What wasn’t right was that George and [then-wife] Marcia’s thirteen year marriage was quickly imploding, largely because of Lucas’s own neglect. He knew he could be difficult to live with. “It’s been very hard on Marcia, living with somebody who is constantly in agony, uptight and worried, off in never-never land,” Lucas told Rolling Stone. Charles Lippincott, Lucasfilm’s master of marketing, wasn’t entirely surprised…”George would take problems to bed with him,” said Lippincott, “and [Marcia] said this caused a lot of problems.”

But it went deeper even than that; for George Lucas, movies would always be the other woman. As devoted to Marcia as he might be, there was forever one more movie, one more project, demanding the time and attention he couldn’t or wouldn’t give to his wife.

I’ve read a lot of Christian books and heard a lot of Christian speakers talk about strategies for avoiding adultery (the “Billy Graham rule” is a timely example). But I don’t believe I’ve ever read an entire chapter or listened to an entire session by a Christian on strategies for avoiding having an affair with work. Why is that? Why, for people who believe that husbands ought to sacrifice themselves for their wives as Jesus Christ sacrificed himself for his Bride, is the threat of all-consuming careerism not taken more seriously?

Right out of college I spent over a year working in the marketing department of a well-to-do mortgage firm. It wasn’t a particularly good experience, but it was at least eye-opening. I heard the way professional men and women casually talked about how little time they spent at home. If you want to know how what is supposed to sound like complaining can actually be a form of bragging, hearing professionals talk about their time away from home is an education. It was remarkable to me how easily a company that was explicitly “family-oriented” corporate rhetoric could nonetheless cultivate this kind of attitude.

Even worse, it’s scary how easily the veneer of godliness can be applied to this “other woman.” If a man is working long nights and weekends so he can spend time away from his family and with a female coworker, he would be (rightly) rebuked. But if he’s working long nights and weekends just because he derives from his career a peace and identity and thrill that home cannot match, what do we say then?  If you want to get really uncomfortable, go back to that previous sentence and replace the word “career” with the word “ministry.” The temptation for celebrity pastors to put their spouse and family on autopilot must be sore indeed when there are so many book deals and conference invitations to be gained.

It’s sobering to think how many Christians might pat themselves on the back for not being the foolish young man whom Solomon sees going by the house of the forbidden woman, when the only reason why is that they are still at the office with the “other woman.” “The safest road to hell,” said Screwtape, “is the gradual one–the gentle slope, soft underfoot, without sudden turnings, without milestones, without signposts.”