Why Facebook Won’t Just Go Away

Comparing your first profile picture to your current one is what Facebook does best.

Over the past several days I’ve seen many of my social media friends participate in what looks like a viral experiment: Post your first ever “profile picture,” no matter how old, alongside your current photo. The results are nostalgic and charming and quite fun. It’s warming to see faces, transformed (if even slightly) by time, amidst the political screeds and clickbait links. It’s a homely and encouraging way to experience social media.

It occurred to me that this is why Facebook won’t just go away, no matter how many sins it commits against privacy, our cognitive health, or politics. The one thing Facebook threatens us all with is the one very thing it’s good at: Keeping. Facebook has become a public repository of memory, a monument by which many of us can view and re-experience our past. Facebook keeps, and in keeping, it holds for users what many of us are too embarrassed to admit out loud that we want to keep: Memories, even of the mundane and routine.

There are, of course, other ways to build repositories of memory. But many of them have fallen out of fashion. Scrapbooking has lost to Instagram. Keeping a diary depends a lot on the desire and ability to write longhand, and few have either. Technological change has tethered the ability to capture life with the obligation to share and store it digitally. Outside of the social media platforms, how much physical record of their own past do most people really own? For millions, the only meaningful artifacts to their lives are on Facebook.

Almost everything Facebook does nowadays it does poorly. It is ad-infested, link-biased, creepily intelligent, and ugly to look at. It does, however, hold onto our posts, our photos, our statuses—our digital selves. Because of that, it holds onto a part of us that we know, trembling, can disappear forever with one emptying of a virtual trash bin. We signed up for Facebook because we thought it opened up our present and defined our future. Now that future is past, and we just want to go back, and can’t. And Facebook knows it.

Advertisements

The Cross vs Nostalgia

We call it “Good Friday” now. But no Christian should want to relive that day.

On the day we call Good Friday, Jesus’ disciples were foolish. They volunteered to die with him, demonstrating that they neither understood his mission or their own hearts. Hours after pledging their lives, they fled at the sight of Roman soldiers. The best friends Jesus had, the men who had spent three years by his side, stood back, denying they even knew him, as he was given over to an illegal trial for his life.

Witness after witness told lie after lie, so many lies that the Scriptures tell us that the testimonies didn’t even agree. Jesus was beaten, mocked, and pronounced guilty on the authority of actors, and nobody intervened. Nothing happened to stop the farce and return Jesus his dignity. No one stood up to corruption and injustice. He was alone.

He was alone as Pilate cowered before the crowd and ordered him flogged until his flesh hung off his bones like wet parchment. He was alone as Pilate cowered again and ordered his crucifixion, declining to announce what crime was being punished. He was alone as the weight of a wooden cross smote him into the earth. He was alone as the nails were driven with ruthless efficiency. A man who had raised little girls up from the dead was stripped naked so that federal agents could play dice with his clothing. Nothing happened, no one stopped it.

Why do we call this day “good”? This is the kind of day that we learn about in history textbooks, with black and white photos of burned bodies stacked on top of each other. This is the kind of day where we watch Planned Parenthood surgeons sift through a petri dish of humanity, looking for the most valuable of the remains. This is the kind of day where armed guards open fire on peaceful protesters, or sic dogs on children. There’s nothing remotely sentimental about the cruel injustices of “Good Friday.” So why do we call it good?

We call it good because tradition and nostalgia aren’t synonyms. The past—the realities of the faith once for all delivered to the saints—is our life. Without Calvary there is no church, there is no heaven, and there is no hope. Christians don’t believe in the idea of the atonement—they believe in the history of it. Jesus really did die on a cross, in a real part of the Middle East, surrounded by real people who really did shout for a healer and a teacher to be murdered by a government they proclaimed to hate. This isn’t just theology. It’s history. It’s our history, our tradition, and our hope.

It’s not, however, our nostalgia. Tradition is about receiving from the past; nostalgia is about disfiguring it. Nostalgia is our cultural mood right now because it affords the comfort of the past while letting us Nostalgia is superficial in essence but can be tyrannically earnest; we can try to reinvent our entire lives in the image of that which reminds us that we were once young. But for the age of nostalgia, hope is to be found in the here and now. We must be nostalgic so that we can be comforted by the past without being taught by it.

We dare not be nostalgic about Easter. Only the foolish would sentimentalize the flogging, the walk to Golgotha, and the naked, shredded flesh. To make the Passion an object of our nostalgia—to see in it only the value of our grandfather’s generation, the benefit of a “Christian nation”—is to spit upon the cross itself. It is said that in the United States are millions of “Easter and Christmas” churchgoers, those who make time in their secular existence for two hours of hymnody a year. Oh, if only these Americans could see in their holidays the blood and the gore and the evil! If only they could see the gospel in its visceral reality, and not in its Thomas Kinkadian counterfeit.

If they could–if we could–we would not look at Good Friday with nostalgia. But we would look at it, and, if God is merciful, we might never look away.

Brett McCracken (and Me) on Movies, Nostalgia, and Criticism

For the last few weeks I’ve been chatting via email with pastor, writer, and Christian film critic Brett McCracken. Brett is one of the most articulate, and consistently helpful evangelical culture writers that I know. I was eager to get some of his perspective on a variety of movie-related topics–such as the state of the industry, Christian approaches to film, the importance of critics, etc.

The conversation will continue beyond this post, but I asked Brett if I could share some of our thoughts already. He graciously agreed.

____________

Samuel: Before we talk about issues related to Christians and movies, I’d love to get your perspective on just the film industry as a whole right now. I think a lot of people thought this year’s crop of Oscar nominees was a strong one, so in one sense there’s good reason to be excited about what Hollywood is doing. But in another sense, 2016 was, a lot like previous years, a huge year for reboots, remakes and sequels. I’m not sure what you make of that trend?

Personally, I’ve not been shy about criticizing what I feel like is a dearth of creative thinking and originalism from the studios. It seems to me that this drought of fresh ideas may not be unprecedented but it does feel quite invulnerable right now. As long as superhero franchises top the box office, we’re going to get more of the same (how many different Spider-Man actors can we cram into one millennium?)

Is that the impression you have, or am I missing something?

Brett: I think your reading of the industry is correct. It seems like studios are indeed mostly interested in reboots, remakes and sequels, which is to say: proven formulas and guaranteed global moneymakers. One of the key words in that last sentence is global. As the theatrical, old-school movie experience in North America declines in popularity, in other parts of the world it has been growing. Thus, Hollywood has in the last decade or so really concentrated on overseas markets. The thing about movies that play well all over the world is that they have to be able to click with audiences across various cultural divides. This means more subtle, character-driven stories (as well as comedy, which doesn’t translate well across cultures) have less global appeal, while (you guessed it) familiar franchise films and big budget, CGI-heavy action films draw audiences everywhere.

I also think there is an interesting dynamic going on in the larger culture right now, which is a sort of obsession with nostalgia and an inability to truly innovate new things. You see this in politics, with both parties holding on to old goals and defined more by nostalgia for the past (e.g. “Make America Great Again”) than vision for the future (the inability of the GOP to unite around a healthcare vision is case-in-point). Yuval Levin’s Fractured Republic is a great book for understanding the “politics of nostalgia.”

The same thing seems to be happening in film and television. Everything is sequel/spinoff/reboot/franchise/familiar. A new Twin Peaks. Fuller House. Girl Meets World. Another Fast and the Furious movie. Another Spiderman. Another Indiana Jones. New Star Wars and Avengers and Justice League films planned until 2025 and beyond (or so it seems). Live action versions of beloved Disney animated films. Even the “original” things are driven by nostalgia. Stranger Things is soaked in 80s/Spielbergian nostalgia. La La Land is an exercise in nostalgia for a classic Hollywood genre. When news breaks that The Matrix is being rebooted, you know things are bad.

I think in one sense, nostalgia is a proven source of commercial success at a time when the industry is jittery because the whole model is changing (streaming, etc). On the other hand, there is a cultural anxiety at play wherein the present is simply too fragmented, too overwhelming, too unknowable (this is the “post-truth” / #alternativefacts era after all) to inspire contemporary narratives that resonate with mass audiences. And so Hollywood goes to the future (sci-fi, apocalypse, dystopia) or to the past, but doesn’t know what to do with the present. I recommend Douglas Rushkoff’s book Present Shock for understanding this phenomenon.

None of this is to say there are no places where innovation is happening. There are. Especially in new formats on streaming sites like Netflix. It will be interesting if these new formats inject new life and originality into Hollywood’s storytelling rut.
____________

Samuel: Your point about nostalgia and current cultural anxiety over the present is very interesting. I suppose if you were OK with the “post-9/11” cliche you could try to understand the box office since 2001 as representing a cultural thirst for morality and heroism. 2001-2002 seems to be a watershed time frame, too; Lord of the Rings and the first Harry Potter film both debuted in 2001 and immediately made fantasy the go-to genre, and then in 2002 you had Sam Raimi’s Spider-Man really re-energize the market for superhero films. But I think it’s just as plausible to see it, as you said, as a response to an increasingly fragmented, metanarrative-less public square.

Every time I talk about this I remember A.O. Scott’s essay about the death of adulthood in pop culture. His argument has been very compelling for me and, in my opinion, helps make sense of a lot of what we see from Hollywood right now. Especially helpful was his description of this movie generation as the “unassailable ascendancy of the fan,” meaning that audiences are essentially immune to film criticism because they have franchises rather than stories, and with those franchises comes a sense of belonging, the belonging of fandom. Do you think that as movies become more openly nostalgic, formulaic, and franchise-driven, the task of the movie critic becomes harder or even more irrelevant? Should critics just embrace the reboot era and judge movies on how well they resuscitate the original material, or should there be a point where critics say, “Even if this is a well-produced retread, it’s a retread, and as such its value as art is inherently handicapped”?

Brett: I think the task of the critic today is different than it was in previous eras, but no less crucial. It’s true that some franchise and tentpole films are “critic-proof,” but the rising popularity of sites like Rotten Tomatoes indicates that audiences are at least somewhat interested in critics’ opinions, even if a correlation between critical consensus and a film’s box office success is debatable.

From my perspective, the importance of the critic today is not about a “thumbs up or down” endorsement as much as it is about adding value and depth to an experience of film/TV, at a time when the overwhelming speed and glut of media leaves us with little time and few tools for processing what we’ve seen. Whether or not audiences are interested in taking time to process and understand something, rather than quickly moving on to the next thing, is an open question. I know for myself after I see a complex film, I love reading my favorite critics as a way of extending the filmgoing experience by processing it communally. The communal aspect of film/TV is being lost, as society becomes further atomized and isolated, with no two media feeds looking the same. Fan communities fill some of this communal void, but reading critics is another way we can make an otherwise solitary experience something that connects us with others.

As to the question of how critics should approach films in the reboot/franchise era, I think the task should be less about fidelity to franchise and the “did they get it right?” question, as much as simply evaluating it as a film on its own two feet. A film’s faithfulness to the “world” of the franchise is a concern for fandom. A film’s insights into the world we live in today is the concern for critics. What does a film or TV show, however nostalgic it may be for the past, say about who we are as a society today? This is a question I believe critics should be asking.

There is plenty of room for innovation and beauty within the constraints of franchise (see Nolan’s The Dark Knight, LOTR, some of the Harry Potter films, and so on), and some might argue that the limits of genre/franchise/adaptation actually can sometimes spark the most innovation. The fact that “there is nothing new under the sun” need not be a downer for critics and audiences. Great narratives, great characters and themes endure because they are great. The best artists can mine these sources in ways that are fresh and timely for today’s world.
____________
Samuel: I agree with you about the importance of good criticism. As I’m sure you have, I’ve known a lot of Christians who sort of thumbed their nose at professional criticism. I’ve been the “negative” guy in the group leaving the theater plenty of times. I think the perception many people have (and I would say this definitely more true in conservative culture) is that talking honestly about a movie’s flaws and strengths is a kind of elitism that exaggerates the importance of movies. “It’s just a movie,” “Just enjoy it,” etc etc.

Right now I’m reading Tom Nichols’ book “The Death of Expertise,” and the main argument of that book is that we are in a cultural moment in America where there is not only widespread ignorance (which is not really unique to our time) but active hostility toward knowledge and credentials (which, Nichols argues, IS unique). As someone who has watched more movies than many, probably most, other Christians, and has studied and practiced the art of criticism and analysis, how do you exercise a kind of authority in your criticism, without pretense or arrogance? If someone were to approach you and say that Terrence Malick, for example, was an untalented director who made gibberish, what’s your approach to correcting this idea–or do you? Can you argue taste after all?

(This is an important question to me because it gets at the heart of something I believe strongly about Christian culture–that we’ve failed to produce good art largely because our idea of good has been defined down to mean “family-friendly” and “inoffensive”)

Brett: I think taste is, in large part, learned. It’s why people in different cultures and contexts have taste for certain types of foods and have different conceptions of beauty. They’ve been nurtured in a certain environment where they’ve developed those tastes. So when someone doesn’t share our taste exactly, we shouldn’t begrudge them for it. But I do think it’s natural and good for us to not simply shrug and say “to each their own!” but to dialogue and try to help others see what we see in something. Lewis talks about how our desire to share something we enjoy with others is not superfluous but rather integral to our enjoyment of it: “We delight to praise what we enjoy because the praise not merely expresses but completes the enjoyment; it is its appointed consummation.”

And so if someone were to say to me that Terrence Malick is untalented and makes gibberish, I could not just say “well we see differently.” Part of my enjoyment of Malick (an integral part) is being able to share with others what I’ve discovered in his work. And so I’d do my best to not be angry and get worked up about the other person’s comments, but to share with them why I think Malick is brilliant and his films are important. This is the nature of criticism. A good critic writes not out of a place of spite but a place of love. My enjoyment of Malick’s films doesn’t stop if others struggle with them. But if I can help others through their struggles and help them appreciate Malick more, that only adds to my enjoyment.

Another thing I would say is that for film critics or any expert on anything, it’s important that you show and not just tell that something is important. What I mean is, rather than simply pronouncing “x is good,” an expert’s criticism or description of “x” should prove its goodness by being so beautiful and interesting and enlightening in its own right that readers can’t help but see x as something of value. The way Paul Schrader wrote about Ozu films made me learn to love Ozu. The way my Wheaton College professor Roger Lundin so passionately talked about Emily Dickinson made me learn to love Dickinson. The way the food critics on Chef’s Table talk about the importance of certain chefs makes me desire to try their restaurants. The way my college roommate passionately loved coffee helped me develop a more refined taste for it.

It’s not just what critics or experts say but how they say it that matters. So much of what we learn as humans comes from observation of others, from models and mimesis. Good critics and experts (and teachers of any sort) model ways of thinking about and enjoying things well. And we need to value those models now more than ever, in a society where it is easier than ever to consume things poorly, cheaply, excessively. The Nichols book sounds great, and very timely!

I would hope that when people observe how much I love Malick, how seriously I take his films and how deeply I engage them, I hope they not only gain an appreciation for Malick but also a desire to love and think deeply about other films, engaging them deeply even when it is challenging. This is what I hope my writing about film for Christian publications has done over the years: modeled a more complex, nuanced and ultimately more enriching engagement with film beyond the reductive “good = inoffensive and family friendly” approach that you rightly note has left us ill-equipped to be good creators.

Review: “Beauty and the Beast” (2017)

When the history of Hollywood’s current creative stagnation is written, we very well might regard the new live action version of Disney’s “Beauty and the Beast” as the quintessential movie of the era. It is a remarkably efficient summation both of nostalgia’s culture’s strengths and its weaknesses. Like a newly illustrated edition of your favorite novel, “Beauty and the Beast” brings color and movement to a classic story, and that’s about it. I found myself enjoying it, and then convinced afterwards that what I had been enjoying wasn’t the film itself, but the ghost that inhabited it. “Tale as old as time,” indeed.

Like many movies I see nowadays, rehashing the plot is pointless. You either know it or else decided several sentences ago to stop reading this review. Let me say instead that those who love the 1991 film will be satisfied with what they see here. Bill Condon’s version is faithful to the animated movie, almost to the point of doggedness. Entire shots are precisely recreated, and a majority of the dialogue remains unchanged. Whether you think that’s good or bad depends almost entirely on what you want from a film like this. Will seeing exactly what you’ve seen before cause you to cheer? An entire generation of film studio CEOs are banking on it.

But, as I said above, nostalgia culture has its strengths. A film that’s as deeply embedded into our cultural memory as “Beauty and the Beast” is a prime candidate for some delightful interpretation. In this version, much of that delight comes from the casting and the visuals. All of the cast are well chosen (with one crucial exception; more on that in a second), but the great Emma Thompson and Ian McKellen stand above all others. Thompson’s rendition of the film’s title song is a perfect update of Angela Lansbury’s famous performance. McKellen has a lot of fun as the valet-cum-clock Cogsworth, and Ewan McGregor suprised me with his funny, silky (if a little obviously derivative) Lumiere. Visually, the film is breathtaking, as lush and vivid and flawless as probably any live action version of this story will ever be. Everything is in order.

Everything, that is, except for Emma Watson. Watson has been sadly and egregiously miscast as Belle. This isn’t for lack of trying, mind you; Watson is a beautiful, gifted actress and she tries hard here, but she never connects with the material, and the script demands so little from her that her talents never have a chance. The problem, I suspect, is that Watson has been chosen for her physical resemblance to the animated Belle, and her role was conceived as a flesh-and-blood stand in for a character the producers had no intention of reimagining. This is a major disappointment in a category the film shouldn’t have disappointed in.

What else can I say? You know what you’re getting here. The point of fast food is that you don’t have to wonder what you’re going to get. It may not be great, but you’ve had it before, and we don’t always have time to take risks. There’s nothing wrong with some occasional fast food filmmaking. But, if the reboot era has you stressed, it’s fine dining I suggest.

Rowling In the Deep

“Fantastic Beasts” may be good entertainment, but it comes at a cost.

I have plans to see Fantastic Beasts and Where to Find Them later today. Before I do, though, I want to reiterate a version of something I’ve said several times before in this space: Regardless of how good Fantastic Beasts is, and how much I enjoy it (which, based on reviews from people I trust, may be quite a lot), I think its existence is, for the most part, a mistake, and something that sincere fans of J.K. Rowling’s work will regret in years to come.

Right now, American pop culture is absolutely trapped in a hyper-nostalgia. There are plenty of reasons to be concerned that this isn’t just a fad or a phase. Rather, it looks more like a philosophical shift in how culture makers produce stories, and how we as an audience consume them. As A.O. Scott has written, so much of our film, TV, and literature appeals to childlikeness–not childlike wonder, mind you, but childlike sense of identity. Critical conversations about meaning and narrative are being thrown aside in what Scott has called the “ascendancy of the fan,” the transformation of mainstream pop culture into a mere collection of constantly rebooted brands: Marvel vs DC, Star Wars vs Star Trek, Bourne vs Bond, etc etc, ad infinitum.

I’ve said all this before, and I’m not going to restate my many comments here. But I want to very briefly apply these concerns to Rowling and to the Harry Potter universe. I have two reasons. First, I love the Potter series and have an especial affection and admiration for it. Second, I think what Rowling is doing with her legacy is the most glaring example we have of the danger of the reboot nostalgia culture.

The Harry Potter series (books 1-7) will, I’m convinced, be read widely with delight centuries from now. A few days ago I drew the wrath of Twitter when I declared that the Potter books were, taken as a whole, better than Lewis’s Narnia series. I stand by that. That’s not a dig at Narnia, either; I just believe that the Potter series is that good, and that its genius will only be greater appreciated in the years to come.

Part of that genius is in the story’s ending. I won’t spoil it (if you haven’t read the series, I envy the joy you will take in reading it for the first time), but the best way I can put it is that Rowling ended her tale with a beautiful and poetic symmetry that brought her characters a genuinely satisfying closure. At the last turn of the page in Harry Potter and the Deathly Hallows, there is an eschatological joy in seeing good triumph over evil in a final, authoritative way.

What Rowling has done in the years since Deathly Hallows is more than marketing. She has sought to open up her mythology in a way that keeps the story going eternally. This was the point of Pottermore, a website that put users into the wizarding world through interactive content–content written by Rowling (as the ads for Pottermore made a point of repeating over and over again). Rowling’s involvement in Pottermore was clearly a pitch to fans that the story hadn’t ended, that the world was still being written and that by signing up for the service, they could be part of the new stories.

Rowling’s intentions became even more clearer with the publication of Harry Potter and the Cursed Child. Officially, the hardback copy that was sold in Barnes and Noble was simply the published script of a stage play, based on the Potter series. *Unofficially* (and again, in marketing), it was quite obviously the 8th book of the series. I never read the book, but my wife excitedly did. She was extremely disappointed, telling me that the characters of Cursed Child spoke and acted like fan fiction creations, not the heroes of books 1-7. Several reviews I saw echoed this sentiment.

The reviews for Fantastic Beasts have been much more positive, and I fully expect to enjoy it. But the pattern that Rowling has established thus far seems clear. The world of Harry Potter has been reopened, and its mythology has broken out of its original fate and is being written, and rewritten, and written again. It is, for all practical purposes, now a piece of fan fiction.

Fan fiction exists to let fans live inside their favorite stories. But one of the defining marks of all great stories is the way they live inside of us. What I fear is happening to Harry Potter is that a wonderful, beautiful piece of literature is becoming a cultural artifact to our inability to let stories teach us about this world and this life. The lessons we can draw from Harry, Ron, and Hermione are in danger of becoming lost in the constant reinvention of their world. By not letting our favorite stories end, we turn them into tools rather than teachers–objects that authenticate our childlike desire to not let go, to not courageously follow Harry outside the safety and comfort of our magical world, and into a dangerous, wild place where we have a job to do.

I want very much for succeeding generations to know the Harry Potter series as a brilliantly told, biblically haunted epic, not as another resource for Dungeons and Dragons devotees. My fear is that even in well-made films and interesting books, Harry’s lessons are lost, and we will be entertained and distracted at the cost of something precious.

Quote of the Day

A plea to J.K. Rowling.

Harry Potter and the Cursed Child is no such work. As other countless fans have pointed out, the writing of the work is mediocre, at best—full of clichés and halfhearted character development, with a plot that is absolutely riddled with holes. Many of the original characters (especially Hermione) are not true to their original selves, serving as two-dimensional copycats.

So what does the book do? Well, it keeps the Harry Potter series alive and in the limelight. It serves to inspire new fans to return to the original books. And it definitively makes money—lots of it. But that’s the extent of its virtues.

I caution you, because I think there’s a point at which truly excellent authors know how to say “enough.” Their fans can content themselves with the simplicity and beauty of a finite offering (be it one book or seven). Limiting the scope of a fictional creation enables it to stay mysterious, enchanting, and delightful. Limiting the scope of Harry Potter serves to inspire and foster the imagination of its fans more than coughing up another 20 volumes ever would.

-Gracy Olmstead, in an “open letter” to Harry Potter author J.K. Rowling that also doubles as a disappointed review of the published play, “Harry Potter and the Cursed Child.”

Olmstead gets to something important here: Churning out low-quality work, merely for the sake of keeping a franchise in the news, is not just bad for the franchise, it’s bad for the reader. No matter how many superfans will wait in line at Barnes and Noble for your newest offering, there is something in this kind of hyper-nostalgic, never-say-die mentality that robs future generations of the literary richness that comes from having some of the story untold.

Good Friday and the Age of Nostalgia

We call it “Good Friday” now. But no Christian should want to relive that day.

On the day we call Good Friday, Jesus’ disciples were foolish. They volunteered to die with him, demonstrating that they neither understood his mission or their own hearts. Hours after pledging their lives, they fled at the sight of Roman soldiers. The best friends Jesus had, the men who had spent three years by his side, stood back, denying they even knew him, as he was given over to an illegal trial for his life.

Witness after witness told lie after lie, so many lies that the Scriptures tell us that the testimonies didn’t even agree. Jesus was beaten, mocked, and pronounced guilty on the authority of actors, and nobody intervened. Nothing happened to stop the farce and return Jesus his dignity. No one stood up to corruption and injustice. He was alone.

He was alone as Pilate cowered before the crowd and ordered him flogged until his flesh hung off his bones like wet parchment. He was alone as Pilate cowered again and ordered his crucifixion, declining to announce what crime was being punished. He was alone as the weight of a wooden cross smote him into the earth. He was alone as the nails were driven with ruthless efficiency. A man who had raised little girls up from the dead was stripped naked so that federal agents could play dice with his clothing. Nothing happened, no one stopped it.

Why do we call this day “good”? This is the kind of day that we learn about in history textbooks, with black and white photos of burned bodies stacked on top of each other. This is the kind of day where we watch Planned Parenthood surgeons sift through a petri dish of humanity, looking for the most valuable of the remains. This is the kind of day where armed guards open fire on peaceful protesters, or sic dogs on children. There’s nothing remotely sentimental about the cruel injustices of “Good Friday.” So why do we call it good?

We call it good because tradition and nostalgia aren’t synonyms. The past—the realities of the faith once for all delivered to the saints—is our life. Without Calvary there is no church, there is no heaven, and there is no hope. Christians don’t believe in the idea of the atonement—they believe in the history of it. Jesus really did die on a cross, in a real part of the Middle East, surrounded by real people who really did shout for a healer and a teacher to be murdered by a government they proclaimed to hate. This isn’t just theology. It’s history. It’s our history, our tradition, and our hope.

It’s not, however, our nostalgia. Tradition is about receiving from the past; nostalgia is about disfiguring it. Nostalgia is our cultural mood right now because it affords the comfort of the past while letting us Nostalgia is superficial in essence but can be tyrannically earnest; we can try to reinvent our entire lives in the image of that which reminds us that we were once young. But for the age of nostalgia, hope is to be found in the here and now. We must be nostalgic so that we can be comforted by the past without being taught by it.

We dare not be nostalgic about Easter. Only the foolish would sentimentalize the flogging, the walk to Golgotha, and the naked, shredded flesh. To make the Passion an object of our nostalgia—to see in it only the value of our grandfather’s generation, the benefit of a “Christian nation”—is to spit upon the cross itself. It is said that in the United States are millions of “Easter and Christmas” churchgoers, those who make time in their secular existence for two hours of hymnody a year. Oh, if only these Americans could see in their holidays the blood and the gore and the evil! If only they could see the gospel in its visceral reality, and not in its Thomas Kinkadian counterfeit.

If they could, if we could, we would not look at Good Friday with nostalgia. But we would look at it, and, if God is merciful, we might never look away.

“The Force Awakens” and Getting Trapped By Nostalgia

In conversations with friends about the new Star Wars movie, I’ve noticed two trends. The first is that most of the people I’ve talked to report enjoying the movie quite a bit (and that makes sense, seeing as how the film is scoring very well on the critic aggregation site Rotten Tomatoes). The second trend is that virtually no one has criticized The Force Awakens for being too much like the original Star Wars trilogy. Indeed, the opposite seems to be true: Most people who have told me how much they like Episode VII have mentioned its similarity, both in feel and in plot, to George Lucas’s first three Star Wars films as a reason why they like it so much.

For the record, I enjoyed The Force Awakens quite a bit, and J.J. Abrams’ homage to the golden moments of the original films was, I thought, well done. But many of my conversations about it have confirmed to me what I suspected when Episode VII was announced: We’re trapped in a cultural moment of nostalgia, and we can’t get out of it.

Of course, the nostalgia-entrapment begins with the existence of movies like The Force Awakens. As I’ve said before, as much as I love Star Wars, the fact that a 40 year old franchise is still dominating the box office, news cycle, and cultural attention is not something to be excited about. There comes a point when tradition becomes stagnation, and at least in American mainstream film culture, it seems like that line was crossed some time ago. Case in point: Included in my screening of Star Wars were trailers for a Harry Potter spinoff, another Captain America film, an inexplicable sequel to Independence Day, and yet *another* X-Men movie.  In other words, had an audience member in my theater just awoken from a 12 year coma, they would have seen virtually nothing that they hadn’t seen before.

Nostalgia, if unchecked, runs opposed to creativity, freshness, and imagination. Even worse, the dominance of nostalgia in American pop culture has a powerful influence in marketing, making it less likely every year that new storytellers with visions of new worlds, new characters and new adventures will get the financing they need to materialize their talents. That is a particularly disheartening fact when you consider that the storytellers whose work has spawned a generation’s worth of reboots and sequels were themselves at one point the “unknowns:” George Lucas couldn’t find a studio to finance Star Wars until an executive at 2oth Century Fox took a risk on a hunch; Steven Spielberg finished “Jaws” with much of Universal’s leadership wanting to dump both movie and director; and for much of the filming of “The Godfather,” executives of Paramount openly campaigned to fire writer/director Francis Ford Coppola. If formula and nostalgia had been such powerful cultural forces back then, there’s a good chance there’d be no Star Wars to make sequels for at all.

The trap of nostalgia is deceitful. It exaggerates the happiness of the past, then preys on our natural fear that the future will not be like that. But this illusion is easily dismantled, as anyone who has discovered the joys of a new story can attest.

There’s a freedom and a pleasure in letting stories end, in closing the book or rolling the final credits on our beloved tales. The need to resurrect our favorite characters and places through the sequel or the reboot isn’t a need based in the deepest imaginative joys. It is good that stories end rather than live on indefinitely so that we treasure them as we ought and lose ourselves in a finite universe rather than blur the lines in our mind between the truth in our stories and the truth in our lives. If we cannot allow myths to have definite beginnings and endings, it could be that we are idolatrously looking to them not for truth or grace but for a perpetual youthfulness.

Of course, there are dangers on the other side too. An insatiable craving for the new can be a sign of the weightless of our own souls. A disregard for tradition can indicate a ruthless self-centeredness. And, as C.S. Lewis reminded us, novelty is not a virtue and oldness is not a vice.

But we should be careful to distinguish between a healthy regard for those that come before us, and a nostalgia that (unwittingly) devalues tradition by ignoring how and why it exists. In the grand scheme of things, how many Star Wars films get made is probably not of paramount importance. But being trapped by nostalgia has its price. An irrational love of the past can signal a crippling fear of the future. Christians are called to lay aside the weight of fear and follow the gospel onward. If we’re not even willing to learn what life is like without a new Star Wars or Harry Potter, how can we do that?