Harvey Weinstein vs Billy Graham

In Ross Douthat’s weekend op-ed on the lecherous Harvey Weinstein, he makes, in passing, what I think is a very important point:

If liberals want to restrain the ogres in their midst, a few conservative ideas might be helpful.

First: Some modest limits on how men and women interact professionally are useful checks on predation. Many liberals were horrified by the revelation that for a time Mike Pence avoided one-on-one meetings with women not his wife. But one can find the Pence rules too sweeping and still recognize that life is easier for women if their male bosses don’t feel entitled to see them anywhere, anytime. It would not usher in the Republic of Gilead if it were understood that inviting your female subordinate to your hotel room, Weinstein-style, crosses a line in a way that a restaurant lunch does not.

Consistent readers of this blog may remember the Pence controversy to which Ross refers. It was over something that has been called in evangelical life “the Billy Graham rule,” named for the evangelist’s self-imposed mandate that he would not meet alone with another woman without his wife present. After it came out that Vice President Pence practiced a similar mandate, the internet exploded with accusations of sexism, Puritannery, and unfeeling obliviousness to the career struggles of women. Some of those accusations, in fact, came from evangelicals, who lamented such “sexualizing” of male-female relationships.

Would the Billy Graham rule have prevented exploitation like that seen by Harvey Weinstein? To answer that, we might do well to hear the words of one of Weinstein’s victims:

In 2014, Mr. Weinstein invited Emily Nestor, who had worked just one day as a temporary employee, to the same hotel and made another offer: If she accepted his sexual advances, he would boost her career, according to accounts she provided to colleagues who sent them to Weinstein Company executives. The following year, once again at the Peninsula, a female assistant said Mr. Weinstein badgered her into giving him a massage while he was naked, leaving her “crying and very distraught,” wrote a colleague, Lauren O’Connor, in a searing memo asserting sexual harassment and other misconduct by their boss.

…Ms. Nestor, a law and business school student, accepted Mr. Weinstein’s breakfast invitation at the Peninsula because she did not want to miss an opportunity, she later told colleagues. After she arrived, he offered to help her career while boasting about a series of famous actresses he claimed to have slept with, according to accounts that colleagues compiled after hearing her story and then sent on to company executives.

Emily Nestor was looking for a business meeting. She found sexual harassment instead. That speaks far, far more to Weinstein’s character than anything or anyone else. But is it really “sexist” to think there’s a problem deep in the equation when a female employee feels professionally obligated to meet a powerful male executive in his hotel room? And does one unnecessarily “sexualize” male-female dynamics by suggesting that powerless females are more vulnerable to powerful males in the absence of tangible, navigable rules?

I understand where BG rule critics are coming from when they regret that male-female relationships in the church are subjected to sexual scrutiny, at the cost of authentic friendship and/or professional respect. That’s a valid concern, and we ought to be sensitive to it. Elders and male professionals don’t get to hide behind “wisdom” or “discernment” as a way of muting input from women or avoiding transparency. Nor should Christians aspire to mincemeat spirituality that is so highly gendered it results in a bifurcated church community (“Men, you do life with men, and women y’all go off and have your Bible study too”).

But the Harvey Weinsteins and Bill O’Reillys of the world are reminders that women in subordinate positions to men often feel pressured into closeness, and that this pressure almost always serves male libido and ego more than it serves women. If women often do not have the professional/economic leverage to afford rare or nonexistent access to male leadership (and I think that’s often true), how much less do they have leverage to refuse a meeting or a conversation because of uncomfortable circumstances?

The weakness of rules is that they don’t always take into account mitigating circumstances and can fail to meet the needs of the moment. But the strength of rules is that you don’t have to impugn someone’s motives in order to enforce them. Rules are there even when the people come and go. Not meeting alone with a member of the opposite sex entails not meeting them alone in a hotel room. Seems reasonable to me–and I have a feeling it would have seemed reasonable to Emily Nestor.

Advertisements

Brett McCracken (and Me) on Movies, Nostalgia, and Criticism

For the last few weeks I’ve been chatting via email with pastor, writer, and Christian film critic Brett McCracken. Brett is one of the most articulate, and consistently helpful evangelical culture writers that I know. I was eager to get some of his perspective on a variety of movie-related topics–such as the state of the industry, Christian approaches to film, the importance of critics, etc.

The conversation will continue beyond this post, but I asked Brett if I could share some of our thoughts already. He graciously agreed.

____________

Samuel: Before we talk about issues related to Christians and movies, I’d love to get your perspective on just the film industry as a whole right now. I think a lot of people thought this year’s crop of Oscar nominees was a strong one, so in one sense there’s good reason to be excited about what Hollywood is doing. But in another sense, 2016 was, a lot like previous years, a huge year for reboots, remakes and sequels. I’m not sure what you make of that trend?

Personally, I’ve not been shy about criticizing what I feel like is a dearth of creative thinking and originalism from the studios. It seems to me that this drought of fresh ideas may not be unprecedented but it does feel quite invulnerable right now. As long as superhero franchises top the box office, we’re going to get more of the same (how many different Spider-Man actors can we cram into one millennium?)

Is that the impression you have, or am I missing something?

Brett: I think your reading of the industry is correct. It seems like studios are indeed mostly interested in reboots, remakes and sequels, which is to say: proven formulas and guaranteed global moneymakers. One of the key words in that last sentence is global. As the theatrical, old-school movie experience in North America declines in popularity, in other parts of the world it has been growing. Thus, Hollywood has in the last decade or so really concentrated on overseas markets. The thing about movies that play well all over the world is that they have to be able to click with audiences across various cultural divides. This means more subtle, character-driven stories (as well as comedy, which doesn’t translate well across cultures) have less global appeal, while (you guessed it) familiar franchise films and big budget, CGI-heavy action films draw audiences everywhere.

I also think there is an interesting dynamic going on in the larger culture right now, which is a sort of obsession with nostalgia and an inability to truly innovate new things. You see this in politics, with both parties holding on to old goals and defined more by nostalgia for the past (e.g. “Make America Great Again”) than vision for the future (the inability of the GOP to unite around a healthcare vision is case-in-point). Yuval Levin’s Fractured Republic is a great book for understanding the “politics of nostalgia.”

The same thing seems to be happening in film and television. Everything is sequel/spinoff/reboot/franchise/familiar. A new Twin Peaks. Fuller House. Girl Meets World. Another Fast and the Furious movie. Another Spiderman. Another Indiana Jones. New Star Wars and Avengers and Justice League films planned until 2025 and beyond (or so it seems). Live action versions of beloved Disney animated films. Even the “original” things are driven by nostalgia. Stranger Things is soaked in 80s/Spielbergian nostalgia. La La Land is an exercise in nostalgia for a classic Hollywood genre. When news breaks that The Matrix is being rebooted, you know things are bad.

I think in one sense, nostalgia is a proven source of commercial success at a time when the industry is jittery because the whole model is changing (streaming, etc). On the other hand, there is a cultural anxiety at play wherein the present is simply too fragmented, too overwhelming, too unknowable (this is the “post-truth” / #alternativefacts era after all) to inspire contemporary narratives that resonate with mass audiences. And so Hollywood goes to the future (sci-fi, apocalypse, dystopia) or to the past, but doesn’t know what to do with the present. I recommend Douglas Rushkoff’s book Present Shock for understanding this phenomenon.

None of this is to say there are no places where innovation is happening. There are. Especially in new formats on streaming sites like Netflix. It will be interesting if these new formats inject new life and originality into Hollywood’s storytelling rut.
____________

Samuel: Your point about nostalgia and current cultural anxiety over the present is very interesting. I suppose if you were OK with the “post-9/11” cliche you could try to understand the box office since 2001 as representing a cultural thirst for morality and heroism. 2001-2002 seems to be a watershed time frame, too; Lord of the Rings and the first Harry Potter film both debuted in 2001 and immediately made fantasy the go-to genre, and then in 2002 you had Sam Raimi’s Spider-Man really re-energize the market for superhero films. But I think it’s just as plausible to see it, as you said, as a response to an increasingly fragmented, metanarrative-less public square.

Every time I talk about this I remember A.O. Scott’s essay about the death of adulthood in pop culture. His argument has been very compelling for me and, in my opinion, helps make sense of a lot of what we see from Hollywood right now. Especially helpful was his description of this movie generation as the “unassailable ascendancy of the fan,” meaning that audiences are essentially immune to film criticism because they have franchises rather than stories, and with those franchises comes a sense of belonging, the belonging of fandom. Do you think that as movies become more openly nostalgic, formulaic, and franchise-driven, the task of the movie critic becomes harder or even more irrelevant? Should critics just embrace the reboot era and judge movies on how well they resuscitate the original material, or should there be a point where critics say, “Even if this is a well-produced retread, it’s a retread, and as such its value as art is inherently handicapped”?

Brett: I think the task of the critic today is different than it was in previous eras, but no less crucial. It’s true that some franchise and tentpole films are “critic-proof,” but the rising popularity of sites like Rotten Tomatoes indicates that audiences are at least somewhat interested in critics’ opinions, even if a correlation between critical consensus and a film’s box office success is debatable.

From my perspective, the importance of the critic today is not about a “thumbs up or down” endorsement as much as it is about adding value and depth to an experience of film/TV, at a time when the overwhelming speed and glut of media leaves us with little time and few tools for processing what we’ve seen. Whether or not audiences are interested in taking time to process and understand something, rather than quickly moving on to the next thing, is an open question. I know for myself after I see a complex film, I love reading my favorite critics as a way of extending the filmgoing experience by processing it communally. The communal aspect of film/TV is being lost, as society becomes further atomized and isolated, with no two media feeds looking the same. Fan communities fill some of this communal void, but reading critics is another way we can make an otherwise solitary experience something that connects us with others.

As to the question of how critics should approach films in the reboot/franchise era, I think the task should be less about fidelity to franchise and the “did they get it right?” question, as much as simply evaluating it as a film on its own two feet. A film’s faithfulness to the “world” of the franchise is a concern for fandom. A film’s insights into the world we live in today is the concern for critics. What does a film or TV show, however nostalgic it may be for the past, say about who we are as a society today? This is a question I believe critics should be asking.

There is plenty of room for innovation and beauty within the constraints of franchise (see Nolan’s The Dark Knight, LOTR, some of the Harry Potter films, and so on), and some might argue that the limits of genre/franchise/adaptation actually can sometimes spark the most innovation. The fact that “there is nothing new under the sun” need not be a downer for critics and audiences. Great narratives, great characters and themes endure because they are great. The best artists can mine these sources in ways that are fresh and timely for today’s world.
____________
Samuel: I agree with you about the importance of good criticism. As I’m sure you have, I’ve known a lot of Christians who sort of thumbed their nose at professional criticism. I’ve been the “negative” guy in the group leaving the theater plenty of times. I think the perception many people have (and I would say this definitely more true in conservative culture) is that talking honestly about a movie’s flaws and strengths is a kind of elitism that exaggerates the importance of movies. “It’s just a movie,” “Just enjoy it,” etc etc.

Right now I’m reading Tom Nichols’ book “The Death of Expertise,” and the main argument of that book is that we are in a cultural moment in America where there is not only widespread ignorance (which is not really unique to our time) but active hostility toward knowledge and credentials (which, Nichols argues, IS unique). As someone who has watched more movies than many, probably most, other Christians, and has studied and practiced the art of criticism and analysis, how do you exercise a kind of authority in your criticism, without pretense or arrogance? If someone were to approach you and say that Terrence Malick, for example, was an untalented director who made gibberish, what’s your approach to correcting this idea–or do you? Can you argue taste after all?

(This is an important question to me because it gets at the heart of something I believe strongly about Christian culture–that we’ve failed to produce good art largely because our idea of good has been defined down to mean “family-friendly” and “inoffensive”)

Brett: I think taste is, in large part, learned. It’s why people in different cultures and contexts have taste for certain types of foods and have different conceptions of beauty. They’ve been nurtured in a certain environment where they’ve developed those tastes. So when someone doesn’t share our taste exactly, we shouldn’t begrudge them for it. But I do think it’s natural and good for us to not simply shrug and say “to each their own!” but to dialogue and try to help others see what we see in something. Lewis talks about how our desire to share something we enjoy with others is not superfluous but rather integral to our enjoyment of it: “We delight to praise what we enjoy because the praise not merely expresses but completes the enjoyment; it is its appointed consummation.”

And so if someone were to say to me that Terrence Malick is untalented and makes gibberish, I could not just say “well we see differently.” Part of my enjoyment of Malick (an integral part) is being able to share with others what I’ve discovered in his work. And so I’d do my best to not be angry and get worked up about the other person’s comments, but to share with them why I think Malick is brilliant and his films are important. This is the nature of criticism. A good critic writes not out of a place of spite but a place of love. My enjoyment of Malick’s films doesn’t stop if others struggle with them. But if I can help others through their struggles and help them appreciate Malick more, that only adds to my enjoyment.

Another thing I would say is that for film critics or any expert on anything, it’s important that you show and not just tell that something is important. What I mean is, rather than simply pronouncing “x is good,” an expert’s criticism or description of “x” should prove its goodness by being so beautiful and interesting and enlightening in its own right that readers can’t help but see x as something of value. The way Paul Schrader wrote about Ozu films made me learn to love Ozu. The way my Wheaton College professor Roger Lundin so passionately talked about Emily Dickinson made me learn to love Dickinson. The way the food critics on Chef’s Table talk about the importance of certain chefs makes me desire to try their restaurants. The way my college roommate passionately loved coffee helped me develop a more refined taste for it.

It’s not just what critics or experts say but how they say it that matters. So much of what we learn as humans comes from observation of others, from models and mimesis. Good critics and experts (and teachers of any sort) model ways of thinking about and enjoying things well. And we need to value those models now more than ever, in a society where it is easier than ever to consume things poorly, cheaply, excessively. The Nichols book sounds great, and very timely!

I would hope that when people observe how much I love Malick, how seriously I take his films and how deeply I engage them, I hope they not only gain an appreciation for Malick but also a desire to love and think deeply about other films, engaging them deeply even when it is challenging. This is what I hope my writing about film for Christian publications has done over the years: modeled a more complex, nuanced and ultimately more enriching engagement with film beyond the reductive “good = inoffensive and family friendly” approach that you rightly note has left us ill-equipped to be good creators.

Why Do Liberals Love Harry Potter?

My favorite read of the day is this article on understanding why progressives, especially millennials in the Obama to post-Obama era, are so in love with using the Harry Potter stories as metaphors for America’s current cultural moment. The author has an interesting theory, one that I (mostly) agree with: American liberals love Harry Potter because of Hogwarts. To be specific, they love the idea that schools are reliable bastions of legitimate authority.

Excerpt:

High school movies of the 80s were obsessed with the illegitimacy of schools’ authority; Matthew Broderick hacks into his high school’s computer in both Ferris Bueller and Wargames, to make a mockery of the so-called permanent record, and John Hughes’s movies in general are always focused on the improvisatory genius of children and adolescents and the dull brutish obsessions of school personnel…

This is a remarkable contrast with the Harry Potter films, which (partly due to the superfluity of British acting talent available to the various directors) often make Dumbledore and the various Hogwarts teachers far grander and more impressive than the teenage protagonists…

From an outside perspective, Harry Potter is a funny fantasy for liberals to cohere around. Going off to centuries-old boarding school where your mum and dad were Head Boy and Head Girl, where tolerance and broadmindedness consists of admitting that  lower-class Muggles can occasionally have the same genetically-mediated gifts as the gentry, where the greatest possible action for a woman is to let herself be slain so her son can grow up to revenge himself on her killer…all sounds more reactionary than progressive. But if contemporary liberalism is the ideology of imperial academia, funneled through media and non-profits and governmental agencies but responsible ultimately only to itself, the obsession with Harry Potter makes a lot more sense.

This is an interesting take, and I think the author rightly connects the romanticism of Hogwarts to the self-perception of the educated, technocratic progressive class. Hogwarts is attractive to liberals not mainly because they desire the world it depicts, but because they sincerely believe the world it depicts is the one that they (via the university) have created. The contrast the author draws between the cruel, dimwitted authority figures of the 80s high school comedies and the near saintlike teachers at Hogwarts is perceptive. Cynicism toward established authority was once considered a liberal rejection of conservative social order. Now, reverence toward the academy–and those who work it–is non-negotiable.

But I have another theory. In John Granger’s indispensable book How Harry Cast His Spell, Granger persuasively demonstrates how Rowling’s Harry Potter novels appropriate the most important narrative traditions of Western history. The 7 books tell a unified hero story that deliberately evokes Western mythology (I’m using that word to mean both fairy tales and historical narratives, such as Scripture, that become significant literary developments in Western thought). The gospel, the Odyssey, Camelot–these and more myths are the narrative mold around which the Potter stories are formed.

As the author of the blog notes, much about the Harry Potter series seems conservative. Harry Potter is culturally conservative in ways that don’t seem to bother liberal presuppositions. Voldemort and his followers are enemies of diversity–that much is clear. But it’s also true that Hogwarts is not exactly a factory of self-determination. Everyone gets sorted into houses–notably, students can desire a particular house, but they do not determine it–and these houses impose a preexisting shape of life onto the students. This doesn’t seem to upset the modern progressive reader, perhaps because in the course of the story, the students who most stridently do their own thing end up consistently being the biggest heroes. What gets lost in the glorification of the Boy Who Lived is the fact that he lived because of the actions of another (his mother!!), and that his heroic journey is empowered not by self-authentication, but by the wisdom and traditional forms of his mentors (Dumbledore chiefly).

So why do liberals love Harry Potter? I think it may be because Harry Potter is a reminder, however dim, of what a world that ennobles human aspiration without the shadow of the sexual revolution would look like. The American Left is deeply mired in its own self-destructive contradictions. Its aspiration for a truly self-authenticated existence is eviscerated by its insistence on cutting the legs out from under community and tradition. Rowling’s tale is a of a world where this tradeoff is unnecessary.  What’s true of Hogwarts is true of Harry Potter as a whole: This is a place where people and choices matter, where you really can be a hero–just not alone.

Why We Need the ‘Elitism’ of the Oscars

Mathematically speaking, the odds are that if you A) purchased a ticket to a movie in 2015 and B) watch the upcoming Academy Awards telecast on Sunday night, you C) won’t see your favorite movies from last year win…well, anything. The New York Times observed last year that the Oscars still represent a startlingly large discontinuity between the films honored by the Academy of Motion Picture Arts and Sciences and those honored with the almighty dollar by the American public. Case in point from last year: Whereas nominee American Sniper earned over $300 million domestically and only earned a technical award at the 2015 show,  Best Picture-winner Birdman grossed less than a tenth of that. Put those facts together and you get a sparsely-watched telecast and Oscar elitism:

“It’s sad, but most people have to finally accept that the Oscars have become, well, elitist and not in step with anything that is actually popular,” said Philip Hallman, a film studies librarian at the University of Michigan. “No one really believes anymore that the films they chose are the ones that are going to last over time…”

It wasn’t supposed to be this way: In 2009, Academy officials increased their field of best picture nominees, from five to a maximum of 10, in a bid to embrace large, world-spanning films — “The Dark Knight,” “Inception” — that are the pinnacle of populist art. The plan was to shift the Oscars back toward relevancy, “a history where most of the winning films were also popular with the audience,” as Mr. Hallman put it on Monday.

That strategy failed, of course, because it was perfunctory. If you see your job, as Academy voters do, as rewarding the year’s very best-made and most artistically compelling films, increasing the number of nominees you *must* have is merely spreading the vegetables around on your plate before ignoring them again. There was never any reason to believe that five slots in the Best Picture category were excluding movies that ought to win; as this article says, the purpose of the change was to tell the American public, “Hey, we’re watching the same movies as you–we promise!”

But is this reassurance even a good thing?

The Oscars are indeed “elitist” and have been for a very long time, if by “elitist” you mean “Consciously choosing to not see the film industry the way most Americans see it.” But such “elitism” is actually the heart of why the Oscars still matter. For the awards to not be elitist in a meaningful way would be for them to become utterly meaningless.

Unlike the Grammys and Emmys, the Academy Awards frequently honors work that isn’t “successful” by popular industry standards. Oscar-winning films can lack both the power of distribution and rich marketing funds that major pictures–the kind you’re likely to see a huge cardboard display for at your local mall theater–thrive on.

In other words, the Oscars don’t just reward studios with market research teams and lavish PR campaigns. They honor filmmakers and films. Call it elitism if you want, but that is exactly what every industry needs–incentive for innovation that goes beyond corporatism.

That’s not the only good thing about the Academy’s”elitism” either. A healthy dose of film snobbery is welcome if it even slightly punctures the asphyxiating creative stagnation that characterizes Hollywood right now. For more than a decade now, the American box office has become a practical altar to the franchise, the sequel and the recycled comic book story. It’s worse than you think; since 2002, only two non-franchise, non-sequel movies have topped the yearly box office. The two films? James Cameron’s highly derivative Avatar and Disney’s Frozen, both of which have sequels currently in development. Also since 2002, the Spider-Man, Superman, and Batman franchises have each been rebooted twice, and Pirates of the Caribbean and the intolerable Transformers series have each had *four* installments, all of them major hits (Transformers: Age of Extinction topped the entire box office in 2014 despite scoring a Rob Schneider-like 18% at Rotten Tomatoes). And of course, the box office will now continue to be dominated by the Star Wars franchise, after The Force Awakens obliterated records and proved to the film industry once again the financial wisdom of repackaging twice-told tales.

The American public simply isn’t very good at going to movies right now. New York Times film critic A.O. Scott, in one of 2014’s most important essays, contemplated the infantilizing of both our entertainment and our lifestyles. Scott characterized the current generation of pop culture as the “unassailable ascendency of the fan,” through which serious (=adult) consideration of meaning and symbolism are replaced with childlike loyalty to never-ending franchises that are essentially live-action cartoons. What’s lost in this phase is a realistic sense of what our world is like, and how to respond to it through art.

Even if you don’t pine for the years of “gritty,” existentially harsh films like Raging Bull and Midnight Cowboy, there’s something to be said for films that don’t need superhero paradigms in order to tell a rich story. This year’s list of Best Picture contenders is a particularly rich palate: Human perseverance against nature in The Martian and The Revenant, or the quest for truth and justice in Spotlight and Bridge of Spies. Most Americans would never think to dedicate a Saturday to a film like Brooklyn or Room if it weren’t for a healthy critical culture that highlights great storytelling in a dim commercial context.

The Oscars serve our culture by recognizing stories and storytellers. Film critics provide the public with a small yet often effective antidote to the monotony and meaninglessness of Memorial Day weekend openings. It is good for the everyday, working class moviegoer to know that there are alternatives to the blockbusters. Call it elitism if you want. It’s the good kind.

“The Force Awakens” and Getting Trapped By Nostalgia

In conversations with friends about the new Star Wars movie, I’ve noticed two trends. The first is that most of the people I’ve talked to report enjoying the movie quite a bit (and that makes sense, seeing as how the film is scoring very well on the critic aggregation site Rotten Tomatoes). The second trend is that virtually no one has criticized The Force Awakens for being too much like the original Star Wars trilogy. Indeed, the opposite seems to be true: Most people who have told me how much they like Episode VII have mentioned its similarity, both in feel and in plot, to George Lucas’s first three Star Wars films as a reason why they like it so much.

For the record, I enjoyed The Force Awakens quite a bit, and J.J. Abrams’ homage to the golden moments of the original films was, I thought, well done. But many of my conversations about it have confirmed to me what I suspected when Episode VII was announced: We’re trapped in a cultural moment of nostalgia, and we can’t get out of it.

Of course, the nostalgia-entrapment begins with the existence of movies like The Force Awakens. As I’ve said before, as much as I love Star Wars, the fact that a 40 year old franchise is still dominating the box office, news cycle, and cultural attention is not something to be excited about. There comes a point when tradition becomes stagnation, and at least in American mainstream film culture, it seems like that line was crossed some time ago. Case in point: Included in my screening of Star Wars were trailers for a Harry Potter spinoff, another Captain America film, an inexplicable sequel to Independence Day, and yet *another* X-Men movie.  In other words, had an audience member in my theater just awoken from a 12 year coma, they would have seen virtually nothing that they hadn’t seen before.

Nostalgia, if unchecked, runs opposed to creativity, freshness, and imagination. Even worse, the dominance of nostalgia in American pop culture has a powerful influence in marketing, making it less likely every year that new storytellers with visions of new worlds, new characters and new adventures will get the financing they need to materialize their talents. That is a particularly disheartening fact when you consider that the storytellers whose work has spawned a generation’s worth of reboots and sequels were themselves at one point the “unknowns:” George Lucas couldn’t find a studio to finance Star Wars until an executive at 2oth Century Fox took a risk on a hunch; Steven Spielberg finished “Jaws” with much of Universal’s leadership wanting to dump both movie and director; and for much of the filming of “The Godfather,” executives of Paramount openly campaigned to fire writer/director Francis Ford Coppola. If formula and nostalgia had been such powerful cultural forces back then, there’s a good chance there’d be no Star Wars to make sequels for at all.

The trap of nostalgia is deceitful. It exaggerates the happiness of the past, then preys on our natural fear that the future will not be like that. But this illusion is easily dismantled, as anyone who has discovered the joys of a new story can attest.

There’s a freedom and a pleasure in letting stories end, in closing the book or rolling the final credits on our beloved tales. The need to resurrect our favorite characters and places through the sequel or the reboot isn’t a need based in the deepest imaginative joys. It is good that stories end rather than live on indefinitely so that we treasure them as we ought and lose ourselves in a finite universe rather than blur the lines in our mind between the truth in our stories and the truth in our lives. If we cannot allow myths to have definite beginnings and endings, it could be that we are idolatrously looking to them not for truth or grace but for a perpetual youthfulness.

Of course, there are dangers on the other side too. An insatiable craving for the new can be a sign of the weightless of our own souls. A disregard for tradition can indicate a ruthless self-centeredness. And, as C.S. Lewis reminded us, novelty is not a virtue and oldness is not a vice.

But we should be careful to distinguish between a healthy regard for those that come before us, and a nostalgia that (unwittingly) devalues tradition by ignoring how and why it exists. In the grand scheme of things, how many Star Wars films get made is probably not of paramount importance. But being trapped by nostalgia has its price. An irrational love of the past can signal a crippling fear of the future. Christians are called to lay aside the weight of fear and follow the gospel onward. If we’re not even willing to learn what life is like without a new Star Wars or Harry Potter, how can we do that?