Death By Minutia

So many things that we modern people add to our lives are utterly trivial. This is a spiritual AND political problem.

There is darkness without, and when I die there will be darkness within. There is no splendour, no vastness anywhere, only triviality for a moment, and then nothing. 

This is bleak stuff from the philosopher Bertrand Russell, who, as an atheist, rejected any transcendent meaning to life or death. The best a sentient being can hope for, Russell argued, was “triviality for a moment.” Had professor Russell lived to see the age of cable news and social media, he probably would have been even more convinced of this. If you’re looking for a powerful argument for this kind of gloomy nihilism, you could do worse than the amount of triviality that drives our cultural consciousness. How difficult is it to hold forth that life is not meaningless when so much of what we give our attention is?

Trivialities shape the modern, Western soul. Our weeks and years are busier than ever and yet many report deep dissatisfaction and disillusionment. Technology has streamlined our work and curated our relationships, engineering existence for maximum efficiency, while depression, anxiety, and loneliness seem to be the most reliable fruits. Why is this? At least partially it is because a lopsided share of the things that we moderns add to our lives does not matter. They produce exhaustion but not meaning. Even many of the things that trigger outrage and righteous indignation are utterly insignificant. Politically, pscyhologically, and even spiritually, minutia is killing us.

Consider a pair of helpful illustrations from the recent news cycle. The New York Times hired a technology writer named Sarah Jeong for their editorial page. Not long afterwards, several Twitter users, including many conservative journalists, had unearthed a lot of Jeong’s old Tweets in which she quite plainly expresses contempt and dislike for white people, especially white men. Almost faster than you could read all the screenshots, a small library of thinkpieces was published from both ideological sides of the American blogosphere. Left publications like Vox and The New Republic defended Jeong and her Tweets as misrepresented victims of a racist, right-wing smear campaigns. On the other hand, others wrote that Jeong’s Tweets were clearly racist and the Left’s defense of her hire by the Times was gross hypocrisy from the social justice movement.

This type of thing is almost totally irresistible to people like me, who invest time and energy in the online world of ideas. I got sucked in. I knew it was dumb, meaningless, and a waste of time, but the neural reward patterns were too much to overcome. I found myself reading thinkpieces that enraged me, scanning Twitter accounts for something to either vindicate my opinions or further anger me, and imagining all the various evils that this episode revealed about my ideological opposites. It was a thrilling exercise. I felt alive and in the know, already planning to write something that would head off the conversation among the friends I just knew must be having tons of private conversations about this Trending Topic. I went to bed full of righteous invective and eager to meet the next morning with my weapon: my “take.”

I woke up the next morning embarrassed and frustrated that I had wasted last night.  Sarah Jeong has no influence in my life, wherever she works. I had no idea who she was until I suddenly had strong opinions about her (and if I’m being honest, I didn’t really know anything about her even afterwards). An evening’s worth of attention and angst had been spilled over some journalist’s handful of 180-character sentences. I had absolutely nothing to show for my absorption, except for another ride on social media’s outrage-go-round. Worst of all, I knew I had deepened my dependance on outrage to get me thinking. Awful.

Mine is a common experience. Twitter thrives on addicting its users to triviality. Its engineers and programmers know, and in some cases admit, that the platform relies on negative emotion to drive up clicks. Stories like Sarah Jeong’s are an analytics counter’s dream come true: A polarizing trending topic that whips up strong tribal emotions but offers little offline substance. The drama is wholly contained within the frenetic subculture of social media and blogs. Sermonizing and demonizing is fine even if nobody is talking about the issue this time next week, because the point is not meaningful discourse, but per-click ad revenue. Everybody wins, except your brain.

Of course, not everything that trends on social media is trivial. Twitter at its most useful is a hub of informed conversation that offers an invaluable view into the people and places that make up the news. Consider the recent revelations of widespread abuse cover-up in the Catholic dioceses of Pennsylvania. While the bare legal facts are available in any traditional media outlet, reading the comments, prayers, and (yes) arguments of Catholics who are reckoning with these horrors gives me an insight into how real people are thinking about and responding to these stories, not to mention a fresh empathy and even a sense of Christian burden-sharing. That’s far beyond the capability of any journalistic institution.

But in order for this positive effect to be monetized, it has to be inexorably dependent on minutia. My Twitter feed must, by industrial necessity, offer me three doses of triviality for every one dose of significance. Even if I’m zeroed in on following the conversation and developments of the sex abuse scandals, Kanye West’s politics, or the latest protest at Starbucks, or the inchoate rants of some Reddit men’s rights activist (and the equally inchoate “clapbacks” to the same) are all pushed in my face. Truly meaningful words are buried like fossils in the sediment of minutia. This is the way Silicon Valley wants it, because it’s minutia, not meaning, that cheaply and efficiently captivates my attention.

A prime example of how meaning and minutia are purposefully conflated, to the benefit of tech like Twitter,  is Donald Trump recent insult of basketball superstar LeBron James and journalist Don Lemon. The President of the United States denigrated both James and Lemon’s intelligence before saying “I like Mike” (millennials: that’s Michael Jordan). Soon enough all those hot takes on journalism and racism swapped out “Jeong” and “New York Times” for “Trump” and “LeBron James.” The most pressing question for America became what Trump “really” meant.

Whether the President of the United States says something racist is a very legitimate question. But does this tweet really impart any new knowledge, shed any unseen light, or help us further clarify the stakes of our current political moment? I doubt it. Yet judging by Twitter, you would think this was the most important event since the election. Outrage has a way of creating the illusion of significance, and Trump understands this better than many of his opponents. As Ezra Klein notes, Trump is president in part because his team learned how to take advantage of the self-interested dysfunctions of the American media. Were we as a culture not so energized by meaningless nonsense, we wouldn’t need to care what a New York real estate baron thinks about an athlete. Now we are forced to care, a just punishment for our misplaced care then.

Social media is not the first technology to weaponize trivia. Neil Postman eviscerated television’s effect on Americans’ ability to process information in his 1985 book Amusing Ourselves to Death, and his critique has been both applied to social media and cited as an example of how every generation has their Luddites. But social media, especially Twitter, is different than television in important ways. It is more mobile, more personal, and its neural rewards are more alluring. Postman warned that TV makes us empty-headed and passive. But at its worst, Twitter can make us empty-headed and passive while we think we are actually being smart and courageous. Trivialities are dangerous to the degree that we cannot actually tell them for what they are. In our age, it’s not the silly vacuity of TV that gets pride of place in our cultural imagination, but the silly vacuity of hashtags and screenshots. Television is just television. Twitter is resistance.

Confusing minutia for meaning is a surefire path toward mental and emotional burnout at best, and an existential transformation into the very things we despise at worst. Fortunately, there are off-ramps. The best way to fight this burnout is to unplug and log off, redirecting your best energies away from the ephemera of online controversies and toward analog life. Because of the neurological boost social media offers, being conscious of its effects is the first, hardest, and most important step toward resisting them. These intentional acts are likely to arouse a sense of condemnation, either from ourselves or others, for not being as “in the know” as we once felt compelled to be. But this is precisely the social media illusion: that being “in the know” about petty, trivial, insignificant trends and conversations is no different than being in the know about anything else. All it takes is a few days away from the black hole of Twitter controversies to recalibrate the mind and realize just how small and unreal they are.

This isn’t just therapeutic, either. Small, organic self-government depends on the capability of citizens to know what’s happening right in front of them. Being smothered by minutia—especially minutia that privileges the comings and goings of remote, celebrity personalities—is a good way to miss the issues and debates that really matter. Your day on Twitter is far more likely to give you a comprehensive education about an over-the-top student protest at a college you’ve only heard about once in your life than about the people and issues in your county school board. For millions of Americans coming into voting age right now, the age of distraction is the only one they know. Minutia overload is normal, maybe even desirable. Reversing this trend is integral to stopping the dangerous political and cultural trend to conceptualize “America” as the handful of economically vogue cities and a smattering of famous rich people. How different would our own national politics be, how different would the White House be, if we weren’t so enamored with glitzy meaninglessness?

Our spirits always eventually mirror what we behold. Putting outrage-ridden triviality in front of our faces throughout the week, throughout the month, and throughout the year is not a neutral hobby. It’s a spiritual practice that makes us less able to feel the beauty of transcendent realities more deeply and less willing to make the effort to do so. If Bertrand Russell was right about existence’s only being “triviality for a moment, then nothing,” let us eat, tweet, and be merry, for tomorrow we and all the people we dislike die. If he was wrong, and more specifically, if all of human history is actually heading to a particular place and a particular Person in the light of whose glory and grace the trivial things of earth will grow strangely dim, then we’ve got a lot of work to do.

Advertisements

The Politics of Distraction

I think Ross Douthat is exactly right about the need for some kind of positive, strategic response to the smartphone age. “Compulsions are rarely harmless,” he writes, and therein lies the key point: Digital addiction is real, and its long term consequences, though mysterious now, will not be something to receive with gladness. Some may scoff at Douthat’s idea of a “digital temperance movement,” but scoff at your own peril. If hyperconnectivity and omni-distraction are indeed what we think they are, the cultural harvest from a digitally addicted age will stun.

In any event, now is certainly no time to be underestimating the long-term shaping effects of technology. Consider how incredibly prescient Neil Postman’s Amusing Ourselves to Death seems in a post-election 2016 era. Is there any doubt that the television’s impact on the public square, especially its reliance on trivialization and celebrity, played a key role last year? If you were to close your eyes and imagine a United States without cable news as it exists right now, does it get easier or harder to mentally recreate the last few years of American politics? Postman warned in Amusing that television represented a watershed in mass epistemology. In other words, television changed not just how people received information, but how they processed it, and consequently, how they responded to it. Our political culture is a TV political culture, and 2016 was irrepressible proof of that.

You don’t have to venture far from this line of thinking to see why the digital age represents similar dangers. As Douthat mentions, the soft, inviting blue glow of impersonal personality and our Pavlovian responses to “Likes” and “Retweets” are enough of a rabbit hole themselves. But consider still the effect of the digital age on information. The online information economy is overwhelmingly clickbait: “content” custom designed by algorithms to get traffic and give as little as possible in return. Even more serious news and opinion writing, when subjected to the economic demands of the internet, often relies on misleading, hyperbolic, or reactionary forms of discourse.

In the digital age, the competition is not so much for people’s patronage but for their attention, and screams and alarms always get attention. This trend isn’t just annoying for readers and exasperating for writers. It represents a fundamental challenge to the discipline of thinking, and to the moral obligation to believe and speak true things. Postman warned that using lights and flashes to blend facts with entertainment would shape culture’s expectations of truth itself. When what is interesting/fun/sexy/cool/outrageous/ becomes indistinguishable, visually, from what is true, then what is true becomes whatever is interesting/fun/sexy/cool/outrageous. If this is true for television, it is exponentially more true for the smartphone, a pocket-sized TV with infinite channels.

Who can foresee the politics of a distracted age? What kind of power will conspiracy theorists who master the art of going viral wield in years to come? What kind of political ruling class will we end up with when a generation of would-be leaders have been Twitter-shamed out of their careers? It’s hard to say.

Can we reverse these trends? I do like much of what Douthat prescribes as antidote. But the fact is that the internet, social media, and the smartphone are not merely trendy fads. They are part of an emerging technological transformation. Facebook will wither and Twitter will fade, but the “age of ephemera” will stand. Resisting it will likely depend much more on what people value than what they fear. Loneliness, for example, is endemic in the social media generation. Does the healing of lonely souls with real physical presence disarm an important motivator in online addiction? That’s a question that every parent, and every church, should be asking right now. And of course, individuals fed up with the noise of pixels will trade in their smartphones and delete their accounts.

For those who really want to resist the age of distraction, there will be ways to do so. The hardest challenge will be for those who kinda want to resist but also want to be plugged in. These are the folks to whom the smartphone is most cruel. And perhaps the best advice that can be given for those of us in this camp is: Deactivate every now and again, go to church, walk outside frequently, and read at least 1 physical book per month. A distracted age is a loud age. Thankfully, the universe is, once you’re able to really listen to it, pretty quiet.

The Phone and His Boy

Reflections on one of the most important essays you’ll read this year.

Andrew Sullivan’s latest essay in New York Magazine is one of the essential pieces of reading I’ve come across so far this year. Partly, I suppose, because it is the essay that I’ve been trying and failing to write for the past year. The title according to the URL slug of the article is “How Technology Almost Killed Me,” and the headline chosen by the magazine to appear in social media shares is “My Distraction Sickness–And Yours.” But the headline I personally love is the one that appears directly on the page:

“I Used To Be a Human Being.”

This is the essence of Sullivan’s essay. What if our endlessly connected lives, empowered by mobile technology and sustained by an ever-demanding social media age, are actually making us less like the people we are created to be?

As Sullivan reminds us, he spent more than a decade professional enmeshed in the online world. At its height, Andrew’s blog was updated at least a dozen times per day, often with nothing more than links and summaries of what he and his team found around the web. It was lucrative business, but it came at a cost. Sullivan’s physical, mental, and emotional health eventually spiraled downward, culminating in his announcement two years ago that he was leaving the blogosphere for good.

All that to say: When a man whose online presence has earned him money and reputation tells you that digital addiction is a major threat, you should probably listen.

Here’s an excerpt, but I cannot urge you enough to read the entire piece:

…as I had discovered in my blogging years, the family that is eating together while simultaneously on their phones is not actually together. They are, in Turkle’s formulation, “alone together.” You are where your attention is. If you’re watching a football game with your son while also texting a friend, you’re not fully with your child — and he knows it. Truly being with another person means being experientially with them, picking up countless tiny signals from the eyes and voice and body language and context, and reacting, often unconsciously, to every nuance. These are our deepest social skills, which have been honed through the aeons. They are what make us distinctively human.

By rapidly substituting virtual reality for reality, we are diminishing the scope of this interaction even as we multiply the number of people with whom we interact. We remove or drastically filter all the information we might get by being with another person. We reduce them to some outlines — a Facebook “friend,” an Instagram photo, a text message — in a controlled and sequestered world that exists largely free of the sudden eruptions or encumbrances of actual human interaction. We become each other’s “contacts,” efficient shadows of ourselves.

And what a constant diet of “shadows” does is spread our emotions and attention so thin over our lives that we lose the ability to connect deeply with the biggest moments, the most fundamental truths, and the most important relationships. Everything becomes digitized so that life itself is defined down. We are never fully here because we are never fully anywhere; our thoughts are continually spliced up between the earth and the ether.

I’ve seen this play out in my own life. My iPhone offers the security and comfort of never having a bored moment. Twitter means I’m never more than 140 characters away from letting peers know I still matter (virtue-signaling, anyone?). The constant, agonizing pull to grab my phone in any moment of stillness or quietude is a daily experience. The temptation to keep checking notifications or blog stats, sometimes doing nothing more than refreshing the page or switching between tabs for an hour, is a daily experience.

And I’ve felt the consequences: Reading is harder for me because I can only go a few pages without needing something newly stimulating, and writing is even worse. I’ve found it more difficult than ever to meditate on Scripture for more than a couple minutes, or to immerse in focused prayer. Several times over the past year I’ve come home and told Emily that, despite my “output,” I still feel like the day has been wasted–or rather, that the day has evaporated like steam, while my back was turned for a few minutes.

Should I dismiss this struggle as an unavoidable feature of life in the information economy? Should I chalk up my hitting the wall in prayer and meditation to a lack of spiritual delight? It’s possible, of course. But I don’t think so. I think it’s more likely that while many evangelicals have been running around proclaiming that technology is morally neutral–“it’s just how you use it”–the “neutral” technology has been shaping me and many others in ways that make it harder to pursue faithfulness.

One last thought: I’ve been seeing many people respond to Sullivan’s essay with frustration that he doesn’t seem aware of how closely tied many people’s jobs are with online connectivity. Some have criticized the piece for idealizing a sort of seamless transition from online life to disconnected solitude, when an increasing number of people in Western culture pay their bills through jobs centered around the internet.

As someone who has one of those jobs, I don’t have a lot of sympathy for this critique. It’s true that many people have careers that wouldn’t tolerate a total retreat to online monkishness. I haven’t the foggiest idea how that truth is somehow incompatible with Sullivan’s warning sign. For every person who is online 24/7 to support themselves or their families, there are at least 50 others who are online that much and have no idea why. If you feel like you can’t make a dent in your online life without endangering yourself or loved ones, God has grace for your situation. If, on the other hand, you feel like you can’t make a dent in your online life without exposing yourself to the frictions and foibles of flesh-and-blood reality, let me encourage you: I think it’s worth it.