How it works

Ugh. I did it again.

Monday, January 28, 2019. 12:30pm.

Opens Spotify. Sees name of musician whose songs I enjoyed many years ago.

“Oh man, she’s really good. I haven’t listened to her in a long time. I should find some of those gems.”

Searches Spotify for some favorite songs. Starts listening.

“Wow, now I remember how good these songs are. I haven’t seen much of this woman lately, I wonder what she’s up to.”

Goes to official website. Looks around for 5 seconds, then clicks the link to the Twitter profile.

“Let’s see here.”

Sees artist Tweet about Covington Catholic/Nathan Philips. I don’t agree.

“Oh, gross. She hasn’t even corrected this bad take that she RT’d. Everyone knows by now the perspective she’s offering here is WRONG and UNFAIR. Honestly she’s probably the kind of person who would slander you online and not even apologize later.”

Sees more Tweets, including a RT of another person I admire offering same Wrong Opinion.

“Oh my gosh, these people are infuriating. They’re so smug in their wrongness. Honestly those discernment bloggers are right about these folks. ”

Realizes song is still playing by artist.

“This song’s not even that good. She’s probably just a liberal activist now. I don’t want to support that.

Stops song.

It happened again, didn’t it?

Sigh.

Advertisements

Shelter in the Shame Storm

We who grew up with the internet are going to have to reckon with the spiritual powers embedded in the technology we put in our pockets.

Helen Andrews’s essay on online shaming, featuring in the forthcoming January issue of First Things, is the kind of piece that can genuinely change readers. It is a stunningly powerful meditation that is simultaneously personal and sweeping. I can’t even choose a passage to excerpt without feeling like I’m under-representing the quality of writing, so please; if you haven’t read it, stop reading this blog and go read Helen’s essay.

I’ve been trying to figure out why, beyond the exceptional literary beauty on display in the writing, this essay has left such a strong impression on me. Perhaps one reason is that more and more of my thinking and writing has been taken up with trying to understand what technology, especially social media, is doing to me and my generation. I know some friends roll their eyes whenever they read another sentence like that one, but I wonder if they roll their eyes only because they haven’t allowed themselves to really listen to what’s going on—which, ironically, is one of the most aggressive symptoms of the social media contagion. There are probably only two kinds of people whose online habits aren’t at least challenged by phenomenons like online shaming: the people who stop reading essays like Helen’s because they don’t want them to be challenged, or the people for whom online shaming is not a problem but a bonus. Four years ago I would have said the latter group didn’t exist. Four years and too much time on Twitter later, I know for a fact it does.

This is a point Helen brings up to devastating effect. “The more online shame cycles you observe,” she writes, “the more obvious the pattern becomes: Everyone comes up with a principled-sounding pretext that serves as a barrier against admitting to themselves that, in fact, all they have really done is joined a mob. Once that barrier is erected, all rules of decency go out the window, but the pretext is almost always a lie.” In other words, people Twitter-shame not (ultimately) because they feel duty-bound to, but because they want to, because doing so is pleasurable and brings, however fleeting, satisfaction.

Not long ago it was common to hear that the internet doesn’t really “form” us, it simply removes analog inhibitions and frees up the true self. There’s probably some truth there, but all it takes is a little digital presence to quickly realize just how easy it is to become something online that bears little or no resemblance to your life offscreen. Put another way: If the tech is neutral and the only problem are the preexisting moral conditions of the users, online mobs should only be constituted of noxious people going after truly innocent targets. Alas, that’s not what happens.

At some point people like me who grew up with the internet are going to have to reckon with the spiritual powers that are embedded into the technology we put in our pocket. We’re going to have to determine to understand (a dangerous resolution!) how and why it is that the avatar-ization of our thoughts and names creates on-ramps in our hearts for delighting in the suffering of people whose only crime is disagreeing with us, or being friends with somebody who does. Why does mitigating our experience of the world through screens push us toward cruelty and resentment? Is it because we’re bored? Because our dopamine receptors are so calloused by notifications and we need a bigger hit? Is it because we are created to feel the very things social media is designed to prevent us from feeling? And after all these questions: Why is it that the fear of losing “connection” or “platform” is so strong that we shrug, pray for our broken world, and then check Instagram again?

I’ll confess to living out my own anathemas. As I was reading through Helen’s piece the first time, I stopped halfway and went immediately to YouTube to look up the fateful clip she describes. It was an eminently forgivable curiosity; how many of us can read an essay about such a moment without wondering where we can access it? So I watched the clip, then resumed Helen’s essay. And then a funny thing happened. I went back to the clip and watched it again, and then another time. Even right after reading about the man who grabs his phone and unwittingly invites Helen’s now-husband to watch a moment of profound humiliation, and wagging my head at such a clueless guy, here I was, basking in someone’s lowest public moment, because I found the “cringeworthiness”….well, what did I find it? Entertaining? Funny? Educational? To be honest, I’m not sure. I don’t know why I watched that video 3 times. But I did all the same.

Let’s say that YouTube didn’t exist, and that the only way such footage was accessible to me was through an exhaustive combing of C-SPAN files. Would I have made the effort to watch it? Perhaps. Perhaps not. I think the better question is whether, in a world where YouTube didn’t exist, and there wasn’t a multi-million dollar sub-industry that feasted on attention spans with “content,” there would have even been an extant clip to find. Perhaps one reason I went looking for the clip was that I knew I would find it. Perhaps another reason was that I had never stopped myself from viewing someone’s lowest professional moment before; why stop now? I don’t dislike Helen, and my guess is that we would agree on 98% of important matters. I didn’t relish her embarrassment while reading her testimony. I wasn’t piling-on. I just…watched.

I’m not sure where the shelter is from the shame storm. Today it feels as if anybody who has ever written or done anything in public is liable to be ridden out of civilization on a rail (or thread). But I’m hopeful that the same offline existence that can relieve anxiety and heal relationships can also re-calibrate our desires so as not to crave the saltiness of shame. Lord, grant me serenity to accept the Tweets I cannot change, the courage to log off, and the wisdom to know which comes first.

Death By Minutia

So many things that we modern people add to our lives are utterly trivial. This is a spiritual AND political problem.

There is darkness without, and when I die there will be darkness within. There is no splendour, no vastness anywhere, only triviality for a moment, and then nothing. 

This is bleak stuff from the philosopher Bertrand Russell, who, as an atheist, rejected any transcendent meaning to life or death. The best a sentient being can hope for, Russell argued, was “triviality for a moment.” Had professor Russell lived to see the age of cable news and social media, he probably would have been even more convinced of this. If you’re looking for a powerful argument for this kind of gloomy nihilism, you could do worse than the amount of triviality that drives our cultural consciousness. How difficult is it to hold forth that life is not meaningless when so much of what we give our attention is?

Trivialities shape the modern, Western soul. Our weeks and years are busier than ever and yet many report deep dissatisfaction and disillusionment. Technology has streamlined our work and curated our relationships, engineering existence for maximum efficiency, while depression, anxiety, and loneliness seem to be the most reliable fruits. Why is this? At least partially it is because a lopsided share of the things that we moderns add to our lives does not matter. They produce exhaustion but not meaning. Even many of the things that trigger outrage and righteous indignation are utterly insignificant. Politically, pscyhologically, and even spiritually, minutia is killing us.

Consider a pair of helpful illustrations from the recent news cycle. The New York Times hired a technology writer named Sarah Jeong for their editorial page. Not long afterwards, several Twitter users, including many conservative journalists, had unearthed a lot of Jeong’s old Tweets in which she quite plainly expresses contempt and dislike for white people, especially white men. Almost faster than you could read all the screenshots, a small library of thinkpieces was published from both ideological sides of the American blogosphere. Left publications like Vox and The New Republic defended Jeong and her Tweets as misrepresented victims of a racist, right-wing smear campaigns. On the other hand, others wrote that Jeong’s Tweets were clearly racist and the Left’s defense of her hire by the Times was gross hypocrisy from the social justice movement.

This type of thing is almost totally irresistible to people like me, who invest time and energy in the online world of ideas. I got sucked in. I knew it was dumb, meaningless, and a waste of time, but the neural reward patterns were too much to overcome. I found myself reading thinkpieces that enraged me, scanning Twitter accounts for something to either vindicate my opinions or further anger me, and imagining all the various evils that this episode revealed about my ideological opposites. It was a thrilling exercise. I felt alive and in the know, already planning to write something that would head off the conversation among the friends I just knew must be having tons of private conversations about this Trending Topic. I went to bed full of righteous invective and eager to meet the next morning with my weapon: my “take.”

I woke up the next morning embarrassed and frustrated that I had wasted last night.  Sarah Jeong has no influence in my life, wherever she works. I had no idea who she was until I suddenly had strong opinions about her (and if I’m being honest, I didn’t really know anything about her even afterwards). An evening’s worth of attention and angst had been spilled over some journalist’s handful of 180-character sentences. I had absolutely nothing to show for my absorption, except for another ride on social media’s outrage-go-round. Worst of all, I knew I had deepened my dependance on outrage to get me thinking. Awful.

Mine is a common experience. Twitter thrives on addicting its users to triviality. Its engineers and programmers know, and in some cases admit, that the platform relies on negative emotion to drive up clicks. Stories like Sarah Jeong’s are an analytics counter’s dream come true: A polarizing trending topic that whips up strong tribal emotions but offers little offline substance. The drama is wholly contained within the frenetic subculture of social media and blogs. Sermonizing and demonizing is fine even if nobody is talking about the issue this time next week, because the point is not meaningful discourse, but per-click ad revenue. Everybody wins, except your brain.

Of course, not everything that trends on social media is trivial. Twitter at its most useful is a hub of informed conversation that offers an invaluable view into the people and places that make up the news. Consider the recent revelations of widespread abuse cover-up in the Catholic dioceses of Pennsylvania. While the bare legal facts are available in any traditional media outlet, reading the comments, prayers, and (yes) arguments of Catholics who are reckoning with these horrors gives me an insight into how real people are thinking about and responding to these stories, not to mention a fresh empathy and even a sense of Christian burden-sharing. That’s far beyond the capability of any journalistic institution.

But in order for this positive effect to be monetized, it has to be inexorably dependent on minutia. My Twitter feed must, by industrial necessity, offer me three doses of triviality for every one dose of significance. Even if I’m zeroed in on following the conversation and developments of the sex abuse scandals, Kanye West’s politics, or the latest protest at Starbucks, or the inchoate rants of some Reddit men’s rights activist (and the equally inchoate “clapbacks” to the same) are all pushed in my face. Truly meaningful words are buried like fossils in the sediment of minutia. This is the way Silicon Valley wants it, because it’s minutia, not meaning, that cheaply and efficiently captivates my attention.

A prime example of how meaning and minutia are purposefully conflated, to the benefit of tech like Twitter,  is Donald Trump recent insult of basketball superstar LeBron James and journalist Don Lemon. The President of the United States denigrated both James and Lemon’s intelligence before saying “I like Mike” (millennials: that’s Michael Jordan). Soon enough all those hot takes on journalism and racism swapped out “Jeong” and “New York Times” for “Trump” and “LeBron James.” The most pressing question for America became what Trump “really” meant.

Whether the President of the United States says something racist is a very legitimate question. But does this tweet really impart any new knowledge, shed any unseen light, or help us further clarify the stakes of our current political moment? I doubt it. Yet judging by Twitter, you would think this was the most important event since the election. Outrage has a way of creating the illusion of significance, and Trump understands this better than many of his opponents. As Ezra Klein notes, Trump is president in part because his team learned how to take advantage of the self-interested dysfunctions of the American media. Were we as a culture not so energized by meaningless nonsense, we wouldn’t need to care what a New York real estate baron thinks about an athlete. Now we are forced to care, a just punishment for our misplaced care then.

Social media is not the first technology to weaponize trivia. Neil Postman eviscerated television’s effect on Americans’ ability to process information in his 1985 book Amusing Ourselves to Death, and his critique has been both applied to social media and cited as an example of how every generation has their Luddites. But social media, especially Twitter, is different than television in important ways. It is more mobile, more personal, and its neural rewards are more alluring. Postman warned that TV makes us empty-headed and passive. But at its worst, Twitter can make us empty-headed and passive while we think we are actually being smart and courageous. Trivialities are dangerous to the degree that we cannot actually tell them for what they are. In our age, it’s not the silly vacuity of TV that gets pride of place in our cultural imagination, but the silly vacuity of hashtags and screenshots. Television is just television. Twitter is resistance.

Confusing minutia for meaning is a surefire path toward mental and emotional burnout at best, and an existential transformation into the very things we despise at worst. Fortunately, there are off-ramps. The best way to fight this burnout is to unplug and log off, redirecting your best energies away from the ephemera of online controversies and toward analog life. Because of the neurological boost social media offers, being conscious of its effects is the first, hardest, and most important step toward resisting them. These intentional acts are likely to arouse a sense of condemnation, either from ourselves or others, for not being as “in the know” as we once felt compelled to be. But this is precisely the social media illusion: that being “in the know” about petty, trivial, insignificant trends and conversations is no different than being in the know about anything else. All it takes is a few days away from the black hole of Twitter controversies to recalibrate the mind and realize just how small and unreal they are.

This isn’t just therapeutic, either. Small, organic self-government depends on the capability of citizens to know what’s happening right in front of them. Being smothered by minutia—especially minutia that privileges the comings and goings of remote, celebrity personalities—is a good way to miss the issues and debates that really matter. Your day on Twitter is far more likely to give you a comprehensive education about an over-the-top student protest at a college you’ve only heard about once in your life than about the people and issues in your county school board. For millions of Americans coming into voting age right now, the age of distraction is the only one they know. Minutia overload is normal, maybe even desirable. Reversing this trend is integral to stopping the dangerous political and cultural trend to conceptualize “America” as the handful of economically vogue cities and a smattering of famous rich people. How different would our own national politics be, how different would the White House be, if we weren’t so enamored with glitzy meaninglessness?

Our spirits always eventually mirror what we behold. Putting outrage-ridden triviality in front of our faces throughout the week, throughout the month, and throughout the year is not a neutral hobby. It’s a spiritual practice that makes us less able to feel the beauty of transcendent realities more deeply and less willing to make the effort to do so. If Bertrand Russell was right about existence’s only being “triviality for a moment, then nothing,” let us eat, tweet, and be merry, for tomorrow we and all the people we dislike die. If he was wrong, and more specifically, if all of human history is actually heading to a particular place and a particular Person in the light of whose glory and grace the trivial things of earth will grow strangely dim, then we’ve got a lot of work to do.

Tweet-less

Since deactivating my Twitter account two weeks ago, the following reflections have come to me:

  1. If there were any question before that my relationship with Twitter was addictive, now that it’s gone, I have zero doubt. It seems to me that sometimes you can’t tell how hooked you are until you’re off the hook.
  2. I had a truly comforting thought a few days ago: Right now, there’s someone on Twitter saying wrong things, and I can’t see it or respond to it. There’s a genuine peace right there.
  3. I’m emailing individuals more, since I can’t Tweet them. It feels warmer and more personal to me, especially to email someone I haven’t met, just to tell them I appreciate their work or a note of encouragement. I know that technically speaking an email is just as ephemeral as a Tweet, but for some reason, it doesn’t feel that way.
  4. I’ve felt more compelled to write and more able to do so in twitter’s absence. In fact, I recently wrote a piece for TGC that I consider the best thing I’ve ever sent to them–owing partly, I suspect, to mental channels that aren’t nearly as clogged with minutiae.
  5. Not being able to Tweet out my blogs is a bummer. It’s exposed the conflict in me between the writer and the publicist, the person who wants to write from the soul and the person who wants to write so other people will tell me I’m a good writer. Like I said in point #1, I don’t know how aware of this I would be if I still had Twitter. For that reason alone, I think deactivating was a good move.

Social Media Resolutions for 2017

  1. I will be less cynical. Sarcasm and withering criticism are to social media what static is to AM radio. There’s no need for one more person’s trying too hard to be funny.
  2. I won’t start or join a “pile-on.” If I wouldn’t publicly shame a person in real life, I shouldn’t do it online.
  3. I will tweet unto others as I would have others tweet unto me.
  4. I won’t be so preoccupied with my phone that I forget the people around me. If at any time I feel defensive when someone suggests I take a break, I should interpret my defensiveness as a sign they’re right.
  5. I will be more concerned with saying what is true and helpful than building my “brand.” If someone says something better than I said it, that’s not a problem.
  6. I won’t repost meanness or trollishness, not even to mock it.
  7. I will always feel free to not chime in.
  8. I won’t “hate-click” or “hate-share.” In the new writing economy, there is no such thing. (see resolution #6)
  9. I will actively seek out and share the wisdom of others, rather than see it as a threat.
  10. I won’t sneer at those who do the opposite of these resolutions, since I already know I myself will fail them.

The Slough of Internet Despond

The latest nominee for Tweet of the Year comes from professor James K.A. Smith:

I am endlessly perplexed by people who say–and there are many who do–that social media and the internet “community” are the best measures of What’s Really Happening in the world today. These folks will point us to Twitter if we want to know what’s really making an impact in our culture, the things people are really talking about. There’s an entire journalism industry, in fact, being formed around the idea that the internet has a personality, and that this personality is every bit as consequential to your experience of the world as the 10PM news. Thus, you get stories in your news feed like, “Celebrity XYZ Recently Said This, and the Internet is NOT Happy About It.”

If you spend most of your day scanning social media sites and blogs, you will probably come away with a very specific idea of what American culture is like. The latest hashtags will probably convey some sense of despair or outrage; the latest viral videos will either do the same, or else distract. But here’s the thing: Because of the effect of digital media on human attention, the internet is designed to be totally absorbing and supremely now. If you’re riding the bus and two people behind you are quarreling, you probably won’t get off the bus and feel a palpable sense of depression for the rest of the day at how selfish human beings can be. On the other hand, if you’re reading Twitter hashtags and following back-and-forths between really angry users and the target of their outrage, you will almost certainly turn off your phone and feel consumed by it. That’s not because the outrage you just watched is more real (actually the opposite is probably true), it’s because your brain absorbed it in a qualitatively different way than it absorbed the bus ride (for more on this topic, I recommend this outstanding book)

This is exactly why a dive into social media will lead you to believe that the world is probably a terrible place to live right now. Everything, from the littlest of impolite slights to the most difficult issues of human justice, is magnified with unending intensity on the screen. If you turn off your phone and head down to the library or the coffee shop, though, it kinda seems the people you’re sitting next to don’t have any idea that they should be packing their bags for the bomb shelter. They talk normally, seem relatively calm, maybe even kind. It’s almost as if you’re experiencing two distinct cultures: One a perpetually moving but never anchored sea of consciousness, bent every which way by advertising and technology; and the other, a culture of place, permanence, and sunshine.

I know a lot of people, some very close to me, who are going through difficult times right now. There are thousands of people in Louisiana this second who have suffered cataclysmic loss. Yet invariably, the most miserable people I run into are not these people. The most miserable people are the ones who don’t suffer, but merely hover–attached to the world by ether, spending their emotions and their hours consuming a diet of pixels.

 

Defining Decency Down

When a heartbreaking catastrophe takes place, but you don’t live-Tweet it, do you still care?

If a horrific act of murder happens somewhere in the world, but you don’t blog within minutes about it, or Tweet about What It All Means…do you still care?

In the week and a half since a young man (I won’t name him. It’s a scandal that we make celebrities out of terrorists and psychopaths) brutally murdered nearly 50 people in an Orlando nightclub, I and many of those close to me have had much to think about. The nightclub was a gay nightclub. The killer obviously targeted a specific community of human beings that particularly offended him, one that he wanted to terrorize. In the era of our media-soaked, clicks-oriented identity politics, the weight of that thought can be hard to feel. Not hard to understand, mind you; hard to feel, to truly have the horror and hatred and vulnerability of such an act reverberate in the soul.

The simple fact is that true empathy is not easy and it’s not instant. That’s an inconvenient truth, but it’s truth. Entering into the sorrow of another–what the Bible calls “bearing one another’s burdens”–is a moral, emotional, intellectual, and interpersonal discipline. It must be practiced. Christians are commanded to bear one another’s burdens because the default setting of the sinful human race is apathy and extreme self-absorption. To have the margins of one’s heart expand to include those with no earthly connection to you, whose well-being or tragedy will probably never intersect (in an economic or relational sense) with your life–that is something serious, and spiritual.

But in the age of Twitter, that kind of measured thinking doesn’t sell. Federal investigators were still among the bodies of victims inside the Pulse nightclub when online pundits started to eviscerate the “silence” of Christians and other religious traditionalists. From Twitter accounts across the country poured forth not just heartache but hellfire and damnation on all those who had failed to live-Tweet their sorrow or confess that they were partially to blame. In the hours and days after we knew what had happened to those people in Florida, the empathy and grief became inextricable from the bitterness and frustration with those who hadn’t grieved the right way, or hadn’t done it fast enough, or had “hid” behind words like “thoughts and prayers” instead of calling for new laws.

Does this sound healthy to you? Does it sound like the response of those who are grieving in a centered, emotionally mature way? Or does it sound more like what we would expect of a generation that doesn’t feel anything until its been siphoned through an online server and processed into pixels?

The danger of the internet has always been the temptation to live life through it, one orbit short of the uncomfortable, offensive, difficult realities of real, flesh-and-blood existence. Social media offers as convincing a replication of actual community as human brains have invented thus far. Many of us carry our community in our pocket, in a smart phone whose soft blue glow has rewired neural pathways and made us anxious and listless when we’re not logged in.

We seem to be at a point in American culture where a good many people seem to think that our online identities are crucial extensions of our moral selves–so crucial, in fact, that whether or not a person is compassionate or caring can be evaluated by a quick glance at their pages. Has this person acknowledged the story that’s on cable news right now in a timely fashion? Have they offered the kind of words that are acceptable for their online medium? If the answer to either of those questions is “No” (or “Unclear”), then they must be shamed. Those are the scales of online justice, and they are absolute and unyielding.

But the greater sadness in all this is not what happens to those who are actually praying or meditating or grief counseling, while others are Tweeting. The greatest sadness is what happens to compassion itself. Contorting social media to be an arbiter of decency doesn’t define social media up nearly as much as it defines decency down. It takes literally no authentic expression of oneself to click the particular combination of letters on a smartphone or keyboard that will garner endorsements (Retweets) or authentications (Likes). That kind of mastery of social media platforms is not a moral progress; it is a marketing skill, one that can be taught and learned and memorized and utilized to make enormous amounts of advertising dollars. Using social media “correctly” is not a character virtue; its a technological achievement.

The outrage directed at those who don’t grieve in the way the internet wants them to grieve does not foster compassion; it fosters hot-takes and the clicks that fund hot-takes. Those who genuinely believe that a Tweet or a Facebook post can be used to measure the rightness or the wrongness of a person’s capacity for love are thinking of love exactly the way that the advertising industry wants them to. Whether we are talking about the age of the billboard or the age of the meme, this idea of love is nothing more than Impulse –> Product –> Satisfaction. It makes for great car commercials and punchy online journalism. It makes for lousy human hearts.

Instead of defining decency down, perhaps more of us should consider adopting this kind of personal rule: When something happens (in the news, in my life, in my feed, etc.) that triggers in me a tremendous desire to express myself online, the time I should spend offline, in silent contemplation, should be directly proportional to the intensity of my desire to post. If I *can’t wait* to get my Tweet out there, I should spend quite a bit of time thinking before I put it out there. If I don’t feel quite alive until my Facebook post goes up, it shouldn’t go up right now. Only when I have a palpable sense of how small and ephemeral social media is, and how foolish I would be to think of it as some immanent layer of my humanity–only then should I share my thought with the online world.

This kind of principle might, just might, help us to keep in mind the difference between social media justice and cosmic justice, between the perfectly-edited compassion of the Good Blogger, and the dirty, costly, divisive compassion of the Good Samaritan.