Content Has Consequences

Melinda Lee
The Startup
Published in
7 min readFeb 3, 2021

--

Let’s start choosing them

You can’t escape the buzz: in the past few weeks, a horde of Redditors organized a coordinated campaign against prominent hedge funds by purchasing shares of Gamestop en masse. With billions of dollars on the line, these retail investors — now dispersed across subreddits, discord servers, and YouTube video comments — forced the stock to skyrocket over 14,300% as establishment investors scrambled to cover their losses and contain contagion across the market.

This episode is still unfolding — now trending headlines are about $GME shares crashing — but it has become painfully clear to Wall Street, the financial media, and anyone else watching that powerful institutions no longer monopolize the instruments of control. The Redditors have seized an instrument of their own, hidden in plain sight: social media’s viral message. The flurry of directives to hold the line in order to send $GME to the moon, peppered with emojis, converted shared content into shareholdings. Screengrabs from Reddit posts detail how their earnings would pay off their debt, pay for loved ones’ medical expenses, and treat frontline workers to lunch would be shared on other corners of the internet — recruiting even more investors to join in a budding movement against once-untouchable Goliaths on Wall Street. The differences between content and capital, meme and money, and attention and asset have collapsed entirely.

If it wasn’t obvious before, it is now: content has consequences.

The news moves fast these days. It’s hard to believe that, only three weeks ago, we collectively witnessed a devastating and unsettling blow to democracy, the rule of law, and the containment of far right extremism and white supremacy. And we witnessed this implosion in real time on January 6th, when we watched an armed and determined mob storm the Capitol building live on our newsfeeds.

One of the insurrectionists in the building, Anthime Joseph Gionet, live streamed the attack on DLive under his public persona, Baked Alaska. These recordings would become the very evidence used to identify him — and dozens of others — by the FBI. A rising celebrity in the alt-right’s content economy, he built his platform by escalating confrontational pranks and political rhetoric, following the metrics from his audience as he toggled between political allegiances (he once endorsed Bernie Sanders and Andrew Yang, before pivoting hard for Donald Trump.) It was within the alt-right’s content filter bubble that he found his burgeoning stardom: rewarded by a steady stream of followers and engagements, this attention reinforced and incentivized the antisocial impulses in his videos (like pepper-spraying a bouncer or harassing masked essential workers).

Where did he learn and hone these instincts as a creator and content analyst? In a New York Times op-ed, my former colleague Ben Smith described Gionet’s training — both formal and informal — at our former employer, BuzzFeed: through countless creations, uploaded as experiments, and by measuring the attention they would capture.

And as the numbers rolled in and populated watchtime, subscriber, and view count graphs, he — like so many of us creators — became intoxicated by the immediate and mass feedback of the crowd.

Of course it would stimulate an insatiable appetite for more: more eyeballs on the screen, more engagements, and more clout. He’d eventually take this hunger with him to the Capitol, where he turned the camera on himself, kicked his feet up on Nancy Pelosi’s desk, and spoke directly to his enraptured audience.

BuzzFeed is a place that I had long admired. It was a prime competitor for advertisers, attention, and audience back when I worked at other media companies like Hearst and Meredith. It was a place that had cracked the viral content code, amassing huge audiences and regularly spinning up new digital brands on platforms like Facebook or YouTube. And it’s the place where I would eventually take on the role as Content Chief of BuzzFeed Media’s ensemble of brands.

Before joining BuzzFeed, I remember reviewing the unimpressive numbers from our Facebook Live shows and trying to get a sense of the market. In the press, my eye caught a headline that BuzzFeed broke the internet by stretching rubber bands around a watermelon live and in real time. Somehow, they managed to amass and transfix this audience by both the literal and figurative tension of an incoming explosion.

All of us — as strategists and producers — paid attention, all the while learning, experimenting and competing to captivate audiences so that we could optimize a privileged metric in our industry: watchtime — the valuable seconds the user spends on the screen.

Four years later, those lessons would be taken to its logical conclusion and broadcast to the world: the time-based tension of a mob storming the Capitol as we awaited the explosion of our democracy and civil discourse. What led us to this moment? And do we have a chance to course correct before it’s too late?

The 90s’ promises of a democratized internet — the information superhighway, mapped out over usenet groups and ICQ, and uploaded to neighborhoods on Geocities, that would connect all of us into a global community — seem rosy in hindsight. And in the 2000s, the promises of budding social media platforms like Facebook (whose mission is “to give people the power to build community and bring the world closer together”) now ring naive in the wake of the antagonistic, divisive, and fractured world we plainly see today.

Researchers and tech ethicists alike have been sounding the alarm about social media’s social dangers for years: algorithms that reproduce racist and sexist stereotypes, the rise of political extremism and the alt-right, and proliferating conspiracy theories. Nobody can deny now that digital content — and the apparatus of technologies, market strategies, and appetites that animate it — has consequences in the world. Yet despite these calls for reform, the response from big platforms has been reactive and lukewarm at best.

It now seems clear that the ways we designed our platforms — and the content we produced in turn — have had complex and unintended consequences once they were released upon the public. The metrics we used to measure engagement in the video space (like views, shares, and watchtime) captured certain dimensions of attention valuable to advertisers.

In the scramble for the audience’s eyeballs in an ever-saturated visual economy, we didn’t stop to ask about the kinds of affects or emotional states that were being generated.

As we experimented, iterated, and assembled these learnings, we assembled something like Frankenstein’s monster pieced together by short-term goals. It simplified complex, individual people into aggregate numbers and insights; shares, views, follows, and watchtime signals turned into incentives for what we made and unleashed next. And for years, we all tinkered with suggestion algorithms and programmatic advertising tools to squeeze more watchtime out of viewers — while mass shooters uploaded their manifestos, influencers tacitly (and explicitly) promoted body dysmorphia on Instagram, and Facebook’s faceless contract content moderators around the world were traumatized on the front lines of a content economy that inadvertently rewards the emotions that “engage” people the most: pain, violence, and hatred.

If we could do it all over again, how would we create a platform from scratch that accounts for these mistakes?

Could we design a platform that promotes goodness instead of wrath, envy, or schadenfreude? Can we ever make good on those early internet promises to connect us to each other and build community — rather than pronouncing our differences, stoking antagonism, and constructing separate realities for us to embody?

Redesigning the future

If there has ever been a time to fundamentally reimagine our communities, it is now.

January concluded as the deadliest mile marker of the pandemic, with nearly 100,000 lives succumbing to COVID-19 in a single calendar month. In a combination of anxiety and fear, we watched the curve of infections, deaths, and hospitalizations rise and fall for a year — not unlike the rogue retail investors captivated by their stock tickers, or concerned denizens glued to prediction needles and polls in a historically contested election. After a decade of being measured and tracked by our most intimate technologies, we have also all become both analysts and subjects mesmerized by uncertainty and concern. As vaccines shakily roll out, schools debate reopening, and our peers circulate epidemiological and immunological misinformation, we reevaluate our own position in the faceless crowd. Can we learn to stand shoulder-to-shoulder instead?

Between big tech and the media, we built a content industry that has profoundly shaped the way that people view themselves and the world around them. The competition for attention has turned audiences into numbers to crunch, viewers into doomscrooling buckets of rage, and each human person into their own brand, responsible for attracting their own followers by being louder, more confrontational, and more divisive.

In the midst of the cascading political, cultural, and public health crises of this viral pandemic, we must now ask how viral media can also be reformed.

Live video offers us tremendous potential to disintermediate creators from their audiences and to restore the agency of users. Zoom dance parties and community gatherings, Alexandria Ocasio-Cortez’s Instagram Live dispatches, and countless streams of solidarity amongst Civil Rights protestors have connected fragmented people into real-time dialogue across distance, hierarchy, and fear. And once we start remembering, we realize there have always been pockets of hope on the platforms that also troubled us: to its credit, BuzzFeed also helped launch communities of self-acceptance and body positivity; Upworthy and The Dodo leveraged virality to promote positivity. When COVID first breached emergency departments across the country, nurses and doctors turned to TikTok to dance choreographies, author memes, and often, addressed us directly with profound vulnerability through their PPE to remind us of our collective responsibility in a grander community we took for granted.

The lessons of these past few years have raised critical questions we must now all ask ourselves as creators and audiences: what alternatives are there to tension, anger, or disgust?

How do we prioritize goodwill, critical thinking, and connection as metrics worthy of optimizing? What if content moderation — currently a risk management strategy akin to shoveling buckets of water out of a sinking boat — became a core project at the heart of every platform? What kinds of design and policy choices would we need to claim to restore users back into human beings? And do we dare imagine alternatives to capturing captive audiences — when we know that people, and the communities they call home, really want to be free?

--

--

Melinda Lee
The Startup

stage ten network president. content junkie, food spy, wellness striver. former: buzzfeed content chief; meredith video gm; hearst int’l content/aud dev head