By Ellen McGirt
Updated: August 22, 2018 3:34 PM ET

The tragedy of the commons is a social science theory that attempts to predict the outcome when humans use a public, unregulated space. And yes, it is profoundly tragic.

It sounds simple, at first. Let’s say you own some cows and a pasture. Chances are you’d manage your cow business carefully. You wouldn’t put too many animals on your patch of land, and you’d make sure there was enough water and food for the cows you did have. But on public lands, you might observe that lots of people just drag all their damn cows onto it because it’s free to use and nobody controls it. Ultimately your pasture thrives while the public pasture is quickly destroyed. It’s short-term self-interest at scale.

This was the theory put forth by the economist William Forster Lloyd in 1833 (yes, it involved cows and pastures) and was later revived as a popular 1968 treatise about sustainable development by ecologist Garrett Hardin. The tragedy of the commons describes the phenomenon when individuals within a shared system end up depleting a common resource because they use it without regard for the long-term health of the broader environment. The precepts have been applied to everything from overfishing to nuclear brinksmanship, and most recently, climate change.

I’ve been thinking a lot about this theory lately, and the tragic mess that social media, the commons of the digital age, has become.

The data points are everywhere.

Consider this just-published manifesto from Kelly Marie Tran. The Star Wars actor fled from Instagram earlier this year after being subjected to a steady stream of racist and misogynist messages. She’s back and ready to talk. “It wasn’t their words, it’s that I started to believe them,” she began. “Their words seemed to confirm what growing up as a woman and a person of color already taught me: that I belonged in margins and spaces, valid only as a minor character in their lives and stories.”

Tran is just one of many, many, many people, often women and women of color, who have been driven from the online commons due to unrepentant harassment on social platforms. Even parents who are grieving murdered children aren’t safe in the digital public square.

This can’t be the pasture that the smartest minds in tech and venture capital had in mind, right?

Online spaces have been deftly hijacked by prankster trolls and true believers alike, driving hate speech into the mainstream. Even examining the phenomenon gives it oxygen. “Hate groups ha[ve] gamed the media,” says Clive Thompson in Wired, talking about the rise of alt-right memes and dangerous conspiracy theories that make their way to the real world mindsets.“They did it energetically and successfully. Now it may be time to invoke the wisdom of WarGames—where the only way to win is not to play.”

Sure, except the pasture just fills up elsewhere.

Writer Zeynep Tufekci cites “dark posts,” the non-public, targeted messages used by the Trump campaign to discourage African American voter turnout as an example. Unexamined, manipulative and dangerous speech is a feature, not a bug of digital life. “It’s important to realize that, in using these dark posts, the Trump campaign wasn’t deviantly weaponizing an innocent tool,” she says. “It was simply using Facebook exactly as it was designed to be used.”

Digital tragedies influence more than just voting behavior.

Facebook has been blamed for fomenting deadly violence against the Rohingya people and other Muslims in Myanmar. A Reuters investigation found over 1,000 recent examples of hate speech, some of which had been on the site for over six years. “The poisonous posts call the Rohingya or other Muslims dogs, maggots and rapists, suggest they be fed to pigs, and urge they be shot or exterminated,” they found.

And a new study that analyzed every anti-refugee attack that occurred in Germany over a two year period found one central variable: Facebook. From The New York Times:

Towns where Facebook use was higher than average, like Altena, reliably experienced more attacks on refugees. That held true in virtually any sort of community — big city or small town; affluent or struggling; liberal haven or far-right stronghold — suggesting that the link applies universally.

Their reams of data converged on a breathtaking statistic: Wherever per-person Facebook use rose to one standard deviation above the national average, attacks on refugees increased by about 50 percent.

The digital commons analogy is not a perfect one. The spaces are not fully public, nor are they entirely unmanaged. Though we pay a high price (our data) in terms of admission to them, we have no say in their governance. And the monstrous behavior that they enable, whether by design or neglect, have complex origins. Small gestures like sort of banning Infowars’s Alex Jones was nice, but it’s no substitute for the deep thinking required to revive the early promise of these online spaces.

In search of comfort, I turned to Elinor Ostrum, the late Nobel Prize-winning professor of political science at Indiana University at Bloomington. Her research proved that the tragedy of the commons is not inevitable.

Ostrum won the 2009 Nobel in Economic Sciences for her profound belief in you and me: Her work showed that ordinary people working together are capable of creating the kinds of rules that can help preserve and equitably manage shared resources.

She died in 2012, but you can get a sense of her in this charming short video from Big Think.

She describes her theory of “polycentricity” as a “complex nested system,” of markets, governments, community groups and individuals working to create a workable governance system. “In many instances but not all, people have found ways of agreeing on their own rules, and extracting themselves from the problem,” she says.

It will be hard work, however.

“It ain’t pretty in the sense that it’s nice and neat,” she says. And most people tend to dismiss creative solutions that are complex. “But society is complex. People are complex,” she says.

But it’s our only hope. “Simple solutions to complex problems? Not a good idea.”

 

SPONSORED FINANCIAL CONTENT

You May Like