What look like editorial decisions are an effort to create a "safe community," it says.
Facebook has had to make some controversial decisions lately about what kind of content can or cannot be posted for the service’s 1.7 billion members to experience.
Last month, the service took down the famous photo of a naked, nine-year-old Vietnamese girl who was sprayed with napalm, as it triggered an automated system that blocks nudity. But after protests that the photo was of great historical value, Facebook reinstated the posting. Then last week, it was reported that some posts by Donald Trump were going to be pulled as hate speech, until CEO Mark Zuckerberg intervened and reversed the decision.
Those kinds of tough calls—and many, many similar situations—strike some commentators as similar to the kinds of editorial decisions that media companies make every day. That’s prompted calls for Facebook to be treated as a media company.
Get Data Sheet, Fortune’s technology newsletter.
But Facebook’s top executives pushed back on that notion on Tuesday. Facebook is a “platform for all ideas” but also needs to be a “really safe community,” Sheryl Sandberg, the company’s chief operating officer, said at the Wall Street Journal D Live conference. What may look like editorializing is more Facebook attempting to reconcile those core principles, she said.
“Those two things can come into conflict because one person’s free expression can be another person’s hate,” she explained. “You can see in lots of decisions we’re balancing these things.”
Chief product officer Chris Cox argued that Facebook fb fits the definition of a technology company better than it does a media company. Acknowledging that Facebook had a “huge responsibility” as the world’s largest social network, Cox maintained that the company was providing a platform for others—from individuals to groups to media publishers—to share their stories.
Does Facebook Need Humans To Decide What’s Trending?
Facebook uses a combination of automated systems and human overseers to decide what content should not be allowed. The algorithms and employees are divided into teams based on the type of controversy they deal with, such as bullying or hate speech. Facebook consults with experts in each field to develop its procedures, Cox said.
“A media company is about the stories that it tells,” he said. “A technology company is about the tools that it builds.”
The pushback seems unlikely to resolve the question any time soon, as the debate will be reignited every time Facebook makes a decision to block, or not block, some form of controversial content.