A number of recent events have proven that live video of breaking news can be incredibly important for understanding such situations, and in many cases that video is being streamed on Facebook by those involved, whether it's a shooting in Louisiana or the aftermath of a protest in Dallas.
Given that, the fact that Facebook (fb) will not only remove videos in some cases but also deactivate a user's account at the request of police also raises a host of important questions. What responsibility, if any, does the network have as a news outlet when it makes such decisions?
The deactivation occurred earlier this week during an incident in Baltimore County, MD. While in a standoff with police in her apartment, 23-year-old Korryn Gaines was posting video on Facebook and interacting with followers on both Facebook and Instagram.
At some point, the police asked Facebook to shut down the woman's social-media accounts, and the company complied. Gaines was later shot and killed after she pointed a gun in the direction of the officers, according to statements made at a news conference in Baltimore on Tuesday.
Police said they were trying to serve an arrest warrant on Gaines on charges related to a routine traffic stop in March, including disorderly conduct and resisting arrest. Gaines's five-year-old son was wounded.
"We did in fact reach out to social media authorities to deactivate her account, to take it offline, if you will," Baltimore County police chief James Johnson said at the press conference. "Why? In order to preserve the integrity of the negotiation process with her and for the safety of our personnel [and] her child."
Get Data Sheet, Fortune’s technology newsletter.
Johnson said that not only was Gaines posting video of the operation as it unfolded, but her followers on both Facebook and Instagram were also "encouraging her not to comply" with police requests that she surrender peacefully. Facebook deactivated the account about an hour after being asked to do so.
According to police, none of the videos that Gaines broadcast have been deleted, but instead have been archived to serve as evidence. Still, the fact that police can have a user's account deactivated while they are live-streaming video is disturbing, as Black Lives Matter activist Deray McKesson noted on Twitter:
Police have suggested that Gaines was psychologically unbalanced, and that being urged not to comply by her followers on social media was not helping them negotiate a peaceful surrender, putting police officers, bystanders, and her son at risk.
But what if Facebook had received a similar request while Lavish Reynolds was live-streaming the aftermath of the police shooting of her boyfriend Philando Castile in Minneapolis last month? Would the social network have complied with that request too? Facebook removed the video for a time, but later said this was due to a "technical glitch."
Or take the example of the police shooting of Alton Sterling in Baton Rouge. Live video shot by a bystander is the only evidence we have that the shooting may not have been justified. Would Facebook have deactivated the user accounts of those filming the shooting if asked to do so by police?
Apple has its sights set on Netflix. Watch:
In Dallas, during the protests following the shootings of Castile and Sterling, a number of bystanders were streaming on Facebook while the police were trying to locate the shooter or shooters. Would Facebook have shut down those feeds if requested to do so, because they were "posting video of the operation as it unfolded," and therefore possibly jeopardizing the police pursuit?
A Facebook spokesperson told Fortune that the social network will remove user accounts or any other content after a request from police if it believes that doing so is necessary to prevent "imminent harm to a child or risk of death or serious physical injury to any person," which is the same standard it uses when deciding whether to disclose personal information.
There is some evidence that the social network is favorable to requests from local and federal authorities—it deleted hundreds or possibly thousands of user pages that were created by prisoners after being asked to do so by the government, and never mentioned it publicly, even in the company's so-called "transparency report." That report notes when government requests for data are agreed to, but doesn't say when requests to remove data or suspend accounts are agreed to.
If Facebook—a place where as many as 60% of millennials say they often go for news—wants to be seen as a trustworthy platform for journalism and a place to get a reliable picture of the world, it is going to have to come clean about when and why it removes things.