U.K. Health Secretary Matt Hancock has warned that social media companies could face a ban if they fail to remove harmful content focusing on self-harm and suicide.
“If we think they need to do things they are refusing to do, then we can and we must legislate,” Hancock said in an interview on the BBC’s Andrew Marr Show.
Hancock became concerned about material on platforms like Facebook and Twitter after Molly Russell, a 14-year-old in the U.K. took her own life after seeing suicidal content on Instagram in 2017, according to her father. The Russell family’s lawyer believes the algorithms of the app “push negative material.”
Hancock recently sent a letter to various platforms, acknowledging steps some have taken to remove inappropriate media. However, he said there is room for improvement. “It is appalling how easy it still is to access this content online and I am in no doubt about the harm this material can cause, especially for young people,” he wrote, as reported by The Telegraph.
Instagram told the BBC it works with experts who guide them on the “complex and nuanced” issues surrounding mental health and self-harm, while also saying the company doesn’t “remove certain content.”
“Instead (we) offer people looking at, or posting it, support messaging that directs them to groups that can help,” Instagram said to the BBC, though it will review its policies.
Facebook, which owns Instagram, has issued an apology to Molly’s family.
Hancock added in the interview that the U.K. government is looking to develop a white paper concerning the issue.
“I want to make the U.K. the safest place to be online for everyone – and ensure that no other family has to endure the torment that Molly’s parents have had to go through,” he said.