TikTok takes on the mess that is misinformation
TikTok is well aware it’s no longer a service used solely by bubbly teens doing dance challenges. It’s also a place where people spread harmful misinformation—a problem that has proliferated on the platforms of its more mature rivals.
The service said on Wednesday that it’s taking new steps to crack down on the problem. It will now label posts with “unsubstantiated content” in an effort to reduce the number of times they’re shared. If users still try to share the video, they’ll receive another warning reminding them that the video has “unverified” content before they can share it.
But social media experts say TikTok’s new step raises a lot of questions.
For example, what is considered “unsubstantiated content”? Will users ultimately be responsible for flagging such posts or does TikTok have a plan in place to proactively find them? How much of this work will be done by artificial intelligence versus humans? Is its content moderation team big enough to handle the massive number of posts users publish every day? And how much does TikTok’s Chinese owner ByteDance influence those decisions?
“I think there is some intentional ambiguity in the message here,” said Yotam Ophir, an assistant professor at the University of Buffalo who studies misinformation. “It’s hard for me to understand what they’re going to do.”
TikTok had not responded to a request for comment by the time this newsletter was published.
Here’s what we know. TikTok says it already removes misinformation as it identifies it, though we don’t know how that’s identified. It also says that in the U.K., it partners with Logically, a company that combines human fact-checkers with artificial intelligence to fight misinformation. When Logically determines a video has misinformation, TikTok removes it. It’s unclear if this is only happening in the U.K.
Meanwhile, legislators and lawmakers have been turning up the pressure on social media companies, concerned about hate speech, violent posts, and misinformation that could lead to real-world harm. As a result, Facebook and Twitter have rapidly ramped up their rules, getting tougher on posts that contain problematic content like election misinformation and fake coronavirus cures. And they’ve all seemingly been rattled by the riots at the U.S. Capitol, which were fueled by false claims of voter fraud that were bolstered on social media.
“It’s certainly a come-to-Jesus moment,” said Sarah Roberts, an associate professor at the University of California, Los Angeles who studies content moderation. “It was yet again a demonstration of the ways in which it’s not feasible to pretend like the world of social media is somehow completely divorced from the rest of our reality.”
TikTok’s decision to crack down on misinformation will likely come with a host of new challenges and a lot more scrutiny over how it executes the plan. But experts agree it’s a step in the right direction, both for society and the business. Roberts points out that if anything, TikTok can now claim it’s making a good-faith effort to control the problem and reduce its business risks.
But as director of Vanderbilt University’s Stanton Foundation First Amendment Clinic Gautam Hans says, TikTok is wading into the gray area of regulating speech. And ultimately: “Speech is messy.”
An executive Parlay. Parler’s CEO John Matze says he’s been ousted by the company’s board after the app was shuttered by Amazon Web Services and banned on Google and Apple’s app stores. So who will lead Parler, should it move forward? Two wild guesses: Jeffrey Wernick, Parler’s chief operating officer, could step up into the leading role. Or if Parler wants to double down on its conservative slant, it may consider Dan Bongino, the right-wing political commentator who has invested in the company.
You shall not pass. Internet providers in Myanmar are temporarily blocking access to Facebook and its family of apps after the military seized power in a coup. The government claims to have made the move because it believes Facebook may lead to instability in the country (hate speech on Facebook fueled a genocide in the region in 2016 and 2017). The block is expected remain in place until Feb. 7 and comes as Facebook reportedly designated Myanmar a “temporary high-risk location.”
The cloning continues. Instagram is working on a new feature that has a TikTok-like feel, according to TechCrunch. The social network, which is owned by Facebook, is planning to release Vertical Instagram Stories, which will give users the chance to vertically scroll through the videos and photos that disappear 24 hours after they’re posted. (By the way, Instagram’s Stories are a copycat of Snapchat, which made disappearing posts popular.) And this isn’t the first time Instagram has created a TikTok copycat feature. In August, Instagram rolled out Reels, a TikTok-like feature that allows users to share 15-second videos.
An ethical conundrum. Two Google engineers quit following the controversial ousting of Timnit Gebru, an artificial intelligence ethics researcher. David Baker, an engineering director, and Vinesh Kannan, a software developer, said they left because they felt the company mistreated Gebru. Late last year, Gebru said she was fired for calling out Google’s lack of commitment to diversity. After she was fired, thousands of people, including Google employees, signed an open letter demanding an explanation from Google.
FOOD FOR THOUGHT
Researchers at New York University’s Stern Center for Business and Human Rights have said there is no evidence to support the claim that social media companies are biased against conservative views. The theory has been trumpeted by high-profile Republicans including former President Donald Trump.
“Rather than prompting reflection and reconsideration, the violence at the Capitol and Donald Trump’s banishment from Twitter and Facebook seemed to heighten many Republicans’ determination to portray Silicon Valley as the archenemy of the political right. … [But] most conservatives aren’t likely to retreat exclusively to their own corner of the social media world and cease paying attention to Facebook, Twitter, and YouTube. Conservatives are drawn to the established platforms for the same reason liberals are: That’s where you can reach the largest audiences and enjoy the benefits of the network effect. And as much as they condemn supposed social media favoritism, conservatives appear to relish wielding the bias-claim cudgel, even though it’s based on distortions and falsehoods,” the report states.
IN CASE YOU MISSED IT
GameStop debacle brings calls for new ‘plumbing’ on Wall Street By Jeff John Roberts
A new fund wants to fix venture capital’s other diversity problem By Claire Zillman and Emma Hinchliffe
This start-up is giving away over $1 million of its seed money By Chris Morris
Why the GameStop YOLO trade has me bullish By Jen Weiczner and David Z. Morris
GameStop futures rally again. Should investors worry? By Bernhard Warner
(Some of these stories require a subscription to access. Thank you for supporting our journalism.)
BEFORE YOU GO
A new trend on YouTube might be helpful for anyone who’s feeling caged up at home. Digital artists are creating videos that give people a virtual ambience: You can sit in a virtual cozy spot while you watch and listen to the virtual rain fall on the virtual window in front of you. Or you can hang out in a Victorian-style library or a hip coffee shop, all from the convenience of your laptop. Vice explored these 3D videos that have become so popular. So grab a cup of coffee, sit back, and relax in whatever virtual environment suits you best.