By Tom Huddleston Jr.
August 17, 2017

Facebook is taking yet another step to cut back on fake and misleading content on the service. This time the target is clickbait videos.

Facebook has spent the past several months trying to curb “fake news” and other misleading posts, including adding a fact-checking feature for flagging disputed posts and launching a “Journalism Project” aimed at helping the company work more closely with legitimate news publishers.

On Thursday, Facebook introduced new updates meant to reduce the number of video clickbait posts that appear on the service. Among the posts that Facebook is targeting are those that feature fake video play buttons as well as actual video files that only show a static image.

“When people click on an image in their News Feed featuring a play button, they expect a video to start playing,” Facebook engineers wrote in a blog post. “Spammers often use fake play buttons to trick people into clicking links to low quality websites.”

Fake video play buttons can dupe Facebook users into making unintended visits to dubious sites, including those with malicious adware. But spam posts featuring a video of a static image are more likely trying to go viral by taking advantage of the fact that Facebook’s News Feed algorithm heavily promotes video content. Facebook noted that the latter type of spam posts can “trick people into clicking on a low quality experience.”

Get Data Sheet, Fortune’s technology newsletter.

Facebook said it is tweaking that algorithm to “markedly decrease” the number of video clickbait posts that appear in users’ feeds. In May, the company made similar tweaks to reduce its promotion of news stories posted with typical clickbait headlines, especially those that either withhold or exaggerate information. Earlier this month, Facebook also said it has created a software algorithm to identify and flag suspicious news stories in order to have them reviewed by third-party fact checkers.

The company’s stepped-up efforts to curb clickbait and fake news have been prompted in part by criticism from some users, politicians, and shareholders. Many of those critics suggested Facebook’s problems may have affected last year’s presidential election, which CEO Mark Zuckerberg initially dismissed as a “crazy idea” last fall before he began to seriously tackle the problem. Other social media giants, such as Twitter and Google’s YouTube, have also been battling the spread of misleading content.

SPONSORED FINANCIAL CONTENT

You May Like

EDIT POST