Says it will look at signals that determine whether a post is spammy or sensationalized.
After initially denying that “fake news” was a problem on its network, Facebook has spent the past few months trying to come up with ways to stop the spread of hoaxes and false news reports. The latest step is a change to the algorithm that the company says will promote what it calls “authentic” news stories in users’ news feeds.
A post published on Wednesday by two Facebook research scientists and one of the social network’s engineering team said that “authentic communication” is one of the key values of the Facebook news feed. That includes news stories that people “consider genuine, and not misleading, sensational or spammy,” the post added.
Facebook uses a number of signals to determine what to show you in your news feed, the researchers said—including how close you are to the person or page posting an update or sharing a piece of content, and how many likes and shares it has.
Get Data Sheet, Fortune’s technology newsletter
With the latest update, the company says it will be adding new “universal signals” to the algorithm and its ranking process. To come up with those signals or indicators, engineers looked at various Facebook pages and tried to identify whether they were posting spam, or trying to “game the feed” by asking for likes and shares.
All of that data went into training a model that determines in real time if posts are likely to be authentic, Facebook explains: “For example, if page posts are often being hidden by people reading them, that’s a signal that it might not be authentic.”
Ranking things as obvious spam or attempts to game the news feed could remove some of the more egregious types of fake news, such as the pages created by Macedonian teens who are trying to generate revenue from advertising using Facebook fb and Google googl . Both companies have also taken steps to turn off that revenue tap for such sites.
The biggest problem for Facebook in trying to determine what is authentic is that plenty of users who agree with the political bent of a post may choose not to hide it or mark it as spam because it fits with their preconceptions, a psychological effect known as “confirmation bias.”
In the same way, defining what is a “sensational” or “misleading” news story is a very difficult judgement call to make. What a pro-Trump user defines as misleading might not be the same as what an anti-Trump user believes is misleading, and vice versa. So the most insidious forms of “fake” news could continue to be problematic.
Some observers don’t think Facebook should try to fix fake news because that essentially forces the social network to decide what is true and what isn’t. And the difficulty of doing that could explain why Facebook has been so reluctant to admit that it is a media company.