In a post late Friday, Facebook CEO Mark Zuckerberg outlined a bevy of new initiatives to crack down on fake or misleading news. Facebook has faced intense criticism in the wake of the November 9th election, primarily from left-leaning critics arguing that huge volumes of false news spread on the site influenced the electorate, perhaps contributing to Donald Trump’s victory.

Get Data Sheet, Fortune’s technology newsletter.

Zuckerberg, despite previously rejecting such arguments, wrote in the new statement that “we take misinformation seriously.” He says Facebook will develop “stronger detection” through technical systems—that is, artificial intelligence and machine learning programs that can rank sources by trustworthiness. That could look something like a metric being developed by the university-based Trust Project, which would rate outlets according to their consistency and reputation, reporters’ expertise, and diversity of perspectives.

Facebook will also look to third parties for help with verifying news sources. Zuckerberg cites “respected fact checking organizations,” saying the company is already in discussions with some and hopes to work with many. Sites such as Politifact.com and Factcheck.org may become sources for warning labels attached to stories that have been deemed untrustworthy.

Zuckerberg also says that Facebook will work to make it easier for users to report fake news. That feature has been available for years, but flagging content as fake currently requires digging through three layers of menus.

For more on Facebook and the news, watch our video.

Finally, Zuckerberg reiterated Facebook’s new policy of banning purveyors of misinformation from advertising on the site. That’s a small step towards a bigger goal of, in Zuckerberg’s words, “disrupting fake news economics”—many of the fake news pieces that have gone viral on the platform during the election were produced by financially-motivated publishers, including a cadre of Macedonian teenagers recently profiled by Buzzfeed.

Given the renewed post-election scrutiny of Facebook’s persistently troubled relationship with news sharing, these moves are good PR. But they also represent steps away from the site’s insistence that it’s a platform, not a media company—a stance that protects it from regulation or prosecution for content shared by users.