Late Saturday, Facebook CEO Mark Zuckerberg posted a statement responding to criticism of Facebook’s role in the U.S. Presidential election. Both leading up to and in the wake of Donald Trump’s election victory, Facebook has been criticized for facilitating the spread of false or misleading stories and other content.

Zuckerberg first emphasized that “of all the content on Facebook, more than 99% of what people see is authentic,” and that it was “extremely unlikely hoaxes changed the outcome of the election in one direction or the other.” Immediately following the election, Zuckerberg described the idea that fake news on Facebook may have influenced the results as a “crazy idea.”

Get Data Sheet, Fortune’s technology newsletter.

However, Zuckerberg also announced that Facebook will continue refining tools that allow users to flag hoaxes and fake news. The function, which takes hoax reports into account in the site’s content algorithm, was first introduced in January of last year. But it has seemed to have little effect, and Zuckerberg admits that “there is more we can do here.”

As Zuckerberg points out, crowd content moderation is a huge social-engineering challenge, particularly when it comes to the sort of political content observers are most concerned about on Facebook. Many news stories, he wrote, “express an opinion that many will disagree with and flag as incorrect even when factual.”

Zuckerberg’s statement was followed by a New York Times report describing a flurry of discussions among top Facebook executives about the social network’s possible role in the election. But, despite that renewed anxiety, yesterday’s statement may strike some as affirming Facebook’s relatively hands-off approach to content curation. Zuckerberg writes that “assuming that people understand what is important in their lives” has been core to Facebook’s success, and that Facebook “must be extremely cautious about becoming arbiter of truth ourselves.”

For more on TK watch our video.

That’s exactly the charge the site faced earlier this year following allegations that human curators had removed conservative-leaning news stories from the Trending Topics section. The feature was automated in August in a bid to increase objectivity (or at least the appearance of objectivity).

But many experts think Facebook needs to be more proactive, with opponents of President-elect Trump particularly harsh in their assessment of Facebook’s influence. Fake news stories that reached millions of Facebook users this year included one claiming that Pope Francis had endorsed Donald Trump, and another reporting that Fox News host Megyn Kelly had been fired.