Inside a bland conference room—equipped with video screens, a whiteboard, and an American flag—a small squad of Facebook employees is trying to prevent disinformation campaigns from tilting the upcoming midterm elections.
This “election war room,” as Facebook describes it, is where about 20 employees representing the company’s legal, policy, security, and other departments huddle and spot anomalies in internal Facebook data that may indicate that bad actors are spreading propaganda on the social network and its sister services.
Facebook invited reporters to briefly tour the war room at the company’s Menlo Park, Calif. headquarters on Wednesday.
The facility resembled a conventional office room, with young employees working side-by-side, some situated in front of standing desks adorned with multiple screens. One would be mistaken that the employees were coding away on typical software projects if it weren’t for the printed-out labels attached to some of the monitors with descriptions like “Elections Software Engineer,” “Integrity Software Engineer,” and “Research Brazil.”
Employees gazed at constantly updating charts on several large video screens to monitor things like potential “voter suppression” campaigns or spam operations intended to mislead U.S. users. The room felt a bit cramped because the staff members were working in such close proximity to each other.
An American flag was placed on one wall, while a Brazilian flag was on another, indicating the team’s work monitoring the polarized elections in the South American country. Facebook executives said that there will likely be staff working in the room all day in the lead up to the U.S. elections.
The war room, which opened in September, is part of Facebook’s (FB) broader efforts to avoid a repeat of the 2016 U.S. presidential elections, during which Russian-linked entities spread fake news. Lawmakers hammered Facebook for failing to prevent the problem, creating a public relations black eye and calls for regulation that only increased after the company’s high-profile data privacy scandals that were revealed in spring.
After initially dismissing the importance of disinformation campaigns, Facebook CEO Mark Zuckerberg has promised to invest heavily in combating the problem. And now Facebook argues that the war room is a prime example of its efforts.
“We know that when it comes to an election, really every moment counts,” said Samidh Chakrabarti, director of elections and civic engagement for Facebook. “If there are late breaking issues that we see on the platform, we need to be able to detect them and respond to them in real time as quickly as possible.”
Some of Facebook’s other anti-disinformation endeavors include working with third-party organizations like the Associated Press to fact-check questionable news stories and photos, and creating tools to automatically identify bogus content and fake accounts. It has also introduced a public database of political ads that have been published on Facebook and that is intended to identify who paid for them.
But occasionally, even those efforts stumble.
For instance, The New York Times reported on Wednesday about a loophole in Facebook’s political ad database that allowed some political ad buyers to disguise themselves, undermining the database’s effectiveness. Facebook told the newspaper that it would fix the problem.
Facebook built custom software that identifies viral content spreading on the platform so that war room staff can analyze it, Chakrabarti said. The company also uses software that monitors trends on other sites like Twitter and Reddit to identify any possible disinformation campaigns that could spread to Facebook and its related properties like photo-sharing app Instagram and messaging service WhatsApp.
Chakrabarti said Facebook has already used the war room to monitor the first round of the Brazilian elections earlier this month. He said that the staff successfully spotted a voter suppression campaign that tried to spread an inaccurate message that the elections had been delayed a day by protests.
“We were able to remove it off the platform before it had a chance to go viral at all,” Chakrabarti said.
Still, Brazil’s electoral court complained about the amount of misinformation on Facebook and WhatsApp during the elections, and ordered Facebook to remove over thirty links to fake news reports intended to smear a Brazilian politician, the Guardian reported last week.
Nevertheless, Chakrabarti insists that the war room was useful during the Brazilian elections and that it let Facebook’s staff work faster than they would have otherwise. Although they could work remotely, having everyone in the same room let the team make quick decisions and talk “face-to-face,” he said.
Get Data Sheet, Fortune’s technology newsletter.
However, it’s unclear whether the war room will become a permanent fixture or just a temporary Band-Aid. Executives emphasized that several teams policing the problem work virtually—combating election interference in Facebook offices worldwide—making this one small conference room in Menlo Park just a small part of the effort that also happens to be tangible and therefore PR-friendly.
The “war room is something new that we’re trying in terms of having a physical presence, and we’ll reevaluate and see how it works here after the U.S. midterms to determine if this is something that we want to continue for major elections going forward,” said Katie Harbath, Facebook’s director of global politics and government outreach.
Facebook is unlikely to invite staff from other companies like Twitter and Reddit that are struggling to curtail fake news to join its team in the war room. Although Facebook works with others, it’s easier to communicate with them “virtually” than physically, said Facebook head of cybersecurity policy Nathaniel Gleicher.
“What you don’t want to do is lose time to facilitate the face-to-face engagement and take that time away that’s needed to respond,” Gleicher said of the challenges of getting different company representatives to physically meet in the war room.
Harbath characterized Facebook’s company-wide push to safeguard its platform and related services as akin to when the social networking titan prioritized its mobile app in 2012 after falling behind competitors. It’s an effort that Harbath said represents Facebook’s “new normal,” which means that investors and the public should get used to the company further spending money and resources on safety-related issues.
“This isn’t going to stop right after the midterms,” Harbath said referencing multiple other upcoming world elections. “This is really going to be a constant arms race.”