Ukraine InvasionCybersecurityEnergyTravel IndustryAutos

A Ukrainian journalism professor has fought Putin’s disinformation machine for 8 years—with surprising success. Here’s why he’s convinced fake news can be defeated

June 28, 2022, 12:00 PM UTC
Russian President Vladimir Putin with defense minister Sergei Shoigu (left) and commander-in-chief of ground forces Oleg Salukov at the Victory Day Parade at Red Square in Moscow.
Contributor/Getty Images

Shortly after Russia invaded and annexed Crimea in 2014, a team of Ukrainian journalists and researchers launched StopFake, a fact-checking service aimed at debunking the Kremlin’s well-funded disinformation machine. Operating on a shoestring budget, StopFake has grown into a vital force in Ukraine’s efforts to protect the country against a different kind of Russian invader—the parade of propagandists, deepfakes, trolls for hire, and bot armies that do their best to undermine truth, divide the population, and sow doubt in the country’s democratic institutions. Exposing those lies is nothing less than a daily battle, says StopFake’s cofounder and chief editor, Yevhen Fedchenko. 

“This war is created by disinformation,” he says. The best way to fight back, he adds, is to learn the dark art of disinformation yourself, to better understand and discredit your foe. StopFake’s work, mostly funded by grants, has been translated into 13 languages and heralded by press-freedom advocates. Nonprofit Freedom House has called it a “gold standard” for hunting down fake-news perpetrators and exposing their lies. Its model of investigating dodgy claims, and publishing detailed reports on why the public should be suspicious, has become “a model in other Central and Eastern European countries,” Freedom House notes.

Yevhen Fedchenko, co-founder and chief editor of StopFake.org.
Courtesy of Yuri Panin, Mohyla School of Journalism

Recently StopFake’s work was on display for the world to see. Its investigators debunked a viral story, pushed by Russian media, that Uber was forbidding its drivers to transport Ukrainians fleeing the war. Adding credibility to its report, StopFake tracked down an Uber spokesperson in Ukraine to clarify that there is no such Uber company policy. A few days after StopFake reported its findings, Uber announced the launch of a special program. The ride-hailing firm teamed with the British Red Cross to provide free meals and rides for Ukrainian refugees into Poland. Thousands of Ukrainians ultimately took Uber up on the offer of safe travel.

Fedchenko, the 46-year-old director of the Kyiv Mohyla School of Journalism, spoke to Fortune about the lessons he and his team of roughly 20 have learned while fighting Russia’s efforts to weaponize the internet and the airwaves against his homeland. Sitting in an undisclosed location in western Ukraine, he shared his experience and offered tips on how democracies—individuals, businesses, and organizations—can combat a disinformation campaign. 

This interview has been edited for clarity and brevity.

Fortune: Prior to the war you were in based in Kyiv. Where are you now? Can you say?

Fedchenko: Normally we are all based in Kyiv. But when the incursion started, our team needed to move to different locations inside and outside of Ukraine to ensure our work continues. We are a small and flexible team. We have been prepared for this war even before the war started. We knew everything about this war even before it started. We even knew when it would start based on our analysis of the uptick in Russian disinformation.

If you’re gathering that kind of intelligence I can understand why you’d prefer not to disclose your precise location.

There is a personal security risk for myself and for the team, for sure. As for myself, I know that Russia had prepared lists of those people whom they feel should be eliminated if they are to succeed in taking over Ukraine. I am on this list. That must mean I’m doing something important. If they’re looking for me, you know, I’m not wasting my time.

You said you knew ahead of time when the war would begin. How did you determine that?

We realized early on that disinformation is part of their war effort. So, [our objective] became: Follow their disinformation ecosystem, know how it works—who are the main players, what are the main narratives—and then map the main platforms they’re using to spread their messages. I can compare it to intelligence gathering, like tracking enemy movements of military hardware. Before the incursion started, we were seeing a lot of people and projects mapping Russian military hardware around the borders of Ukraine: Thousands of troops here, hundreds of tanks over there. We’ve been doing basically the same thing, but with Russian disinformation. We’d been examining what kind of narratives they were pushing, how it can be measured in terms of quantity—the volume of hoaxes and disinformation, and how they amplify it. 

So, the lesson being: Follow the main players, their actions, and their words. And what was the result of mapping these things?

We knew everything about the enemy even before this incursion started. For example, the main narratives we had been following ultimately became Russia’s stated pretext for starting this war. When we saw Russia’s fascist narrative—and narratives about Ukraine, that in reality it does not exist, that Lenin actually created Ukraine—we could see that Putin was laying down the groundwork for the war with these narratives. [Editorial note: On the eve of war, Russian President Vladimir Putin justified the invasion of Ukraine in part by describing Ukraine as a fascist state, and claiming that its Western allies were responsible for the ”Nazification” of the country, a story line that still holds purchase inside Russia and is repeated by Putin-philes around the world.] As a journalistic project, it was our job to fact-check this. We’ve been archiving these kinds of narrative since the beginning.

For what larger aim?

We plan to use our archive [of Russian fake news] to prosecute the main Russian propagandists, those people who are implicated in this war. They are not journalists. They are basically participating in the genocide that is happening over here. We have collected all kinds of evidence. Eight years. Thousands of examples. We will combine our efforts with NGOs doing similar work to present the most powerful case against these so-called journalists who are openly calling for the genocide of Ukrainians in their broadcasts, or in their printed pieces, or in their blog posts embedded with the Russian forces fighting in Ukraine.

What are some other lessons you’ve learned fighting Putin’s disinformation apparatus?

This is not Soviet-era disinformation. Russia learned long ago that in order to penetrate Western media, you need to look like the Western media system. You need to sound like Western media. You need to hire Western journalists and editors to look more persuasive. To appeal to local audiences, you need to speak their language. They’re not speaking about Marxism or Leninism anymore, because almost nobody is interested in that. No, they realized they need to speak about the problems of those countries where they’ve set up. They now operate from within those societies. That was the biggest change from the Soviet era, and the most successful one. There are many Russian propagandists living and openly working in the West, masquerading as real media outlets—for RT in France, or Sputnik in Germany, or even in local media, in places where people could never trace them back to Russia or to the Kremlin’s money. With sanctions, we’ve seen some deplatforming of these Russian disinformation outlets, but it’s just the tip of the iceberg. In most cases, their [overseas disinformation] systems remain intact. 

From your analysis, what is the ultimate aim of these kinds of disinformation tactics?

To penetrate the local political system. They do so by focusing on specific local problems. They want to make the problems appear bigger, to make these cracks within society appear bigger and bigger. It’s not about promoting Russian ideologies. It’s about sowing doubts about your government, about your media, about your democratic system, about your integrity, about your local community.

Whether it be a democracy, or a business: How do you fight back against this?

If the [disinformation] system is well financed, it will keep popping up, again and again. So we respond to it in a similar way. If we see them exploiting fake news, we will debunk the fake news. We fight one fake news item at a time. Also, we look for the gaps. If you can identify the gaps, fill them. Because if you do not, [disinformation practitioners] will fill it for you. By filling the gap, I mean creating your own content which explains what you are doing about a certain issue. In this way, you are creating your own narrative about what is really happening. Also, we fight back with the same kind of content. If they are using television stories, we will debunk their messages with TV-quality stories. If they attack us with text-based articles, likewise we respond with text. If they’re on social media, we respond there, in social media. If they translate their stories into different languages, we respond in those same languages. We mirror their tactics, and respond accordingly. There are limitations. If they are using fake news, we wouldn’t respond with fake news of our own. The other thing: Understand who you’re up against—in our case, get inside the Russian brain. We found that many of the same networks and outlets Russia was using to undermine the West’s response to COVID-19, they were later using them to promote their disinformation campaign about Ukraine.

How helpful have the tech companies been in helping you combat Russia’s disinformation attacks?

We’ve been working with [Facebook parent] Meta for the past two years, doing fact-checking for them, as a third-party fact-checker. The other platforms have not been very responsive, though, to our [requests for collaboration]. Telegram, for example, is just another Russian platform, like RT America. And, then there’s TikTok. A lot of people use it, and don’t realize it’s being used as a platform to spread disinformation against the Ukrainian people. And content rules on TikTok are limited. I haven’t seen many instances of TikTok fighting disinformation.

You’ve been doing this for eight years. You’re now on Putin’s public-enemy list. Your country is at war. Are you optimistic the forces of disinformation can be defeated?

Yes, absolutely. We contributed a lot to discrediting the Russian disinformation system. Now, if someone says, ‘Oh, I represent Russian government or Russian media,’ people will question, ‘Can we trust you?’ That’s important. People should begin to differentiate which sources they can trust, and which sources they cannot, by default. And today there is a whole giant fact-checking community out there, and we’re learning a lot about the global disinformation system. The question is: How can we translate this knowledge into real action?

For more on this topic, please read: A growing army of online trolls is using dangerous lies to take down executives and companies. Now they’re coming for you.

Each week, Fortune covers the world of innovation in Breakthrough. You can read previous Breakthrough columns here.