Yandex, Russia’s biggest search engine, is an on-ramp for child sexual imagery, experts say
Experts say the spread of images of child sexual exploitation online has reached unprecedented levels, calling the situation a major social crisis. While much of this material spreads through well-hidden channels, such as the so-called dark web, there is a major culprit hiding in plain sight: Yandex, Russia’s largest technology firm.
Yandex is the most-used search engine in Russia and the fifth most popular worldwide. It has a Nasdaq-traded stock and a market cap of over $14 billion. According to cybercrime experts, though, the site has not implemented controls employed by other large search engines that have cut the accessibility of images showing child sexual abuse. As a result, they say, Yandex has become a major vector for the spread of child sexual imagery online, including by providing a user-friendly on-ramp to darker corners of the Internet. The company’s role in spreading illicit images around the globe appears to be, in large part, an unintended consequence of Russian government policy.
A number of controls block child sexual imagery from the open Internet. One crucial method used by Google and Bing consists of blacklists of illicit websites and illegal images maintained by the London-based nonprofit Internet Watch Foundation (IWF). Developed using data collated from member companies and other sources, these blacklists have been shown to be effective, but Yandex is not a member of the coalition. Neither is Baidu, the third-ranked search portal worldwide, but it does not have an English-language portal. Yahoo, the fourth-ranked portal, is powered by Bing’s technology.
According to Fred Langford, the organization’s deputy CEO, Yandex wants to join the IWF, “but they’re finding it quite difficult.”
“‘It came from above,’ is what they said to me,” Langford says of his conversations with Yandex. “And I think what they were trying to say is that it came from Putin.”
Yandex neither confirms nor denies Langford’s characterization of why it is not a member of the IWF. “We have developed proprietary technologies that help to prevent access to illegal content in general and child abuse content in particular,” the company said in a written statement to Fortune. Experts, however, contend that Yandex’s independent efforts are much less effective than the approach taken by Western companies and nonprofits.
Yandex’s isolation from collaborative efforts appears to be the result of Russian authorities attempting to maintain political control by clamping down on the Internet and the country’s Internet firms. As the Kremlin has pushed to centralize control of political communication, it has limited interaction between Russian businesses and international nonprofits such as the IWF. As a result, Yandex lacks access to the tools necessary to effectively filter out this illicit content.
According to Chad M.S. Steel, a researcher who investigates crimes against children online, children worldwide are the inevitable victims of Yandex’s weakened filtering.
“Any platform like Yandex that makes it easier for individuals to get that first access to [child sexual] content in a low-risk way is a threat to children,” he said.
Blocking images of child sexual abuse
The global spread of the Internet has been accompanied by an explosion of its use to distribute child sexual imagery, including images of child sexual abuse. Over the past decade, reports of child sexual imagery online have grown from 1 million annually to 45 million per year, the New York Times reported in September 2019. Nonprofits, including some that have spent decades fighting child exploitation, say the spread of these images has exceeded the capabilities of law enforcement, governments, and nonprofits to combat it.
Search and social media companies are in a better position to block these images, especially thanks to machine-learning tools that can filter content at a large scale. One such effort began in November 2013, when both Google and Microsoft’s Bing undertook a series of measures to block child sexual imagery in search results. These measures have come to include de-indexing sites known to host child sexual imagery from searches; blocking keywords in dozens of languages used to locate the material; presenting warnings in response to many search terms; and blocking known abuse images.
Over time, the IWF has become more central to these efforts, with U.S. search companies now contributing to the nonprofit’s blacklists of websites and images. These lists and other IWF services are used by digital media providers from Amazon to Twitter to TikTok, as well as Internet service providers in some countries. According to Langford, the IWF’s blacklist of websites hosting child sexual imagery contains from 6,000 to 7,000 unique URLs at any given time, with between 500 and 1,000 either added or removed each day.
“You whack it down in one area, it pops up somewhere else,” says Langford. “It’s all about how quickly we can find it.”
The IWF also maintains a tool that allows its members to identify and filter known child sexual images. This tool uses technology called PhotoDNA, which was developed by Microsoft and researchers at Dartmouth University. PhotoDNA creates compact “hashes,” unique digital signatures that can be used to quickly identify known child sexual images even after they have been cropped, resized, or otherwise manipulated. (PhotoDNA is also used by criminal investigators to automatically process files in child sexual imagery cases, reducing the psychological stress involved in screening such files manually.)
According to a 2015 study by Steel, Bing and Google’s efforts—including the use of shared IWF data—were effective in significantly reducing searches for child sexual exploitation imagery on those platforms. Yandex did not implement the same controls, and Steel’s research found that search levels for child sexual imagery through the site stayed steady.
“We take the problem of illegal content, especially explicit materials involving children or their access to pornographic resources, very seriously,” Yandex wrote in its statement. “Yandex products and services feature technologies, such as blacklisting or pornographic image recognition, to help filter out illegal material.”
Steel is skeptical that Yandex is taking the problem seriously. He claims that the site’s reverse-image search function shows that it does have technology that could be used for more effective filtering. But, he says, “they’re not doing that, or if they are, it is so inextensive that it’s ineffective.” Langford says Yandex would find it difficult or even impossible to achieve the same level of effectiveness as the IWF blacklist, which receives data from more than 140 member organizations, members of the public, and police worldwide, as well as a staff of expert analysts.
And despite its claims, Yandex actively directs users toward child sexual imagery in ways that would be seemingly trivial to change. Tests by Fortune found that, far from blocking image or web searches for English terms like “jailbait,” Yandex autocompletes them for users, suggests other terms often used by users seeking child sexual imagery, and in some cases even points users toward specific websites associated with such terms.
A training ground for predators
Though precise data is scarce, Steel says that Yandex remains a significant conduit for child sexual images nearly five years after his comparative study—particularly for novice offenders.
“If I could categorize offenders as more or less sophisticated, [Yandex] fits in as something that the less sophisticated offenders use frequently because it’s very easy,” Steels says.
At the most basic level, Steel says, the Russian search engine can return child sexual imagery directly through its image search function, whose layout and functionality is similar to Google and Bing’s. It also often leads to Russian image-sharing and social media sites that are rife with child sexual imagery, thanks to what Steel calls “minimal enforcement” of controls on such imagery on Russian websites more broadly.
The site also opens doors to other, even darker corners of the Internet. Users of child sexual imagery often congregate on what’s known as the dark web, which is only accessible using specialized tools. There, they also often use obscure terminology to describe content being sold or shared.
Yandex can act as an on-ramp for offenders not already familiar with those intricacies, Steel says. Those tempted to seek out child sexual imagery for the first time “don’t know the terminology; they don’t know what they’re going to find.” But after a simple search, when users click on an image, they may be led to sites that host other “content of interest,” he says. That might include message boards that teach users relevant terms and where to get more content.
The Russian connection
According to Sarah Mendelson, an expert on anti–human trafficking policy at Carnegie Mellon University, Russian authorities had a fairly laissez-faire stance toward the Internet until 2011’s Arab Spring. Those mass protests toppled dictators in the Middle East and were organized in part over the Internet.
Later that year, Russia itself saw huge protests against Putin’s run for a third presidential term, also organized online. The protests were met with heavy police repression and followed by an election with many signs of fraud that continued Putin’s hegemony. The Russian President has since tightened his authoritarian grip on his country, squelching Internet blacklists and internationally connected machine-learning tools as a consequence.
“Over the last 20 years [the Kremlin] has made it nearly impossible for local and international NGOs to function in Russia,” Mendelson says. This has led to the expulsion or hamstringing of organizations including the MacArthur Foundation, Ford Foundation, Peace Corps, USAID, and now, apparently, the IWF.
Russia’s state media regulator, Roskomnadzor, and the Russian Association for Electronic Communications, an Internet policy advisory group that works closely with the Russian government, did not respond to Fortune’s request for comment.
In 2015, a new law passed requiring all data on Russian citizens be stored locally, potentially giving the Kremlin greater oversight over the activity of dissidents. In November 2019, a new set of laws went into effect requiring that all traffic be routed through state-controlled infrastructure. The first measure has been weakly enforced, and there is broad skepticism that the 2019 measure is even technically feasible.
But the Kremlin has also angled for direct control over big Internet firms such as Yandex and Mail.ru. In November 2019, Yandex agreed to a dramatic restructuring that gave the Kremlin seats on the board and, in effect, the power to remove the company’s leadership at will. This was regarded by investors as preferable to alternatives, which might have included outright nationalization of the companies.
Yandex’s absence from the IWF appears to be a consequence of the Kremlin’s opposition to foreign NGOs as a threat to its power and its growing ability to directly exert its will on the matter.
“The Russian government is interested in making sure that Yandex is held close,” says Mendelson. “They’re going to make sure that Yandex is not going to be overly influenced by the outside.”
Children: authoritarianism’s collateral damage
While Yandex is a Russian company, neither its use nor the consequences of its misuse stop at the Russian border. Experts believe the online proliferation of images showing child sexual abuse is having effects around the globe.
“In 2001, it was a lot harder to get your hands on child exploitation material than it is now,” says Eric Oldenburg, a law enforcement liaison officer at Griffeye who spent 15 years investigating child exploitation, including as part of an FBI task force.
Oldenburg believes increased access is key to a vicious cycle increasing the consumption of child sexual imagery. “Having a site that doesn’t filter content like [Yandex] is an avenue to not only get [offenders] what they want, but the justification that, ‘Hey, it’s out there, everyone’s doing it.’”
Steel agrees. “The real danger is, the more of this material is out there, the more it becomes normalized, and the more likely individuals are to acquire it and continue using it,” he says. That in turn encourages the production of more child sexual imagery, Steel adds, because new technology—including the dark web and cryptocurrency—has made it possible to profit from the production of those images.
But the subjects of this imagery are real children, and their abuse has devastating and lifelong physical and psychological consequences. Many victims of commercial child sexual imagery come from Africa, the Philippines, and other economically troubled regions with weak governments. Russia itself is also a major hub of human trafficking, thanks to its post-Soviet economic collapse and a lack of support services.
“The population that is vulnerable to trafficking is receiving no support” in Russia, Mendelson says, adding that child sexual imagery itself is inherently a form of human trafficking, because victims cannot give legal consent.
According to the U.S. State Department, Russia “is not making significant efforts” to combat human trafficking. That lax approach extends to the possession and distribution of child sexual imagery, according to the IWF’s Langford.
“Certain names do continue to appear on Whois data of various sites [containing child sexual imagery],” Langford says. “When it comes to sharing [child sexual imagery], I don’t see the evidence that the [Russian] police are arresting people… I think there are people who could or should possibly be prosecuted, who may have influence over others.”
As with Yandex’s inability to collaborate internationally on filtering, this may be, above all, a matter of Russian government priorities. After all, if you have only so many resources, you have to decide where to put them, says Steel.
“Russia in general,” he says, “is more concerned with internal security and regulating things like political speech, than with addressing online child sexual exploitation.”
Update 2/25: During reporting for this story, Yandex did not respond to multiple requests to comment on claims that the company had not joined the IWF due to Russian government policy. After publication, a spokesperson described those claims to Fortune as “complete and utter nonsense.” Yandex also expanded on their prior statement, saying: “We have agreed to join the international non-profit organization NCMEC (National Center for Missing & Exploited Children) and we are currently also evaluating the use of PhotoDNA technology in our services. As for IWF, we have never refused to work with them and on the contrary are looking at different forms of cooperation, including sharing with IWF some of our own technology and expertise.”
More must-read stories from Fortune:
—Food-delivery services feeling pressure to finally turn a profit
—Beware: Iranian cyberattacks may actually be false flags
—How to avoid the growing ‘fleeceware’ scam
—Nude child photos and other ‘toxic content’ are on Giphy, shared by private accounts exploiting loopholes, researchers say
—Jack Dorsey makes it clear: Twitter will never get an edit button
Catch up with Data Sheet, Fortune’s daily digest on the business of tech.