Google’s ‘Filter Bubble’ Can Manipulate Your Search Results, Study Suggests
Eighty-seven individuals searched for the same terms as part of Measuring the Filter Bubble: How Google Is Influencing What You Click, a study conducted by DuckDuckGo, a search engine that aims to protect users’ privacy. Participants searched Google for “gun control,” “immigration,” and “vaccinations,” but despite the identical search terms, the results varied, even when controlling the time and location, according to the study.
DuckDuckGo also found that logging out of Google or searching in private browsing did not change this variation in Google search results, perhaps suggesting that such tactics don’t actually provide anonymity.
Regardless of whether the users were logged in, logged out, or using the incognito mode feature, variation remained constant—in some cases, different news sources appeared in the results, while in others, the placement of a result differed drastically between searches, according to the study. The variation was also true of the results that appeared in the Google news and video boxes.
The results suggest that it might not be possible to get truly objective search results that could be replicable among users. Google has suggested that personalization improves the user experience, but too much personalization can also create “filter bubbles” that prevent users from seeing results that don’t align with their world views, thereby potentially increasing partisanship and polarization.
Other reasons that users may see different results, Google explained, is the location of its data centers and localization of query results, or when a search is ambiguous, leading the search engine to rely on recent searches to provide context.