Congress Just Made It Harder for You to Shield Yourself From Internet Discrimination
Did you spend eight hours binge-watching Netflix last night? Or did you search the Internet to read about birth control options, find a new church, or research ways to get involved with your local political party? If so, there’s an important fact you should know: Information about the websites you visited or the time you spent using certain apps is not as private as you think. And that’s not good news if you want to protect yourself against discrimination.
On Tuesday, in a vote with strong Democratic opposition, Congress passed a resolution to undo Federal Communications Commission (FCC) privacy rules issued under the Obama administration late last year that would have protected this sensitive information. Internet service providers (ISP) oppose these rules for a very simple reason: There is a lot of money to be made in selling consumers’ data. As a result, the industry has spent millions lobbying Congress to oppose the protections. The Trump administration’s support of the measure means it is almost certain to become law.
The FCC rules required that major ISPs like Verizon and Comcast receive customers’ permission before using or selling their private information, including web browsing history and app usage data. These same rules also prohibited these companies from adopting policies that leave customers no choice but to consent to sharing their personal information if they want to access the Internet. They further required companies to develop reasonable data security measures and to notify customers if their private information were breached.
These limits are essential to protecting users’ privacy. But they are also essential for another reason: They help consumers take steps to shield themselves from discrimination. Advertisers and data brokers are increasingly using data to decide what prices to advertise to someone, the content they should steer them to, and even the types of loans to offer them. And there is increasing evidence that this data can be used to discriminate against certain communities.
For example, until just recently, Facebook allowed advertisers to target users based on their race, their gender, or a medical condition. Before this policy change, advertisers had the option of, for instance, excluding anyone with an African American “ethnic affinity” from seeing certain housing advertisements. Such practices undermine laws designed to prohibit practices that have been used to discriminate against minorities for generations.
Similarly, evidence shows that data brokers increasingly seek to identify financially vulnerable populations to assist companies in targeting their financial products. For example, a congressional report found that data brokers advertise the ability to identify consumers with fewer traditional bank accounts than average; these include “widows,” “new legal immigrants,” and “consumers with transitory lifestyles, such as military personnel.” Given the financial industry’s history of targeting predatory loans at minority and disadvantaged communities, we should not take lightly the ability of financial institutions to target consumers in this way.
Indeed, ad targeting can even impact the job openings that someone is likely to see. A recent study found that Google’s algorithm was more likely to show prestigious, high-paying executive jobs to men than to women.
This is precisely why the FCC rules are so important. In some cases, it is in consumers’ best interest for their private data to remain just that—private. Yet even if consumers take steps to protect their private information by, for example, using a private web browsing option, their ISP can still see the websites they visit and has access to a host of other sensitive information.
Some critics of the FCC rule have said that they believe that the free market will correct these types of abuses. But this is unlikely to be the case. Nearly half of Americans have access to only one high-speed ISP. In many rural areas, in particular, there is an alarming lack of high-speed Internet options. And anyone who has ever tried to negotiate their bill can tell you that trying to barter with these companies is not an easy endeavor (ISPs consistently rank at the bottom of customer satisfaction surveys).
Moreover, reversal of the FCC rule using the Congressional Review Act—the procedural maneuver being employed by Congress—has other far-reaching consequences. It prevents the FCC from issuing rules that are “substantially the same” in the future. As a result, the FCC may be unable to issue rules that directly respond to future abuses, even if technologies and public opinion on ISP behavior have shifted. What’s more, other government agencies like the Federal Trade Commission lack the authority to issue proactive rules or do not have jurisdiction over some ISPs.
All of this leaves consumers with little ability to control just how their personal information is used. It leaves government agencies tasked with protecting these consumers even less equipped to address future abuses. And it leaves the public once again let down by members of Congress who have voted to sacrifice consumers’ rights so companies can make an extra buck.
Neema Singh Guliani is a legislative counsel with the American Civil Liberties Union Washington Legislative Office.