Taxi drivers want to know what Uber’s algorithms know, and they’re willing to appeal to EU privacy law to find out

Our mission to help you navigate the new normal is fueled by subscribers. To enjoy unlimited access to our journalism, subscribe today.

British taxi drivers who have long been fighting Uber over workers’ rights have now opened up a new front in their war: They’ve launched a data-protection complaint against the ride-hailing company.

The complaint has implications not only for Europe’s rules around the gig economy, but also for companies’ ability to maintain the secrecy of their proprietary algorithms—algorithms that in Uber’s case use information about drivers’ performance to allocate rides to them.

This means drivers’ income is partly dictated by information about things such as their driving behavior and passengers’ communications with Uber’s support department.

The newly minted App Drivers & Couriers Union (ADCU) filed its complaint against Uber in Amsterdam on Monday, the day before Uber starts trying to convince the U.K. Supreme Court that its workers are contractors, not employees, and therefore it doesn’t have an employer’s responsibilities regarding minimum wage and holiday pay. The case will be Uber’s last chance to overturn three successive rulings against it, and its main adversaries in that case, former drivers Yaseen Aslam and James Farrar, are also the president and general secretary respectively of the ADCU.

According to the ADCU’s new complaint, filed in Amsterdam, the location of the company’s international headquarters, Uber doesn’t give its drivers all the personal data it holds on them, nor does it fully explain to them how its algorithms use this information to manage drivers based on their performance.

If this is true, Uber might be breaking the EU’s tough data-protection law, the General Data Protection Regulation (GDPR), in multiple ways.

The law says a company must tell people what personal data it holds on them, and give them a chance to download that data in a common, machine-readable format that allows the data to be ported over to a rival platform. According to the ADCU, Uber only coughs up very limited personal data in this way, or at all.

This is a particular problem for the ADCU and the affiliated International Alliance of App-Based Transport Workers (IAATW), which are trying to set up a “data trust.” The idea is to build a repository of drivers’ personal data that is outside the platforms’ control and aids collective bargaining. But unless companies such as Uber free their drivers’ data in a portable format, the idea won’t fly.

“Uber has deliberately blocked the efforts of drivers to access their data for the purposes of establishing a data trust,” Farrar said in an ADCU statement. “This is not only a violation of the law but a terrible abuse of Uber’s position of informational power over drivers. Drivers suffer not only wage theft but data theft too.”

The GDPR also says that if a company is using people’s data to make an automated decision about them or to profile them, it has to be transparent about the logic involved: Individuals must have the information they need to challenge that automated decision or profile, if they want to.

According to the union, Uber is again failing to give its drivers access to the performance-related profile data that dictates their treatment by the platform. The information here includes notes that Uber employees attach to certain drivers’ profiles, and tags for things like professionalism and navigation skills.

“We all know Uber manages by algorithm, but we don’t know exactly how, even though workers have the legal right to know,” reads the crowdfunding page for the ADCU’s Dutch complaint. “We have evidence that Uber maintains secret driver and courier profiles, which it uses to rate [workers’] performance with categories such as ‘late arrival/missed ETA,’ ‘negative attitude’ or ‘inappropriate behavior.’ For years, we have worked with hundreds of drivers to make data requests, but Uber always blocks the process and refuses to accept any collective approach.”

Uber had not responded to a request for comment at the time of writing, but a spokesperson told the Guardian that its privacy team “works hard to provide any requested personal data that individuals are entitled to.”

The GDPR allows for massive fines of up to 4% of global annual revenues for particularly egregious lawbreaking, but the union is only asking for a fine of 10,000 euros ($11,430) for each day Uber doesn’t comply with the law.

That would be a paltry amount for a company whose 2019 net revenue ran to around $14 billion. Far more momentous would be an order by the Dutch data protection authority against Uber’s algorithmic secrecy.

The part of the GDPR covering automated decision-making hasn’t been tested much in the couple of years since the law entered into force.

One notable exception came last year, when the Finnish data protection authority cracked down on Swedish financial firm Svea Ekonomi over its refusal to tell an 83-year-old man why it rejected his creditworthiness. The investigation showed that Svea Ekonomi’s automated systems would not approve credit for someone that age, which the regulator said was legally unacceptable.

Uber has already shown itself to be the kind of company that will argue a case all the way up to Europe’s highest court if necessary. It did so, unsuccessfully, when trying to claim in a French case that it was just a digital platform rather than a transportation company.

So it is quite possible that this data-protection complaint, if upheld, could end up snowballing—and companies operating in Europe might end up having to be much more transparent about how their algorithms treat people.

More must-read international coverage from Fortune:

Subscribe to Well Adjusted, our newsletter full of simple strategies to work smarter and live better, from the Fortune Well team. Sign up today.

Read More

Artificial IntelligenceCryptocurrencyMetaverseCybersecurityTech Forward