UK Uber drivers are taking the algorithm to court

A group of UK Uber drivers has launched a legal challenge against the company’s subsidiary in the Netherlands. The complaints relate to access to personal data and algorithmic accountability.

Uber drivers and Uber Eats couriers are being invited to join the challenge which targets Uber’s use of profiling and data-fuelled algorithms to manage gig workers in Europe. Platform workers involved in the case are also seeking to exercise a broader suite of data access rights baked into EU data protection law.

It looks like a fascinating test of how far existing legal protections wrap around automated decisions at a time when regional lawmakers are busy drawing up a risk-based framework for regulating applications of artificial intelligence.

Many uses of AI technology look set to remain subject only to protections baked into the existing General Data Protection Regulation (GDPR). So determining how far existing protections extend in the context of modern data-driven platforms is important.

The European Commission is also working on rebooting liability rules for platforms, with a proposal for a Digital Services Act due by the year’s end. As part of that work it’s actively consulting on related issues such as data portability and platform worker rights — so the case looks very timely.

Via the lawsuit, which has been filed in Amsterdam’s district court today, the group of Uber drivers from London, Birmingham, Nottingham and Glasgow will argue the tech giant is failing to comply with the GDPR and will ask the court to order immediate compliance — urging it be fined €10,000 for each day it fails to comply.

They will also ask the court to order Uber to comply with a request to enable them to port personal data held in the platform to a data trust they want to establish, administered by a union.

For its part Uber UK said it works hard to comply with data access requests, further claiming it provides explanations when it’s unable to provide data.

Data rights to crack open an AI blackbox?

The GDPR gives EU citizens data access rights over personal information held on them, including a right to obtain a copy of data they have provided so that it can be reused elsewhere.

The regulation also provides some additional access rights for individuals who are subject to wholly automated decision making processes where there is a substantial legal or similar impact — which looks relevant here because Uber’s algorithms essentially determine the earning potential of a driver or courier based on how the platforms assigns (or withholds) jobs from the available pool.

As we wrote two years ago, Article 22 of the GDPR offers a potential route to put a check on the power of AI blackboxes to determine the trajectory of humankind — because it requires that data controllers provide some information about the logic of the processing to affected individuals. Although it’s unclear how much detail they have to give, hence the suit looks set to test the boundaries of Article 22, as well as making reference to more general transparency and data access rights baked into the regulation.

James Farrar, an Uber driver who is supporting the action — and who was also one of the lead claimants in a landmark UK tribunal action over Uber driver employment rights (which is, in related news, due to reach the UK Supreme Court tomorrow, as Uber has continued appealing the 2016 ruling) — confirmed the latest challenge is “full spectrum” in the GDPR rights regard.

The drivers made subject access requests to Uber last year, asking the company for detailed data about how its algorithm profiles and performance manages them. “Multiple drivers have been provided access to little or no data despite making a comprehensive request and providing clear detail on the data requested,” they write in a press release today.

Farrar confirmed that Uber provided him with some data last year, after what he called “multiple and continuous requests”, but he flagged multiple gaps in the information — such as GPS data only being provided for a month out of two years’ of work; no information on the trip rating assigned to him by passengers; and no information on his profile nor the tags assigned to it.

“I know Uber maintain a profile on me but they have never revealed it,” he told TechCrunch, adding that the same is true of performance tags.

“Under GDPR Uber must explain the logic of processing, it never really has explained management algorithms and how they work to drivers. Uber has never explained to me how they process the electronic performance tags attached to my profile for instance.

“Many drivers have been deactivated with bogus claims of ‘fraudulent use’ being detected by Uber systems. This is another area of transparency required by law but which Uber does not uphold.”

The legal challenge is being supported by the App Drivers & Couriers Union (ADCU) which says it will argue Uber drivers are subject to performance monitoring at work.

It also says it will present evidence of how Uber has attached performance related electronic tags to driver profiles with categories including: Late arrival/missed ETAs; Cancelled on rider; Attitude; Inappropriate behaviour.

“This runs contrary to Uber’s insistence in many employment misclassification legal challenges across multiple jurisdictions worldwide that drivers are self-employed and not subject to management control,” the drivers further note in their press release.

Commenting in a statement, their attorney, Anton Ekker of Ekker Advocatuur, added: “With Uber BV based in the Netherlands as operator of the Uber platform, the Dutch courts now have an important role to play in ensuring Uber’s compliance with the GDPR. This is a landmark case in the gig economy with workers asserting their digital rights for the purposes of advancing their worker rights.”

The legal action is being further supported by the International Alliance of App-based Transport (IAATW) workers in what the ADCU dubs an “unprecedented international collaboration”.

Reached for comment on the challenge, Uber emailed us the following statement:

Our privacy team works hard to provide any requested personal data that individuals are entitled to. We will give explanations when we cannot provide certain data, such as when it doesn’t exist or disclosing it would infringe on the rights of another person under GDPR. Under the law, individuals have the right to escalate their concerns by contacting Uber’s Data Protection Officer or their national data protection authority for additional review.

The company also told us it responded to the drivers’ subject access requests last year, saying it had not received any further correspondence since.

It added that it’s waiting to see the substance of the claims in court.

The unions backing the case are pushing for Uber to hand over driver data to a trust they want to administer.

Farrar’s not-for-profit, Worker Info Exchange (WIE), wants to establish a data trust for drivers for the purposes of collective bargaining.

“Our union wants to establish a data trust but we are blocked in doing so long as Uber do not disclose in a consistent way and not obstruct the process. API would be best,” he said on that, adding: “But the big issue here is that 99.99% of drivers are fobbed off with little or no proper access to data or explanation of algorithm.”

In a note about WIE on the drivers’ attorney’s website the law firm says other Uber drivers can participate by providing their permission for the not-for-profit to put in a data request on their behalf, writing:

Worker Info Exchange aims to tilt the balance away from big platforms in favour of the people who make these companies so successful every day – the workers.

Uber drivers can participate by giving Worker Info Exchange their mandate to send a GDPR-request on their behalf.

The drivers have also launched a Crowdjustice campaign to help raise £30,000 to fund the case.

Discussing the legal challenge and its implications for Uber, Newcastle University law professor Lilian Edwards suggested the tech giant will have to show it has “suitable safeguards” in place around its algorithm, assuming the challenge focuses on Article 22.

“Article 22 normally gives you the right to demand that a decision made in a solely automated way — such as the Uber algorithm — should either not be made or made by a human. In this case Uber might claim however, with some success, that the algorithm was necessary for the Uber context with the driver,” she told us.

“However that doesn’t clear their path. They still have to provide ‘suitable safeguards’ — the biggest of which is the much-discussed right to an explanation of how the algorithm works. But noone knows how that might operate.

“Would a general statement of roughly how the algorithm operates suffice? What a worker would want instead is to know specifically how it made decisions based on his data — and maybe how it discriminated against him or disfavoured him. Uber may argue that’s simply impossible for them to do. They might also say it reveals too much about their internal trade secrets. But it’s still terrific to finally have a post GDPR case exploring these issues.”

In its guidance on Article 22 requirements on its website, the UK’s data watchdog, the ICO, specifies that data controllers “must provide meaningful information about the logic involved in the decision-making process, as well as the significance and the envisaged consequences for the individual”.

It also notes Article 22 requires that individuals who are subject to automated decisions must be able to obtain human review of the outcome if they ask. The law also allows them to challenge algorithmic decisions. While data controllers using automation in this way must take steps to prevent bias and discrimination.

Go to Source