Daily News

View All News

Europe – Dutch Court of Appeals rules against Uber and Ola Cabs in GDPR case

06 April 2023

An appeals court in the Netherlands ruled in favour of workers and against Uber and Ola Cabs in a series of rulings on driver data protection claims. Founded in India, Ola Cabs claims to be the world’s 3rd largest ride-hailing app.

Worker Info Exchange brought the cases in support of members of the App Drivers & Couriers Union in the UK and a driver based in Portugal.

TechCrunch reports that the legal challenges against the algorithmic management practices of Uber and Ola were originally lodged on behalf of drivers in the UK back in 2020, in July and September, and was centred on digital and data access rights enshrined in the European Union’s GDPR (General Data Protection Regulation).

GDPR provides individuals with rights to data held on them and information about algorithmic decision making applied to them, where it has a substantial or legal effect (such as employment/access to work). And while the UK is no longer an EU member it transposed the European data protection framework into national law before leaving the EU.

The rulings followed earlier judgements, in March 2021, by the Court of Amsterdam — which did not accept the robo-firing charges in those instances and largely rejected the drivers’ requests for specific data.

In the appeal against judgments of the Amsterdam District Court, the drivers and union requested the Court order Uber and Ola to explain to them how certain decisions are made. These include decisions by Uber to unilaterally close driver accounts due to suspected fraud, with the driver not being heard when making these decisions. 

The drivers also wanted information about the decisions made on which journeys are allocated to them (matching passenger to driver), with which their journey price is determined and with which certain 'scores' are assigned to them in the internal system, such as the 'fraud probability' score. 

The first case involved four drivers who were found to be effectively robo-fired by Uber without recourse. The second case involved the denial of access to personal data upon requests made to Uber by six drivers. The third case involved the denial of access to personal data upon requests made to Ola Cabs by three drivers.

The drivers faced spurious allegations of ‘fraudulent activity’ by Uber and were dismissed without appeal.

The court found that the limited human intervention in Uber’s automated decisions to dismiss workers was not ‘much more than a purely symbolic act’. The decision to dismiss the drivers was taken remotely at an Uber office in Krakow, Poland and the drivers were denied any opportunity to be heard.

The court noted that Uber had failed to make “clear what the qualifications and level of knowledge of the employees in question are. There was thus insufficient evidence of actual human intervention.” 

The court found that the drivers had been profiled and performance managed by Uber: - “this example illustrates, in the court's view, that it involves automated processing of personal data of drivers whereby certain personal aspects of them are evaluated on the basis of that data, with the intention of analysing or predicting their job performance, reliability and behaviour.”

The court ordered that Uber must explain how driver personal data and profiling is used in Uber’s upfront, dynamic pay and pricing system. Similarly, the court ordered Uber to transparently disclose how automated decision making and worker profiling is used to determine how work is allocated amongst a waiting workforce.

Ola Cabs was also ordered to disclose meaningful information about the use in automated decision making of worker earnings profiles and so called ‘fraud probability scores’ used in automated decision making for work and fares allocation.

The court also rejected arguments by both Uber and Ola Cabs that to explain allegations and automated decision making negatively effecting workers would threaten their rights to protect trade secrets. The court ruled that such claims were entirely disproportionate relative to the negative effect of unexplained automated dismissal and disciplining of workers. It also rejected the argument that the requests for data and the involvement of Worker Info Exchange and the ADCU trade union amounted to an abuse of the data protection rights of the individual appellants.

James Farrar, Director of Worker Info Exchange said, “This ruling is a huge win for gig economy workers in Britain and right across Europe. The information asymmetry & trade secrets protections relied upon by gig economy employers to exploit workers and deny them even the most basic employment rights for fundamentals like pay, work allocation and unfair dismissals must now come to an end as a result of this ruling. Uber, Ola Cabs and all other platform employers cannot continue to get away with concealing the controlling hand of an employment relationship in clandestine algorithms.”

Anton Ekker of Ekker law, said, “Transparency about data processing on Uber and Ola's platforms is essential for drivers to do their jobs properly and to understand how Uber makes decisions about them. The practical and legal objections raised by Uber and Ola were largely rejected by the Amsterdam Court of Appeals.”

“Of great importance, in addition, is the court's finding that several automated processes on Uber and Ola's platforms qualify as automated decision-making within the meaning of Article 22 GDPR,” Ekker added. “These include assigning rides, calculating prices, rating drivers, calculating 'fraud probability scores' and deactivating drivers' accounts in response to suspicions of fraud. The judgments clearly establish that drivers are entitled to information on the underlying logic of these decisions.”

In a statement to TechCrunch, an Uber spokesperson said, “We are disappointed that the court did not recognise the robust processes we have in place, including meaningful human review, when making a decision to deactivate a driver’s account due to suspected fraud.”

“Uber maintains the position that these decisions were based on human review and not on automated decision making, which was acknowledged earlier by the previous court. These rulings only relate to a few specific drivers from the UK that were deactivated in the period between 2018 and 2020 in relation to very specific circumstances,” the spokesperson added.