Uber Faces Legal Action Over ‘Racist’ Facial Recognition Software
News

Uber Faces Legal Action Over ‘Racist’ Facial Recognition Software

By Hannah Shewan Stevens, Interim Editor 11 Oct 2021
Discrimination, Technology, Workplace
Credit: Barna Bartis / Unsplash

Get involved

Share this article with a friend

Have a cuppa with EachOther

Tell us how EachOther has helped you understand human rights in the UK by completing our short survey.

Fill in the survey here

Uber is facing legal action following revelations that its facial recognition algorithm is five times more likely to cause the termination of darker-skinned workers. 

The company uses the system – describing it as a “photo comparison” tool – to ensure couriers and drivers match the photographs attached to their user profiles. Workers can choose to have their identity verified by the software developed by Microsoft or by a human reviewer. 

The Independent Workers’ Union of Great Britain (IWGB) has filed a claim for indirect racial discrimination on behalf of a member whose account was terminated following an error with the algorithm. The App Drivers and Couriers Union (ADCU) have also launched employment tribunal claims against Uber for the same reasons. 

“Hundreds of drivers and couriers who served through the pandemic have lost their jobs without any due process or evidence of wrongdoing and this reflects the larger culture at Uber, which treats its majority-BAME workers as disposable,” said Henry Chango Lopez, general secretary of the IWGB. “Uber must urgently scrap this racist algorithm and reinstate all the drivers it has unfairly terminated.”

With multiple companies, including courier service Bolt, implementing artificial intelligence driven facial recognition software, fears that discrimination is buried in the code are increasing. Failing to rectify discriminatory coding means companies are in danger of breaching Article 14 of the Human Rights Act, which protects against discrimination. 

The system is allegedly fraught with problems and workers are having their accounts frozen, facing threats of termination or being permanently fired after their selfies failed to match the company’s “Real Time ID Check”. 

Similar software to the version Uber is using was shown to have a failure rate of 6% for darker-skinned male faces and 20.8% for darker-skinned female faces. The same research revealed that the failure rate for white men was 0%. This has potentially profound ramifications for Uber’s workforce, which, in London, is made up of 95% people from ethnic minority backgrounds. 

Concerns about discrimination built into racial recognition software have been mounting for some time. In January 2021, Amnesty International launched its Ban The Scan campaign to prevent the use of facial recognition systems. In 2020, Microsoft said it would refuse to sell such technology to police departments in the United States until federal law regulates the technology. 

On 6 October, Uber drivers took strike action, boycotting the app for 24 hours and protesting outside Uber HQ in Aldgate, London. 

“This movement is gaining real momentum now because we know that we must stand up for our rights if we want them respected,” explained Nader Awaad, Uber driver and chair of the United Private Hire Drivers (UPHD) branch of the IWGB. “We deserve equality, fair pay and job security and we are prepared to fight for it.”

In the last year alone, the IWGB has represented over 200 drivers and couriers who have been unfairly terminated on a range of grounds, including false customer complaints. 

Many were terminated without investigation or proper notice, a chance to appeal or a right to representation. Although a Supreme Court Ruling determined that Uber drivers are workers and should be protected from discrimination, drivers and couriers are not yet legally protected from unfair dismissal. 

Last week’s strike launched a joint campaign by Black Lives Matter UK (BLM UK) and the IWGB to fight against the use of biased facial recognition algorithms and demand that unfairly terminated workers are reinstated. 

“The impact of Uber’s facial recognition algorithm reflects a complete lack of care for black people and their livelihoods,” said a representative from BLM UK. “The gig economy, which already creates immense precarity for Black key workers, is now further exacerbated by this software that prevents them from working at all, purely based on the colour of their skin. Racist practices such as these must come to an end.”

The two organisations are calling for the introduction of a fair terminations process called for last year by more than 60 MPs, a liveable wage for all drivers, 15% max commission, transparency of charges on customers and an end to fixed-rate trips. 

About The Author

Hannah Shewan Stevens Interim Editor

Hannah Shewan Stevens is an NCTJ-accredited freelance journalist, editor, speaker and press officer based in Birmingham. Her areas of interest are broad-ranging but the topics she is most passionate about are disability, social justice, sex and relationships and human rights. Hannah believes in using her own voice and elevating others to create meaningful change in the world. She is also a sex columnist for The Unwritten and has recently completed her first accreditation in delivering Relationships and Sex Education.

Hannah Shewan Stevens is an NCTJ-accredited freelance journalist, editor, speaker and press officer based in Birmingham. Her areas of interest are broad-ranging but the topics she is most passionate about are disability, social justice, sex and relationships and human rights. Hannah believes in using her own voice and elevating others to create meaningful change in the world. She is also a sex columnist for The Unwritten and has recently completed her first accreditation in delivering Relationships and Sex Education.