Feature

What Could Artificial Intelligence Mean For Human Rights?

By Emma Guy, Editor 27 Apr 2023
Discrimination, Equality, Privacy
Credit: Hitesh Choudhary

Want more discrimination-related news?

Get regular news about this topic by signing up to our daily newsletter.

Donate here.

To keep telling the stories that put the human into human rights, EachOther relies on your generous support.

If you have not heard of artificial intelligence (AI), you most likely will have heard of the likes of facial recognition and online chatbots, both of which use it. While AI has the potential to help solve societal challenges, there has been a mass call for global regulation to ensure it is used in responsible and inclusive ways.

In May, the European Parliament will vote on the AI Act, which will set out to regulate AI technologies in order to protect human rights. In an open letter to members of the parliament’s leading committees, civic organisations including Amnesty International have called for the following to be prohibited: automated risk assessments and profiling systems; predictive analytical systems used to interdict, curtail and prevent migration; and “deception detectors” and remote biometric identification cards (RBI) in the context of migration and border management.

What is AI?

AI is the simulation of human intelligence by machines. It requires a foundation of specialised hardware and software for writing and training machine learning algorithms. AI programming focuses on cognitive skills that include learning, reasoning, self-correction and creativity.

The development of AI is increasingly rapid. New tools like OpenAI’s ChatGPT – the world’s fastest-ever growing app – are being released every month. While there are several levels at which AI can function, AI systems do not have a sense of self. This means that AI systems or machines do not have consciousness or self-awareness and therefore cannot understand their current state.

 

Kate Jones, director of the Diplomatic Studies Programme at Oxford University, is an expert consultant on human rights law. Speaking about resetting the relationship between AI and human rights, Jones stated: “Human rights are central to what it means to be human. They were drafted and agreed, with worldwide popular support, to define freedoms and entitlements that would allow every human being to live a life of liberty and dignity.”

What does AI mean for workers’ rights?

Workers’ rights have also been called into question by algorithms that calculate pay. Examples include app-based services that can encourage workers to undertake longer shifts. The concern for gig workers is due to the fact that, in some cases, drivers or riders have worked longer hours but not necessarily been paid for that extra time. The most recent example has been Lyft drivers, who claim that destination filters have been changed and they have been penalised for not completing those jobs.

More broadly, AI threatens to replace human workers in a variety of fields, endangering people’s right to work.

Could AI help tackle online sexual exploitation?

Technology-facilitated child sexual abuse and exploitation is not only a child protection matter but also raises questions of corporate responsibility and public policy.

The charity End Violence Against Children and Technological University Dublin have been developing ‘N-Light’, an AI tool to tackle online child sexual exploitation and abuse. The AI tool aims to advance global understanding of trends in perpetrator behaviour and debunk strategies and tactics that are used to gain access to and coerce children into sexually exploitative acts.

Dr Susan Mckeever and Dr Christina Thorpe stated: “There is great potential to identify gaps in policy, emerging or evolving threats, technologies being misused for nefarious purpose, etc, which could better equip front-line responders, such as hotlines, helplines, and child’s rights agencies in their work.”

Identifying forced labour at sea

Fishing in high seas is not only labour intensive but it is also remote, making it an attractive environment for human traffickers to employ forced labour. Under, for example, the European Convention on Human Rights and the UK Human Rights Act, people have the right to freedom from slavery and forced labour.

According to the International Labour Organisation (ILO), there are several ways in which AI could help non-governmental organisations (NGOs) and law enforcement agencies identify people who have been trafficked or forced to work on fishing boats.

By combining the Automatic Identification System and existing satellite imagery with AI, it could help port inspectors identify victims of modern slavery.

The Automatic Identification System was designed to avoid collisions on the high seas, and Vessel Monitoring Systems were adapted for fishing vessels to support fisheries management at the national level. Meanwhile, there is existing satellite imaging for vessels that help identify craft that have turned off their tracking systems – or “gone dark”. However, the ILO acknowledges that the AI would need to incorporate standards set by the ILO’s Work in Fishing Convention No 188, as well as the ILO’s indicators of forced labour.

Ongoing concerns about facial recognition 

You do not have to travel far in the UK before your face is picked up through facial recognition software. This type of system is implemented in public spaces such as train stations. However, the growing use of facial recognition combined with machine learning raises other concerns. Facial recognition software is capable of tracking individuals, as well as the movements of groups or crowds. This technological development enables an unprecedented form of mass surveillance, potentially putting at risk people’s right to freedom of assembly and association and their right to a private and family life.

Research also indicates that facial recognition can be discriminatory when it comes to race and gender. According to a study by Massachusetts Institute of Technology (MIT) and Microsoft, facial recognition algorithms are less likely accurately to identify women and people of colour than men and white people – meaning that women of colour, in particular, face higher rates of mistaken identity, unfounded suspicion and false accusations.

Regulation is needed 

In order to safeguard human rights and for AI to contribute to the cause of social justice worldwide, regulations is needed.

Kate Jones said: “AI, its systems and its processes have the potential to alter the human experience fundamentally. But many sets of AI governance principles produced by companies, governments, civil society and international organisations do not mention human rights at all. This is an error that requires urgent correction.”

The Council of Europe has also echoed this concern, previously stating: “Ensuring that human rights are strengthened and not undermined by artificial intelligence (AI) is one of the key factors that will define the world we live in. AI-driven technology is entering more aspects of every individual’s life, from smart home appliances to social media applications, and it is increasingly being utilised by public authorities to evaluate people’s personality or skills, allocate resources, and otherwise make decisions that can have real and serious consequences for the human rights of individuals.”

From spyware and its implications for the right to privacy to self-driving vehicles and their implications for the right to life, systems which deploy AI raise many difficult questions when it comes to safeguarding our human rights. The European Parliament will vote on the AI Act in the following weeks.

About The Author

Emma Guy Editor

Emma has a background in undercover and investigative journalism. For the last few years, she has co-created Investigation units for independent media outlets and produced investigative podcasts that lift the lid on injustices in the UK legal system. She is passionate about making investigations and human rights inclusive for audiences and works with grassroots movements and activists to do this. Outside of work, Emma is also a PhD candidate in Human Rights Law, investigating reproductive rights and trafficking in the UK and Europe.

Emma has a background in undercover and investigative journalism. For the last few years, she has co-created Investigation units for independent media outlets and produced investigative podcasts that lift the lid on injustices in the UK legal system. She is passionate about making investigations and human rights inclusive for audiences and works with grassroots movements and activists to do this. Outside of work, Emma is also a PhD candidate in Human Rights Law, investigating reproductive rights and trafficking in the UK and Europe.