Feature

Is The Home Office’s ‘Secret’ Visa Application Algorithm Discriminatory?

By Sophie Blake, Volunteer Writer 8 Nov 2019
Discrimination, Immigration
Credit: Unsplash.

Imagine arriving at Heathrow airport and everyone who is not British was divided into three queues at passport control. Individuals from rich, predominantly white countries were directed towards the fast-moving green queue, while other nationalities – mostly ethnic minorities – had to pass through the backlogged red queue, where each person was subject to intense questioning and background checks. 

This could be the hidden reality of our immigration system, according to rights groups the Joint Council for the Welfare of Immigrants (JCWI) and Foxglove, an advocacy organisation working to make technology accountable.

Last week, it was announced the groups had initiated legal proceedings to force the Home Office to reveal how the “secretive technology” it uses to stream visa applications works.

The groups suspect the that tool is the digital version of a “racist” set of immigration policies, known as the “hostile environment.”

The Home Office insists that the tool is not discriminatory and that it complies with equality law.

What Is The ‘Hostile Environment’?

The hostile environment is an immigration policy, introduced by Theresa May when she was Home Secretary, which is designed to make it as difficult as possible for people with uncertain immigration status to go about their lives in the UK.

It includes controversial measures such as the attempt to collect information on the immigration status of pupils via the school census. This, and a number of other measures, were later dropped following fierce opposition. 

The hostile environment has been condemned as racist and discriminatory towards people from black and minority ethnic (BME) backgrounds, upon whom the policy has a disproportionate impact. 

Research carried out in 2017 by JCWI into the impact of the hostile environment in housing revealed that, in a mystery shopping exercise, 58 percent of landlords turned down an enquiry from a British BME tenants without a passport. 

In May 2019, the UN Special Rapporteur on Racism echoed concerns over the discriminatory impact of the hostile environment immigration policies in her report on the UK.

She wrote: “Where the strategy for immigration enforcement is so overbroad, and foreseeably results in the exclusion, discrimination and subordination of groups and individuals on the basis of their race, ethnicity or related status, such a strategy violates international human rights law and undermines the Government’s stated commitments to racial equality.” 

The government has attempted to distance itself from the hostile environment following the Windrush scandal, with then-Home Secretary Sajid Javid preferring the term “compliant environment.”

Where Do Algorithms Come In?

https://pixabay.com/en/code-code-editor-coding-computer-1839406/

Image Credit: Pixabay. 

The Home Office uses a digital tool to stream visa applications into “Red”, “Amber” and “Green” categories.

However, it has refused requests to explain how the algorithm works made under Freedom of Information Act, citing an immigration exemption. 

JCWI and Foxglove fear the tool uses “crude characteristics”, such as nationality and age, to filter applications. They argue this has resulted in applicants from predominantly rich, white countries receiving the “Green” category and effectively being fast-tracked through the system, while poorer people of colour are subject to increased scrutiny and checks. 

The Home Office maintains that the AI system is used only to “allocate” applications and that visa decisions are ultimately taken by a human caseworker. A spokesman said: “We have always used processes that enable UK Visas and Immigration to allocate cases in an efficient way.

“The streaming tool is only used to allocate applications, not to decide them. It uses data to indicate whether an application might require more or less scrutiny and it complies fully with the relevant legislation under the Equalities Act 2010.”

Nevertheless, while a person allocated to the red category may still be granted a visa, JCWI maintains that their chances are significantly reduced by comparison to an equivalent applicant who has been allocated to the green category. The process is also likely to take much longer. 

In a parliamentary debate on the topic in June, Chi Onwurah, the Shadow Minister for Business, Energy and Industrial Strategy, raised the issue of confirmation bias – where human decision-makers are inclined to approve computer-made decisions. 

Who Is Affected?

It is impossible to know exactly who is affected as the Home Office has refused to disclose details of the algorithm.

But Cori Crider, a director of Foxglove, told the Guardian that, if nationality is being used to stream applications, “it’s easy to guess who ends up in the green queue and who gets pushed to the back of the bus in red”. 

Delays and difficulties in obtaining a visa can have wide-reaching impacts, affecting a person’s ability to visit family, take up studies, or travel for work. Oxfam has highlighted the “self-harming” impact of the UK’s visa policy. In April, just one of the 25 African academics expected at its blog-writing training course was granted a visa in time to attend the event. 

What About Human Rights?

scales of justice No-Deal Brexit

Image Credit: Flickr. 

Article 14 of Human Rights Act protects against discrimination. While there are instances where discrimination or unequal treatment by the government may be permitted, these are strictly limited and there must be objectively justifiable reasons for such action.

The Home Office also maintains that it does not stream “on the basis of race” and that it complies fully with the Equality Act 2010, which includes a duty on government to consider how its decisions and policies affect people based on “protected characteristics” such as race and ability.

However, a lack of transparency over how the digital system functions means it is difficult to assess whether the government is meeting its obligations. 

Speaking to the Financial Times in June, Christina Blacklaws, president of the Law Society, said the practice of screening visa applicants by algorithm “may well disadvantage certain groups of people based on generic markers such as age, country of origin or whether they have travelled before.”

What Next?

JCWI, supported by Foxglove, is now bringing a judicial review to attempt to force the Home Office to disclose details of the algorithm used to screen applicants. 

Among the information they are seeking is details of the policy and guidance documents that inform the streaming process, as well as technical information on how the system works. 

A crowdfunding appeal has been launched to support the legal costs of the challenge.

Featured Image Credit: Unsplash.

Read more: 

About The Author

Sophie Blake Volunteer Writer

Sophie works in the human rights sector and has worked on issues including the death penalty, criminal justice reform and public engagement in decision-making. She holds an MA in Understanding and Securing Human Rights from the School of Advanced Study.

Sophie works in the human rights sector and has worked on issues including the death penalty, criminal justice reform and public engagement in decision-making. She holds an MA in Understanding and Securing Human Rights from the School of Advanced Study.