What’s the problem?
The Home Office is using two AI tools: The Asylum Case Summarisation (ACS) tool uses ChatGPT-4 to summarise asylum interview transcripts. The Asylum Policy Search (APS) tool summarises country Policy and Information Notes (CPINs), guidance documents, and Country of Origin Information (COI) reports.
The Home Office’s own evaluation revealed that 9% of the ACS AI-generated summaries were so flawed they had to be removed from the pilot.
It also found that 23% of caseworkers lacked full confidence in the tools’ outputs.
Despite this, the Home Office has pushed ahead and rolled out the use of these tools.
But even worse, individuals affected by these tools will NOT be informed about the use of AI in their cases. So applicants have no way of checking how the information they supplied has been changed by AI. Now a legal opinion has found that the Home Office’s failure to tell people that AI tools are being used in their assessments is likely to be unlawful.
Bringing a legal challenge will take time – we need to stop their use now.
Governments have a history of testing novel technologies on vulnerable populations before rolling them out more widely. So it’s in all of our interests to stop these tools being used before AI is used across government departments to inform life-changing decision.
Write to your MP now.
