In a world increasingly reliant on technology, the UK police forces’ escalating use of retrospective facial recognition (RFR) technology is causing alarm amongst human rights advocates and privacy campaigners. The technology, which deploys advanced algorithms to scan millions of custody images, has witnessed a substantial increase in its application, despite persistent warnings about its opacity and potential misuse.
An investigation spearheaded by i and Liberty Investigates unveiled a staggering 330% increase in searches using RFR technology against the Police National Database (PND) last year. The PND holds over 16 million images of individuals arrested, many of whom were never charged with a crime. The technology compares these images with suspect pictures from various sources like CCTV, mobile phones, dashcams, or doorbell footage.
Campaigners have voiced their concerns about the disturbing risk of innocent individuals being pinpointed by the system. The PND and other databases maintained by individual forces reportedly contain hundreds of thousands of images of individuals who were either never charged or were later acquitted of a criminal offense following their arrest.
Secrecy and lack of public awareness
The secrecy surrounding the use of RFR technology and the extent of public awareness about it have been subjects of concern. Despite denials from 13 of the 43 UK’s territorial police forces about using RFR in 2022, Home Office figures reveal they conducted thousands of searches. The Policing Minister, Chris Philp, acknowledged that all 43 UK territorial police forces are now utilizing the technology, anticipating it to have “an enormous impact on our ability to lock up criminals.”
Potential for unlocking historic crimes
Senior officers, including the Commissioner of the Metropolitan Police, Sir Mark Rowley, have praised the technology, stating it has “immense potential” and could revolutionize crime-fighting, akin to the impact of DNA three decades ago. The technology is now being extended to “cold cases,” reviewing historical casework to identify previously unidentified individuals.
However, human rights groups and the Government’s surveillance watchdog have expressed grave concerns about the rapid deployment of RFR. They emphasize a lack of transparency and potential breaches of legal obligations by police forces failing to publish policies on the usage of facial recognition. Fraser Sampson, the Government’s biometrics and surveillance camera commissioner, stated that the level of transparency around the technology “falls far short” of what is needed.
Retention of innocent individuals’ images
A significant issue is the retention of images of individuals on the PND and other law enforcement databases despite them never being charged or subsequently cleared of any charge. Despite a Court of Appeal ruling declaring the retention of these images unlawful, forces are keeping this data indefinitely. Campaigners argue that millions of such images have been wrongfully retained, leading to innocent people being wrongly labeled as criminals.
Police forces’ stance
Police forces maintain that legacy computer systems necessitate individual removal of custody images, and resource constraints prevent discarding all wrongly retained data. The National Police Chiefs Council (NPCC) insisted that RFR technology allows police to identify and eliminate suspects faster and more accurately. The Metropolitan Police stated that facial recognition provided “fantastic opportunities” for more effective policing.
The escalating use of retrospective facial recognition technology by police forces in the UK is a double-edged sword. While it holds immense potential for solving crimes, including historic ones, the lack of transparency, potential misuse, and retention of innocent individuals’ images raise serious ethical, legal, and human rights concerns. The balance between leveraging technology for public safety and upholding individual rights and freedoms remains a contentious issue, necessitating robust debates, clear policies, and stringent safeguards.