The launch of a facial recognition application by police forces in Wales, marking the first such deployment in the UK, has generated concerns regarding potential human rights breaches. This technology will enable officers to utilize their mobile phones to verify an individual’s identity. Its potential applications include identifying deceased or unconscious persons, as well as those who are unable or unwilling to provide their details. Jake Hurfurt, representing the civil liberties and privacy organization Big Brother Watch, stated that the app “creates a dangerous imbalance between the public’s rights with the police’s powers”. Known as Operator Initiated Facial Recognition (OIFR), the application has undergone testing by 70 officers across south Wales and is scheduled for use by South Wales Police and Gwent Police. Law enforcement officials indicated that employing the app on unconscious or deceased individuals would facilitate their prompt identification, allowing families to be contacted with care and compassion. In situations where someone is wanted for a criminal offense, the forces affirmed it would ensure their swift arrest and detention. Police also noted that instances of mistaken identity could be resolved easily, eliminating the need for a visit to a police station or custody suite. Authorities confirmed that photographs captured using the app would not be retained. Furthermore, images taken in private locations such as houses, schools, medical facilities, and places of worship would only be utilized in circumstances related to a risk of significant harm. However, Mr. Hurfurt asserted: “In Britain, none of us has to identify ourselves to police without very good reason but this unregulated surveillance tech threatens to take that fundamental right away.” Charlie Whelton, from the civil liberties group Liberty, characterized the technology as a “deeply invasive breach of our privacy rights, data protection laws and equality laws.” He added: “We urgently need the government to introduce safeguards to protect us as we go about our daily lives, rather than allowing the police to continue to experiment at the expense of our civil liberties.” The software operates by acquiring a “probe image,” typically a face captured from CCTV or a mobile phone, and then measures the facial features, which constitute biometric data. This data is subsequently compared with all custody images stored in the database shared by police forces. In August 2020, the Court of Appeal ruled that the use of automatic facial recognition (AFR) technology by South Wales Police was unlawful, following a legal challenge by the civil rights group Liberty and Ed Bridges. Nevertheless, the court also concluded that its application represented a proportionate interference with human rights, as the benefits outweighed the impact on Mr. Bridges. Mr. Bridges had previously stated that being identified by AFR caused him distress. Assistant Chief Constable Trudi Meyrick of South Wales Police indicated that the new app enhances the police’s “ability to accurately confirm a person’s identity.” She further stated: “This technology doesn’t replace traditional means of identifying people and our police officers will only be using it in instances where it is both necessary and proportionate to do so, with the aim of keeping that particular individual, or the wider public, safe.” Assistant Chief Constable Nick McLain of Gwent Police described the adoption of this technology as an “integral part of effective policing and public safety.” He elaborated: “The use of this technology always involves human decision-making and oversight, ensuring that it is used lawfully, ethically, and in the public interest.” He also noted: “We have a robust scrutiny process in place to ensure accountability and testing found no evidence of racial, age or gender bias.”

Leave a Reply

Your email address will not be published. Required fields are marked *