The rapid uptake of facial recognition technology by law enforcement agencies in helping to resolve crimes, conduct faster investigations and bring offenders to justice has raised serious governance challenges.
To address these challenges, the World Economic Forum, in partnership with the International Criminal Police Organization (INTERPOL), the Centre for Artificial Intelligence and Robotics of the United Nations Interregional Crime and Justice Research Institute (UNICRI), and the Netherlands police have published a white paper, Responsible Limits on Facial Recognition, Use Case: Law enforcement investigation.
The initiative represents the most comprehensive policy response to the risks associated with facial recognition technology (FRT) and aims to ensure its responsible use by law enforcement agencies. The Netherlands police will pilot the paper’s framework.
“Around the world, law enforcement agencies are rapidly adopting facial recognition technology as part of their investigations, but this comes with risks for citizens,” said Kay Firth-Butterfield, Head of Artificial Intelligence and Machine Learning, World Economic Forum. “This is the first global multistakeholder effort to mitigate these risks effectively.”
Remote biometric technologies – particularly FRT – have gained a lot of traction in the law enforcement sector and FRT accuracy has improved significantly. However, incorrect implementation, without due consideration of the ramifications, can result in major abuses of human rights and harm to citizens, particularly those in underserved communities.
These concerns have led to policy-makers exploring various options, from banning FRT for law enforcement agencies to introducing additional accountability mechanisms that limit the risk of abuse of fundamental freedoms and wrongful arrests. However, prevention of untargeted surveillance, assessment of the performance of authorized solutions, procurement processes for law enforcement agencies and the training of professional forensic examiners are largely overlooked. The new framework is primarily designed to address these gaps.
The Forum partnered with key law enforcement players to identify the risks and build appropriate governance processes. It also held workshops with civil society organizations to review the various drafts and incorporated their recommendations. In practice, the framework is composed of a common set of proposed principles for using FRT by law enforcement agencies that includes provisions on the protection of fundamental human rights. There is also a self-assessment questionnaire intended to support agencies to comply with these principles.
As a central partner of this endeavour, the Netherlands police will begin testing the assessment questionnaire in early 2022. “Building and maintaining trust with citizens is fundamental to accomplish our mission and we are well aware of the various concerns related to facial recognition. In this regard, being the first law enforcement agency to test the self-assessment questionnaire is a means to reaffirm our commitment to the responsible use of facial recognition for the benefits of our community,” said Marjolein Smit-Arnold Bik, Head of Special Police Operations, the Netherlands. “We also encourage other law enforcement agencies in various countries to participate in the testing phase and contribute to this global effort.”
“We have co-designed this framework to serve as a unique reference to law enforcement in our 194 member countries on the responsible and transparent use of facial recognition,” said Cyril Gout, Director of Operational Support and Analysis, INTERPOL. “We will support its implementation through our global police network to increase awareness of this important biometric technology. Almost 1,500 terrorists, criminals, fugitives, persons of interest or missing persons have been identified since the launch of INTERPOL’s facial recognition system in 2016.”
“Ensuring the human rights compliant use of FRT in a way that is strictly necessary and proportionate to meet legitimate policing aims is immensely important,” said Irakli Beridze, Head, UNICRI Centre for Artificial Intelligence and Robotics. “We are pleased to contribute to this valuable initiative to develop a robust governance framework for the use of facial recognition in the context of criminal investigations and believe that it will also be an important source for our broader joint work with INTERPOL on the responsible use of artificial intelligence by law enforcement.”
Alem Tedeneke, Public Engagement, World Economic Forum, +1 646 204 9191, email@example.com
Notes to editors
- Read more about the Responsible Limits on Facial Recognition Project and the Artificial Intelligence and Machine Learning Platform
- View Forum photos
- Read the Forum Agenda, also in French | Spanish | Mandarin | Japanese
- Check out the Forum’s Strategic Intelligence Platform and Transformation Maps
- Follow the Forum on Twitter via @wef @davos | Instagram | LinkedIn | TikTok | Weibo | Podcasts
- Become a fan of the Forum on Facebook
- Watch Forum videos
- Learn about the Forum’s impact
- Subscribe to Forum news releases and Podcasts