Implementing responsible and ethical use of crime prediction technologies

A manual for mitigating harm

Since 2010, the world has seen a sharp uptake in the use of crime prediction technologies.

Although these tools have the potential to improve public safety, they are often deployed without sufficient training and oversight, adherence to procedural safeguards, or compliance with standard operating procedures. Nor have they been adequately accompanied by data sharing and governance protocols, as well as cybersecurity measures. Further, the paucity of attention paid to the social and ethical implications when rolling out these technological innovations has raised questions about their legitimacy and undermined public trust in their efficacy.

Indeed, civil rights advocates across Europe and the United States have expressed concerns that crime prediction tools may not only infringe privacy rights but reinforce negative bias about who commits crime and where it takes place, a failing that could serve to legitimize discriminatory policing practices. Some policing agencies have responded by imposing outright bans on crime prediction technologies, while others have ignored the caveats and leveraged these tools with minimal transparency and accountability, ignoring threats to human rights altogether.

This manual aims to offer an alternative for those who are interested in exploring the use of crime prediction technologies in a socially responsible and ethical manner. It advocates an approach that leverages the potential of emerging technologies to advance public safety concerns while also implementing measures to identify potential risks and mitigate various social harms. Finally, given the risks associated with algorithmically enabled tools, specifically in the domain of public safety and security, this manual supports their use when a high degree of caution has been exercised prior to their deployment.

 

Read the publication

 

Access the visualization tool

 

The Igarapé Institute uses cookies and other similar technologies to improve your experience, in accordance with our Privacy Policy and our Terms of Use, and by continuing to browse, you agree to these conditions.

Skip to content