Powered by high powered machine learning tools, it is made up of three components: 



Back-end forecasting algorithm

that will forecast when and where a type of event is more likely to occur.



Web-based dashboard

that allows users to understand how crime concentrates and where their patrolling officers are located along the territory.



Social impact strategy

to mitigate bias. 



The algorithm is also tested for fairness to avoid feedback loops on biased data.

Research indicates that violent crimes and crimes against property concentrate in specific locations (hot spots) and predictable periods. About half of crimes in most cities around the world occur in just 5% of addresses.


The first step in preventing crime is understanding the territory. Data-driven crime mapping and prediction systems can help police departments reduce fatalities by up to 10%, lower crime incidents by as much as 40% and dramatically reduce emergency response times.

CrimeRadar allows users to enhance spatial awareness and optimize operational efficiency, improving cost-efficiency through adequate resource allocation and reduces emergency response times as patrol officers will already be physically positioned near areas where the need is most likely to arise. 

CrimeRadar will be made available to other police departments upon request. To use the tool, departments must comply with minimum transparency and reporting standards. The Institute will support the departments to make sure that the standards are met.

The requirements are laid out below, along with the Primary Accountable Institution (PAI). These requirements follow the recommendations from the FAT/ML work group:


> Responsibility and Recourse – Make available externally visible avenues of redress for adverse individual or societal effects of an algorithmic prediction system, and designate an internal role for the person who is responsible for the timely remedy of such issues. (PAI: Police Department)


> Explainability – Ensure that algorithmic predictions as well as any data driving those predictions can be explained to end-users and other stakeholders in non-technical terms. (PAI: Police Department)


> Accuracy – Identify, log, and articulate sources of error and uncertainty throughout the data sources so that expected and worst case implications can be understood and inform mitigation procedures. (PAI: Police Department)


> Fairness – Ensure that algorithmic predictions do not create discriminatory or unjust impacts when comparing across different demographics. (PAI: Police Department).


> Auditability – Enable interested third parties to probe, understand, and review the behavior of the algorithm through disclosure of information that enables monitoring, checking, or criticism, including through provision of detailed documentation, technically suitable APIs, and permissive terms of use. (PAI: Igarapé Institute).


public version

Prior to the 2016 Rio Olympics, the Igarapé Institute developed rio.CrimeRadar, a digital platform that used machine learning to predict crime rates in different neighborhoods at different times. The platform was soft launched to the public during the Rio Olympics, in August of that year, and focused on the city’s metropolitan area. CrimeRadar drew on over five years of crime data collected by the Rio de Janeiro state police to determine relative crime risks for the upcoming week. The app was conceived and developed by Igarapé Institute, associated with Via Science and Mosaico.





Competition organized by

Civil Police of the Federal Capital District


Competition organized by
the InterAmerican Development Bank – IADB


The Igarapé Institute uses cookies and other similar technologies to improve your experience, in accordance with our Privacy Policy and our Terms of Use, and by continuing to browse, you agree to these conditions.

Skip to content