Predictive policing

[5][6] The use of algorithms creates a more effective approach that speeds up the process of predictive policing since it can quickly factor in different variables to produce an automated outcome.

The city of Chicago uses data blended from population mapping crime statistics to improve monitoring and identify patterns.

The "AI Ethics of Care" approach recognizes that some locations have greater crime rates as a result of negative environmental conditions.

[10] At the conclusion of intense combat operations in April 2003, Improvised Explosive Devices (IEDs) were dispersed throughout Iraq’s streets.

However, the extensive areas covered by these IEDs made it impractical for Iraqi forces to respond to every American presence within the region.

This challenge led to the concept of Actionable Hot Spots—zones experiencing high levels of activity yet too vast for effective control.

This situation presented difficulties for the Iraqi military in selecting optimal locations for surveillance, sniper placements, and route patrols along areas monitored by IEDs.

Since the mid-1990s PGIS has been introduced into the Chinese public security industry to empower law enforcement by promoting police collaboration and resource sharing.

[28] Additionally, the Uttar Pradesh Police utilize the AI-powered mobile application 'Trinetra' for facial recognition and criminal tracking.

[33] Cities throughout the United States are enacting legislation to restrict the use of predictive policing technologies and other “invasive” intelligence-gathering techniques within their jurisdictions.

Following the introduction of predictive policing as a crime reduction strategy, via the results of an algorithm created through the use of the software PredPol, the city of Santa Cruz, California experienced a decline in the number of burglaries reaching almost 20% in the first six months the program was in place.

Facial recognition technology has been criticized for its reduced accuracy on darker skin tones - which can contribute to cases of mistaken identity and potentially, wrongful convictions.

[35] In 2019, Michael Oliver, of Detroit, Michigan, was wrongfully accused of larceny when his face registered as a “match” in the DataWorks Plus software to the suspect identified in a video taken by the victim of the alleged crime.

For example, the inclusion of arrest or incarceration history, neighborhood of residence, level of education, membership in gangs or organized crime groups, 911 call records, among other features, can produce algorithms that suggest the over-policing of minority or low-income communities.