The software product formerly known as PredPol but rebranded into Geolitica, has been right in predicting crime a paltry <1% in an investigation about its use at the police department in Plainfield, New Jersey.
We examined 23,631 predictions generated by Geolitica between Feb. 25 to Dec. 18, 2018 for the Plainfield Police Department (PD). Each prediction we analyzed from the company’s algorithm indicated that one type of crime was likely to occur in a location not patrolled by Plainfield PD. In the end, the success rate was less than half a percent. Fewer than 100 of the predictions lined up with a crime in the predicted category, that was also later reported to police.
Captain David Guarino of the Plainfield Police Department expressed the hope that this software would make their work more accurate and efficient. But with this low success rate they decided to end their use of the product. The article further mentions that Geolitica plans to end operations by the end of 2023.
The underlying problem with this predictive software is of course that “Geolitica’s software tended to disproportionately target low-income, Black, and Latino neighborhoods….”
The article does a nice job explaining how data bias slips into these products. The software returns what it is being fed, in terms of quality and quantity:
The Bureau of Justice Statistics reported that less than half of violent and property crimes were reported to police in 2022. Reporting rates are also inconsistent across demographic groups. The agency found in a 2012 report that Black, Latino, and lower-income crime victims were more likely to report the crime than White people and people from higher-income households.
A major problem with Geolitica’s system, as it was used in Plainfield, is that there were a massive number of predictions compared to a relatively small number of crimes.
Yet again the ongoing datafication of urban life ends up harming those in marginalized positions, just like in the past the cadaster has underpinned many home evictions of the urban poor.
Link to original article on The Markup>>