On discriminatory algorithmic policing

The Intercept has a lengthy feature on the various ways in which data-driven algorithmic policing reinforces racism in police work, esp. among the LAPD who have a mixed track records already… The article is based on the rich empirical insights from a new book, “Predict and Surveil: Data, Discretion, and the Future of Policing” (reviewed here and here) by sociologist Sarah Brayne.

Brayne was given unique access to everyday police routines and was able to observe up closely the intricacies of what their actual work with data and algorithms looks like in practice. Braynesays this is “a manifestation of the militarization of policing” in the US.

The main point of criticism, again, is the self-fulfilling prophesy of an overpolicing feedback loop:

Such a system means, of course, that individuals in overpoliced neighborhoods can easily get caught up in a vicious cycle where they are, as Brayne writes, “more likely to be stopped, thus increasing their point value, justifying their increased surveillance, and making it more likely that they will be stopped again in the future.”

This Intercept article gives a quick overview of some of the technologies in use, and the companies and the politics and business incentives behind them (with the usual suspects like Palantir, Predpol).

An interesting further observation is that police officers themselves were ‘squeemish’ about the very same technologies of surveillance being used on themselves. One would perhaps expect the golden rule of “do unto others..” to be more firmly entrenched in the working ethos of public servants.

Original article on The Intercept >>

Leave a Reply