A survey of predictive policing: how data makes it possible, its benefits and pitfalls, and what it may portend for American law enforcement and race relations.
In an important book that goes to the heart of issues at the forefront of contemporary life, Ferguson (Law/Univ. of the District of Columbia; Why Jury Duty Matters: A Citizen’s Guide to Constitutional Action, 2012) examines how police departments are now using supposedly “objective” data-driven surveillance technologies to work more effectively in a budget-cutting era and to avoid claims of racial bias. In this engaging, well-written narrative, based on studies and a deep understanding of policing, the author describes the growing police use of shared data (the National Crime Information Center database is “reportedly accessed 12 million times a day by authorities”), its effects on how and where police work, and its usefulness in predicting future criminals (just as Amazon uses data to identify repeat shoppers). Some uses of data are surprising, as in Chicago, New Orleans, and other cities, where police maintain “heat lists” of individuals likely to be involved in crimes and then write to and visit the listed suspects, warning them to avoid criminal activity. The data used in predictive policing is prone to bias and error, warns Ferguson, and it includes “black data,” which is opaque, hidden in complex algorithms deemed proprietary by software vendors who work with police. Using erroneous data can lead to “aggressive police presence, surveillance, and perceived harassment” in poor communities of color. In fact, “big data policing reifies many of the systemic inequalities of traditional policing,” writes the author, who is candid in his assessment of the role of implicit bias in law enforcement. He concludes with questions he urges police departments to ask about racial bias, error, and accountability in data-driven policing.
Essential reading for anyone who wants to understand how technology is changing American policing.