Ars Technica: Is “Big Data” racist? Why policing by data isn’t necessarily objective

Ars Technica: Is “Big Data” racist? Why policing by data isn’t necessarily objective. “Algorithmic technologies that aid law enforcement in targeting crime must compete with a host of very human questions. What data goes into the computer model? After all, the inputs determine the outputs. How much data must go into the model? The choice of sample size can alter the outcome. How do you account for cultural differences? Sometimes algorithms try to smooth out the anomalies in the data—­anomalies that can correspond with minority populations. How do you address the complexity in the data or the ‘noise’ that results from imperfect results? The choices made to create an algorithm can radically impact the model’s usefulness or reliability. To examine the problem of algorithmic design, imagine that police in Cincinnati, Ohio, have a problem with the Bloods gang—­a national criminal gang, originating out of Los Angeles, that signifies membership by wearing the color red.”