MIT Technology Review: Inspecting Algorithms for Bias

MIT Technology Review: Inspecting Algorithms for Bias. “ProPublica, a Pulitzer Prize–winning nonprofit news organization, had analyzed risk assessment software known as COMPAS. It is being used to forecast which criminals are most likely to ­reoffend. Guided by such forecasts, judges in courtrooms throughout the United States make decisions about the future of defendants and convicts, determining everything from bail amounts to sentences. When ProPublica compared COMPAS’s risk assessments for more than 10,000 people arrested in one Florida county with how often those people actually went on to reoffend, it discovered that the algorithm ‘correctly predicted recidivism for black and white defendants at roughly the same rate.’ But when the algorithm was wrong, it was wrong in different ways for blacks and whites.”