Violent crime prediction algorithms are racially biased?

A computer algorithm that calculates the likelyhood of a criminal reoffending has been accused of being racially biased against blacks, even though race is never entered into the system. 

… race is not an explicit factor in its assessment algorithm. However, some of the factors that do inform the scores can be closely tied to race, like the defendant’s education level, employment status and social circumstances such as family criminal history or whether or not their friends take illegal drugs. And the specific calculations necessary to arrive at the final score are proprietary — meaning defendants and the general public have no way to see what might be influencing a harsh sentence.

Isn’t it racist to say that these social traits determine your race? 

While algorithms like these might be well-intentioned, the system’s opacity is already seen as a problem. In Chicago, for example, police have had surprising accuracy using an algorithm to predict who will commit or be the target of gun violence, but members of the ACLU find it troubling that members of the community can be singled out as criminals without any insight into what landed them on the CPD’s list.

Are we headed towards a “Minority Report” by machine?

via Engadget

Right-Mind