Prediction of crime AI “fired”

The algorithm Compas, who worked for the American police since 1998, was engaged in the analysis of data from defendants, and then, on the basis of the information received helped to solve, for example, whether to release the offender on bail or is it better to leave detention. Choosing a measure of restraint, the system took into account age, gender and level in the criminal career. 20 years of “service” algorithm estimated more than a million people, but recently it is recognized as incompetent, after which he was immediately withdrawn.

Scientists from Dartmouth College to check how accurate the system is and if we can trust her. For this, they have recruited freelancers, ordinary people without legal training, to make their own decisions on the basis of short certificates of people by giving subjects information about sex, age, criminal record and several other parameters.

Forecast accuracy has a small dossier of freelancers amounted to almost 70 percent, but the program has lagged behind men by five percent, while relying on the 137 paragraphs of the biography. Analysis of the judgments of the algorithm showed that black inmates are often the program under suspicion.

“Errors in such cases can be very expensive, so it is worth considering, do I need to apply this algorithm for pronouncement of judicial verdicts,” says one of the study’s authors.

Having studied the principle of operation of the algorithm, the researchers came to the conclusion that the younger the defendant, and the more arrests he has, the higher the likelihood of relapse, and experts in the field of AI have recognised the technology unreliable.

Prediction of crime AI “fired”
Vyacheslav Larionov


Date:

by