Algorithm Compas, who worked for the US police since 1998, was engaged in analyzing the data of the defendants, and then on the basis of the information he received he helped to decide, for example, whether to release the perpetrator on bail or better to remain in custody. Choosing a measure of restraint, the system took into account age, gender and track record in the criminal field. Over 20 years of “service” the algorithm estimated more than a million people, but recently recognized as incompetent, after which he was immediately recalled.
Scientists from Dartmouth College checked how accurate the system is and whether it can be trusted. To do this, they recruited freelancers, ordinary people without legal education, make their own decisions based on short information about people, giving the subjects information about the field, age, criminal history and several other parameters.
The accuracy of the forecast of the small files of freelancers that were available was almost 70 percent, the program fell behind the people by five percent, while relying on 137 biographical points. An analysis of the judgments of the algorithm showed that black prisoners are more likely to be at the program under suspicion.
“Errors in such cases can be very expensive, so it’s worth considering whether it is necessary to apply this algorithm in general to make court verdicts,” says one of the authors of the study.
Having studied the principle of the algorithm, the researchers came to the conclusion that the younger the defendant and the more arrests he had behind him, the higher the likelihood of a relapse, so AI experts recognized the technology as unreliable.
Scientists from Dartmouth College checked how accurate the system is and whether it can be trusted. To do this, they recruited freelancers, ordinary people without legal education, make their own decisions based on short information about people, giving the subjects information about the field, age, criminal history and several other parameters.
The accuracy of the forecast of the small files of freelancers that were available was almost 70 percent, the program fell behind the people by five percent, while relying on 137 biographical points. An analysis of the judgments of the algorithm showed that black prisoners are more likely to be at the program under suspicion.
“Errors in such cases can be very expensive, so it’s worth considering whether it is necessary to apply this algorithm in general to make court verdicts,” says one of the authors of the study.
Having studied the principle of the algorithm, the researchers came to the conclusion that the younger the defendant and the more arrests he had behind him, the higher the likelihood of a relapse, so AI experts recognized the technology as unreliable.
Tags
Science