Swiftmail
Buy and sell Bitcoin instantly at www.Fxprobitcoin.com List your own coin on the Fxprobitcoin exchange. Cash out of bitcoin at www.SwiftCoin.club

The Criminal Justice System Is Relying On Tech To Do Its Job And That's Just Going To Make Everything Worse


The criminal justice system appears to be outsourcing a great deal of its work. On the law enforcement side, automatic license plate readers, facial recognition tech, and predictive policing have replaced beat cops walking the streets and patrolling the roads. Over on the judicial side, analytic software is helping make sentencing decisions. This is supposed to make the system better by removing bias and freeing up government personnel to handle more difficult duties algorithms can't handle.

As is the case with most things government, it works better in theory than in practice. ALPRs create massive databases of people's movements, accessible by a hundreds of law enforcement agencies subject to almost zero oversight. More is known about facial recognition's failures than its successes, due to inherent limitations that churn out false positives at an alarming rate. Predictive policing is the algorithmic generation of self-fulfilling prophecies, building on historical crime data to suggest future crimes will occur in high crime areas.

While the judicial side might seem more promising because it could prevent judges from acting on their biases when handing down sentences, the software can only offer guidance that can easily be ignored. That and the software introduces its own biases based on the data it's fed.

The logic for using such algorithmic tools is that if you can accurately predict criminal behavior, you can allocate resources accordingly, whether for rehabilitation or for prison sentences. In theory, it also reduces any bias influencing the process, because judges are making decisions on the basis of data-driven recommendations and not their gut.

You may have already spotted the problem. Modern-day risk assessment tools are often driven by algorithms trained on historical crime data.

As we’ve covered before, machine-learning algorithms use statistics to find patterns in data. So if you feed it historical crime data, it will pick out the patterns associated with crime. But those patterns are statistical correlations—nowhere near the same as causations. If an algorithm found, for example, that low income was correlated with high recidivism, it would leave you none the wiser about whether low income actually caused crime. But this is precisely what risk assessment tools do: they turn correlative insights into causal scoring mechanisms.

Correlation is not causation. Past performance is not indicative of future results. And an individual being sentenced is not the average of 20 years of historical crime data. The software may steer judges away from personal biases, but it creates new ones to replace them. It's a lateral "improvement" that does little more than swap the inputs.

Once you've got a system brought up to speed on garbage in, the biases multiply and perpetuate. Sentencing decisions based on biased data generate more bad data for the sentencing software… which then leads to successively harsher sentences for the same criminal act with each iteration. As the recursive data rolls in, the sentencing recommendations will justify themselves because who can argue with raw data?

This is not to say tech should not be used by the criminal justice system. It's that it needs to be subject to rigorous oversight and its employers made aware of its limitations, including its innate ability to reinforce biases, rather than remove them. This isn't something to be taken lightly. The lives and liberties of Americans are literally at stake. Taking a hands-off approach to tech deployment is highly irresponsible, and it indicates those in power care very little about what happens to the people they serve.


Disclaimer: The information contained in this web site is for entertainment purposes only. John McAfee, John McAfee Swiftmail and Swiftcoin are not affiliated with McAfee Antivirus. This web site does not offer investment advice. Check with your attorney, financial advisor and local statutes before using this web site, McAfee Swiftmail or Swiftcoin. John McAfee makes no warranty or guarantee, expressed or implied, as to the confidentiality, performance or suitability of Swiftmail and Swiftcoin for any purpose. Use these products at your sole risk.