{"id":5453,"date":"2018-02-02T17:24:54","date_gmt":"2018-02-02T21:24:54","guid":{"rendered":"http:\/\/ncjolt.org\/?p=5453"},"modified":"2020-06-04T20:52:34","modified_gmt":"2020-06-04T20:52:34","slug":"can-ai-predict-crime-well-humans-can","status":"publish","type":"post","link":"https:\/\/journals.law.unc.edu\/ncjolt\/blogs\/can-ai-predict-crime-well-humans-can\/","title":{"rendered":"Can AI Predict Crime As Well As Humans Can?"},"content":{"rendered":"<p>In a world with increasing reliance on technology, it is unsurprising that computer algorithms are now being used to predict crime. Many have seen the movie <a href=\"http:\/\/www.imdb.com\/title\/tt0181689\/\">Minority Report<\/a>, starring Tom Cruise, in which a futuristic society has abolished all murder due to its ability to harness individuals\u2019 psychic powers to predict killings before they even occur. Now, with the progression of <a href=\"https:\/\/futureoflife.org\/background\/benefits-risks-of-artificial-intelligence\/\">artificial intelligence<\/a> technology it is no surprise that we are getting closer to becoming the world in Minority Report. However, recent studies have proven that the data yielded from this technology is extremely unreliable. In fact, a <a href=\"http:\/\/advances.sciencemag.org\/content\/4\/1\/eaao5580\">study<\/a> published on the 17<sup>th<\/sup> of January dispelled this myth of reliability. The researchers in the study utilized algorithms that are typically used for predicting <a href=\"https:\/\/www.nij.gov\/topics\/corrections\/recidivism\/Pages\/welcome.aspx\">recidivism<\/a>, or a criminal\u2019s likelihood that he or she will offend again. Interestingly enough, these algorithms are actually <a href=\"https:\/\/www.naturalnews.com\/2018-01-29-minority-report-ai-courts-predict-criminal-repeat-offenders-guessing.html\">used<\/a> for parole and judicial proceedings to help those in the justice system to determine if an offender is likely to commit a crime again. The results of the study were troubling.<\/p>\n<blockquote><p>The researchers <a href=\"https:\/\/www.nytimes.com\/2018\/01\/19\/us\/computer-software-human-decisions.html\">found<\/a> that when randomly selecting groups of lay people, they could predict if a criminal will recidivate about two-thirds of the time, which is essentially identical to the <a href=\"http:\/\/www.harriscountylawlibrary.org\/ex-libris-juris\/2017\/6\/13\/court-compass-mapping-the-future-of-user-access-through-technology\">technology<\/a> that courts use to determine that same final results.<\/p><\/blockquote>\n<p>This has the potential for huge impacts on offenders, as this technology is yielding the same accuracy as untrained human beings, and judges are lending great credence to this technology. <a href=\"http:\/\/www.thedartmouth.com\/article\/...\/crime-predicting-algorithm-versus-human-research\">Ms. Julia Dressel<\/a>, a researcher who conducted this study for her undergraduate thesis at Dartmouth College alongside Mr. <a href=\"http:\/\/www.cs.dartmouth.edu\/farid\/\">Hany Farid<\/a>, a computer science professor, <a href=\"https:\/\/www.nytimes.com\/2018\/01\/19\/us\/computer-software-human-decisions.html\">confesses<\/a> that \u201can algorithm\u2019s accuracy can\u2019t be taken for granted, and [the courts] need to test these tools to ensure that they are performing as we expect them to.\u201d The two also revealed that similar levels of accuracy can be found using just two pieces of data: the defendant\u2019s amount of past convictions and the age of the defendant at the time of sentencing. This is very interesting when juxtaposed against <a href=\"https:\/\/doc.wi.gov\/Pages\/AboutDOC\/COMPAS.aspx\">Compas<\/a>, or Correctional Offender Management Profiling for Alternative Sanctions, which utilizes six more nuanced variables to determine recidivism rates. In fact, Eric L. Loomis, a Wisconsin resident, was <a href=\"https:\/\/www.nytimes.com\/2016\/06\/23\/us\/backlash-in-wisconsin-against-using-data-to-foretell-defendants-futures.html\">told<\/a> by a judge that he was a \u201chigh risk\u201d to the community at large and sentenced him to six years in prison for eluding the police. The judge had relied on Compas for this decision. The Supreme Court of Wisconsin <a href=\"http:\/\/www.scotusblog.com\/case-files\/cases\/loomis-v-wisconsin\/\">denied<\/a> Loomis\u2019s petition for a writ of certiorari in January 2017. Despite this denial, it is still interesting to note that Loomis based his appeal on equal protection claims, asserting that male and female defendants are treated differently using the Compas algorithm. The organization<a href=\"https:\/\/www.propublica.org\/\"> ProPublica<\/a>, which is dedicated to investigative journalism and exposing systemic inequality, <a href=\"https:\/\/www.propublica.org\/article\/machine-bias-risk-assessments-in-criminal-sentencing\">found<\/a> that the Compas software treated black defendants inequitably, which can also raise some Equal Protection concerns.<a href=\"http:\/\/www.equivant.com\/\"> Equivant<\/a>, the company that invented Compas, <a href=\"http:\/\/www.equivant.com\/blog\/official-response-to-science-advances\">disagrees<\/a> with Dressel\u2019s and Farid\u2019s findings, and argues that both ProPublica\u2019s and their study was inaccurate due to small sample sized.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>In a world with increasing reliance on technology, it is unsurprising that computer algorithms are now being used to predict crime. Many have seen the movie Minority Report, starring Tom Cruise, in which a futuristic society has abolished all murder due to its ability to harness individuals\u2019 psychic powers to predict killings before they even <a href=\"https:\/\/journals.law.unc.edu\/ncjolt\/blogs\/can-ai-predict-crime-well-humans-can\/\" class=\"more-link\">&#8230;<\/a><\/p>\n","protected":false},"author":1,"featured_media":5454,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":[],"categories":[51],"tags":[],"_links":{"self":[{"href":"https:\/\/journals.law.unc.edu\/ncjolt\/wp-json\/wp\/v2\/posts\/5453"}],"collection":[{"href":"https:\/\/journals.law.unc.edu\/ncjolt\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/journals.law.unc.edu\/ncjolt\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/journals.law.unc.edu\/ncjolt\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/journals.law.unc.edu\/ncjolt\/wp-json\/wp\/v2\/comments?post=5453"}],"version-history":[{"count":1,"href":"https:\/\/journals.law.unc.edu\/ncjolt\/wp-json\/wp\/v2\/posts\/5453\/revisions"}],"predecessor-version":[{"id":6993,"href":"https:\/\/journals.law.unc.edu\/ncjolt\/wp-json\/wp\/v2\/posts\/5453\/revisions\/6993"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/journals.law.unc.edu\/ncjolt\/wp-json\/wp\/v2\/media\/5454"}],"wp:attachment":[{"href":"https:\/\/journals.law.unc.edu\/ncjolt\/wp-json\/wp\/v2\/media?parent=5453"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/journals.law.unc.edu\/ncjolt\/wp-json\/wp\/v2\/categories?post=5453"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/journals.law.unc.edu\/ncjolt\/wp-json\/wp\/v2\/tags?post=5453"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}