Everyday Erinyes #111

 Posted by at 6:49 am  Politics
Feb 172018
 

Experts in autocracies have pointed out that it is, unfortunately, easy to slip into normalizing the tyrant, hence it is important to hang on to outrage. These incidents which seem to call for the efforts of the Greek Furies (Erinyes) to come and deal with them will, I hope, help with that. Even though there are many more which I can’t include. As a reminder, though no one really knows how many there were supposed to be, the three names we have are Alecto, Megaera, and Tisiphone. These roughly translate as “unceasing,” “grudging,” and “vengeful destruction.”

The issue I’m looking at today is perhaps not as “sexy” as some other hot issues … but in the long run it is no less important.

We had hoped that in the age of technology, we could eradicate bias by putting computer programs in place of our old systems. With algorithm-based systems, we can make faster, less variable predictions about the likelihood of people ending up in the criminal justice system again, or recidivism. But it’s become increasingly apparent that automating the process made things worse because now we have taken old bias and embedded it by teaching it to computers. We hoped machines could provide the fair treatment humans have failed to give criminal defenders and past offenders—but they haven’t. (Emphasis mine)

Julia Dressel is a senior at Dartmouth, whose undergraduate honors thesis is gaining national attention. With professor Hany Farid, she released a report in the Journal of Science Advances titled “The accuracy, fairness, and limits of predicting recidivism.”

The two evaluated the risk assessment software COMPAS—Correctional Offender Management Profiling for Alternative Sanctions—and the results were shocking. Participants only had seven details about offenders, compared to 137 given to COMPAS, and were accurate 67 percent of the time.

ProPublica looked at the same software in 2016, and found high levels of racial bias. “Findings included heightened and false predictions on black recidivism, a false reduced risk for white recidivism and black defendants misclassified as having a higher risk for violent offenses.”

“Algorithmic tools sound impressive, so people are quick to assume that a tool’s predictions are inherently superior to human predictions. It’s dangerous if judges assume COMPAS produces accurate predictions, when in reality its accuracy is around 65 percent. Therefore, it’s important to expose when software like COMPAS isn’t performing as we expect. We cannot take any algorithm’s accuracy for granted, especially when the algorithm is being used to make decisions that can have serious consequences in someone’s life,” Dressel continued.

The justice systems injustices end up becoming self-fulfilling prophecies at worst, but, at best, self-perpetuating. When blacks are more likely mto be suspected, they are more likely to be prosecuted, and more likely to be convicted. Thus, they become more likely to have more crimes on their record than whites with otherwise similar profiles. The two most predictive criteria of recidivism are age and total number of prior convictions. The same bias that led to their prior convictions now leads to new bias in predicting recidivism. Yet all the original bias is also still working too.

“People with criminal records face significant collateral consequences of their conviction, including barriers to voting, employment, housing, and financial public assistance. These barriers complicate the reintegration process after incarceration and likely increase the odds a person will recidivate,” [Kara] Gotsch [of The Sentencing Project] explained…. The U.S. relies on incarceration and severe punishments to address circumstances that really result from our broken social safety net, like inadequate access to medical and mental health treatment, including drug treatment. Incarceration will not solve these problems, but community investment in services can.”

Certainly a black man with a college degree will get fewer job interviews than a white male high school graduate. But with equivalent education, a black man with no police record will also get fewer interviews than a white man with one felony conviction. I realize, as I expect most people who give the criminal justice system a moment’s though do, that anyone caught up in the justice system will lose thereby – black, white, Hispanic, Asian, men, women. The fact that certain groups lose more than other groups, if we are willing to look at that, can actually shed light on what everyone loses, and how appropriate those losses in reality are.

Since the studies publication, Northpointe, Inc. (now Equivant), which owns the COMPAS software, wrote an “Official Response” on its website that alleges Dressel’s research “actually adds to a growing number of independent studies that have confirmed that COMPAS achieves good predictability and matches the increasingly accepted AUC standard.” However, they make no mention of the racial bias and say they will review the materials from the study for accuracy.

If companies won’t take responsibility for making racial bias permanent, it’s up to us as a community to bring attention to racial disparities. Both the Sentencing Project and Dressel’s research are a step in the right direction.

Alecto, Megaera, and Tisiphone, at least this project will send you to computer labs rather than to associate with Dushes. That may or may not be a plus.

The Furies and I will be back.

Cross posted to Care2 HERE

Share

  8 Responses to “Everyday Erinyes #111”

  1. First off, Kudos to Ms. Dressel and her professor Mr. Farid for her undergraduate honors thesis regarding this Unfortunately this is so true, and saddening.

    It starts early. imho. Being retired now, from the school system, this is also prevalent with young men and women (students) who are still in school(s). I knew several that were on this merry-go-round with TJC, with their probation officers, the court(s), and it seemed to me, sadly, that they (the kids), had a very hard time getting off the machine. Thus, revolving around again and again.

    In reading this post, one can see that it progresses to adulthood, with lifelong prison bias, and that “the two most predictive criteria of recidivism are age and total number of prior convictions. The same bias that led to their prior convictions now leads to new bias in predicting recidivism.” Personally, I believe that because all children, (people), have the potential for good, they deserve our best efforts in their behalf.

    I can honestly say that I don’t have the answers to this, but I sure hope that the Furies can start with the computers relating to this subject. I wish them well.

    Thanks, Joanne for a most interesting post.

  2. Good one JD! ?

    Businesses have often claimed that my problem was the result of their computer making a mistake.  When a computer makes a mistake, it usually dies.  This kind of mistake comes from the idiot that told the computer what to do with the data. 

    ?

  3. Computer problems, like those cited, suffer from the old “Garbage in, garbage out,” syndrome!
    “The U.S. relies on incarceration and severe punishments to address circumstances that really result from our broken social safety net, like inadequate access to medical and mental health,” does not answer the issue of our broken society, screw the net!  People of color have been the people most impacted by the failure of Reconstruction.  President Andrew Johnson hated blacks, allowed them to be free, but utterly victimized by their former owners.  The North won the war, but the South won reconstruction.  “Separate but equal” never was.  “Jim Crow” saw to that, and it is coming back, now.  The majority run system in the U.S. has always made it more difficult for black families to take good care of their own.  Incarceration has long separated black kids from their fathers, and brought on all the problems that creates, and these kids have been shown that the bigger society does not give a flying f**k for their lives.  And so the cycle continues.  
    Am I preaching to the choir?  I guess so, but the “Family values” conservatives would not read past the 2nd sentence.  
    I have read, “Smithsonian Magazine” (recently) that had A. Johnson been removed from office, by one more vote in congress, that the next in line to be president, a Mr. Wade, I believe would have seen to it that Lincoln’s legacy would have been enhanced, furthered.  What a different country this might have become!?  I’ve got to do some reading on what Johnson was not removed from office.

    • I understood that the Senator who cast the tie-breaking “No” vote detested Johnson, but did not feel, either that the case was proven, or else that the charges were actually impeachable offenses.  I know that’s vague, but I’m going off my memory – and I’m not at home right now.  I’d be very interested in what you find.  I’ve been in exactly that position on a Court Martial (I’ve never been on a civilian jury, but I’ve known the same situation to arise.)  It’s maddening, but in our justice system, the prosecution is required to PROVE its case in order to get a conviction.

  4. I’ve worked both as a scientist and in IT and for both it holds true that you take out what you put in and that neither is bias free. At least conscientious scientists will take all precautions to avoid their own bias from interfering with the outcome by following strict procedures and using the most bias-free tools available, but that won’t keep bias from research questions and hypotheses themselves; but for computer programs no such rules are available. At best these programs work on the basis of scientific data handed to them, but if this data isn’t bias-free the outcome, in this case prediction, isn’t bias-free either.

    There’s nothing scientifically, i.e. statistically wrong with using two very strong predictors of recidivism, age and the total number of prior convictions, to predict recidivism. (I expect gender to be another one.) In fact these two predictors are so strong that they’re almost all that’s needed to predict with 67% accuracy. Adding 130 other variables, none of them as powerful as the first two, will only lower the accuracy, which is exactly what happens (by 2%). So inherently there’s nothing much wrong with COMPAS; it is the underlying bias in the justice system that poses the problem: racial bias which has black young people, especially men, arrested and convicted far more often and for more minor crimes than their white counterparts and the great difficulty for any felon to rehabilitate – again biased towards black felons, either in prison or after release – to avoid recidivism. And so COMPAS will give back what it is fed: a “heightened and false predictions on black recidivism, a false reduced risk for white recidivism and black defendants misclassified as having a higher risk for violent offenses.”

    Statistics can’t alter that, it’s not statistics that are biased but the justice system and society in general. I’m afraid there’s very little the Furies can do about that. The makers of COMPAS can’t take the bias out of their algorithms because it isn’t there. Any judge making the same risk-assessment based on just a person’s file will sadly come to the same conclusion as the program does; only a few will do significantly better.

    • If one is aware that data is bad, and specifically bad in one or more particular directions, there are ways to compensate for that.  Convincing – to be frank, Republicans – that the data is bad and in what directions is going to be very difficult.  I wonder if producing a competing system with appropriate tweaks but with out telling anyone what they are (because, of course, patentable intellectual property is confidential in nature) would be a plan.

Sorry, the comment form is closed at this time.