The California Department of Corrections and Rehabilitation has blocked request for public records from researchers wanting to examine if race and ethnicity bias exist in parole-suitability decisions, according to a lawsuit filed by the Electronic Frontier Foundation.
“We want to create a machine learning tool that can extract factors from parole hearing transcripts, describe the current decision-making process, and identify which decisions appear inconsistent with that process and might be worthy of reconsideration,” said Catalin Voss, a PhD student at Stanford University. “We need race date to do that.”
CDCR officials denied public records requests since 2018 claiming exemption under the California Records Request Act.
Kristen Bell, an assistant professor of law at the University of Oregon is no stranger to exposing illegitimate factors that have, in the past, influenced parole suitability decisions specifically juveniles who were serving extensive sentences.
“There has been much debate about evidence-based criminal justice reform in California, but how can we know if we’re moving any closer to justice when the prison system is preventing independent researchers from accessing race data,” said Bell.
Former managing attorney of The California Innocence Project, Michael Semanchik is using an AI assistant program to identify patterns of inconsistencies in documents to help overturn sentences.
“We are spending a lot of our resources and time trying to figure out which cases deserve investigation,” Semanchik told the American Bar Association Journal. “If AI can just tell me which ones to focus on, we can focus on the investigation and litigation of getting people out of prison.”
In 2023, a Vera Institute of Justice report looked at 168 hearing transcripts and found that most people denied parole had a low Correctional Offender Management Profiling for Alternative Sanctions score.
The COMPAS score is a risk assessment algorithm that predicts the chances an individual commit violence or re-offend based on factors like education level, parole plans, age, and criminal history.
Residents of San Quentin shared their thoughts on data collection, using machine-learning algorithms to determine their freedom and the impact of a suitability hearing using artificial intelligence.
Resident George Camarena was reluctant to the idea of an algorithm controlling his freedom. He said he does not know enough about AI to feel comfortable placing his freedom in its hands.
“I would say no to AI because I don’t trust it yet. At the end of the day, the person going [through the parole-suitability process] should be able to opt in or out of using AI because it’s their life,” said Camarena.
Resident Charles S. has a release date, He talked about his friend’s parole denial. The parole board denied his release, but asked that he complete a specific program, which he did. When he returned for his suitability hearing 18 months later, there was a new panel of commissioners with a new set of requirements.
“People shouldn’t have to see a different parole board each time and they all want different things,” said Charles S.
Resident Maxx Robinson said he would be reluctant to using an algorithm that just gathers documents because he said that AI is unable to pick up on the emotional cues that are normally recognized by other humans.
“Remorse is relative. It has to be seen and felt by a human being. A machine can’t determine if a person does or does not have remorse because remorse is measured in many ways,” Robinson said.
The Journal of Quantitative Criminology report noted that parole boards have close to unlimited discretionary power after a person is sentenced. Researchers have found that locating race discrimination in the parole decision-making process is a challenge because race is not an individual factor of a person’s parole denial.
The report noted that some individuals denied parole had complete release plans, positive records of in-person education while incarcerated, and vocational programming, but their denial was for their original commitment offense.
“Racial disparities in rates of prison release are not, in and of themselves, an indication of racial discrimination, as there may be factors that can appropriately influence the release decision that also correlate with race,” said the report.
In 2016, New York passed a law requiring all parole denials be followed by a written explanation from the New York State Parole Board on why the person was denied parole whenever that denial falls outside the COMPAS risk score, according to New York Codes, rules regulations of 2020. In 2021, New York proposed state Senate Bill S1415A. The bill stated that they will release “any incarcerated person appearing before the board who is eligible for release on parole, unless the parole case record demonstrates there is a current and unreasonable risk the person will violate the law if released and such risk cannot be mitigated by parole supervision.”
Saira Hussain, staff attorney for Electronic Frontier Foundation noted that the response from CDCR is only an argument.
“The California Department of Corrections and Rehabilitation is making the same arguments with us that have already lost in court,” said Hussain. “Our clients want to use machine learning to identify patterns of discrimination – something you’d think prison officials might want to learn more about.”
Staff attorney Cara Gagliano said government officials should not have the last say so on who gets certain information.
“Our clients simply want CDCR to follow the law and provide the records they need to do their work,” Gagliano said.