• Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar
MENUMENU
MENUMENU
  • Home
  • About
    • Contact Us
    • FlaglerLive Board of Directors
    • Comment Policy
    • Mission Statement
    • Our Values
    • Privacy Policy
  • Live Calendar
  • Submit Obituary
  • Submit an Event
  • Support FlaglerLive
  • Advertise on FlaglerLive (386) 503-3808
  • Search Results

FlaglerLive

No Bull, no Fluff, No Smudges

MENUMENU
  • Flagler
    • Flagler County Commission
    • Beverly Beach
    • Economic Development Council
    • Flagler History
    • Mondex/Daytona North
    • The Hammock
    • Tourist Development Council
  • Palm Coast
    • Palm Coast City Council
    • Palm Coast Crime
  • Bunnell
    • Bunnell City Commission
    • Bunnell Crime
  • Flagler Beach
    • Flagler Beach City Commission
    • Flagler Beach Crime
  • Cops/Courts
    • Circuit & County Court
    • Florida Supreme Court
    • Federal Courts
    • Flagler 911
    • Fire House
    • Flagler County Sheriff
    • Flagler Jail Bookings
    • Traffic Accidents
  • Rights & Liberties
    • Fourth Amendment
    • First Amendment
    • Privacy
    • Second Amendment
    • Seventh Amendment
    • Sixth Amendment
    • Sunshine Law
    • Third Amendment
    • Religion & Beliefs
    • Human Rights
    • Immigration
    • Labor Rights
    • 14th Amendment
    • Civil Rights
  • Schools
    • Adult Education
    • Belle Terre Elementary
    • Buddy Taylor Middle
    • Bunnell Elementary
    • Charter Schools
    • Daytona State College
    • Flagler County School Board
    • Flagler Palm Coast High School
    • Higher Education
    • Imagine School
    • Indian Trails Middle
    • Matanzas High School
    • Old Kings Elementary
    • Rymfire Elementary
    • Stetson University
    • Wadsworth Elementary
    • University of Florida/Florida State
  • Economy
    • Jobs & Unemployment
    • Business & Economy
    • Development & Sprawl
    • Leisure & Tourism
    • Local Business
    • Local Media
    • Real Estate & Development
    • Taxes
  • Commentary
    • The Conversation
    • Pierre Tristam
    • Diane Roberts
    • Guest Columns
    • Byblos
    • Editor's Blog
  • Culture
    • African American Cultural Society
    • Arts in Palm Coast & Flagler
    • Books
    • City Repertory Theatre
    • Flagler Auditorium
    • Flagler Playhouse
    • Flagler Youth Orchestra
    • Jacksonville Symphony Orchestra
    • Palm Coast Arts Foundation
    • Special Events
  • Elections 2024
    • Amendments and Referendums
    • Presidential Election
    • Campaign Finance
    • City Elections
    • Congressional
    • Constitutionals
    • Courts
    • Governor
    • Polls
    • Voting Rights
  • Florida
    • Federal Politics
    • Florida History
    • Florida Legislature
    • Florida Legislature
    • Ron DeSantis
  • Health & Society
    • Flagler County Health Department
    • Ask the Doctor Column
    • Health Care
    • Health Care Business
    • Covid-19
    • Children and Families
    • Medicaid and Medicare
    • Mental Health
    • Poverty
    • Violence
  • All Else
    • Daily Briefing
    • Americana
    • Obituaries
    • News Briefs
    • Weather and Climate
    • Wildlife

Criminal Justice Algorithms: Being Race-Neutral Doesn’t Mean Race-Blind

April 3, 2022 | FlaglerLive | 1 Comment

blind justice algorithm
An algorithm is the centerpiece of one criminal justice reform program, but should it be race-blind? (the_burtons/Moment via Getty Images)

By Duncan Purves and Jeremy Davis

Justice is supposed to be “blind.” But is race blindness always the best way to achieve racial equality? An algorithm to predict recidivism among prison populations is underscoring that debate.




The risk-assessment tool is a centerpiece of the First Step Act, which Congress passed in 2018 with significant bipartisan support, and is meant to shorten some criminal sentences and improve conditions in prisons. Among other changes, it rewards federal inmates with early release if they participate in programs designed to reduce their risk of re-offending. Potential candidates eligible for early release are identified using the Prisoner Assessment Tool Targeting Estimated Risk and Needs, called PATTERN, which estimates an inmate’s risk of committing a crime upon release.

Proponents celebrated the First Step Act as a step toward criminal justice reform that provides a clear path to reducing the prison population of low-risk nonviolent offenders while preserving public safety.

But a review of the PATTERN system published by the Department of Justice in December 2021 found that PATTERN overpredicts recidivism among minority inmates by between 2% and 8% compared with white inmates. Critics fear that PATTERN is reinforcing racial biases that have long plagued the U.S. prison system.

As ethicists who research the use of algorithms in the criminal justice system, we spend lots of time thinking about how to avoid replicating racial bias with new technologies. We seek to understand whether systems like PATTERN can be made racially equitable while continuing to serve the function for which they were designed: to reduce prison populations while maintaining public safety.




Making PATTERN equally accurate for all inmates might require the algorithm to take inmates’ race into account, which can seem counterintuitive. In other words, achieving fair outcomes across racial groups might require focusing more on race, not less: a seeming paradox that plays out in many discussions of fairness and racial justice.

How PATTERN works

The PATTERN algorithm scores individuals according to a range of variables that have been shown to predict recidivism. These factors include criminal history, education level, disciplinary incidents while incarcerated, and whether they have completed any programs aimed at reducing recidivism, among others. The algorithm predicts both general and violent recidivism, and does not take an inmate’s race into account when producing risk scores.

Based on this score, individuals are deemed high-, medium- or low-risk. Only those falling into the last category are eligible for early release.

A woman in a white suit looks up at a man in a suit with his back to the camera.
Then-President Donald Trump listens as Alice Marie Johnson, who was incarcerated for 21 years, speaks at the 2019 Prison Reform Summit and First Step Act Celebration at the White House.
AP Photo/Susan Walsh

The DOJ’s latest review, which compares PATTERN predictions with actual outcomes of former inmates, shows that the algorithm’s errors tended to disadvantage nonwhite inmates.

In comparison with white inmates, PATTERN overpredicted general recidivism among Black male inmates by between 2% and 3%. According to the DOJ report, this number rose to 6% to 7% for Black women, relative to white women. PATTERN overpredicted recidivism in Hispanic individuals by 2% to 6% in comparison with white inmates, and overpredicted recidivism among Asian men by 7% to 8% in comparison with white inmates.




These disparate results will likely strike many people as unfair, with the potential to reinforce existing racial disparities in the criminal justice system. For example, Black Americans are already incarcerated at almost five times the rate of white Americans.

At the same time that the algorithm overpredicted recidivism for some racial groups, it underpredicted for others.

Native American men’s general recidivism was underpredicted by 12% to 15% in relation to white inmates, with a 2% underprediction for violent recidivism. Violent recidivism was underpredicted by 4% to 5% for Black men and 1% to 2% for Black women.

Reducing bias by including race

It is tempting to conclude that the Department of Justice should abandon the system altogether. However, computer and data scientists have developed an array of tools over the past decade designed to address concerns about algorithmic unfairness. So it is worth asking whether PATTERN’s inequalities can be remedied.

One option is to apply “debiasing techniques” of the sort described in recent work by criminal justice experts Jennifer Skeem and Christopher Lowenkamp. As computer scientists and legal scholars have observed, the predictive value of a piece of information about a person might vary depending on their other characteristics. For example, suppose that having stable housing tends to reduce the risk that a former inmate will commit another crime, but that the relationship between housing and not re-offending is stronger for white inmates than Black inmates. An algorithm could take this into account for higher accuracy.

But taking this difference into account would require that designers include each inmate’s race in the algorithm, which raises legal concerns. Treating individuals differently on the basis of race in legal decision-making risks violating the 14th Amendment of the Constitution, which guarantees equal protection under the law.

Several legal scholars, including Deborah Hellman, have recently argued that this legal concern is overstated. For example, the law permits using racial classifications to describe criminal suspects and to gather demographic data on the census.

Other uses of racial classifications are more problematic. For example, racial profiling and affirmative action programs continue to be contested in court. But Hellman argues that designing algorithms that are sensitive to the way that information’s predictive value varies across racial lines is more akin to using race in suspect descriptions and the census.

In part, this is because race-sensitive algorithms, unlike racial profiling, do not rely on statistical generalizations about the prevalence of a feature, like the rate of re-offending, within a racial group. Rather, she proposes making statistical generalizations about the reliability of the algorithm’s information for members of a racial group and adjusting appropriately.




But there are also several ethical concerns to consider. Incorporating race might constitute unfair treatment. It might fail to treat inmates as individuals, since it relies upon statistical facts about the racial group to which they are assigned. And it might put some inmates in a worse position than others to earn early-release credits, merely because of their race.

Key difference

Despite these concerns, we argue there are good ethical reasons to incorporate race into the algorithm.

First, by incorporating race, the algorithm could be more accurate across all racial groups. This might allow the federal prison system to grant early release to more inmates who pose a low risk of recidivism while keeping high-risk inmates behind bars. This will promote justice without sacrificing public safety – what proponents of criminal justice reform want.

Furthermore, changing the algorithm to include race can improve outcomes for Black inmates without making things worse for white inmates. This is because earning credits toward early release from prison is not a zero-sum game; one person’s eligibility for the early release program does not affect anyone else’s. This is very different from programs like affirmative action in hiring or education. In these cases, positions are limited, so making things better for one group necessarily makes things worse for the other group.

As PATTERN illustrates, racial equality is not necessarily promoted by taking race out of the equation – at least not when all participants stand to benefit.

Duncan Purves is Associate Professor of Philosophy at the University of Florida. Jeremy Davis is a Postdoctoral Associate at the University of Florida.

The Conversation arose out of deep-seated concerns for the fading quality of our public discourse and recognition of the vital role that academic experts could play in the public arena. Information has always been essential to democracy. It’s a societal good, like clean water. But many now find it difficult to put their trust in the media and experts who have spent years researching a topic. Instead, they listen to those who have the loudest voices. Those uninformed views are amplified by social media networks that reward those who spark outrage instead of insight or thoughtful discussion. The Conversation seeks to be part of the solution to this problem, to raise up the voices of true experts and to make their knowledge available to everyone. The Conversation publishes nightly at 9 p.m. on FlaglerLive.
See the Full Conversation Archives
Support FlaglerLive's End of Year Fundraiser
Thank you readers for getting us to--and past--our year-end fund-raising goal yet again. It’s a bracing way to mark our 15th year at FlaglerLive. Our donors are just a fraction of the 25,000 readers who seek us out for the best-reported, most timely, trustworthy, and independent local news site anywhere, without paywall. FlaglerLive is free. Fighting misinformation and keeping democracy in the sunshine 365/7/24 isn’t free. Take a brief moment, become a champion of fearless, enlightening journalism. Any amount helps. We’re a 501(c)(3) non-profit news organization. Donations are tax deductible.  
You may donate openly or anonymously.
We like Zeffy (no fees), but if you prefer to use PayPal, click here.

Reader Interactions

Comments

  1. MikeM says

    April 4, 2022 at 4:30 pm

    Before anybody gets their panties in a knot over this, maybe one should learn what an algorithm is. Definition of algorithm
    : a procedure for solving a mathematical problem (as of finding the greatest common divisor) in a finite number of steps that frequently involves repetition of an operation
    broadly : a step-by-step procedure for solving a problem or accomplishing some end.

    Someone has put out an algorithm for measuring who is likely to reoffend. Maybe there is an algorithm for who will become gay, or transgender too. It is all BS. It is pseudo science psycho babble. College professors trying to measure the human mind with a mathematical equation.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

  • Conner Bosch law attorneys lawyers offices palm coast flagler county
  • grand living realty
  • politis matovina attorneys for justice personal injury law auto truck accidents

Primary Sidebar

  • grand living realty
  • politis matovina attorneys for justice personal injury law auto truck accidents

Recent Comments

  • Sonny on Palm Coast Will Consider Lowering Citywide Speed Limit to 25 and Let Residents Request Traffic-Calming Devices in Neighborhoods
  • Skibum on Supreme Court Hears the Challenge to Birthright Citizenship
  • Larry on Palm Coast Council Launches Review of City Charter, This Time Seeking an Actual Advisory Committee
  • Maryanne on Supreme Court Hears the Challenge to Birthright Citizenship
  • Skibum on Children May Attend Drag Shows, Court Rules, Striking Down Florida Law
  • James on The Daily Cartoon and Live Briefing: Wednesday, May 14, 2025
  • Samuel L. Bronkowitz on Florida University System Leaders Plead with Court To Restore Discriminatory Restrictions on Chinese Students
  • God is in the details on Palm Coast Council Launches Review of City Charter, This Time Seeking an Actual Advisory Committee
  • Laurel on To Protect Florida’s Environment, Conservation Is Cheaper Than Restoration
  • Laurel on The Daily Cartoon and Live Briefing: Tuesday, May 13, 2025
  • Larry K on Palm Coast Will Consider Lowering Citywide Speed Limit to 25 and Let Residents Request Traffic-Calming Devices in Neighborhoods
  • PeachesMcGee on Palm Coast Will Consider Lowering Citywide Speed Limit to 25 and Let Residents Request Traffic-Calming Devices in Neighborhoods
  • Laurel on Children May Attend Drag Shows, Court Rules, Striking Down Florida Law
  • Susan on Florida University System Leaders Plead with Court To Restore Discriminatory Restrictions on Chinese Students
  • Laurel on The Daily Cartoon and Live Briefing: Wednesday, May 14, 2025
  • Laura H. on Superintendent LaShakia Moore Is Taking on ‘School Choice’ on Her Terms: Stop Competing with Vouchers at a Disadvantage

Log in