Claim: Crime Prediction Software Is Racist

1
202

An analysis by The Markup and Gizmodo claims that crime prediction software that promised to be bias-free actually perpetuates it.

They use Plainfield, NJ as an example where the prediction in one neighborhood, without white residents, was 1,940 crimes and another, a white area, was 11.

Dozens of other neighborhoods in other cities had the same results. They said five million predictions bore the same results.

As a result, more police might be assigned to these areas, The Markup states, although they call it targeting.

The software often recommended daily patrols in and around public and subsidized housing, targeting the poorest of the poor.

“Communities with troubled relationships with police—this is not what they need,” said Jay Stanley, a senior policy analyst at the ACLU Speech, Privacy, and Technology Project. “They need resources to fill basic social needs.”

I need to interject here. As a white person, I wouldn’t dare go near the projects in minority neighborhoods. As my Black friend in Bed Sty says, you can’t come here or you will be in danger.

“Overall, we found that the fewer White residents who lived in an area—and the more Black and Latino residents who lived there—the more likely PredPol would predict a crime there. The same disparity existed between richer and poorer communities,” the authors write.

Predpol recommends sending police into areas where crime predictions are heaviest. The authors don’t know if police follow the recommendations, as they state in the article.

They “found that locations with lots of predictions tended to have high arrest rates in general, suggesting the software was largely recommending officers patrol areas they already frequented.”

What They Aren’t Considering

The one thing they leave out is these areas are where the crime has taken place and with no bail laws and prisoners being released, it is on the rise.

Another factor not considered is most people living in these areas want more police.

Look at the South Side of Chicago as a perfect example. It is almost 100% Black on Black crime. Residents want more police in these areas.

The crime prediction is possibly accurate, and The Markup is mistaking it for racism.

Mistakes can happen and no one wants innocent people arrested, but catching criminals is not racist. About 50 percent of the homicides in this country are committed by Black people. That isn’t racist. It’s a fact. It has been that way and continues to be that way.

We need to start providing the children in these neighborhoods with better education. They need opportunities. To do that, crime has to be lowered. Good teachers usually won’t go to dangerous neighborhoods.

More importantly, with a majority of the Black households run by a single parent or a grandparent, Black society is being ravaged. The children need a responsible father. In 2020, there were about 4.25 million Black families in the United States with a single mother. This is an increase from 1990 levels, when there were about 3.4 million Black families with a single mother.

In 2017, 49% of Hispanic families were also headed up by a single parent.

It is racist to not look at these problem honestly and try to do something about the root causes.

Then there is the gang violence, not gun violence. Criminals can always get guns.

Insofar as the software is concerned, it’s not telling the police anything they don’t already know.

 


PowerInbox
0 0 votes
Article Rating
Subscribe
Notify of
guest

1 Comment
Newest
Oldest Most Voted
Inline Feedbacks
View all comments