Jump to content
Join the Unexplained Mysteries community today! It's free and setting up an account only takes a moment.
- Sign In or Create Account -

LAPD Algorithm's Target Minorities


Dark_Grey

Recommended Posts

The LAPD Has a New Surveillance Formula, Powered by Palantir

Futurism.com
Injusticetoday.com (source)

Quote

The Los Angeles Police Department was recently forced to release documents about their predictive policing and surveillance algorithms, thanks to a lawsuit from the Stop LAPD Spying Coalition (which turned the documents over to In Justice Today). And what do you think the documents have to say?

If you guessed “evidence that policing algorithms, which require officers to keep a checklist of (and keep an eye on) 12 people deemed most likely to commit a crime, are continuing to propagate a vicious cycle of disproportionately high arrests of black Angelinos, as well as other racial minorities,” you guessed correctly.

Algorithms, no matter how sophisticated, are only as good as the information that’s provided to them. So when you feed an AI data from a city where there’s a problem of demonstrably, mathematically racist over-policing of neighborhoods with concentrations of people of color, and then have it tell you who the police should be monitoring, the result will only be as great as the process. And the process? Not so great!

Quote

These surveillance reports identify “probable offenders” in select neighborhoods, based on an LAPD point-based predictive policing formula. Analysts find information for their reports using Palantir software, which culls data from police records, including field interview cards and arrest reports, according to an updated LAPD checklist formula, which uses broader criteria than the past risk formula the department was known to have used. These reports, known as Chronic Offender Bulletins, predate Palantir’s involvement with the LAPD, but since the LAPD began using the company’s data-mining software in September 2011, the department claims that bulletins that would have taken an hour to compile now take “about five minutes.”

Guilty until proven innocent? Like the article says, the algorithm is only as good as the data being fed to it. I don't see how an algorithm could be used to justify any questionable Police behavior, however.

  • Like 1
Link to comment
Share on other sites

 
Just now, Dark_Grey said:

Guilty until proven innocent? Like the article says, the algorithm is only as good as the data being fed to it. I don't see how an algorithm could be used to justify any questionable Police behavior, however.

exactly, also lets not point minorities with a wide brush, it is about black minority, not Asian, not Indian, or any other. and by statistics blacks commit more than half of violent crimes, while being only 13% of population.

Link to comment
Share on other sites

4 hours ago, aztek said:

exactly, also lets not point minorities with a wide brush, it is about black minority, not Asian, not Indian, or any other. and by statistics blacks commit more than half of violent crimes, while being only 13% of population.

To which, Im sure, we shall be told the facts are racist.

  • Like 1
Link to comment
Share on other sites

52 minutes ago, Sir Wearer of Hats said:

To which, Im sure, we shall be told the facts are racist.

They're also sexist. Shame on you! ;)

  • Like 1
  • Haha 1
Link to comment
Share on other sites

I think the article is saying that the patterns of actually racist Cops is being inputed in to the machine and that makes the machine spit out a map of a black neighborhood because that's where crime is most likely to occur next. 

Racist Cops would affect the outcome of the algorithm but the real crime numbers based on area is enough to get the same results, without the racism element. In other words, the crime numbers are what they are regardless of what the computer thinks.

Link to comment
Share on other sites

2 hours ago, Dark_Grey said:

I think the article is saying that the patterns of actually racist Cops is being inputed in to the machine and that makes the machine spit out a map of a black neighborhood because that's where crime is most likely to occur next. 

Racist Cops would affect the outcome of the algorithm but the real crime numbers based on area is enough to get the same results, without the racism element. In other words, the crime numbers are what they are regardless of what the computer thinks.

Or in other words, if you Can’t argue with the evidence, argue with how it was collected. What a state the world’s in.

Edited by Sir Wearer of Hats
  • Like 1
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    • No registered users viewing this page.