Eldorado Posted September 16, 2019 #1 Share Posted September 16, 2019 "Police officers have raised concerns about using "biased" artificial-intelligence tools, a report commissioned by one of the UK government's advisory bodies reveals. "The study warns such software may "amplify" prejudices, meaning some groups could become more likely to be stopped in the street and searched." Full report at the BBC: https://www.bbc.co.uk/news/technology-49717378 The UK Gov Study: "The Royal United Services Institute (RUSI) has published research - commissioned by CDEI - into the use of algorithms in policing, and the potential for bias." Full monty: https://www.gov.uk/government/publications/report-commissioned-by-cdei-calls-for-measures-to-address-bias-in-police-use-of-data-analytics 4 Link to comment Share on other sites More sharing options...
+and-then Posted September 16, 2019 #2 Share Posted September 16, 2019 Garbage in - garbage out was a quaint concept when computers were just doing economic analyses. Now that we seem intent on using their wisdom to decide the fate of human beings and their liberty, it's not so quaint any longer. It's scary as hell. 1 2 Link to comment Share on other sites More sharing options...
Piney Posted September 16, 2019 #3 Share Posted September 16, 2019 (edited) 8 minutes ago, Eldorado said: "Police officers have raised concerns about using "biased" artificial-intelligence tools, a report commissioned by one of the UK government's advisory bodies reveals. It contains "skin tone color swatches". Anything one shade above my tone or below is a criminal. Edited September 16, 2019 by Piney **** Atlantis 2 Link to comment Share on other sites More sharing options...
Dark_Grey Posted September 16, 2019 #4 Share Posted September 16, 2019 9 minutes ago, Eldorado said: "Police officers have raised concerns about using "biased" artificial-intelligence tools, a report commissioned by one of the UK government's advisory bodies reveals. "The study warns such software may "amplify" prejudices, meaning some groups could become more likely to be stopped in the street and searched." Full report at the BBC: https://www.bbc.co.uk/news/technology-49717378 The UK Gov Study: "The Royal United Services Institute (RUSI) has published research - commissioned by CDEI - into the use of algorithms in policing, and the potential for bias." Full monty: https://www.gov.uk/government/publications/report-commissioned-by-cdei-calls-for-measures-to-address-bias-in-police-use-of-data-analytics One demographic commits more crimes than the other? It must be the data that's wrong! If you were expecting a pre-determined answer, scrap the AI and go back to filling out reports by hand. The program only spits out results based on the data fed to it. If you want to increase the efficiency of law enforcement, focus on the problem areas instead of polishing your PC image. 3 Link to comment Share on other sites More sharing options...
aztek Posted September 16, 2019 #5 Share Posted September 16, 2019 (edited) or maybe the real situation is the other way around, it s cops who are biased and are afraid AI will take away their ability to act on their bias. and it also isn't good for political correctness, the one who screams robbery the loudest is the robber himself Edited September 16, 2019 by aztek Link to comment Share on other sites More sharing options...
Eldorado Posted September 16, 2019 Author #6 Share Posted September 16, 2019 2 minutes ago, Dark_Grey said: One demographic commits more crimes than the other? It must be the data that's wrong! If you were expecting a pre-determined answer, scrap the AI and go back to filling out reports by hand. The program only spits out results based on the data fed to it. If you want to increase the efficiency of law enforcement, focus on the problem areas instead of polishing your PC image. Algorithmic bias https://en.wikipedia.org/wiki/Algorithmic_bias 1 Link to comment Share on other sites More sharing options...
aztek Posted September 16, 2019 #7 Share Posted September 16, 2019 Japanese vehicles have a special feature in their nav systems, on domestic market, they can tell you if you are entering high crime area, that feature was not allowed to be brought to usa, because it was racist, guess which areas of the cities would be flagged as high crime in USA, the answer is in the reason they banned it, 1 Link to comment Share on other sites More sharing options...
XenoFish Posted September 16, 2019 #8 Share Posted September 16, 2019 Not even robocop can survive PC culture. Link to comment Share on other sites More sharing options...
'Walt' E. Kurtz Posted September 16, 2019 #9 Share Posted September 16, 2019 53 minutes ago, Piney said: It contains "skin tone color swatches". Anything one shade above my tone or below is a criminal. One of my friends always got checked by the customs based only on his looks. And as a swede im so pale they would call in some kind of paranormal research team. 1 Link to comment Share on other sites More sharing options...
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now