UK Police Use AI System to Stop Crime Before It Happens
Manchester England November 29 2018 It sounds like something out of Minority Report, but police in the UK want to try and use artificial-intelligence to prevent violent crimes before they occur.
According to a report by New Scientist, the system is called the National Data Analytics Solution (NDAS) and works by using a “combination of AI and statistics” to assess the risk of someone becoming a victim of gun or knife crime, as well as them becoming a victim of modern slavery.
However, the measures that are taken once this system has identified an individual are “a matter of discussion” says Ian Donnelley, the lead on the project. The intention, apparently, is not to arrest but rather to provide support from health or social workers in the area. One of the examples given is counselling individuals with a history of mental health issues, or potential victims being contacted by social services.
The team has gathered more than a terabyte of data from local and national police databases – which included stop and search records and logs of crimes committed. The software then crunched the numbers, finding 1400 indicators that could help predict crime, 30 of which were “particularly powerful.” These included the number of crimes someone committed with assistance, and the number of crimes that had been committed by people “in that individual’s social group”.
West Midlands Police is leading the project, with London’s Metropolitan Police, Greater Manchester Police, and six other branches also involved. The intention for this system to be used by every force in the UK.
If this is sounding a little dystopian, your concerns are matched by the Alan Turing Institute. According to New Scientist, a report from the institute said there were “serious ethical issues” about whether such technology is in the public good. While the intentions are good overall, the report says, it fails to recognise important issues in full, such as inaccurate predictions.
As we’ve reported on before, artificial intelligence can easily reinforce the biases in society rather than act objectively. Amazon’s recent recruiting tool, which used artificial intelligence, had to be scrapped because it penalised women – a result of being trained on data gathered from a society still struggling with sexism in workplaces.
In 2016, nearly all of the 44 winners of an AI-judged beauty contest were white, the reason being that the algorithm was mostly trained with photos of white people; similarly, in 2015, Google’s Photos app mistakenly tagged two black people as gorillas.
Meanwhile police funding in the UK has been cut significantly over recent years, and economically disadvantaged areas suffer the largest cuts to social care. According to Andrew Ferguson at the University of the District of Columbia, arrests correlate where police are stationed and disproportionately affects residents of poor neighbourhoods, so is easy to see how the relationship between crime, economically-challenged neighbourhoods, and a lack of social services could result in this system providing inaccurate results.
The police’s aim to produce a prototype by the end of March next year.
PC Magazine