Skip to content

They create AI that predicts crimes a week in advance and 90% accuracy

Researchers at the University of Chicago created a artificial intelligence able to predict crimes, after analyzing the historical information of the city, New Scientist magazine said. The results were predictions one week in advance and 90% accurate. However, questions arise due to the possible misuse of this system.

Ishanu Chattopadhyay of the University of Chicago and colleagues created an AI model that analyzed historical crime data for Chicago, Illinois, from 2014 to the end of 2016and then predicted crime levels for the weeks following this training period”, indicated the London magazine.

LOOK: Petvation, the facial recognition device that identifies your pet to let it enter your home

The results were favorable, even with information from other cities. “The model predicted the probability of certain crimes occurring throughout the city, which was divided into blocks about 300 meters wide, with one week in advance and with an accuracy of up to 90%. It was also trained and tested on data from seven other major US cities, with a similar level of performance”, he added.

Chattopadhyay, for his part, assured that this type of system could help law enforcement. “Law enforcement resources are not infinite. So you want to use that optimally. It would be great if you could know where the homicides are going to happen”, he pointed out to the magazine.

LOOK: Ready for 2040? Sky Cruise, the nuclear-powered flying hotel that uses an AI to navigate

How could AI perpetuate racial and social prejudice?

The inclusion of the information that the police have can influence what this AI would be predicting. This is because They not only include the actual reports, trials and other files of the cases, but also the complaints made by citizenswhich are often based on prejudice or lies.

The Chicago Police Department, in recent years, tested an algorithm that created a list of people considered most at risk of being involved in a shooting, either as a victim or as a perpetrator, according to the Chicago Tribune. The details of both were initially kept secret, but when the list was published, the result was that 56% of African-American men between the ages of 20 and 29 were on it.

For this reason, people’s biases can influence AI prediction results. Being a “fact” that has been granted by a machine, which lacks these prejudices, they would be taken as true, despite the fact that they are based on human reports.

Source: Elcomercio

Share this article:
globalhappenings news.jpg
most popular