This US mayor joked cops should “mount .50-caliber” guns where AI predicts crime

Lancaster mayor R. Rex Parris once referred to an African-American man running for city council as a “gang candidate.”
Lancaster mayor R. Rex Parris once referred to an African-American man running for city council as a “gang candidate.”
Image: AP Photo/Reed Saxon, File
We may earn a commission from links on this page.

A controversial California mayor wants local police to “hurry up” with the expansion of a citywide surveillance system before anyone finds out and forces it to stop.

During a March 26 city council meeting in Lancaster, a desert community of 160,000 in Los Angeles County, city official Patti Garibay discussed an IBM Watson “dashboard” that police have been using for the past six to eight months to focus enforcement efforts.

Garibay, Lancaster’s energy manager, calling it a “hybrid policing model,” said the city is working with IBM to “automate and to use machine learning” so the “machine will tell us, ‘This is where you need to be.’”

“With machine learning, with automation, there’s a 99% success, so that robot is—will be—99% accurate in telling us what is going to happen next, which is really interesting,” Garibay told the mayor and other local officials, citing test results from “the city of Idaho.”

“Well, why aren’t we mounting .50-calibers [out there]?” asked mayor R. Rex Parris, referring to a powerful rifle used by military snipers, quickly adding, with a broad smile, that he was “being facetious.”

Predictive policing is part of a widening trend in the US and elsewhere for cities looking to embrace technology. Companies like IBM and PredPol promise reduced crime rates and better allocation of resources when police use their proprietary algorithms. Researchers and activists warn that predictive policing can lead to abuses, including the targeting of minority populations. Critics say the use of opaque algorithms can launder biases that officials might be striving to overcome in their police forces.

The desert crime scene

Lanscaster is looking to use this technology for a broad range of policing activities, including community policing and crime prevention, as well as addressing homelessness and panhandling, according to procurement documents reviewed by Quartz.

The city has been described by Vice as “an area best known for neo-Nazis and meth labs.” A nexus of gang violence and hate crimes, Lancaster experiences 83% more crimes per square mile than the rest of California. Violent crimes per capita are nearly double the national average.

Garibay went on to explain that the city is exploring additional surveillance technologies, including facial recognition software and drones.

“You guys are incredible,” Parris, a personal-injury lawyer by trade, exulted. “I mean, think about this. There’s no other city in the world doing what you guys are doing. And nobody knows it. Wow. You’d better hurry up and get it done or they’re gonna make us not do it.”

The “99% success” rate “was not based on historical accuracy…but rather on IBM technology in general,” Tess Epling from Lancaster’s administration department tells Quartz, adding that IBM  has reportedly achieved accuracy rates of “more than 90% within models they’ve built including traffic, weather, and more.”

IBM makes no public reference to its predictive policing software’s accuracy, and IBM does not have any known public contracts in the state of Idaho. When contacted, IBM told Quartz it would try to find information on any 99% figure.

“From my perspective the whole thing is an example of a ‘smart’ city being pitched by people who really aren’t thinking through the consequences or even know how the stuff works,” says Dave Maass, an investigative researcher at the Electronic Frontier Foundation, who alerted Quartz to the details of the city council meeting.

Hard to foresee where predictive policing goes

Sean Goodison, a senior research criminologist with the Police Executive Research Forum (PERF), a Washington, DC nonprofit that works with US law-enforcement agencies, tells Quartz that there is “no standard way to measure” the effectiveness of predictive policing algorithms such as IBM’s.

“Often, predictive policing software compares itself to ‘random’ policing to claim superior prediction skill,” Goodison says. “While objectively true, the comparison reflects a straw-man fallacy in that no police department randomly assigns all its resources or expects to see crime distributed randomly within a jurisdiction.”

In 2015, the city of Miami, Florida received a grant to start work with predictive policing software called HunchLab, the Miami Herald reported. The project didn’t take off, according to Rob Guerette, an associate professor of criminology at Florida International University (FIU), who was involved with the assessment of the software.

“The original hope was that they could use the newest, slickest thing out on the market,” Guerette tells Quartz. “There was so much ambiguity around the ultimate usefulness of it that they decided instead to change direction and they decided to instead invest in improving their human capital.”

Predictive policing is remains a new and unproven technology. A report this year in New York University’s law review found that improper data practices artificially brought down the official crime rates in cities like Chicago and New Orleans, which meant machine-learning systems were not being given the correct data. Additionally, claims about the accuracy of predictive policing have not been subject to critical examination, say experts.

Read the full text of Lancaster’s original hybrid-policing request here:

See on Kickstarter

This story has been updated with comment from the City of Lancaster.

Correction: A previous version of this article incorrectly stated Miami police received a grant to work with IBM. The grant was actually for HunchLab, a predictive policing software from another company.