Most of Mayor Byron Brown’s proposed $5.4 million hike in police spending is for new patrol officers and detectives.
There is also $364,000 earmarked for a product called ShotSpotter.
ShotSpotter deploys an array of microphones in a neighborhood — 15 to 20 per square mile, attached to buildings and light posts — to detect and pinpoint the source of gunshots, then report the location to police.
The company claims the technology is 97 percent accurate, provides police with intelligence on gunfire that might otherwise go unreported, improves police response time, and helps to reduce gun-related crime. About 120 cities have contracts with ShotSpotter, according to an opinion piece by the company’s CEO, published last fall in The Buffalo News. The company charges up to $90,000 per year per square mile covered by its microphones.
Does it work?
Several studies in different cities say the answer is no — at least, not well enough to justify the cost.
Take Chicago, for example.
Chicago’s police department has spent nearly $8.5 million per year on ShotSpotter since entering a contract with the company in 2018.
Last May, the MacArthur Justice Center released a report that found ShotSpotter “inaccurate, expensive and dangerous.” The study found ShotSpotter alerts rarely helped Chicago police to prevent, solve, or even find a crime.
“Almost nine times out of ten, the police don’t turn up evidence of gun crime or any crime at all,” wrote one of the report’s authors, Jonathan Manes, a former University at Buffalo law professor and former co-chair of the Buffalo Police Advisory Board.
“This system puts police on high alert and sends them racing into communities,” he added. “It creates a powder keg situation for residents who just happen to be in the vicinity of a false alert.”
Chicago’s Office of the Inspector General also studied ShotSpotter, releasing its findings last August. The Inspector General found only about 9 percent of ShotSpotter alerts yielded anything of value to police.
“ShotSpotter alerts rarely produce documented evidence of a gun-related crime, investigatory stop, or recovery of a firearm,” the Inspector General determined.
In an email to Investigative Post, a ShotSpotter spokesperson said the Inspector General’s report “did not specifically suggest that ShotSpotter alerts are not indicative of actual gunfire,” even if those alerts do not yield evidence or police reports.
“It is important to note that traditional 911 calls for service from community members during this same time period resulted in a police report or evidence recovered is about the same rate as ShotSpotter and there is universal agreement about the value of the 911 system,” the spokesperson wrote.
The Chicago studies corroborate data collected elsewhere.
In 2020, Police Chief Magazine published an analysis of ShotSpotter’s results in St. Louis, Missouri. That analysis showed ShotSpotter led to an increase in “shots fired” calls for police to investigate and quickened dispatch times to those calls.
However, ShotSpotter did not improve resolution of those calls. It did not yield troves of evidence. Over a decade, the study found, ShotSpotter’s microphones generated more than 19,000 calls for service in St. Louis, but just 13 arrests.
The study also found that, while ShotSpotter resulted in an overall increase in the number of “shots fired” reports, the number of citizens calling 911 to report gunfire decreased in St. Louis neighborhoods where the listening devices were deployed.
Those citizen calls to 911 were “over seven times more efficient in uncovering and responding to criminal behavior” than ShotSpotter, according to the study.
In response to the St. Louis study, ShotSpotter’s spokesperson pointed Investigative Post to a competing review conducted by New York University’s Policing Project, which found assaults dropped “about 30 percent following the implementation of the technology relative to comparable areas without it.”
The ShotSpotter spokesperson did not note — though the authors of the competing analysis do — that ShotSpotter has been a funder of the Policing Project since 2018. ShotSpotter has paid the Policing Project for services. And ShotSpotter’s CEO sits on the project’s advisory board.
“The Policing Project’s pre-existing relationship with ShotSpotter and pre-existing audit played a role in initiating this report,” the authors wrote very near the top of their study.
ShotSpotter’s spokesperson also noted that the lead author of the first St. Louis study published an essay last September softening his criticism of the technology. While he did not recant his findings in St. Louis, he wrote “my more recent work in Cincinnati shows strong crime reductions” that correlate with that city’s deployment of ShotStopper microphones.
Additional studies have been conducted in other cities :
- A study produced by Johns Hopkins University of 68 large urban counties across the nation using ShotSpotter found “no significant impact on firearm-related homicides or arrest outcomes.”
- An analysis by a Dayton, Ohio, public radio station found that fewer than 2 percent of ShotSpotter alerts resulted in arrests.
- A Memphis TV station reported that 5,000 ShotSpotter alerts over six months yielded just 15 arrests.
Nonetheless, ShotSpotter continues to win new clients and renew old ones.
Despite reports from the Chicago Inspector General and MacArthur Justice Center, Chicago’s Police Department extended their contract through 2023. Pittsburgh has increased its ShotStopper coverage from four square miles to 18, at a cost of $1.2 million a year through 2025, despite questions about the program’s efficacy.
Now Buffalo, which first flirted with the technology more than a decade ago, is queueing up.
Michael DeGeorge, spokesman for both the mayor and the police department, refused to answer questions from Investigative Post or make any other city officials available for an interview.
Last month, however, Buffalo Police Commissioner Joseph Gramaglia told the Common Council’s Police Oversight Committee that ShotSpotter could get his officers to potential crime scenes more quickly. That would improve their chances of catching criminals in the act, collecting evidence and helping victims.
“The quicker you can get to a shooting victim, the quicker you can render first aid and get them to a hospital,” Gramaglia said.
On a web page answering criticism of its products, ShotSpotter links to a study of Camden, New Jersey, indicating ShotSpotter alerts improved police response times to shooting scenes there. This, in turn, meant shooting victims got to a hospital more quickly.
However, the study also found the health outcomes for those victims was “not significantly different.” A study in Hartford, Connecticut, reached the same conclusion: ShotSpotter alerts offered “no benefit” to shooting victims.
Gramaglia told the Council the statistics he cited to support the purchase of ShotSpotter came from meetings with and presentations by the company.
The commissioner said police would benefit from learning about and responding to gunfire that isn’t reported to 911. Between 2015 and 2019, the department averaged 1,625 “shots fired” calls each year, according to an Investigative Post analysis. Gramaglia believes there’s a lot of gunfire the department is missing.
On its website, ShotSpotter cites a Brookings Institute study that concluded only 22 percent of shooting incidents generate 911 calls.
ShotSpotter claims its system is 97 percent accurate in identifying gunfire. That claim assumes every alert the system generates is correct, unless a client reports an error.
Given what analysts have found in Chicago and St. Louis, critics say, it’s likely police simply don’t bother to tell the company every time ShotSpotter sends them on what Manes, author of the Chicago study, calls a “dead-end deployment.”
“The fact that ShotSpotter still touts these ‘accuracy’ numbers reveals its contempt for science and accountability,” Manes wrote in a Buffalo News opinion piece, co-authored with Anthony O’Rourke, a UB law professor who testified to the Common Council this week about ShotStopper’s flaws.
“It’s not based on any actual testing of the system to see how often it’s fooled by loud noises,” O’Rourke told Investigative Post.
In their essay, Manes and O’Rourke refer to reports that ShotSpotter data has contributed to wrongful prosecutions. In Rochester, for example, a man charged with attempted murder of a police officer — based in part on ShotSpotter data — is suing the city for false and malicious prosecution.
According to a Reuters investigation of the case, ShotSpotter altered its reports multiple times to match the police narrative of the shooting in question. ShotSpotter did so “per the customer’s instruction,” according to documents obtained by Reuters.
A recent NBC News investigation into ShotSpotter’s track record revealed one secret to the company’s success: effective lobbying. The company has been successful in convincing federal agencies to make grant money available to help pay for its product. Then it helps police departments win that grant money.
However, according to O’Rourke, the annual price of a ShotSpotter contract isn’t the end of the expense. “The hidden costs of it — in terms of manpower for increased overtime chasing dead-end alerts — are just not visible,” he said.
Not all law enforcement officials are sold on ShotSpotter.
Last summer, the company sent a letter to law enforcement agencies across the country, NBC News reported, asking their help to push back against media reports disparaging the company and its product. Ralph Clark, Shotspotter’s CEO, asked the company’s clients to highlight “the positive impact it has made in your city or town, whether through interviews, bylined pieces or social media posts.”
John Johnson, a prosecutor in Little Rock, Arkansas, demurred, saying he’d never seen a homicide case file that included data from ShotSpotter.
“Congratulations to those cities that have been successful in their implementation of your product,” Johnson wrote to Clark in an email obtained by NBC News, “but in my opinion the money would be better spent installing video cameras around the city that show what happens rather than a bunch of microphones that ‘listen.’ ”