BACK TO

BINJ INVESTIGATIVE FEATURES

FACIAL ANIMOSITY

As Massachusetts lawmakers weigh new facial recognition rules, their own guards shop for AI-enhanced surveillance that privacy advocates call “chilling,” “unlawful,” and “racially biased”


Protests on Beacon Hill may never be the same again. Pretty soon, equipment that is capable of leveraging AI to digitally unmask demonstrators plus recognize vehicles by linking dozens of sophisticated cameras may cover the golden dome from every angle.

At the Massachusetts State House, legislators are preparing to update state laws regulating the use of facial recognition technology—placing clearer boundaries on when law enforcement can use the technology that has been found to discriminate against people of color and misidentify suspects.

Meanwhile, the controversial tech is already being used in Mass to investigate fraud and property theft. And the agency that handles security at the State House is itself fixing to order video equipment that comes bundled with facial recognition capability.

Officials vow they will “never” use the technology they are buying. But in a state where police have used facial recognition software without any guardrails before, civil liberties advocates are skeptical. They say there’s a likelihood of usage creep with any new surveillance technology, despite promises and good intentions.

“It’s understandable if members of the public might be wary because, to be honest, if they acquire this and install it, who is going to keep them to the pledge they are making?” Alex Marthews, founder of the privacy advocacy group Digital Fourth, explained how this technology can become normalized outside of public view. He continued, “At that point, the equipment is already purchased and installed and it would be difficult to dial back on it. It would be better if they got equipment that was not technically capable of being able to administer facial recognition. … Any technology that is adopted, you have to assume it is going to be exploited.”

“Whenever police have a tool there is a temptation to use it,” said Matthew Guariglia, senior policy fellow at the Electronic Frontier Foundation. “This is where you need to have transparency, the ability for the public to audit what tech police are using.”

State Sen. Jamie Eldridge, who co-chaired a commission on the use of facial recognition technology in 2021, said the current bill in play is focused on law enforcement use. He’s unsure if the language can be changed to include State House security, but said the issue needs to be addressed.

“It’s absolutely very concerning,” Eldridge said.

Boston Calling, police spying

Ten years ago, thousands of concertgoers crowded City Hall Plaza for the first Boston Calling music festival. Some cops joined the party remotely, watching via numerous cameras that secretly used so-called Face Capture technology which took photos of attendees for further analysis by law enforcement.

The cameras were part of a surveillance network designed by IBM and secretly welcomed with open arms by vulnerable city officials in the wake of the 2013 bombing of the Boston Marathon, a subsequent DigBoston investigation revealed. A Boston Police Department spokesperson initially said the BPD “do[es] not and have not used or possess this type of technology.” But that was untrue, as reporters had already unearthed documents and memoranda detailing the capabilities, along with video from the concert—including images of officers using IBM’s technology.

The deployment in question happened during the administration of Boston Mayor Tom Menino. After prior use of Face Capture was revealed in 2014, a spokesperson for the newly-elected Mayor Marty Walsh confirmed the presence of “situational awareness” tech at the concerts, but added: “The City of Boston did not pursue long-term use of this software or enter into a contract to utilize this software on a permanent basis. … From the City’s perspective, we have not seen a clear use case for this software that held practical value for the City’s public safety needs.”

A decade ago, the “People Search” function used during Boston Calling categorized subjects according to their “baldness,” “head color,” and “skin tone,” among other factors. All these years later, Guariglia said facial recognition technology still misidentifies people of color, with the perception of efficacy often caught in a feedback loop that relies on surveillance that is heavily focused on minority neighborhoods.

“Black men and women are most likely to be misidentified, and they’re the demographic most surveilled in America,” Guariglia said. “This technology disproportionately ends up in Black neighborhoods. When you put all surveillance equipment in one neighborhood, suddenly CCTV cameras serve up mostly Black individuals. … This is precisely why EFF has been advocating for a full ban of facial recognition technology—it’s too ripe for abuse, and too invasive. Even if it worked 100% of the time, that still means more effective surveillance and tracking.”

Boston officials had similar misgivings in 2020, when City Council members passed an ordinance banning use of the technology in the city. Reports at the time noted that BPD visual analysis software had facial-recognition capabilities, but the media was told it wasn’t being used; then-Commissioner William Gross even said at a hearing, “Until this technology is 100%, I’m not interested in it.”

The measure, which was co-sponsored by future Mayor Michelle Wu, was a big deal, with Boston reportedly becoming the second major municipality to ban facial recognition tech after San Francisco in 2019. But it didn’t go away entirely … 

In 2021, Boston city councilors pushed for protocols that would give the body more oversight and approval of surveillance technology; meanwhile, city officials were quietly looking to hire consultants to maintain a linked network of more than 1,000 video cameras across the Metro Boston area, with remote access shared across nine cities. Following reporting by the Boston Institute for Nonprofit Journalism on those contradictory developments, which echo the current quandary around facial recognition at the State House, Janey hit the brakes on the surveillance move, only for BPD to find ways around the regulations, while other law enforcement agencies across the state snooped with zero oversight—for a limited time.

State restriction, fraud investigation

In 2022, Mass legislators passed a law permitting only three agencies—the Registry of Motor Vehicles, the Massachusetts State Police, and the Federal Bureau of Investigation—to use facial recognition tech. Local police would need to get a warrant and then ask state police to use it—unless they made an emergency request stating deployment would help prevent harm to others, or in the event that they needed to identify a dead body. And departments had to file those requests, including the alleged crime and number of results obtained, to be included in an annual report on the use of facial recognition software in the state.

The first report, published by the Executive Office of Public Safety and Security and covering September 2022 to October 2023, lists 14 uses of facial recognition technology. Three are emergency orders; two are for unresponsive deaths; one is reportedly related to “Sexual Trafficking, Missing/ Endangered juvenile.” The others came via court orders, with one involving a rape investigation and another an assault and battery on a police officer. The remaining uses were for property and financial crimes—fraud, larceny, ATM skimming, and passing bad checks.

The Norwood Police Department accounted for three of the 14 requests—one for counterfeit checks, one for identity theft and larceny of a motor vehicle, and one for identity fraud and larceny by check. (The last one is listed as an emergency request, but the department said in response to questions for this article that it was a misfile, and that all NPD requests were done by court order.)

An NPD spokesperson said one case involved using facial recognition technology on images of a person who used a fake ID to buy a car, saying the police had footage of the known suspect and tried to identify him. Another case involved a known suspect talking to another person via social media, so police ran the photo tied to the social media account of that second person. They noted that such a use would warrant follow-up investigation, since social media accounts frequently use images of people other than the user, but praised the tech as a “good tool for law enforcement that doesn’t get abused.”

The NPD spokesperson added: “Once you ID a person, then you have to do an investigative process to determine whether or not they’re involved in a crime. You think the database pops a person up and they’re charged, that’s not the case. … It’s just a way to develop a lead. … A lot of people are under the impression that facial recognition is a fishing expedition, but it’s not.”

These tools, however, are fundamentally flawed, according to privacy experts. Guariglia of EFF said that by using the state’s RMV database, for example, facial recognition functions increase the likelihood that innocent people will be pulled into such an investigation.

“What are they comparing that grainy suspect photo to?” Guariglia said. “What they’re comparing to in many states is every single person who had a driver’s license photo taken. Suddenly, millions of innocent people find themselves constantly having their faces searched and compared to a suspect—it erodes everyone’s presumption of innocence. … Facial recognition still gets things wrong, especially in reference to Black men and women. Police around the country have demonstrated they can’t be trusted to do follow-up investigations.”

When facial recognition tech compares an input to mugshots already in the system, it targets people who were arrested for past crimes, but have no reason to be suspected to current ones—especially since a comparison doesn’t have to be complete to be acted on.

“Even if a person was previously arrested, that is no indication they’re likely to commit a crime in the future, and that doesn’t mean they’re a suspect for the rest of their life,” Guariglia said. “What the tech spits out is the most ‘likely’ match, but that could be only 60% [likely]. They don’t say that when making the arrest. That is the larger problem using this [kind of] technology—how defense attorneys don’t get access to the tech and how it works.”

The aforementioned state commission charged with examining the use of facial recognition tech by law enforcement, co-chaired by Sen. Eldridge, made several recommendations in 2022. Among them, law enforcement should be required to turn over all records of facial recognition searches to defendants and defense attorneys. Lawmakers are currently considering a bill that would adopt that recommendation along with several others, like requiring that annual state report on sanctioned deployments to note the race and gender of suspects. The proposal also specifies that state officials would have to hold a public hearing before procuring new facial recognition technology. State police would still be allowed to use it, and to do so on behalf of local departments, but the bill would clearly make it “unlawful” for any other law enforcement agency to buy, possess, or use the tech.

The Act to implement the recommendations of the special commission on facial recognition technology was originally passed in the Mass House last year, but languished before being picked up by the Senate this term. The measure’s now before the House Committee on Ways and Means. Guariglia and Marthews said their organizations support the new restrictions, as did Kade Crockford, director of the Technology for Liberty Program at the ACLU of Massachusetts.

“The ACLU of Massachusetts strongly supports the recommendations of the special commission, and we urge the legislature to enact legislation implementing them as soon as possible,” Crockford said. “No person should have to worry that the government could use racially biased, invasive software to track their every public movement.”

As for within and around the hallowed halls on Beacon Hill … Crockford continued: “For the same reasons, we would be very concerned about the use of face surveillance at the State House. Lawmakers, staff, lobbyists, journalists, advocates, and Massachusetts residents should not have to worry that they are being secretly tracked by technology enabling the cataloging of each person’s every movement inside the State House or on State House grounds. Such surveillance is chilling of First Amendment protected activity and has no place in a democracy.”

State House surveillance

Last month, the Bureau of the State House, which is tasked with “preserving” and “maintaining” the capitol, posted a request for equipment on the state’s clearinghouse for public bids. The solicitation listed State House Security Director Shaun Deinstadt as the purchaser of a Network Video Recorder. The bid, which was quickly removed from public view less than a week after initially posting, included an addendum listing the specifications that Deinstadt, a former CIA officer, was looking for. Specifically, the bureau was shopping for an AI NVR made by the Motorola-owned company Avigilon.

“This compact, yet powerful family of recorders simplifies networking and installation of security systems by eliminating the need for dedicated analytics servers,” the one-sheet boasts, describing the NVR as a “Enterprise-Hardened Recorder [that] offers a cloud-ready suite of comprehensive security solutions to customers.” One solution: “No Face Mask Detection” tech that uses AI to monitor students wearing masks. Other accouterments include: “Classified Object Detection,” “Avigilon Appearance SearchTM technology,” “Facial Recognition,” and “License Plate Recognition to up to 50 third-party or Avigilon non-analytic cameras on a single AI NVR.”

In a statement, Tammy Kraus, superintendent of the Bureau of the State House, said that while the department is requesting a recorder with that function, they would never use it.

“The State House has never used facial recognition technology nor will we be using facial recognition technology in the future,” Kraus wrote. “The solicitation for quotes that you referenced is for the replacement of our existing network video recorder which is nearing the end of its useful life. … The network video recorder we solicited quotes for will include the ability to use facial recognition technology, but this is not, and would not, be activated. Activation requires the payment of an additional licensing fee over and above the cost to acquire the network video recorder.” 

But the very existence of facial recognition in the tech the State House is buying made Guariglia skeptical that security wouldn’t use it.

“The fact that companies are going to increasingly bundle these technologies together means we can’t rely anymore on police just saying, We’re not going to use it. There needs to be some policy and legal framework making sure to address this, otherwise we’re just relying on their word. That is never a comfort.”

“It is of course good if they pledge they’re not going to use the facial recognition capabilities,” Marthews of Digital Fourth said. “The difficulty is, and it isn’t necessarily the fault of folks at the State House, but there is a really long track record when it comes to promises of use restrictions in surveillance equipment—law enforcement says it is capable of surveillance but will refrain, the technology is approved and rolled out, and then you find out later it’s being used for that purpose.”

Both Marthews and Guariglia echoed Crockford’s concerns that facial recognition technology at the State House could chill free speech and the public’s right to protest without fear of retribution. Guariglia said legislators “absolutely” need to push back on the tech for their institutions.

“This would bar people concerned about privacy from civic participation, they’d be nervous to come in and talk to public officials, which is their right to do,” Guariglia said. “There is a real chilling factor, that in going to talk to public officials about unpopular opinions, lodging a protest, might get you immediately identified and opened up to reprisal.”

The EFF policy fellow continued: “The trouble is mission creep—not knowing whether or not the devices supposed to theoretically protect elected officials end up being a device primarily for identifying people who have political disagreements with those elected officials.”

Marthews agreed, saying anonymity is important for potential whistleblowers, and the right of anyone who attends a meeting or protest and doesn’t want to publicly identify themselves. Furthermore, he said giving State House security the potential to use facial recognition goes against the intent of the bill legislators are looking to pass.

“The essence of the proposal the state is moving forward with is to be careful and thoughtful and judicious in ways facial recognition technology will be deployed—only in the context of probable cause for certain crimes,” Marthews said. “We want there to be due process around that. … It would be inconsistent to set up due process around general law enforcement for facial recognition, but around the legislature’s space there has to be facial recognition of everyone of a particularly invasive kind.”

“If the government is so concerned about transparency coming in and out of the State House, they can consider making the legislature and governor’s office subject to the Freedom of Information Act,” Marthews added. “It doesn’t seem to me to be valid to have a world where you have to affirmatively ID the face of everyone entering the legislature but also feel it is not appropriate to have people find out what’s going on inside the government.

“It’s another instance of full transparency for you and me, and full privacy for the people making the decisions.”

This article is syndicated by the MassWire news service of the Boston Institute for Nonprofit Journalism. If you want to see more reporting like this, make a contribution at givetobinj.org

Share the Post:

Related Posts

Receive the latest news

Subscribe To The HorizonMass Newsletter