By Felipe De La Hoz
New Yorkers walking down the street are likely to be dimly aware that they are being surveilled, in some way, by a mixture of private and public entities. It’s part of the trade-off that we’ve made as a society, a kind of persistent monitoring in exchange for a sense of security and convenience.
The shift has been accelerated by the twin engines of post-9/11 cultural shifts and the growing primacy of social media and targeted advertising, and elicited relatively little pushback as the technologies have grown more sophisticated while receding out of sight. The average person might object to a cop sitting on a street corner visibly jotting down the information of every driver that passed by, for example, but few seem perturbed or even aware of the fact that automatic license plate readers are logging vehicle locations and timestamps going back years, all over the city, and making this data available not only to the police but on occasion federal officials and private data brokers.
The NYPD in particular has long had a small arsenal of powerful surveillance tools, tools that, until the City Council passed the Public Oversight of Surveillance Technology Act in 2020, it could purchase and deploy in total secrecy, without even informing the Council. In addition to the license plate system — which, according budget documents, the city is now seeking to have installed on all vehicles for the new “anti-gun” unit, a revamped version of the controversial former plainclothes anti-crime units — the most potent of this tech is the Shotspotter system, an interconnected web of microphones and AI designed to detect the sound of gunshots; the use of Stingrays, which simulate cell towers to connect to unwitting cell phones in a geographic area and hoover up their data; an internal DNA database gathered through questionable means; and now, the looming possibility of greater investment in so-called gun scanner technology and facial recognition.
Technology’s aim is to reduce crime, solve cases
Each of these technological implements was intended to make investigations and police responses more efficient and reduce crime. As is to be expected, the reality of their rollouts has been more mixed, marred by both technical problems and swirling questions about civil liberties. “There’s this false idea that people have, that a computer system, or a new piece of technology is going to be better than a human, forgetting that humans designed it, humans trained it,” said Jerome Greco, supervising attorney for the Digital Forensics Unit at the Legal Aid Society.
If you’ve been to either Jacobi Medical Center or City Hall lately, you may have noticed something that looks a bit like a very small bus stop without the roof, two little pillars facing each other with some blinking lights and the words “evolv technology” printed prominently on the sides. Walking between the bollards, a passerby might not think twice about it, unaware that they had just been the subject of an invasive scan and AI-generated threat assessment with the objective of determining whether they were carrying a gun or other dangerous weapon. Or, maybe they were unlucky enough to be carrying a laptop or iPad that triggered an alert, likely prompting guards or police to pat them down. This is an example of the latest fad in police tech, the so-called weapons detection system that has come into particular vogue in the wake of recent shootings.
These false positives might be annoying to enter City Hall, but disastrous in the event of a widespread rollout in the subway system, which is Adams’ ultimate objective, for two main reasons: the first is the simple introduction of additional friction for riders in a system that largely depends on speed and ease of access, and which in turn the entire functioning of the city is heavily dependent on, and the possibility of escalation over false positives and misunderstandings. Imagine the Times Square-42nd Street station at rush hour, with hundreds of riders entering per minute, and now having them all be expected to walk through these scanners in view of uniformed police officers, who will inevitably flag dozens down for setting off the alarms with their computers or umbrellas or whatever the system might detect as a gun.
Who will be scanned on the subway and why?
If not every rider will be scanned, then that opens up a whole other can of worms around who will be scanned and why. It doesn’t take much imagination to envision some of the issues with having the police select certain people for scanning based on visual characteristics alone, or some sort of gut check. Checks could be “randomized,” but everyone knows just how random such checks really are, for example, at the airport. This brings us to the other major issue here, which is that in the event that the system does flag someone as potentially being armed, that will invariably trigger a heightened police response, whether they’re actually carrying a gun.
“When it happens to me, they’ll just wave me through and say, ‘Oh, it’s fine.’ When it happens to a young Black or Latinx kid, they’ll get patted down,” said Albert Fox Cahn, founder of the Surveillance Technology Oversight Project. “Every time you wrongly tell a police officer ‘this is an armed person,’ you are creating a dangerous situation. As you’re doing that thousands of times a day, it’s not going to take long for tragedy.” The clear cautionary tale here is Shotspotter, which has been known to send police ready for firefights to the site of fireworks or backfiring motorcycles. In Chicago last year, 13-year-old Adam Toledo was fatally shot by local police after officers responded to a Shotspotter alert expecting an armed response despite no obvious source of gunfire.
One way to reduce the incidence of false positives would be to tweak the sensitivity settings down, but that would by definition also increase the risk of false negatives, i.e. weapons that go by undetected, which at a certain point would make the whole exercise pointless security theater introducing friction for no real benefit. There simply is no way to make the systems both flag all guns while not confusing other objects for guns. “[Data] not really being made available. And it’s not really being provided by independent third party testing,” said Greco. “What are the metrics? Who gets to determine them? And then who gets to determine if they’re actually meeting them?”
Questions have been raised about donors
Questions have been raised over financial ties between Evolv, the company that manufactures and markets the scanners, and major donors to Mayor Eric Adams, though he has pushed back on any suggestion that this has impacted decision-making. More broadly, there is precious little data on how well these tools really work, with companies themselves sometimes raising questions over how ready they are for ubiquitous use. The City Hall and Jacobi deployments, among others, were meant to be pilots to test the technology, but the city has yet to release any results or conclusions, or provide any timeline or intention for doing so. A City Hall spokesperson acknowledged but did not answer questions about it.
Mayor Adams seems to have an abiding belief in the power of these technologies to assuage the fears of crime, fears that he has stoked and benefited from politically, but which he is now losing control of as polls show that the public is increasingly scared of violent crime and skeptical of his response to it despite recent downturns.
In addition to the scanners, he specifically mentioned facial recognition at least twice in press conferences earlier in his term, seeming to allude to the hugely contentious company Clearview AI, which has amassed a database of billions of photographs scraped from social media sites and other sources with the objective of, essentially, making almost every single person on Earth instantly identifiable in a matter of seconds. The NYPD has previously rejected a widespread use of Clearview, even as records showed that many officers were communicating with the company and using its tools.
NYPD officials have said, publicly and privately, that there aren’t any current plans for official deployment of Clearview’s technology, but that could easily change. The prospect really worries civil libertarians for the simple reason that it could fundamentally kill the ability to really be anonymous anywhere, at any time. Many might not see that as necessarily a bad thing, as it would have permitted, for example, the police to more quickly identify the suspect in a fatal subway shooting last month, or the person who killed several homeless people in New York and Washington, D.C., in March.
A double-edged sword
Yet, these capabilities are always a double-edged sword. They could also allow police to identify marchers in Black Lives Matter protests , something the NYPD has already done. In an environment where local police have multifaceted agreements and task forces between each other and with federal agencies, information captured by the NYPD could end up anywhere: with state police, the FBI, even ICE. Ultimately, New Yorkers will have to decide for ourselves the appropriate balance of risk, convenience, and liberty that is tolerable and appropriate, but for that, we need to be aware of what is being deployed and how.
Fox Cahn believes that one significant shift might come in the form of greater public awareness of the unexpected risks of surveillance in the aftermath of a leaked draft opinion of the Supreme Court overturning Roe v. Wade. Several states are already pondering laws to prohibit residents from traveling elsewhere to receive abortion care, and surveillance tech could facilitate their tracking and prosecution, such as via so-called geofence warrants that use location data to determine if someone visited, say, an abortion clinic. “I do think, honestly, the discussion around surveillance, and abortion is going to change this debate fundamentally, in New York,” he said.
Subscribe to our newsletter
Our newsletter features community-centered stories, small business spotlights, local happenings, and the work of a local artist. Out Tuesdays.