City Council is getting ahead of the game with two new bills related to biometric surveillance. Photo: Etienne Girardet

After a circuitous City Council hearing yesterday, two bills related to biometric surveillance are poised to pass and land on Mayor Eric Adams’ desk. One would block landlords and managers from using biometric technology like facial recognition in buildings they own or operate without the express consent of users, and bar owners of multiple dwellings from rolling out such tech for tenants and their guests at all; the other would prevent public accommodations like public stores and music venues from utilizing biometrics to “verify or identify a customer,” functionally making these technologies altogether illegal.

The bills are, to the Council’s credit, somewhat preempting a tech issue before it becomes widespread, which is the opposite of how tech regulation typically works. One of the most consistent features of the push-pull of policymaking around burgeoning technology in the past few years has been that the pace of advancement is so fast and its applications so immediately scalable that they can be rolled out and ubiquitous long before political leaders are able to even grasp them, let alone enact rules. This is why recent examinations of the massive impacts of social media, for example, are happening essentially after companies like Meta (formerly Facebook) have become among the largest in the world and fundamentally shifted everything from global geopolitics to teen mental health.

The same holds true for surveillance tech. Security cameras, incidentally, are a good example. Some people might take issue with the concentration of security cameras in New York City, but ultimately, their rollout was incremental enough that there could at least be a robust public conversation about the extent to which they were appropriate, both in private and public hands. They required a certain visible infrastructure and couldn’t be placed quickly and quietly. Installing facial recognition software on those cameras, however, could in theory be done instantaneously by simply adding software, and would be imperceptible to anyone who didn’t know it was happening.

Here, the Council is to some extent hoping to tackle this before all the externalities start emerging, though of course it’s not that these issues haven’t come up at all out in the wild. Most infamously, Madison Square Garden recently ejected an attorney from Radio City Music Hall when its facial recognition system identified that she worked for a law firm that was involved in litigation against MSG. There was no allegation that she was doing anything untoward — she wasn’t personally involved in the case at all — but the company had issued a blanket prohibition on anyone from that particular law firm and some unknown other number of employers. The push in favor of these Council bills was also of course spurred by tenant organizers who were upset at landlords’ efforts to implement biometric systems.

Should you care about loss of privacy?

A natural question here, especially for busy people already inured to a certain level of privacy loss is: who cares? If I don’t notice my face being scanned as I walk into a store or my apartment building, then what’s the harm? Won’t it just make things more convenient? An analogy I sometimes use when talking about these sorts of things is the idea of handing someone your car keys because they’ve promised to drive you somewhere and take the responsibility off your hands. Perhaps for a while, they do, and you’re satisfied, but eventually they start refusing to drive you based on criteria you don’t really understand, or start recording everywhere they take you and selling the information, or requiring you to pay more for gas than someone else they drive, because they think you can afford it. It’s not easy, or maybe even impossible, to get your car keys back, and maybe you wish you hadn’t handed them over in the first place.

Do you know? Photo: Claudio Schwarz

Point is, you shouldn’t be thinking about the technologies’ use so much as misuse. There are certainly defensible use cases, for example a building tracking a known package thief. Yet the oft-repeated maxim of “well I’m not doing anything wrong” falls apart when you consider that it’s not you who’s determining “what’s wrong,” unless you actively legislate it. Most reasonable people, I’d venture, wouldn’t have considered the attorney chaperoning those kids to the Rockettes as doing anything wrong, yet she was swept up. What if a building owner decides tenant organizing is “wrong” and uses facial recognition to flag and harass organizers? What if groups of stores start sharing the biometrics of people they consider problem customers and blocking them from entry to businesses around the city, with no recourse or appeals process? That’s all within the realm of possibilities unless there are specific rules in place.

In any case, these bills and the broader conversation on biometrics certainly aren’t the last word on the city’s regulation of emerging technologies, or even technologies that impact tenants. It’s becoming increasingly common for landlords to tell tenants that rents aren’t even being calculated directly by them but by algorithm, such as that of the now-notorious Texas company RealPage. A ProPublica investigation last year found that the company’s YieldStar software used data points including rents set by competing companies in an area to raise rents to their maximum possible marketable levels for cities nationwide, taking the decision-making out of managers’ hands and put it into the hands of a machine whose exclusive purpose was to maximize profits.

Following the publication of the report, RealPage faced both class-action lawsuits and now a Justice Department investigation into the contention that the practice is basically illegal price-fixing, even if it is being done via algorithm. That’s certainly welcome enforcement, but it is being undertaken after the software has become rather pervasive, basically under the radar. There are similar concerns around technologies that may, for example, allow landlords to track the location of tenants (and location data is already widely available on data broker marketplaces in ways that de-anonymizing them relatively easy) that they want to accuse of illegally subletting apartments, otherwise facilitate constant monitoring of any little violations of a lease that can be used as fodder for evictions. These bills, if passed, might be a starting point for a much larger conversation.

Gov. Abbott shows disdain for immigrants yet again

Switching gears for a minute, let’s briefly talk about Texas Gov. Greg Abbott. The reactionary Abbott, who clearly harbors some national political ambitions, reportedly began sending chartered buses full of migrants to New York City again after a months-long hiatus, signaling that his political stunt of ferrying migrants to blue cities will continue unabated, human consequences be damned. It might seem odd to someone in NYC, surrounded by immigrants all the time and benefitting from the vibrancy of cultural interchange, but for people like Abbott, these people fundamentally seem like noncitizens first and people second.

This unpleasant reality was broadcast last Sunday when Abbott, responding to a mass shooting that happened in the Texas city of Cleveland, initially referred to the shooter as having “killed five illegal immigrants,” making their immigration status the victims’ primary point of identification. His office later apologized, not for the language itself, but for the statement’s inaccuracy, with a spokesman saying that they had since “learned that at least one of the victims may have been in the United States legally,” as if that was the main issue with his pronouncement. The fact that the office believed that the discontent was around whether it had been mistaken about the exact number of undocumented immigrants among the dead is telling.
In bussing migrants to NYC without any coordination with our municipal government, Abbott isn’t making the calculus that hurting these people is worth the political gain from sticking it to New York or highlighting the supposed hypocrisy of a sanctuary city. Fundamentally, he doesn’t care about hurting them, because they’re them, not us, an out-group whose value is inherently inferior.

Felipe De La Hoz is an immigration-focused journalist who has written investigative and analytic articles, explainers, essays, and columns for the New Republic, The Washington Post, New York Mag, Slate,...

Leave a comment

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.