Skip to main content

Facial recognition locked me out of my own apartment. NYS must ban it.

Racing to escape the rain, I reached my building in the Bronx and positioned my face in front of the facial recognition entrance system. Despite multiple attempts, the screen repeatedly failed to identify me. As a Black woman, I had experienced the inconvenience of the system taking longer to recognize me than it did for lighter-skinned individuals. But stuck outside alone at night, this was the first time I truly felt endangered by this technology and felt aware of the omnipresent surveillance in my neighborhood. This experience forced me to confront the disturbing realities of how much personal biometric data I was relinquishing, who was accessing it, and the inefficacy of a system that barred me from my own home.

Across New York, countless tenants with landlord-installed facial recognition share these concerns about surveillance and racial discrimination. The prevalence of facial recognition technology in residential buildings not only infringes on personal privacy and puts tenants’ biometric data at risk, but it also exhibits racial biases against overpoliced Black and Brown communities. It is imperative New York State legislators ban its use this session to ensure the safety, privacy, and dignity of all New Yorkers in their homes.

Surveillance is especially prevalent in public housing. According to the Washington Post, there is one camera in New York City public housing for every 19 residents, comparable to one for every 20 visitors to the Louvre in Paris. This level of surveillance saturation goes hand in hand with the extreme gentrification occurring in Brooklyn and Manhattan, but also beyond the city, in places like Yonkers and Buffalo. Adding facial recognition technology to their surveillance arsenal is part of landlords’ efforts to attract wealthier, predominantly white tenants to these neighborhoods, making them feel safer, while closely monitoring current, mainly Black and Latinx public housing residents. This increases the likelihood of these tenants being punished or evicted for minor rule violations, including when misidentified. 

The NYPD can access surveillance footage through its dystopian Domain Awareness System (DAS), which consolidates private and public camera feeds and intelligence data, regardless of its origin. This system includes video footage from cameras around the city, including in residential buildings, infringing upon the freedom of residents, particularly Black and Latinx communities who are already disproportionately targeted by discriminatory policing. With the NYPD’s track record of using facial recognition technology against children, who are more prone to misidentification by the technology, it comes as no surprise that residents feel threatened at the idea of the police having access to their biometric data and movements without consent. 

Tania Acabou, a single mother from New Bedford, Massachusetts, found herself evicted on the grounds that she had violated a guest policy, as determined by the facial recognition camera, when the “guest” was her ex-husband providing childcare so that she could attend school. New Yorkers, threatened by similar actions, have resisted non-consensual installation of these systems by landlords. In 2019, residents of Atlantic Plaza Towers in Brooklyn organized against the implementation of facial recognition technology that forced them to surrender their biometric information. The power these systems give to landlords is boundless, forcing tenants to grant access to their biometric data to third parties with no security assurances. 

Protests by community members have successfully led to the introduction of a bill in the New York State legislature which prohibits the use of facial recognition by landlords on any residential premises. To protect New York residents from the threat of being tracked by landlords and not having control over their data, this bill needs to be advanced to the Assembly and Senate floors for voting. At a time when protecting the rights and privacy of New Yorkers is paramount, legislative intervention is more crucial than ever to prevent discrimination and safeguard the homes of residents. 


Renwick-Archibold is a Research Intern at the Surveillance Technology Oversight Project (S.T.O.P.) and a 2024 graduate of Washington University in St. Louis with a degree in computer science and cognitive neuroscience.

The post Facial recognition locked me out of my own apartment. NYS must ban it. appeared first on New York Amsterdam News.

* This article was originally published here