Brooklyn resident Fabian Rogers knew he had to act in 2018 when his landlord suddenly tried to install a facial-recognition camera in the entryway of a rental stable he had called home for years. Under the new security system, all tenants and their loved ones will be forced to undergo a face scan to enter the premises. The owner, like many others, tried to sell the controversial technology as a safety enhancement, but Rogers told Gizmodo he saw it as a subtle attempt to raise prices in improvement territory and force people like him out.
“They were trying to find ways to speed up the ways people get kicked out of the building and then trying to market new flip-flops to the pros,” Rogers told Gizmodo.
Rogers says he tried to speak out against what he saw as a new security measure, but quickly realized there weren’t any laws on the books preventing a landlord from implementing the technology. Instead, he and his tenants’ association had to go on a “nefarious run” attacking the landlord’s reputation with an online shame campaign. Remarkably, it worked. The exhausted owner backed off. rogers now advocates Against facial recognition at the state and national levels.
Despite its success, Rogers said he’s seen increased efforts by landlords in recent years to deploy facial recognition and other biometric identifiers in apartment buildings. The law, the first of its kind debated during a fiery New York City Council hearing on Wednesday, seeks to make the practice illegal forever. Rogers has spoken out in support of the proposed legislation, as have several city council members.

“We are here to address an invisible but urgent issue affecting all New Yorkers: the use of biosurveillance technology,” Councilwoman Jennifer Gutierrez said in a statement. “It is our responsibility as elected officials to carefully examine the potential benefits and risks.”
Council members expressed repeated concerns about the ability of private companies and real estate owners to misuse biometric identifiers or sell them to third parties on Wednesday. Councilwoman Carolina Rivera, who is sponsoring a bill restricting facial recognition in residential areas, said she fears aggressive landlords will use the technology to issue small lease violations against tenants, which could eventually lead to their eviction. Left unchecked, she said, the racially biased algorithms driving these systems risk further fueling gentrification, which threatens to “erode what should be a city’s diverse collective identity.”
Privacy and civil rights advocates supporting the bill — along with its sister bill that seeks to ban the use of facial recognition in sports stadiums and other large venues — could have broad ramifications beyond the Big Apple and serve as an example for other local legislatures.
“Facial recognition technology poses a significant threat to our civil liberties, our civil rights, and the privacy of our citizens,” Derek Perkinson, field director for the National Action Network in New York City, said during a rally outside City Hall on Wednesday. “He is prejudiced and broken… In the name of Al Sharpton, right is right, what is wrong is wrong.”
How will New York City bills affect facial recognition?
the Two bills are under consideration During this week’s council hearing, you’ll approach the limit on facial recognition from two different angles. On the housing side, a invoice It was introduced last week that would make it illegal for landlords who own multiple buildings to install biometric identification systems to screen tenants. Under this law, landlords will be prohibited from collecting biometric data on any person unless they “expressly consent” in writing or through a mobile application.
other new bill, also introduced last week, would amend administrative laws to prevent venues or providers of public accommodations from using biometric recognition technology. these public places It could include retail stores, movie theatres, sports stadiums and hotels, and could directly implicate Madison Square Garden, which has gained bad national reputation Earlier this year to use facial recognition to identify lawyers and promptly dismiss them from its premises. New York already had a law requiring companies like this to put up a sign to let the public know they collect biometrics, but lawmakers and advocates say it does little to prevent large swathes of faces from being sucked out and possibly sold to everyday mediocrity.

What happened during the New York City Council hearing on facial recognition?
Wednesday’s session, hosted by the New York City Council Committees on Technology and Civil Rights, began with lawmakers grilling senior members of the New York City Council. Office of Privacy Informaiton (OIP), which is responsible for advising the mayor and other city agencies on privacy protection and data sharing initiatives. OIP leaders declined to offer insight into the ways local agencies such as the New York Police Department handle biometric data. Instead, one of the city’s leading data privacy bureaucrats spent the better part of two hours dancing around questions and refusing to take any position on the bills in question.
Privacy advocates testifying at the hearing were upset by OIP leaders’ persistence, with one administration official accused of spreading “misinformation” and appearing to be withholding available data. “The New York Police Department is systematically breaking transparency and oversight laws,” said Albert Fox, executive director of the Surveillance Technology Monitoring Project, during the hearing. Fox Cann said the city’s current data privacy practices amount to a “free-for-all.”
Council members warned that facial recognition used by private companies such as Madison Square Garden could lead to an “Orwellian” reality in which people of color are wrongly identified as thieves or another prohibited person and unjustly denied entry. But not all lawmakers agreed. Councilman Robert Holden went onto get the technology and said he believes laws restricting the freedom of private companies to use the system for security amount to government override.
Biometrics:If hacked, it will be hacked for life.
Advocates who have spoken in favor of the bill have spent most of their testimony trying to convince lawmakers of the unique threat technology poses to the population. Fox Kahn said the “time frame of harm” associated with biometric identifiers sets them apart from other types of personal data because they stick to people throughout their lives. He said, “If it’s hacked, it’s hacked for life.”
Left unchecked, these surveillance tools don’t just affect New Yorkers — they amount to an “enormous threat to democracy.” Even improved resolution levels won’t address the underlying problem, Finn said. “Increasing accuracy rates will not fix fundamental flaws,” Finn told lawmakers. “They will always reflect the prejudices of those who make them.”
Rogers, the attorney who successfully fended off his landlord’s attempt to install facial recognition in his apartment, said he’s optimistic that these and other bills across the country can gain traction. However, he acknowledged some of the difficulties inherent in resisting a tool that many people would find convenient.
“The convenience of companies is what leads to technology solutions being the fastest-to-go option,” Rogers said. “I think that as long as supporters are still active, collaborating, and trying to do the political education that makes it possible and understandable to a fifth grader, I think we’ll get to a point where people understand that organization and enforcement are necessary.”