Testimony of Lucy Block Before the New York City Council Regarding Use of Facial Recognition and Biometric Data Collection

We have four core concerns. First, facial recognition technology frequently misidentifies women, people of color and the elderly, creating a disproportionate risk that such residents will be locked out of their homes in the name of “security.” Second, unnecessary collection of biometric data is a breach of privacy of all tenants, and there are no safeguards to guarantee the security of the data collected. Third, the proposed opt-out provision - though well-intentioned - cannot adequately safeguard tenants. For these reasons - and because landlords have many other, less intrusive security measures at their disposal that allow them to ensure building safety without increasing surveillance and compromising privacy - we ask that the Council reject the proposed bills and instead consider a ban on the use of facial recognition technology in housing. Such a ban is already under consideration in Albany and at the federal level, and we believe it would better serve the low-income communities of color that we work with and represent.

Discrimination is Inherent in Both the Technology and Its Proposed Roll-Out

The use of facial recognition and biometric data collection in private spaces will disproportionately disadvantage women, the elderly, and people of color, particularly those with darker skin. A 2018 MIT study showed that facial recognition software often misidentifies people of color: the authors showed that IBM’s algorithm misidentifies light-skinned men just 0.3% of the time and misidentifies dark-skinned women 34.7% of the time.1 People of color already face significant discrimination in housing, including in new luxury buildings of the sort most likely to adopt new facial recognition technology. Imagine those same residents being denied access to their home because the software does not accurately recognize dark skin tones.

Rather than ensuring the security of all residents, facial recognition and biometric data collection will add to the over-policing of residents of color in particular, while breaching the privacy of all residents. People of color are already overpoliced in public and private spaces, and artificial intelligence makes mistakes. We cannot risk merging the two and allowing a new generation of high-tech overpolicing into our homes and businesses.

“Security” That is Insecure

Even if facial recognition systems were free from bias, it is a serious violation of privacy for all residents, and renders all of their sensitive biometric information less secure.2 CM Richards’ Intro 1672 requires that landlords register their use of facial recognition technology with the DoITT. But there are no guarantees whatsoever about the security of biometric data collected – the bill does not place any requirements on how the data must be stored and protected or place limitations on whether and with whom landlords can share tenants’ biometric information. Even if the bill had included such guidelines, no data storage system is immune to breaches and hacking. An attempt to require registration of this technology will not ensure that data is well-protected; it will not ensure the security or privacy of residents.

 

Tenants Will Not Be Able to Opt Out of Surveillance

We appreciate Councilmember Landers’ effort to mitigate the negative impacts of facial recognition technology by seeking to require that landlords provide metal key entrances as a mandatory alternative to a facial recognition entry system and give tenants the right to opt out of the system. Unfortunately, opting out will not be a realistic option for many. First, the bill does not require affirmative consent from tenants for use of the technology, nor require landlords to alert tenants to their right to decline. Second, tenants in a tight rental market may not feel able to exercise their right to opt out if doing so many threaten their tenancy. Low-income tenants with fewer housing options will feel this pressure most acutely. Finally, even if a tenant succeeds in opting out, facial recognition entry systems may well obtain identifying information even from those who have sought to opt out. For instance, the StoneLock System proposed at APT can scan a face up to 3 feet away from the terminal, and the system takes and stores pictures of any face scanned that it does not recognize.

* * *

In conclusion, there is no security interest that outweighs the significant potential harms of the use of facial recognition technology in residential spaces. We urge the Council to reject these bills and instead adopt a ban on the use of this technology. Thank you for the opportunity to testify. I am happy to answer any questions and can be reached at lucy.b@anhd.org or 212-747-1117 x13.

Read moreless

Share this page: