20.8 C
Wednesday, April 24, 2024

Buy now

  • HID
HomeSecurityAlarm SystemsU.S. Bill Would Ban Biometrics In Public Housing

U.S. Bill Would Ban Biometrics In Public Housing

Legislation just introduced in the U.S. entitled ‘The No Biometric Barriers to Housing Act of 2021’ would ban biometric technologies including face recognition in public and assisted housing units funded by the Department of Housing and Urban Development.

“Facial recognition technology consistently misidentifies women and people of color and only exacerbates the constant surveillance and criminalization that the most marginalized already face,” sponsors of the bill said in a statement. “This much-needed bill will ban the use of facial recognition and other biometric technologies in HUD-funded properties and will help protect the civil rights and liberties of tenants throughout our country.”

Beyond banning the use of facial recognition technology in housing units that receive federal funding, the bill also requires HUD to submit a comprehensive research report to Congress. The report must include any known use of facial recognition technologies in public housing units; the tech’s impact on tenants; the purposes for installing such tech in housing units; demographic information of impacted tenants, and the impact of emerging technologies on vulnerable communities in public housing. The No Biometric Barriers to Housing Act follows the introduction of legislation in June that seeks to stop America’s federal government from using biometric technology and to withhold funding from all local agencies that deploy it.

NIST, which assesses face recognition technologies annually, has found that false positive and false-negative rates of the leading algorithms using mugshots, application photographs from individuals applying for immigration benefits, visa photographs, and images taken of travellers entering the United States, have ‘undetectable’ differences between demographic groups.

When NIST thresholds were set for false-positive rates of 0.01 percent for white males (1 in 10,000), more than half of the 17 most accurate algorithms had false-positive rates of 0.03 per cent or better for black males, Asian men, and white women (3 in 10,000). However, NIST’s testing did find poorer performing algorithms exhibited greater demographic bias – this is likely based on skewed data introduced during development. For greatest accuracy, face recognition algorithms need access to equally large datasets of every possible variation.


thumbnail Wall


SEN News
SEN Newshttps://sen.news
Security & Electronics Networks - Leading the Security Industry with News and Latest Events. Providing information and pre-release updates on the latest tech and bringing it all to you daily. SEN News has been in print for over 20 years and has grown strong as a worldwide resource in digital media.


Please enter your comment!
Please enter your name here

Related Articles


20.8 C
11.6 C
31.7 C
11.9 C
23.2 C
23.9 C
13 C
16 C


- Advertisement -