Two members of the U.S. House of Representative have asked the Department of Housing and Urban Development to end the use of biometric technology, including facial recognition, for surveillance purposes in public housing.
Congresswomen Maxine Waters and Ayanna Pressley say that the technology could inaccurately identify residents of color leading to wrongful penalties. The effect could be to harass or punish residents for minor rule infractions, they asserted.
“These policies run directly counter to the goal of increasing housing stability and fairness through HUD-provided housing, which is all the more critical in light of the devastating housing crisis facing our nation,” the lawmakers wrote. “Your agency must act in this critical moment to ensure public housing and HUD-assisted housing residents are not targeted by these discriminatory surveillance systems.”
Multiple studies of facial recognition technology have pointed to divergent error rates across demographic groups, with women of color the least likely to be accurately identified. A 2018 Massachusetts Institute of Technology study found that three commercially released facial analysis programs had a margin of error of between 20% and 34% when identifying dark skinned women, compared to 0.8% or lower for light-skinned men.