Controversy continues to swirl around biometric facial recognition, with U.S. House Committee hearings, legal challenges, and new revelations about who is using the technology and how.
The House Committee on Oversight and Reform held the first of its hearings into facial recognition, examining the impact the technology has on civil rights and liberties. The committee heard testimony from Algorithmic Justice League Founder Joy Buolamwini, Clare Garvie of the Georgetown Law Center on Privacy and Technology, both of whom have sparked debate with recent research. Dr. Cedric Alexander, the former president of the National Organization of Black Law Enforcement Executives, University of D.C. Law Professor Andrew G. Ferguson, and ACLU Senior Legislative Counsel Neema Singh Guliani also testified.
Chairman Elijah Cummings (D-MD) noted in his opening statement that the potential threat to civil rights and liberty posed by biometric facial recognition is a bi-partisan issue, and that the GAO recently found the FBI has not implemented half a dozen changes to its facial recognition system recommended in 2016. Cummings called for recommendations that would recognize the technology’s potential, while safeguarding against misuse.
“The question is, do you have an all-out moratorium,” Cummings said.
In an exchange with Rep. Alexandra Ocasio-Cortez (D-NY), Buolamwini reiterated her findings that facial recognition algorithms work less well on women, people of color, and those of “different gender expressions,” while they do work well for white men, who make up the demographic most represented among algorithm engineers. Rep. Jim Jordan (R-Ohio) invoked the dystopian novel “1984,” and said “I think it’s time for a time out” on the technology’s use. Jordan also expressed dismay that unelected officials from the FBI and 18 states had arranged for image-sharing from driver’s license databases without consulting elected officials or notifying affected individuals.
Recommendations discussed include disclosure rules and standards for law enforcement use.
“If I’m not properly trained, if I’m not supervised, if there’s no transparency… and then something goes awry, then I — the end user, the police chief — end up being the bad guy,” Alexander told the committee.
Ferguson told the committee that case-by-case litigation is inadequate to clarify the legal limits placed on facial recognition by the Fourth Amendment, and that “only legislation can respond to the real-time threats or real-time technology.”
The committee will hear from law enforcement at its next meeting on June 4.
New legal challenges
The Electronic Privacy Information Center (EPIC) has filed a lawsuit (PDF) to compel the U.S. State Department to release information about its sharing of facial images gathered from visa and passport applicants to other federal agencies.
EPIC told a federal court in Washington, DC, that the Customs and Border Protection Agency (CBP) is using the images for an unlawful border system, which it has previously called for the suspension of.
Meanwhile in the UK, former Liberal Democrat Cardiff city councillor Ed Bridges is challenging the use of automated facial recognition by South Wales Police, after gaining support from Big Brother Watch and Liberty and successfully running a crowd-funding campaign. Computing reports that Bridges is arguing the deployments breach data protection and civil rights laws, and that they were decided on without warning to or consultation with the public.
“It’s hard to see how the police could possibly justify such a disproportionate use of such an intrusive surveillance tool like this,” Bridges said, according to Computing.
Business as usual
Amazon shareholders have rejected a pair of measures brought by activist investors that would have forced the company to conduct civil liberties reviews before selling facial recognition to law enforcement agencies and to commission an independent review of the risks associated with Rekognition sales, Reuters reports.
The company had sought to have the vote blocked by the SEC, which declined to intervene. The resolutions were defeated by a wide margin, an Amazon spokesperson told Reuters.
Santa Clara County Sheriff’s Office uses facial recognition across the Bay from San Francisco, where the technology was recently banned, and does so without a policy governing its use, NBC Bay Area reports. A proposed draft policy was due for consideration at a meeting of the county’s Board of Supervisors this week, but a vote on its approval has been delayed.
“Here in Santa Clara County, we have very limited use of facial recognition technology,” county Supervisor Joe Simitain said. “The sheriff uses it essentially as a match for mug shots, static photos. It’s a pretty narrow use.”
A North Korean restaurant in Vietnam, meanwhile, is alleged by two think-tanks to also sell facial recognition technology from the pariah state, CNN reports.
Analysts at the Center for Advanced Defense Studies (C4ADS) and the Center for Nonproliferation Studies (CNS) say that IT services are not included in UN sanctions, and provide a way for the regime to raise funds for its banned nuclear program. They traced a company called Future Tech Group, which recently took down a website advertising facial biometrics, to the restaurant, as well as to the profiles of software experts and software developers touting facial recognition expertise on freelancing websites.
“Essentially, what we see is that the owners of companies registered at the address of the restaurant also appear associated with freelance profiles used to sell advanced facial recognition technology to clients around the world,” C4ADS North Korea and China specialist Jason Arterburn says.
North Korea has tended to group multiple businesses together as “overseas commercial outposts,” Arterburn says, and CNN reports there are dozens of North Korean restaurants throughout Asia, with the majority found in China.