About a week ago I was having a conversation with a group of women all of whom happened to be women of color. We were sharing our most recent experiences about an all-too-frequent occurrence: being misidentified by people who greet you by the name of that other black, or Asian, or Latina coworker/ yoga classmate/ carpool driver. The old ‘we all look alike’ is annoying at best, infuriating at worst. Why is it so difficult to tell the difference between us, especially when body sizes, hairstyles, skin color and facial features are distinctly different?
The human-to-human misidentification is bad enough, but the non-human is downright frightening. Apparently now even computers and the people who program them have the same problem. Facial recognition technology is rapidly becoming the standard in security and in the day-to-day lives of regular Americans. About a month ago, the ACLU reported its study of Amazon’s technology appropriately called Rekognition. Here’s what they did: The ACLU collected 25 thousand police criminal mug shots, and programmed Rekognition to compare those with the photos of Congress members. All 535 members of Congress. Guess what? 28 of the lawmakers were positively matched with the mugshots, and most of those were Congresspersons of color. Amazon said the ACLU didn’t use Rekognition correctly. But, it seems all the facial recognition programs have trouble identifying people of color. That means a lot of innocent people will be erroneously ensnared — at risk for intense search and or arrest. And what happens to the data gathered from these scans? The TSA says the data is not stored, nor will it be shared. But, if you believe that, I’ve got a couple of bridges for sale.
Meanwhile, the Department of Homeland Security is moving ahead in installing facial recognition in the nation’s airports. Orlando airport was the first to require all passengers — including Americans — arriving and departing to have their faces scanned. Americans probably don’t even know they can opt-out, but if they do, they are sent through the same longer process non-Americans endure. Airports like Boston’s Logan are using face scans for some departing international flights, but don’t yet require all international travelers to be scanned. Recently, a 26-year-old traveling on a stolen French passport was caught by facial recognition at Dulles International airport, a first-time apprehension.
But I don’t feel safe with technology I already know may likely work against me. Several studies document consistent errors particularly with darker skinned women. Massachusetts Senator Ed Markey has taken the lead in calling for an investigation into the use and potential abuse of facial recognition. But right now, as its use spreads via government agencies and beyond, there is nothing to reign in a technology that literally can’t identify all of us.