A seemingly indisputable fact of life is that a person's face is his or her own; you're born with it, you die with it, and during your time on planet Earth, it is how people identify you. However, as facial recognition technologies take root, machine learning improves, and massive databases emerge, there are growing questions and concerns about how organizations collect, manage, and use these images.
"Facial images are being captured, processed and used for many purposes—and not necessarily with the knowledge and consent of the rightful owners," says Robert A. Stines, a partner and member of the Emerging Industries Team at the law firm of Freeborn & Peters in Tampa, FL.
Not surprisingly, the stakes are growing. "The way that datasets are being compiled and used is starting to draw some scrutiny, and rightly so," says Tom Kulik, an intellectual property and technology partner at the law firm of Scheef & Stone in Dallas, TX.
Facebook, Flickr, and other social media sites already use facial recognition software to identify people, while technology firms such as Amazon, Google, and IBM tap the technology to train algorithms, and law enforcement agencies and other government entities dump a seemingly endless stream of images into databases. Beyond privacy issues, there are growing concerns about organizations commercially profiting from data without compensating the people whose faces they're using.
Says Michael Zimmer, an associate professor in the School of Information Studies at the University of Wisconsin-Milwaukee, "There is a monetization issue related to facial recognition, but the technology can also become an affront to the dignity and autonomy of an individual if it's used without permission."
Facial recognition technology has moved into the mainstream of society. Today, businesses use it for biometric authentication, merchants rely on it to profile shoppers in stores, tech firms feed images into machine learning systems in order to build better AI software, and public agencies deploy it to keep cities safe. The dirty little secret is that almost all the faces digitally collected are being bought, sold, and added to databases without the explicit consent of the individuals who wear those faces. Organizations typically have free reign with the images because few, if any, restrictions or regulations exist.
In the U.S., copyright laws provide only limited protection, particularly for publicly collected images. Section 107 of the U.S. Copyright Act permits the "fair use" of copyrighted works, including images, for "purposes such as criticism, comment, news reporting, teaching (including multiple copies for classroom use), scholarship, or research…." Meanwhile, Creative Commons licenses don't "prevent the use of photos in a dataset," Kulik says. "Where things become murkier and more questionable is when the images are used for commercial purposes and monetary gain."
For example, what happens if an AI-trained dataset of public-domain images is used for marketing—and it creates a substantial new revenue source for a company? What, if anything, happens if a cash-strapped public agency opts to sell video or still images of faces to businesses in order to put money in the coffers? It's not an abstract concept. Research websites already offer databases of human faces for download, and IBM reportedly used more than 99 million photos and 750,000 videos originally obtained from Flickr for AI research, without the consent of the people whose facial images were used.
The sharing of biometric data, DNA profiles, and other personal data has legal and privacy experts concerned. At present, the General Data Protection Regulation (GDPR) affords some protections for European Union (EU) citizens.
In the U.S., a few states have begun to address the issue. For example, Illinois, which passed the Biometric Information Protection Act (BIPA) in 2008, imposes strict requirements for collecting biometric identifiers such as fingerprints, face or hand scans, and retina scans. This includes: informing a person in writing that this information is being stored, explaining the purpose and length of use, and obtaining express written authorization to use the information. Violations can result in fines of $5,000 per incident.
Jennifer Lynch, Surveillance Litigation Director at the Electronic Frontier Foundation (EFF), argues that issues surrounding the monetization of facial data can't be resolved by merely focusing on ownership. "A better strategy…is to focus on the privacy and security problems that will result from the broad collection and sharing of face recognition data," she says.
EFF proposes strict limits on data collection and sharing of data like face templates, Lynch adds, including "a ban, or at least a moratorium, on government collection of face recognition data," along with "strict opt-in rules for data collection and sharing by private companies."
To be sure, the question of who owns a person's face or what they can do with the digital code a face becomes is only beginning to heat up. Concludes Zimmer, "If someone is going to use my likeness in order to train an algorithm or extract some kind of value, they're essentially taking a piece of me to accomplish the task. The question is: shouldn't I have the right to say 'yes' or 'no'?"
Samuel Greengard is an author and journalist based in West Linn, OR, USA.