become mandatory for “security” settings, for instance, such as where a store might be able to identify a shoplifter during or after a crime. In the final negotiating session, he said, advocates posed on the most extreme case of all: Should a company need consent from a stranger on the street before using facial recognition on them?
Szabo said NetChoice had no position on this issue. But he said in an email that requiring consent before every use of the technology would “create universal complexities that would eliminate many of the benefits of facial recognition.” He then gave examples of some of these complexities:
Would a store need to get opt-in consent from a shoplifter before using facial-recognition technology?  Should police get opt-in consent from a missing child before using this technology to find them?  And should we have to get opt-in consent from every friend and family member before we tag him or her in our own photos?
It is worth noting that some of Szabo’s hypotheticals have little to do with the consumer privacy proposal. Szabo’s first question concerns a security application of the technology—whether a shoplifter can be facially identified—that Bedoya and other advocates say was explicitly exempt from the consent requirement. His second question is about what police can do with facial-recognition technology, even though it is private individuals and companies who would be limited by any proposed consumer regulation. (And regardless, as a legal minor, the missing child’s ability to consent to facial recognition would be delegated to that child’s parents.)
First Amendment and consumer privacy experts also disagreed that a right to use facial-recognition software on someone flows naturally from “a right to take photographs in public,” as Szabo seemed to imply to Fusion.
“It is well established that an entity may be able to lawfully photograph a person on a public street, but not be able to use the photo for advertising or trade purposes without the person’s consent,” said Anita LaFrance Allen, a professor of law and philosophy at the University of Pennsylvania. “This is state law in New York, California, and most other states.”
“There is no First Amendment right to take photographs whenever and however a person might like,” said Daniel Solove, a law professor at George Washington University and the CEO of TeachPrivacy, a security-education firm. “Facial recognition is not taking photos for any expressive purpose, but to use to identify a person for many potential purposes.”
Saying facial recognition deserves First Amendment protection would mean that nearly any kind of sensor or device that captures data also deserves First Amendment protection. “What about a device that captured people’s naked images underneath their clothes?” Solove asked, adding: “The law provides extensive regulation of personal data about individuals.”
* * *
But back to the consent issue. If facial-recognition cameras can be debuted anywhere—if “mass-scale facial recognition in the wild seems inevitable”—how will companies secure consent? Will suburban malls or city thoroughfares soon be filled with bustling cam-bots, snapping pictures of passerby and then rushing after them with a legal consent form, blue pen raised high in the air?
Probably not. “There’s a million different ways you could get consent,” Bedoya told me.
Online, especially, he believes it would be easy. “When a company asks for your location, you get a little popup that says, ‘Hey can we get your location?’,” he said. “When Google and Microsoft try to opt you into facial recognition, they serve you the same kind of dialogue I believe.”
For offline contexts, Bedoya said, there would be other options. A store which wanted to scan your face when you entered could register you through a web page, he said. Even if it scanned the faces of everyone entering a retail location, the store could secure consent to identify only those who opted into its VIP customer program online. VIP customers would then be assigned a personal assistant when they walked in.
He also imagined a society-wide, opt-out program. His students at Georgetown, working with engineers at MIT, theorized a program where consumers could sign up for a “Do Not Tag” program at the local DMV, right when they get their picture taken. The program would work something like an organ donor program, he said, except it would create a large list of “blacklisted” faces unassociated with names.
“Anyone in the state who’s using facial recognition would have to run their database against the opted-out database, and if there’s a face match, they’d have to drop that person out,” he said.
Could regulation be passed to enforce consent? Congress has not passed new consumer privacy legislation since 2009. In that time, California has passed 27 new laws to protect consumers, Bedoya said. In the past decade, too, both Illinois and Texas have passed laws requiring consent before biometric identification. (They were signed by Rick Perry and Rod Blagojevich, respectively.) And one path to nationwide regulation would be for more states to pass their own biometric laws, which would essentially constrain companies—especially those working online—in the United States.
“I don’t think people should throw their hands up,” Bedoya told me. “It is industry best practice to get opt-in consent, it’s just that industry lobbyists in D.C. have decided to take a much more hardline position.”
In an email, Szabo said that NetChoice believes transparency, not regulation, is how to best mitigate facial recognition.
“We are in favor of retail stores providing clear notice of facial recognition use and how that data is being collected and used,” he told me. “If a business does something that is not appealing, consumers will respond and the practice will be abandoned or the company will lose money until they fix the issue.”
He also lamented that “some privacy groups” had left the process. “We hope that these groups will return to the table.  The absence of some stakeholders from NTIA’s process won’t stop us from trying to create a workable code of conduct for facial-recognition privacy,” he said.
Microsoft had a similar refrain, though it said it would commit to more than transparency: “We believe the stakeholder process is important and that is why we are participating. Should there be a consensus that an opt-in approach be adopted, that is something that we could support.”
* * *
And what about the government? The FBI’s facial-recognition database includes 52 million faces and up to one-third of Americans. Despite this size, it likely lags behind commercial databases: For one, it works mostly off a single mugshot for each subject, and facial-recognition software improves for every additional image of the face it has. Instead of a single match, too, it supplies a list of 50 “top candidates,” and Techdirt has estimated even this list has only 80 percent accuracy.
But the EFF frets about the melding of commercial and governmental resources on the issue. “Several years ago, in response to a FOIA request, we learned the FBI’s standard warrant to social media companies like Facebook seeks copies of all images you upload, along with all images you’re tagged in. In the future, we may see the FBI seeking access to the underlying face recognition data instead,” said Jennifer Lynch, an attorney for the EFF, in a statement announcing its withdrawal from the NTIA talks.
After all, traffic-camera systems that detect a speeding vehicle, scan its license plate, and mail the owner of the car a ticket already serve as this kind of robocop. They enforce the law algorithmically and without discretion. If we’re okay with legal enforcement of speed laws, would we be okay with a city installing facial-recognition systems in a “high-crime area” and automatically mailing a ticket to the home of every jaywalker or loiterer?
Which is one reason it’s so important to establish guidelines about these techniques as they apply to businesses. For, under today’s law, “those eyes, that nose, that mouth”—all the unmutables that make you look like you—are not only yours to consider, not only yours to track, and not only yours to sell.