DPC15_300x250_ads_FINAL

PrivacyTraining_ad300x250.Promo1-01

The latest National Telecommunications and Information Administration (NTIA) meeting on drafting a voluntary code on commercial uses of facial recognition technology was short, but not for lack of contentious debate. Quite the opposite, really. As promised at the last NTIA meeting, the trade association representing the biometrics industry had prepared for consideration a draft of principles on a code (released to the public earlier this week), and there were a few people who didn’t like what it had to say—at all.

And it’s a particularly heated time to talk facial recognition: More than 30 privacy and civil liberties groups sent the U.S Justice Department a letter this week asking that it complete a “long-promised audit of the FBI’s facial-recognition database,” while behavioral authentication company Biocatch just announced it has raised $10 million to expand its biometric authentication platform.

The draft framework was circulated publicly on June 17 by the International Biometrics & Identification Association (IBIA) and promotes the ideas of transparency and protection of data as “fundamental privacy tenets.” It calls for the protection of all facial-recognition data but says no single policy is going to work in every application and, therefore, implementers and operators should be left to decide what kind of data protection is necessary on a case-by-case basis.

The draft repeats common refrains in its assertions that privacy is vulnerable in our society because of the influx of online data, big data practices and covert surveillance matters. In fact, it asserts that biometrics are a “major privacy-enhancing technology” that can protect against fraud and identity theft, among other crimes.

Those are very strong words. I’m surprised by them. I mean, anonymity is a pretty fundamental value in American society.

Chris Calabrese of the ACLU

But the part of the framework that had consumer groups such as the American Civil Liberties Union (ACLU) and the Center for Democracy and Technology (CDT) distraught was this line in the draft: “There is no anonymity if we choose to live in society.”

The ACLU’s Chris Calabrese was the first to call on Walter Hamilton, vice chairman of the IBIA, to task for clarification on the statement.

“Those are very strong words,” Calabrese said after reading the passage aloud to the room. “I’m surprised by them. I mean, anonymity is a pretty fundamental value in American society.”

But Hamilton said anonymity and privacy are distinguishable concepts and should be.

“I value my privacy. I don’t want to be intruded on or contacted by an organization trying to sell me a product when I’m having dinner with my family at six in the evening,” he said. “The marketing association may know who I am; they may know some of my demographics from the public data they’ve aggregated and analyzed. But that doesn’t give them the right to invade my privacy by disturbing me.”

However, Hamilton added, if we want to be productive members of society, we’re constantly required to identify ourselves—to receive privileges like driving a car or opening a bank account, he said.

When the Founding Fathers were drafting our Constitution, he argued by way of example, “everyone in the village knew everyone in the village. There was no expectation of anonymity unless they moved away from the village. This concept of anonymity and having no one know your identity is really something unrelated to facial recognition for the large part in our society.”

Calabrese moved to turn on his microphone, looking aghast, then yielded to the CDT’s G.S. Hans to respond with a gesture that said, in effect, “Do you want to handle this?”

Hans called Hamilton’s assertion a “somewhat cramped view of anonymity.”

“I don’t think the objection to the anonymity language here is that we don’t think there are reasons that people should identify themselves in certain contexts,” he said. “But to broadly throw out the notion that anonymity is not relevant or worth preserving I think is inaccurate and I think would, if wrongly construed, raise some of the issues that we identify that broad facial recognition and unchecked surveillance practices could raise.”

New York University’s Travis Hall, a researcher in biometrics and surveillance studies, supported Hans’s point.

“It’s not that as a society, to function in society, you have to give up your ID at all times,” he said. “In certain circumstances and places, you do have to produce your ID, but it also seems that in certain circumstances and certain places you shouldn’t have to.”

The crux of the matter is not the biometric data on its own, Hall said, it’s what that biometric template can be linked to that makes it potentially harmful.

Asked by The Privacy Advisor whether the fundamental difference in opinion may be a divide too great to bridge in the name of drafting a code both industry and consumer groups can support, Calabrese wasn’t sure.

“I don’t think what’s in that whitepaper is, or should be, the industry position,” he said. “But that is up to them to decide.”

The next meeting is scheduled for July 24 in Washington, DC.

Written By

Angelique Carson, CIPP/US

0 Comments

If you want to comment on this post, you need to login

Related