TOTAL: {[ getCartTotalCost() | currencyFilter ]} Update cart for total shopping_basket Checkout

The Privacy Advisor | Top scholars and practitioners tackle privacy’s complex challenges Related reading: Overview of global AI governance law and policy — Part 2: UK

rss_feed

""

What happens when the world’s top privacy thinkers and practitioners get together? For those involved with the annual Privacy Law Scholars Conference (PLSC), a valuable conduit between scholarship and practice emerges.

Backed by George Washington University Law School and Berkeley Law, the PLSC is a forum for legal, information privacy, economic, philosophy, computer and political science scholars to share and craft ideas with industry, advocacy, legal and government practitioners.

“You often hear academics say that practitioners don’t care about scholarship, or vice versa,” says PLSC co-organizer Daniel Solove. “But the practitioners really love this conference and really get along with the scholars.”

A world-renowned information privacy scholar and George Washington Law School professor, Solove says there is a relevant dialogue between academics and practitioners. “This is great news for our field. This kind of conversation to share ideas doesn’t exist in other fields,” he notes. “Practitioners are beating down the door to join this event…It’s really an exciting time.”

With rapid developments in technology, increased reliance on Big Data and proposed changes to regulatory frameworks in many countries across the globe, the time is ripe for understanding and shaping the privacy discussion. From government surveillance to self-regulatory codes of conduct, the topics ranged from the grand to the granular.

Solove said a number of common themes at the PLCS included a slew of empirical scholarship. “Among the big topics this year,” said Solove, “were Big Data, mobile privacy and the U.S. v Jones Supreme Court case. There was a wide array of topics, and the paper quality was stellar.”

Other themes dealt with privacy fairness and operationalizing Privacy by Design, and in two such cases, researchers were recognized by the IAPP for their efforts.

Privacy fairness and user control

“I think one of the cross-cutting themes that came up is the intersection of privacy and fairness considerations,” said Omer Tene. “Many privacy concerns are driven by a fear of discriminations—including price—social stratification, etc.”

Tene is an associate professor at Israel’s College of Management Haim Striks School of Law. He, along with Future of Privacy Forum Co-Chair and Director Jules Polonetsky, CIPP/US, submitted a paper at this year’s conference.

Soon to be published in Northwestern Journal of Technology & Intellectual Property, “Big Data for All: Privacy and User Control in the Age of Analytics” looks into the beneficial uses of Big Data while noting the privacy concerns that stem from the “data deluge.” Tene and Polonetsky posit that the resulting concern for privacy “could stir a regulatory backlash dampening the data economy and stifling innovation.” Policymakers, they argue, need to confront foundational principles that underpin privacy law. Defining personally identifiable information, distinguishing the role of individual control and instilling data minimization and purpose limitation are all concepts that should be addressed. They emphasize “the importance of providing individuals with access to their data in usable format,” thereby allowing individuals to “share the wealth created by their information” and give developers incentives to harness “the value of Big Data.”

Privacy fairness and discrimination are themes found in the ongoing and award-winning work of Alessandro Acquisti and Christina Fong, both from Carnegie Mellon University. Co-winners of the IAPP Privacy Law Scholars Conference Award, their research has looked into hiring discrimination patterns through the use of social networks by employers. A two-pronged approach, Acquisti and Fong have used survey and field research to uncover current hiring practices in a Web 2.0 world, suggesting that regulatory frameworks are being outpaced by technological developments.

In “An Experiment in Hiring Discrimination via Online Social Networks,” Acquisti has worked on the privacy angle while Fong has worked on discrimination issues through an economic lens.

“We started research for this quite a few years ago,” said Acquisti, “when it was unclear whether or not employers had started using social media in general to screen candidates.”

The design of their experiment has been complex, he notes, as it’s difficult to define how certain shared information impacts firms’ hiring decisions. As people continue to share vast amounts of personal data on social networking sites, and as laws governing what can be asked in job interviews remain unchanged, the temptation for employers to mine prospective candidates’ online profiles—off-the-record—will most likely continue.

Putting the design in Privacy by Design

The IAPP PLSC Awards also recognized a study that delved into the practical side of Privacy by Design. New York University Information Law Institute Adjunct Prof. and Senior Fellow Ira Rubinstein and Nathan Good, principal at Good Research LLC, teamed up to flesh out what designing for privacy really means for engineers working to create software systems and services.

By taking a retrospective look at 10 major privacy incidents that have affected Facebook and Google, Rubinstein and Good write that this is “the first comprehensive analysis of engineering and usability principles specifically relevant for privacy.”

Rubinstein said he initially wanted to speak with companies using Privacy by Design but had difficulty finding an adequate number. “So I decided to change gears. There was sufficient information from a number of public sources and reported on by the press.”

Good said he had been working with companies on design solutions and Privacy by Design kept coming up. “People were asking, ‘what do we do to do it?’ I said, ‘well, I don’t know.’ If a company wanted to do Privacy by Design concretely, how would we put that together?”

Rubinstein added, “In particular, we wanted to look at the nuts-and-bolts from the engineering and design side—software engineering, visual design and human computer interaction (HCI). There have been studies conducted more broadly but involved governance, process and notions of accountability, not design for engineering.”

Privacy By Design: A Counterfactual Analysis of Google and Facebook Privacy Incidents” not only looks into translating the Fair Information Practices (FIPs) “into engineering and usability principles and practices,” but illustrates them through 10 highly publicized privacy incidents. For Google, incidents involved Gmail, Search, Street View, Buzz and Google’s new privacy policy, while for Facebook, incidents involved changes to News Feed; Beacon; Facebook Apps; Photo Sharing and changes to its privacy policies and settings.

“Indeed,” Rubinstein and Good write, “despite the strong expressions of support for Privacy by Design, its meaning remains elusive.” Through their analysis, the authors conclude “all 10 privacy incidents might have been avoided by the application of these privacy engineering and usability principles.” They also contend the primary challenge to effective Privacy by Design “is that business concerns often compete with and overshadow privacy concerns.”

One thing that interested Good was figuring out how to create a design methodology. “At the end of the day, design is really hard! And privacy is a difficult concept.”

In the context of the massive amount of information shared daily by individuals—whether through online purchasing, social networking or the use of mobile apps—backend encryption and security along with notice and consent “aren’t as helpful when looking at design literature,” said Good, noting that “privacy for the consumer is secondary when sharing data online.” Good said designing for reluctant users or for users’ secondary considerations is challenging.

Rubinstein added that a lot of privacy engineering issues require design of server tools for managing websites or compliance with data retention policies, whereby systems are built to delete data after certain time periods. Yet, when someone is sharing personal information on Facebook, what kinds of settings or tools are available to call the users’ attention to their actions?

The authors analyze two design perspectives—privacy engineering and usable privacy design. The former centers on building the software required for “satisfying the abstract privacy requirements embodied in the FIPs,” and draws from technical design literature. The latter ensures “that users understand and benefit from well-engineered privacy controls,” and “finds inspiration” in the work of social psychologist Irwin Altman and privacy philosopher Helen Nissenbaum, “both of whom analyze privacy in terms of social interaction,” Rubinstein and Good write.

Good said that icons “or elements that enable the user to understand what they’re doing as they’re doing it” is an important concept. “Checking off a wall of boxes expressing privacy preferences” works up front, but six months later you don’t remember, Good said. “We need something that draws your attention to these issues as you’re doing it.”

For a designer, this is no small task. “From a usability perspective,” he added, “think of the computer screen as real estate. Different parts have different value, and with limited real estate, businesses want to maximize value on the screen.” Good said that engineers need to work closely with business units for optimization.

And he emphasized that designing privacy in from the beginning is a challenging endeavor, deserving a lot of attention.

“I think getting engineers and members of the IAPP involved in discussions around privacy is important,” said Good, adding, “This community has a lot to offer.”

Comments

If you want to comment on this post, you need to login.