TOTAL: {[ getCartTotalCost() | currencyFilter ]} Update cart for total shopping_basket Checkout

Privacy Perspectives | Make the Data Innovation Pledge Related reading: InBloom Wilts Amid Privacy Backlash

rss_feed

""

""

At the IAPP Privacy Academy in San Jose, I was honored and grateful to accept the Privacy Vanguard Award. In a military formation the vanguard are the foremost deployed troops and often the first to encounter challenges. I have been working in Privacy for 20 years now, and consider myself fortunate to have had tremendous people working with me to make our way past those obstacles. Over those years I have seen the privacy profession change dramatically.

1. The Evolution of the Privacy Profession

I have been attending IAPP events since 2001, when no one needed name tags, as attendees knew each other’s names. I joined the IAPP Board in 2006, and the organization had approximately 2,000 members. When I left the board in 2010 there were 7,000 members and there are now more than 17,000. When compared to those IAPP meetings back in 2001, 17,000 is a huge number. However, compared to the way privacy touches everyone’s lives, I believe it dramatically under-represents the importance of the profession.

Over those same years the privacy profession has changed substantively. The qualitative evolution of information privacy has had at least three phases.

Phase I: Principles: First, in the 1970s the US government and the OECD articulated the Fair Information Practice Principles (the FIPPs) that have served as the foundation for our profession. Many other organizations have subsequently published modified versions of the FIPPs, but I continue to come back to the eight principles from the OECD. Those principles are both enduring and flexible. The OECD FIPPs may constantly need reinterpretation for new contexts and technologies, but they are perpetually relevant. They continue, and should continue, to provide protection for individuals while guiding the innovative use of technology and data. At Intel, we call the process of reinterpreting the FIPPs for big data, cloud computing and the internet of things, “Rethink Privacy.”

Phase II: Policies: The second phase took place in the 1980s and 1990s, when the profession promoted openness and transparency by posting Internet privacy policies based on the FIPPS. The process of writing privacy policies caused privacy professionals to think about how our organizations manage personal data and how we will comply with the representations in those documents. We may have created documents that very few people read, but the increase in data processing openness was dramatic and important.

Phase III: Ethics: The privacy profession has now entered a third phase of “data ethics.” This third phase is defined by individuals demanding privacy beyond just compliance and posted privacy policies. We all lead data-creating and data-influenced lives. The public is demanding to understand how individuals can exercise more control over the data that relates to them and what it means to provide reasonable opportunities to access, amend or delete some of the data. Individuals want organizations to be accountable for the impact of the use of data. We are seeing the creation of new products and services marketed to protect privacy. We regularly hear outrage about real and perceived privacy abuses. The public is saying it will not accept innovative uses of data, unless it understands that our organizations can be trusted. The goal is no longer just compliance to a posted privacy policy, but it is doing the “right thing” to protect individuals.

2. Data Can Drive Progress

Technology and data innovation present opportunities to enrich the lives of everyone on earth. For example, we see the potential to improve disease management, education and urban development by better using the data of which our profession is the steward. We have the ability to give individuals increased control over the data we create, while providing protections for the data others create about us. The ethical and innovative use of data can drive both individual benefits and societal gain. Every company is now a data company, and we all have a role to play in the extent to which that data will be used to accelerate progress.

The Obama Administration’s Big Data Report touched on this issue when it said, “Properly implemented, big data will become an historic driver of progress, helping our nation perpetuate the civic and economic dynamism that has long been its hallmark.” It is this opportunity for progress that makes this a critical time for the privacy profession. That progress will be impeded unless we properly implement big data, by embedding privacy into the data sets, the algorithms, the inferences and the resulting actions taken.

InBloom is one example from which we need to learn. The Gates and Carnegie Foundations spent $100 million to build an entity that would better use data to improve the efficiency and effectiveness of schools. Because of privacy concerns, that investment was lost, and even worse, so was that chance to provide new opportunities for children. The challenge before us is to not let that happen again.

Courageous organizations are now moving forward with the use of data to improve people’s lives. For example, Knewton is personalizing learning to improve student achievement. Knewton's goal is to personalize lessons for students around the world. Education companies use Knewton technology to power course materials that dynamically adapt to each student’s unique needs. By analyzing data to figure out what a student knows, Knewton recommends what to study next, helping more students master material and get ahead. Knewton-powered analytics identify knowledge gaps and predict performance to help educators, parents and administrators better support every student. Knewton has posted their guiding principles on protecting student privacy and the first principle is that student data belongs to the student.

The American Society for Clinical Oncology has started a project called Cancerlinq to use data analytics to transform cancer patient care. Clinical oncologists see the opportunity to better analyze patient data to achieve higher quality, higher value cancer care with better outcomes for patients. Cancerlinq is an example of a learning health care system. This type of system uses data analytics to allow doctors to improve care through protected access to the data created by the care those doctors are already providing. Janet M. Marchibroda, director of the Bipartisan Policy Center’s Health Innovation Initiative, has stated that, “CancerLinq is not only addressing the needs of people with cancer. It is also providing a model for other specialties and for the medical field as a whole.”

Both of these organizations have set the objective to protect privacy and make progress. They are not just looking to do what is legally required, but they are invested in not surprising individuals. These organizations are focused on how the data analytics will impact the individuals to whom the data relates.

3. Data Ethics

The privacy professionals working with these organizations know they must do more than just comply with the law. Legislation and regulation always trail innovation and are not sufficient to protect individuals. As one leading privacy officer told me recently, “It is not my role to define what we are legally allowed to do; it is my role to define what is right for us to do.”

In his excellent book Justice: What’s the Right Thing to Do?, Prof. Michael Sandel makes the following observation: “Justice is not only about the right way to distribute things. It is also the right way to value things.”

It is time for the privacy profession to show we value the individual. Innovation is not a good thing for its own sake, but instead is important because of the potential it has to enrich our lives. Data innovation is most powerful when it focuses on improving individuals’ lives, such as lifting people out of poverty, reducing commutes, making food distribution networks more efficient so the hungry are fed, improving road safety, decreasing infant mortality, increasing the safety of our city streets, combatting terrorism and reducing the cost of goods. Society will only allow these uses of data if the privacy profession shows it respects their desire for privacy. That is what it means to practice data ethics. Chief privacy officers should be integral team members on data analytics projects. Data algorithms should all be evaluated for how they will impact individuals’ privacy expectations. This will not happen as long as privacy professionals resign themselves to roles of bureaucratic paper processing.

Privacy is not about seclusion, but instead about engaging in society. Privacy protections give individuals the confidence to engage, to allow their data to be used, to participate in the digital economy. Participation in the digital economy is critical for Intel’s business model. We conduct extensive research to determine individuals’ attitudes towards the use of technology. Recently, we have focused that research on privacy. The findings demonstrate that most people have little understanding of how much of their data is exposed and what can be learned about them from the analysis of that information. However, the research also shows that people care about privacy, and when they perceive risk, they are vocal in their opposition and take actions to protect themselves. One example of this trend is the large percentage of social media enthusiasts who take advantage of privacy settings and/or “self curate” what is posted to reduce their risks.

Another observation from the research was that more than half of respondents who have smartphones reported avoiding or deleting a mobile application because the application requested access to personal data they viewed as inappropriate or unnecessary. However, the same research showed substantial lack of awareness of what personal data was collected by many of these mobile applications. For many applications, less than 10% of individuals who use that application knew the app was accessing data from their contacts file. This gap between individuals’ concerns and their lack of awareness of how their data is being used leads to “data surprises.” These surprises can also create misimpressions and unfounded fear about how the data will be used (as many argue was the case with InBloom).

This environment of data surprises is not “privacy done right” and is not an implementation of the FIPPs. These uses of data may be compliant with the law. These applications’ data practices may be compliant with their privacy policies. However, they do not focus on what Prof. Sandel refers to as “the right way to value things.” These uses of data are not asking what the impact is on the individual. These companies do not have privacy professionals who are practicing data ethics. “People will never find out” should never be the reason to justify a use of data.

Prof. Sandel describes in his book a process to get to the “common good”: “To achieve a just society we have to reason together about the meaning of the good life, and to create a public culture hospitable to the disagreements that will invariably arise.”

It is that public conversation to “reason together” in which the privacy profession must visibly and vocally engage, if we are going to assist the data scientists in having their innovations accepted. We cannot hide underneath a cloak of privacy policies and still hope to have the necessary impact on the world.

It should be a point of pride for the privacy profession that individuals should never be forced into the false choice between privacy and progress. The pursuit of both goals must be the defining value of our work. We all need to play a part in transforming the privacy profession from a focus on compliance to meeting the expectations of individuals, as we are the stewards of the data that relates to them.

4. The Data Innovation Pledge

Each year on January 28, we celebrate Data Privacy Day. It was on January 28, 1981, that Convention 108 was ratified by the Council of Europe recognizing privacy as one of the world’s fundamental freedoms. For seven years now, organizations across the globe have recognized Data Privacy Day by holding events to raise awareness for privacy. For the past two years, Data Privacy Day has fallen just a few days after the newly formed Data Innovation Day. The close proximity of the events is a reminder that privacy and innovation should reinforce each other. For this upcoming Data Privacy Day we need to demonstrate that the privacy profession will play a leading role in that data innovation.

To recognize Data Privacy Day this year, privacy professionals should make the Data Innovation Pledge: “I Will Promote the Ethical and Innovative Use of Data. “ This pledge is a promise that we will not approach our jobs as just a checklist compliance exercise, but instead as a vehicle of social and economic progress. Doctors promise to do no harm. Lawyers commit themselves to support the Constitution. These ethical commitments separate these jobs as professions. Privacy professionals should commit themselves to the ethical use of data to improve people’s lives. Privacy AND progress, not privacy OR progress.

This year by January 28, privacy professionals should reaffirm that, while compliance is important, our profession is also focused on achieving the promise of data innovation, while protecting individuals. We are focused on how we can enrich people’s lives, not on how we can fill out paperwork. All people who believe in the power of progress should make the pledge. Let’s show we recognize that data innovation that sacrifices privacy is not progress. All people who demand to have their freedoms respected should make the pledge. Everyone who believes in liberty should also have the ability to reap the benefits of data innovation. By Data Privacy Day, 100,000 people should make the Data Innovation Pledge.

Receiving the Privacy Vanguard Award is a great honor. I accept the award less as a recognition of past accomplishments, but more as a charge towards future leadership. Please join me in that future activity by going to #datainnovationpledge on Twitter. Let’s make today the start of our profession achieving the great accomplishments of this century.

3 Comments

If you want to comment on this post, you need to login.

  • comment Martin • Sep 22, 2014
    David is a forward looking leader in our community.  I join him in making the pledge.  My sense is that everyone that believes in accountability will join him in the pledge.
  • comment Marc • Sep 26, 2014
    We at the Network Advertising Initiative agree with David that “[p]rivacy protections give individuals the confidence to engage, to allow their data to be used, to participate in the digital economy.” We also agree that, while compliance with privacy laws is important, the role of privacy professionals and robust self-regulation like NAI is to move beyond strict compliance and “focus on achieving the promise of data innovation, while protecting individuals.”
    
    That is what NAI is all about. NAI upholds and preserves consumer trust by setting a high bar for responsible data collection and use by its members. We congratulate David Hoffman on his award and support his call to all privacy professionals. Let’s work together to promote the ethical and innovative use of data so that we can all benefit from the promise of the information age and digital economy.
  • comment Jennifer • Oct 6, 2014
    David, Acxiom joins you in taking the pledge.  We have had a long tradition of data innovation while also considering the privacy implications.  One of our long standing mottoes is "Just because you can do it, doesn't mean you should".