TOTAL: {[ getCartTotalCost() | currencyFilter ]} Update cart for total shopping_basket Checkout

Privacy Tech | Real Privacy Tools for Big Data Related reading: MIT Researchers Create “HTTP with Accountability”

rss_feed

""

New information technology is at the heart of so many of the privacy challenges we face today—big data, Internet of things, metadata analysis—but the technology we actually use to solve privacy problems is just this side of the Stone Age.

The key tool an enterprise uses today is a set of privacy policies written out on paper. This is necessary, of course, but hardly sufficient. Forward-looking organizations have a "privacy program" following the recommendations of “accountability” advocates, with training, clear identification of who is responsible for assuring rules are being followed, regular review of procedures and perhaps even a regular manual audit. These are all valuable steps, but alone, they cannot possibly keep up with the scale, scope and velocity of data being processed by today’s large, information-intensive organizations. They amount to hoping that, with sufficient training about procedures, knowledge workers dispersed through an enterprise will do the right thing. Of course, training is necessary, but without actually keeping track of what happens with data, organizations can suffer major failures (remember Uber).

A new approach to privacy management is necessary in order to enable organizations to handle data at scale and simultaneously remain consistent with the high standards of privacy protection.

In 2008, Harold Abelson, Tim Berners-­Lee, Joan Feigenbaum, James Hendler, Gerald Sussman and I published a paper titled “Information Accountability.” We showed that encryption and access control were not sufficient to protect privacy. Instead, we needed scalable, automated systems to track and analyze actual information usage against machine-readable rules. With accountable systems in place, it is possible to warn personal information users when they may be doing something wrong with data. These systems can also conduct ongoing audits of data usage, either for internal consumption or to build public trust. Our research at MIT has flourished, and many other computer scientists from around the world have advanced the field of accountable systems.

The public policy world is recognizing the need for automated accountability. As part of the White House Big Data Privacy review, the President’s Council of Advisors on Science and Technology identified the need for accountable systems. They found, “Although the state of the art is still somewhat ad hoc, and auditing is often not automated, so‐called accountable systems are beginning to be deployed.” The ability to detect violations of privacy policies, particularly if the auditing is automated and continuous, can be used both to deter privacy violations and to ensure that violators are punished.

The National Research Council report on bulk collection of signals intelligence data, ordered by the president to address civil liberties issues raised in the wake of the Snowden disclosures, found: “Auditing usage of bulk data is essential to enforce privacy protection. Greater automation of auditing is an area that has been greatly neglected by government, industry and academia ...”

And in reviewing Uber’s privacy failures, privacy expert and Hogan Lovells Partner Harriet Pearson, CIPP/US, recommended that “Uber implement additional tools and written procedures that will help automate and further embed compliance with the company’s access control policies into day-to-day operations.”

From our research and product development experience, we see four key features necessary for any information accountability solution:

  1. Common and simple language to create data use rules. Data users and privacy professionals should be able to create and implement rules, without the need for IT support. Changes must also be easy to make and apply automatically to all data. A change in government regulation need not cause major disruptions to the business line owners.
  2. Shared repository of policies and rules that apply to data held across the organization.
  3. Automated, real-time reasoning of data usage against these rules. Manual, point-in-time, procedural audits are not sufficient anymore, no matter how automated the audit reporting might be.
  4. Continuous monitoring and reporting. If privacy adherence exceptions arise, real­-time alerts should be accompanied by an easy-to-understand explanation of why the behavior in question is inappropriate. Privacy professionals should be able to view compliance status at any point in the monitoring.

A system with these characteristics will enable transparency and accountability. Customers and business partners can trust that their information is being used appropriately. Organizations that employ the latest accountable systems technology can earn customer trust and avoid harmful or embarrassing misuse of data. CPOs and general counsels can sleep easier, knowing that data users are not violating laws, regulations and company policies.

1 Comment

If you want to comment on this post, you need to login.

  • comment Tom • Mar 5, 2015
    Interesting concept and runs parallel in someways to accountable access tools that would be looking for inappropriate access behaviors in real time and alerting on them. However those rules would be relatively simplistic compared to 'usage rules' which are highly contextual one would think. Also, while your SOC might be capable of handling real time alerts on inappropriate access profiles,  the thought of investigations teams trying to think through privacy related issues of legitimate use makes me wonder how effective that would be with the current skill set in those areas.