TOTAL: {[ getCartTotalCost() | currencyFilter ]} Update cart for total shopping_basket Checkout

Privacy Perspectives | Can We Adapt to the Internet of Things? Related reading: A view from Brussels: EDPS sends signal on data transfers 

rss_feed

""

The so-called “Internet of Things” is emerging and it promises to usher in profound changes that will rival the first wave of Internet innovation. As microchips and sensors are increasingly embedded into almost all the “smart devices” that we own and come into contact with, a truly “seamless web” of connectivity will finally exist. At the same time, unmanned aircraft systems (UAS), or private drones, are about to become far more ubiquitous. This means many modern technologies will suddenly gain mobility as well as connectivity.

The benefits associated with these developments will be enormous.

McKinsey Global Institute estimates the potential economic impact of the Internet of Things (IoT) to be $2.7 trillion to $6.2 trillion per year by 2025. The biggest impacts will be in health care, energy, transportation and retail services. Commercial UASs will offer significant benefits as well, especially in agriculture and delivery services.

Of course, as with every major technological revolution, IoT and UAS technologies will also be hugely disruptive—for both the economy and social norms. I’ll ignore the economic ramifications and instead just focus on the social issues here because these technologies give rise to what we might consider the “perfect storm” of safety, security and privacy concerns.

In terms of safety, if you thought parents already had enough to worry about with the old Internet, just wait till they learn that the Net will be quite literally woven into the fabric of their children’s lives. When “wearable computing” means every article of clothing or other accessory comes equipped with instantaneous communication and tracking capability, you can imagine how some parents (including me!) may get a little panicky.

But the privacy and security implications of IoT are even more profound, probably explaining why the Federal Trade Commission (FTC) recently announced a November workshop to discuss such issues. Some critics are already forecasting the equivalent of a privacy apocalypse with the rise of these technologies. Meanwhile, the Federal Aviation Administration (FAA) has invited comments in a proceeding “addressing the privacy questions raised… [by] unmanned aircraft systems.” Many privacy advocates fear that commercial drones will soon darken our skies and create an omnipresent panopticon.

So, what is the sensible public policy disposition toward IoT and commercial drones?

In recent filings to the FTC and the FAA, I argued that, despite all the early fear and loathing about these new technologies, policymakers should be careful not to derail them by imposing a “privacy precautionary principle.” That approach would curtail new innovations until their creators could prove that they won’t cause any harms. I argued that the default position toward new forms of technological innovation such as these should instead be “innovation allowed.” This policy norm is better captured in the well-known Internet ideal of “permissionless innovation,” or the general freedom to experiment and learn through trial-and-error.

Data—including personal information—is going to be moving fluidly through so many devices and across so many platforms that it will be extremely difficult to effectively apply Fair Information Practice Principles in a preemptive fashion for every conceivable use and application of these technologies. Specifically, it will be challenging to achieve effective notice and consent in a world where so many devices and systems will be capturing data in real-time.

So, we’re going to need new ideas and approaches.

We should continue to consider how we might achieve “privacy by design” before new services are rolled out, but the reality is that “privacy on the fly” and “privacy by ongoing norm-shaping” may become even more essential.

And this is where the role of privacy professionals will be absolutely essential.

As Deirdre Mulligan and Kenneth Bamberger have noted, increasingly, it is what happens “on the ground”—the day-to-day management of privacy decisions through the interaction of privacy professionals, engineers, outside experts and regular users—that is really important. They stress how “governing privacy through flexible principles” is the new norm. As they noted on this blog in April, “privacy work takes many forms in the firm” today with privacy professionals responding on the fly to breaking developments, many of which could not have been foreseen. To continuously improve upon this model, they argue that the “daily work [of privacy professionals] requires trusted insider status” and “full and early access and ongoing dialogue with business units.” Success, they note, “is best accomplished by a diverse set of distributed employees with privacy training who are nonetheless viewed as part of the business team.”

That is exactly right.

Going forward, privacy professionals within firms and other organizations will need to be on the frontlines of this rapidly evolving technological landscape to solve the hard problems presented by new IoT and UAS technologies. They will need to be responsive to user concerns and continuously refine corporate practices to balance the ongoing services that the public demands against the privacy-invading aspects of these technologies. They will need to get creative about data use and deletion policies and simultaneously work to educate the public about appropriate use of these new tools.

This does not mean there is no role for law here.

For both IOT and UAS, existing privacy torts and other targeted rules (like “Peeping Tom” laws) will need to evolve to address serious harms as they develop. Again, this approach should be responsive instead of preemptive. The most important role for government, however, will be education and awareness-building. Governments are uniquely positioned to get the word out about new technologies—both the benefits and dangers—and can develop messaging—especially to youngsters still in school—about appropriate use of IoT and UAS technologies.

Finally, we should not forget the important role of social norms and conventions. New technologies can be regulated by more than just law. Social pressure and private norms of acceptable use often act as “regulators” of the uses (and misuses) of new technologies and that will likely also be the case for IoT and UAS. For example, the most powerful “regulator” of Google Glasses and other wearable technologies may end up being the scowl of disapproving faces in the crowd. Meanwhile, many organizations will develop strict codes of conduct for the use of such technologies in particular venues.

In sum, we need flexible, adaptive policies and approaches going forward. The challenges ahead will be formidable but the payoff for society to getting this balance right will be enormous.

photo credit: Cea. via photopin cc

Comments

If you want to comment on this post, you need to login.