TOTAL: {[ getCartTotalCost() | currencyFilter ]} Update cart for total shopping_basket Checkout

Privacy Tech | Why the Latest Pew Report Is Important for Privacy Engineers Related reading: Have the NSA Leaks Just Helped the PETs Industry?

rss_feed

""

""

Not long after Edward Snowden shocked the world with a series of revelations about government surveillance, I asked whether this would create a boom for the privacy-enhancing technologies (PETs) industry. In the early days of the post-Snowden era, DuckDuckGo searches went up 26 percent (six months later they had a record year), and NBC Nightly News actually did a report on differential privacy. Imagine that.

No doubt, in the two years since the leaks, there has been an onslaught of products boasting PET capabilities—from messaging services to social networks to a fully equipped PET smartphone. The Onion Router (Tor) is no longer an obscure web browser, and email platforms have made it easier to use GNU Privacy Guard (GPG) encryption.

But clearly, PET services are not easy enough for the average U.S. consumer to use regularly. Last week, the Pew Research Center released a new report on “Why some Americans have not changed their privacy and security behaviors.” And the results are grossly disturbing, at best.

Of the 475 U.S. adults surveyed, an astounding 54 percent said they thought it would, at least, be “somewhat difficult” or “very difficult” to find tools and strategies to protect their privacy while online. Just nine percent said it would be “very easy.”

For our purposes here, however, let’s focus on the many respondents to the Pew report who said they didn’t have the time or expertise to use PETs. For example, one person said, “I do not feel expert enough to know what to do to protect myself and to know that the protection chosen is effective. Technology changes very fast.”

Just a small example: Pew’s Mary Madden noted that 31 percent didn’t know email encryption tools exist.

Plus, there is the fact that 49 percent believe it would be acceptable to monitor the communications of an individual who has “used encryption software to hide files.” This is maybe the most disturbing aspect of the survey for me personally, as it implicitly trots out the classic refrain, “I have nothing to hide.” That’s another discussion altogether, one that Dan Solove has excavated fully and convincingly. Let’s just say that the nothing-to-hide argument has more bull than a Texas cattle farm.

Case in point: Nearly the same breadth, 91 percent—I REPEAT, 91 PERCENT—of U.S. adults in another recent Pew survey said they “agree” or “strongly agree” that consumers have lost control of how their personal data is collected and used by companies.

Clearly, there’s a strong disconnect here; U.S. adults like their privacy but either feel like they don’t need to do anything to protect it out of naïve honesty or don’t have the time or expertise.

Enter the privacy engineers.

This is a fantastic opportunity for developers, engineers, privacy professionals and others designing technology to step into this breach and make privacy protection easy. Now, I know such a challenge is far from simple. Using basic GPG encryption is nowhere near as simple as sending an email via Gmail. It takes background knowledge, caution and a network of trust.

Plus, I’ll admit, many of these great PET services—take Wickr, for example, or Signal, for another—are useless for me personally because none of my friends use them. Again, it’s just easier to use SMS texting or basic email. Or even a phone call!

Yet, we know from the do-not-track debates that default settings are the key to user behavior, because a user is more likely to abide by the default settings. That why it’s important for developers and designers to make PET services as easy for users as possible.

To their credit, Facebook, Google, Yahoo and Apple, among others, are now encrypting their services by default. Last Friday, Google also announced it’s taking steps to encrypt its ads products. Similarly, Netflix announced it will use HTTPS, hopefully prompting other popular services to implement encryption as well.

Yes, default encryption brings up Crypto Wars Part Deux and all of the law enforcement concerns around combating crime and terrorism and security advocates’ concern for weakening the Internet’s integrity (we’ll have to save this one for another time), but the point here is that default encryption is a great example of making PETs easy. It’s designed right into the service or product, and the consumer doesn’t even have to think about it.

The company gets to brag they protect consumer privacy and data security, while the consumer’s privacy and security are actually protected. Really, it’s a win-win. Of course, it goes without saying that the company better live up to such a promise unless it wants to be the focus of a headline in The Wall Street Journal, unhappy consumers and some interested regulators.

Expect new incentives for PETs to continue. In the two-year anniversary month of the first Snowden leaks, the USA PATRIOT Act is due to sunset unless reauthorized by Congress. Companies like Mozilla are trying hard to drum up support for surveillance reform—to end bulk surveillance. Will Congressional action or inaction once again spur consumer demand for PETs?

I think, for a time at least, yes it will.

Though the Pew results this week don’t reflect well on U.S. consumers’ will to privacy, consumers around the world are capable of taking measures to protect their tracks online. Just look at Australia, for example. After the government passed a hugely controversial data retention law, VPN use is skyrocketing. One provider has claimed a 500-percent increase in VPN subscriptions in the Australian market.

This proves consumers will go that extra mile if they need to, but why not provide it for them from the start?

photo credit: Maze Puzzle (Blender) via photopin (license)

1 Comment

If you want to comment on this post, you need to login.

  • comment Chris • Apr 22, 2015
    Great article! As an Information Secuity practicioner who found his way into the privacy industry, I'm happy to see the term "Privacy Engineer" slowly making its way into our field's lexicon. Privacy engineers do the same knd of work as security engineers, only with a focus on privacy controls instead of security ones. In my experience, the confusion caused by the 49% percent who believe its okay to monitor those that encrypt their files are thinking of OTHER people, not themselves. It's not an "I have nothing to hide" issue so much as its a "what does THAT person have to hide" one. This is a state of mind that privacy pros are going to have to work to change in the minds of our consumers and our governments. If good privacy applies to you, then it applies to me too (until such a time that one of us has been shown to have violated that trust).