But most people, despite whatever they do for a living - which increasingly involves using someone else's data in a technologically-oriented fashion - assume, rather guilelessly, a degree of rationality on the part of those collecting, aggregating and then using their data.
They understand that this is a transaction: you get my personal information, I get something in return, even, or especially, if it's primarily intangible: convenience, preference, speed, access. They do think, however contractually ignorant or indifferent they may be, that those on the other side of the trade will recognize that for the sake of reputation and repeat business some responsibility and prudence will be exercised.
And that, of course, is where they are wrong. Because the business of business is sales. And with impatient investors, relentless competitors and ungrateful employees, a penny earned is a penny earned. Which is to say, have assets, will sell.
What business may be about to learn is - to redeploy an old saw - to whom much is given, much is expected. And what consumers and the politicians who court them are beginning to recognize is that there need to be rules because ' the industry,' which, in truth is all industry, is incapable of policing itself. So what smart enterprises will do is be proactive about establishing meaningful protections and granting consumers redress if those protections are broached. Because whether they want to or not, the potential penalties - legal, financial, legislative, regulatory - that could be enacted will be far more painful if they don't. JL
Derrick Harris comments in GigaOm:
Companies selling connected devices need a framework for thinking of user data not just as a valuable resource, but also as something over which they’re the stewards. Know there will be penalties in place if you do something bad, or even just stupid.
Even as websites, wearable computers and, increasingly, every piece of technology we touch gathers and analyzes our data, there’s still hope that privacy will survive. Making that case, however, might mean working from a different definition of privacy than we’re used to.
One cold, hard fact about data privacy is that the data-collection ship sailed long ago, never to return. With limited exceptions, consumers can’t really stop tech companies from collecting data about them. When we log into web services, make phone calls, play our favorite apps or buy the latest in connected jewelry, we’re giving those companies the right to collect just about whatever information they please about who we are and how we use their products.
This why the White House, as part of its new consumer privacy push unveiled on Monday morning, is talking about how student data is used and smart grid data is secured rather than what’s collected. It’s why Federal Trade Commission chairperson Edith Ramirez, speaking about the internet of things at last week’s Consumer Electronics Show, spoke about how long companies should store user data and not whether they should collect it.
The internet of things, in fact, is a prime example of why we’ll probably never be able to put a lid on data collection: because many people actually crave it. The whole point of connected devices is that they collect our data and do something with it, presumably something that users view as beneficial. If I love my fitness tracker or my smart thermostat, I can’t really be upset that it’s sucking up my data.
What I can be upset about, however, is when the company does something unethical or negligent with my data, or something I didn’t agree to (at least constructively) in the privacy policy. It seems this is where a lot of regulatory energy is now being spent, and that’s probably a good thing. (We’ll also delve into this topic at our Structure Data conference in March, with FTC Commissioner Julie Brill.)
Even if it’s forced on them, companies selling connected devices need a framework for thinking of user data not just as a valuable resource, but also as something over which they’re the stewards. Collect the data, analyze it, make your money — the whole industry is predicated on these things. But know there will be penalties in place if you do something bad, or even just stupid.
Of course, the devil here will be in the details. What constitutes an acceptable use, security protocol or retention period could vary widely based on industry, company, product, cost or any other of a number of variables. A connected car is not a fitness tracker. A smart door lock is not a connected toothbrush.
But hopefully, the attention the internet of things is getting early on means lawmakers and regulators will be able to come up with some workable, flexible and relatively future-proof rules sooner rather that later. The last thing we want — especially when dealing with data about our physical-world activity — is a repeat of the web, where it’s 25 years later and we still haven’t figured out what privacy means.