Do you remember when grocery stores didn’t know you were pregnant before your parents? Or when newspapers couldn’t find naked pictures of you by looking through your phone? Boy, those were the days (When did I get this old?). Still, there’s no escaping it. Things are digitizing. Everywhere. Whether you’re registering to vote in Washington State using Facebook or banking on your mobile phone in Kenya, there are, all of a sudden, a bunch of third-party organizations involved in the most intimate parts of your life that weren’t there before. And, for the most part, that’s a good thing. Services are delivered more quickly, collective action is easier to organize, and you can do, well, almost everything, better.
So what’s the catch? There’s a great saying: “If you are not paying for it, you are not the customer; you are the product being sold.” That’s never been more true than it is right now- the digitization of interactions means that every time we carry a smart phone, send a text message, or buy something online, we’re creating value for someone. A lot of it, as it turns out. These days, information isn’t just power, it’s big money.
Telecommunications and online services companies are posting some of the world’s largest profits by doing two, transformational things: collecting huge amounts of data, and using it to increase profit margins. Everyone, from Google to governments, is realizing the value of well-defined, correlated, and usable information. This is where online service providers have made so much progress- in collecting data on nearly everything they touch and a number of things they probably shouldn't. More importantly, though, these organizations have developed algorithms and refining processes yield clear insights from otherwise unmanageably large data sets, enabling them to drive value from us. And all of their interactions with us.
As it stands, we have all given up any ownership interest we may have in that value, even though our interactions are the ones creating it. For the most part, the decisions about the morality of data capture, ownership, and licensing practices have been buried in the corners of unread terms of service agreements. Service providers have been free to set, and more concerning, unilaterally change, the agreements that determine how billions of people’s most personal information is treated. And while there have been some early challenges (like the outcry over Instagram recently), most web platforms do so with impunity. The legal community hasn’t been particularly quick, or consistent, in recognizing the increasingly vital role played by telecommunications technologies and the digital data they create.
Over the years, there has been a lot of conversation about the Digital Divide- the ways that comparatively sophisticated digital communication technologies increase disparities in wealth and access to services. But as governments, businesses, and vital service providers move their interactions to digital platforms like the Internet, mobile and browser applications, and SMS, they’re also creating another kind of divide: a Data Divide.
Here, I mean the Data Divide as the marginalization of individual interests in the collection, analysis, use, and commercialization of data generated through digital interactions to the disproportionate benefit of institutions and service providers. Said more simply, big companies and governments exercise an enormous amount of power over us based on the data that we give them- often unknowingly or without choice.
The strongest advocates for us, the individuals, have come from the online security and privacy communities. Organizations like the Electronic Frontier Foundation, the Center for Democracy and Technology, and the World Wide Web Consortium, have been advocates for creating “Do Not Track” policies and tools, toward helping people gain control over the data that the world’s largest online platforms collect. These tools and policies, though, have faced a tough road toward achieving their goals and there’s significantly further to go.
Still, “Do Not Track,” only really creates the option to prevent the collection of personal and online interaction data- which is different than being able to download and commercialize it yourself. In other words, the options are, either let big companies collect data on you, or no one can do it at all.
I can’t help feeling like that still misses the point, like it’s a bit scorched Earth. What if there was a middle ground? What if you could share in the value that you create, simply by doing whatever it is you already do on these platforms?
The simple point is that technology tools and services continue to base their billion-dollar businesses on our data. That’s not to say they shouldn’t have the right to do so- they provide valuable services that many of us enjoy for free. That said, as the amount of digital data that we create and volunteer about ourselves continues to grow in volume, value, and impact, so, too, does the importance of fair, participatory, and open conversations about how that value is used and spent. After all, it is our data- and that may make it the world’s most democratic commodity. The trick now is just to figure out how to make sure it creates value for all of us.
It’s time that we recognize that the conversation about personal data isn’t just about security, it’s about how we share value. And values. And until we have that conversation openly- outside of rarely read contracts- it will be the divide that pulls us apart.