When terms and conditions apply
In my film Terms and Conditions May Apply, I show how companies have used contracts of adhesion to legally sweep up as much data as possible without meaningful consent. There is a notion that companies have a right to any data they can accrue, no matter how personal, no matter if the person surrendering the information is even aware it’s happening.
First, we should rethink the word data. It’s an unintentionally brilliant PR job that perfectly masks the truth. Data is us in 1’s and 0’s. It’s a digital picture of our brains and habits, cataloguing our movements, desires, intentions, and thoughts. This is powerful stuff. It’s too powerful to be signed away with an uniformed click of a mouse.
This summer, Eric Schmidt of Google came out saying that users have ‘no reasonable expectation of privacy’. He can make this argument largely thanks to terms and conditions, as well as something called the 3rd party doctrine. If you’re unfamiliar, this is an ancient court case from the late 70‘s ruling that when you hand over your information to a 3rd party, you lose control over that data. This is what allows for big data to persist, and what allows the government to access it.
But is Mr Schmidt’s statement valid?
We have fought in the courts to prevent wiretapping for years and there’s a severe penalty for opening up someone’s mail. The FBI or NSA are not just allowed to come into your home and look at your photos, read your journal and make assessments about what kind of person you are. The problem is, users imagine that the Constitution still applies online. In reality, the 4th amendment is overruled by terms and conditions and by judicial precedents like the 3rd party doctrine, rendering Schmidt’s statement linguistically valid even though it’s a sneaky trick. If this is the basis for how corporations legally share big data with the government, then perhaps we must rethink big data - where there is big data there is big surveillance.
In my film I show multiple examples of what happens when big data goes wrong. When phrases or search terms are taken out of context. When a protestor can be arrested before they even pick up their activist sign. When a man making an obvious joke on Twitter is detained and deported. Turns out, the government doesn’t have a sense of humor.
Almost every other day a new story is revealed about how spy systems are being used to subterfuge democracy. Whether it’s building psychological profiles that can be used as leverage (like looking at porn habits), monitoring conversations that get handed over to a domestic drug authority, preventing whistleblowers from talking to the press, the systems are being used for far more than limiting terrorist attacks. And here’s the best part: when you get arrested, or leveraged, or audited, you won’t even know it was because of the surveillance state you now live in - it’ll all happen invisibly.
The effects are chilling
The only way out is for us to rethink how ownership of data works online. We should not be ‘data mines’ for companies, but instead symbiotic partners who share information transparently and freely with each other. The data set that any company has on a user should be property of the user. The whole data set. And the user should be able to take that data with them and leave the relationship with a company, as they see fit. We should have a right to know what information a company has stored on us, as well as a right to control that data. Because if we’re going to have digital identities, then those identities should be owned by us, regardless of any legal tactic a company might employ.
I’ll be the first to admit that so long as it’s legal for companies to accrue every piece of data they can get, it’s unreasonable to expect them to simply change their habits for purely ethical reasons. And there are plenty of things we can garner from big data that help society as a whole, things like predicting disease patterns and the flow of traffic. But while we wait for legislative fixes that may never come, it may be time to consider innovative solutions.
If you’re in the business of statistics, I’m not suggesting that we pull the carpet out from under you. Rather, I am suggesting that we rethink what data is valuable for the betterment of society, how that data can be effectively anonymized and separated from the source, how safeguards can be put in place to ensure personal data cannot be merged and sold and how we can move away from the thoughtless collection of all data and move towards specialized collection that is the result of informed consent.
All those phone recordings that the man was boasting about at the beginning of this article, those are real people sharing private thoughts. In this situation, if the trade for the service is that the conversation will be recorded, sold, and potentially used against the user in the future, then at the very least there needs to be meaningful consent. But ultimately, we need to rethink the entire model of paying with our data. We need to be more than data mines. Where there is big data there is big surveillance and if the honeypot exists, it will be exploited.