Phones have already become external storage for our brains. One day they’ll be connected internally. We need to start discussing digital privacy now before we have no cybernetic privacy in the future.
I’m going to something so controversial I’m not even sure I fully agree with it, at least not yet. This is complex, nuanced, life, death, and the future of our society stuff, and the absolute last thing I’m going to do is take any of it lightly.
Instead, I’m going to take several things that happened this week, break them down, and then suggest how we as a people can move forward.
“What happens on iPhone, stays on iPhone.”
That’s the message Apple plastered across CES this year, on an epic, building-sized poster that wasn’t just a clever play on “What happens in Vegas, Stays in Vegas,” or clever marketing given the lack of attention they got by not showing up in Vegas last year, but a swift and brutal rohambo on Google, Facebook, and Amazon — Companies that primarily suck up your data to operate on it in the cloud, but also to store and exploit it for their own gain, and in stark contrast to Apple, who has made it a point of both differentiation and pride to keep your data on device to operate on it there, exploiting it not at all.
Some loved it. Others hated it. Some found it spot on. Others found it duplicitous. Some would have preferred Apple to stay away. Others would have preferred Tim Cook show up at the show and deliver the message in person, as a full-on, privacy focused keynote, similar to the one he gave last year at the 40th International Conference of Data Protection and Privacy Commissioners.
Why is any of this even a thing?
Ringing in the New Year
Ring, now owned by Amazon, was yet again caught with it’s privacy pants down. Sam Biddle, writing for The Intercept:
Beginning in 2016, according to one source, Ring provided its Ukraine-based research and development team virtually unfettered access to a folder on Amazon’s S3 cloud storage service that contained every video created by every Ring camera around the world.
According to one source of The Intercept. Another publication, The Information, reported on some of this last month as well, interviewing two dozen current and former dozen employees, and business partners, and reviewed scores of internal documents, presentations, communications, and more.
At the time the Ukrainian access was provided, the video files were left unencrypted, the source said, because of Ring leadership’s “sense that encryption would make the company less valuable,” owing to the expense of implementing encryption and lost revenue opportunities due to restricted access.
I’m not sure what “lost revenue opportunities” means here, unless Ring thought watching the video would give them new product ideas or, horrifically, intended to monetize what was coming off those feeds in some way?
The Ukraine team was also provided with a corresponding database that linked each specific video file to corresponding specific Ring customers
So, they didn’t just get to see what, they got to know who.
At the same time, the source said, Ring unnecessarily provided executives and engineers in the U.S. with highly privileged access to the company’s technical support video portal, allowing unfiltered, round-the-clock live feeds from some customer cameras, regardless of whether they needed access to this extremely sensitive data to do their jobs.
Only an email address was apparently needed to get into anyone’s home, which sounds absolutely conspiracy-theory nuts, until you remember Uber was caught doing something similar back in 2016, using a “god-mode” to spy on exes, politicians… Beyonce.
A second source, with direct knowledge of Ring’s video-tagging efforts, said that the video annotation team watches footage not only from the popular outdoor and doorbell camera models, but from household interiors.
Your location: For Sale. Cheap.
Earlier this week, Vice’s Motherboard reported that cell phone carriers had again been caught selling our location data to bounty hunters, debt-collectors, and others. Joseph Cox:
I gave a bounty hunter a phone number. He had offered to geolocate a phone for me, using a shady, overlooked service intended not for the cops, but for private individuals and businesses. Armed with just the number and a few hundred dollars, he said he could find the current location of most phones in the United States.
$300 to be exact.
The bounty hunter sent the number to his own contact, who would track the phone. The contact responded with a screenshot of Google Maps, containing a blue circle indicating the phone’s current location, approximate to a few hundred metres.
And how does this all work?
Although many users may be unaware of the practice, telecom companies in the United States sell access to their customers’ location data to other companies, called location aggregators, who then sell it to specific clients and industries. Last year, one location aggregator called LocationSmart faced harsh criticism for selling data that ultimately ended up in the hands of Securus, a company which provided phone tracking to low level enforcement without requiring a warrant. LocationSmart also exposed the very data it was selling through a buggy website panel, meaning anyone could geolocate nearly any phone in the United States at a click of a mouse.
It’s bad enough that access to highly sensitive phone geolocation data is already being sold to a wide range of industries and businesses. But there is also an underground market that Motherboard used to geolocate a phone—one where Microbilt customers resell their access at a profit, and with minimal oversight.
And that’s just this week. But the stories come out every week. Google and Facebook, so many times. And so much that we risk being desensitize to it. That the horrific risks becoming accepted.
That’s what Apple is tackling with its very public, incredibly pro-active stance on privacy. It’s betting a large part of its competitiveness and credibility on it.
At the 40th International Conference of Data Protection and Privacy Commissioners, Tim Cook used his keynote to advocate for privacy regulation:
We at Apple are in full support of a comprehensive federal privacy law in the United States. There, and everywhere, it should be rooted in four essential rights: First, the right to have personal data minimized. Companies should challenge themselves to de-identify customer data—or not to collect it in the first place. Second, the right to knowledge. Users should always know what data is being collected and what it is being collected for. This is the only way to empower users to decide what collection is legitimate and what isn’t. Anything less is a sham. Third, the right to access. Companies should recognize that data belongs to users, and we should all make it easy for users to get a copy of…correct…and delete their personal data. And fourth, the right to security. Security is foundational to trust and all other privacy rights.
Now, there are those who would prefer I hadn’t said all of that. Some oppose any form of privacy legislation. Others will endorse reform in public, and then resist and undermine it behind closed doors.
They may say to you, ‘our companies will never achieve technology’s true potential if they are constrained with privacy regulation.’ But this notion isn’t just wrong, it is destructive.
Fines are good, fines are great. But so are criminal charges for companies and employees who spy on us and steal our data, or enable violations and abuse, whether it’s through a window or doorbell camera, stalking or selling location data.
But that’s the government protecting against abuse by companies. What about protecting against abuse by the government?
Everything from the Snowdon disclosures to the FBI’s attempt to force Apple to unlock iPhones beyond the scope of any existing laws, the government has proven not just as incapable of self-regulating, but intent on regulating access that would cripple encryption and — no hyperbole, none, zero — destroy functional privacy for everyone.
I don’t have an easy answer to that. I only have a hard one — the right to remain private.
The recognition that our devices have become external storage not just for our data but for our minds — our memories, our ideas, our finances, our health records, our diaries, our sex lives, our most personal and private thoughts and dreams.
And, as technology progresses, our external storage will become internalized, and our biological minds will become readable, by some for of cybernetics.
And, if we don’t start talking about and preparing for the need to protect ourselves now we’ll have a much harder time doing it then.
At the extreme, we should discuss not just the type of privilege extended to spouses, priests, lawyers, and doctors, but the type of rights against self-incrimination some jurisdictions, including the U.S., already holds sacred.
Yes, it will make law enforcement harder, the same way the lack of finger-printing and DNA scanning at birth makes law enforcement harder, but the entire purpose of human and civil rights is to put the interests of the individual before the interests of the state. To make their work harder in order to keep our rights safer.
Some people content the age of privacy is over. That we’ve lost it and we’ll never have it again. Not even the expectation of privacy. That we should just make peace with governments listening in to all our communications, service providers selling all our data, internet companies putting cameras and mics in our bedrooms, living rooms, children’s rooms.
That the cost savings and convenience are more than payment enough for stripping us effectively naked and spreading us eagle across the internet.