Encryption and food for thought for future solutions
This morning I read about the The NSA’s Hidden Spy Hubs in Eight U.S. Cities. We pretty much all know that this is happening, maybe not on that extend though.
The first example of a type of surveillance that we saw on a wide scale was credit cards. Every time we make an economic transaction a record has to be made of who we did business with, what we did with them, how much money we spent, where we were physically located, and what time it took place — all of which goes into a server.
It makes it easier to buy and sell things. We very willingly participate in that, it’s convinient.
The same applies to the communication services or search enginees that we use.
We certainly can not count on the government to respect or help to protect our privacy. ( the constitution itself is not clear about it). Regulation is backward-looking. The GDPR only came after some of the most high-profile data breaches had already happened. We will need to “hack” our way on privacy.
In this post, I describe how our lives and messages get transmitted through different platforms and why they end up on third parties. Also, I will describe some tangible solutions to keep in mind everytime we design new products.
How our texts are transmitted over those apps?
When sending a message to a friend, we assume the message is securely traveling between phones and computers.
Every time a message is sent it is sent encrypted to a server from us.
By encrypted we mean that no one on the network can get between our device and the server and read our text. This is possible by using a technology called “private/public key encryption”.
For example, if Vincent wants to send a “Hi” to Mia, during the journey of his “Hi” from him to the messenger’s server the text will look like something this: fkgtjefj5gd9t3jjfkj79fjejrogdjskj7fjkjjk
So if a hacker gets between you and the server she will not be able to read it. This is amazing until the moment the text arrives to the server. And this is where the end of our privacy happens. When the text arrives to the server ( e.g. AT&T, WhatsApp, Google servers’) it is stored in plain text.
Then the text gets encrypted again and travels to Mia so she can read it.
This is how Facebook messenger, Telegram and WeChat work.
To sum it up, your data are not encrypted. The channel that transmits them is. But where you store them is not.
Most of the email/messenger services that we use today can scan through the content and extract information about your interests, habits, purchases etc and then figure out what ads to serve you later for example. If a government agency knocks on their door and ask for a bunch of user data, it’s up to the companies to decide how they are going to use them, not the users.
End to end encryption and other solutions
Many anonymizer tools or a proxy sites are available to mask your IP address and some of the info about our computer when we surf the web. Also, other strategies are worth trying, such as using multiple browsers, so we mix up our data etc. Unfortunately, most of these solutions might be a bit complicated for most people.
The thing is that if we want to keep things private we have to think about how to tackle it, which adds an additional friction.
This is because currently, security is of secondary importance to founders and people writing code and what we need is a shift in the culture of programming.
A relatively simple solution to many of these problems is to design security into products from the beginning rather than having to come up with retrofits on top of them to fix problems that may arise.
End to end encryption
One solution on how to design services is the end to end encryption. In other words, the apps are encrypted and only readable from the recipient. So even if the data are stored on the server, none but your friend can read them.
All messages are encrypted when they are transmitted but also when they are stored on messenger’s server. Signal messenger works like this.
Anonymity or pseudonym
Also, data anonymity or pseudonym is another way to prevent data exposure.
When my team and I were running Sunshine, we tried to find a fair and equal way for people to get information based on pseudonymity. Sunshine could effectively collect environmental data passively from the sensors of a smartphone and actively through users’ input, without the owner ever having to sign up on the onboarding.
We didn’t really need their actual name or phone number to provide them with personal recommendations. Users could share their information about their skin sensitivites for example and we were providing recommendations and personalize their experience, all without violating their privacy. Users could decide to delete the app without leaving any footprints behind stored.
Users appreciated this freedom from identity and it was actually a catalyst of growth.
Except for the underlying technology, we need to make sure that the services that we use or design have a clear message about why and how they use user’s input. Trading our music data preferences to get personalized recommendations on new songs is completely different than using a messenger to communicate and ending up learning that all our conversations are in the NSA.
Privacy is about consent and self-respect not about hiding and in order to get it back we need to evolve.
The internet was designed for sharing information so we can all have equal access to knowledge and information no matter our location.
We need to design systems that lead towards this direction and we need to do it before it turns against us: both in terms of privacy and the way we form our opinions.
I’d love to hear what you think and your proposals about it. Tweet at me here.
If you want to dive deeper:
Making Private Communication Simple by Moxie Marlinspike