People shouldn’t care about privacy

August 26, 2021
  -  
Rand Hindi

This blog post was slightly updated on the 9th of November 2023.

A technique called homomorphic encryption is about to radically change how we guarantee privacy online.

I have to admit to committing a crime.

In 1999, when I was 14 years old, I helped a friend create a social network that became quite popular with teenagers. It had all the features of a modern social network: a profile page, photo albums, and the ability to send private messages to other members. Back then, all we cared about was having fun and being popular at school. Being amongst the few people of our age who could code, we felt we had a superpower in being able to create websites.

A couple of years into running the site, someone started to bully me at school. Not knowing what to do, I figured that if I could dig up some dirt about him, I could use it to defend myself. This is when I realized I had access to a goldmine: his private messages on our social network. So I went into the database and started reading his private conversations, eventually finding enough embarrassing things that I could walk up to him and blackmail him into leaving me alone.

While there might be some justice in bullying a bully, what I did was fundamentally wrong. Sifting through people’s private messages made me feel sick because I knew I shouldn’t have the ability to do so. Just because I could code didn’t mean I had a right to people’s private lives. At that point, I made a vow to do my best to protect people’s privacy online.

A brief history of privacy online

Back when the internet was created, nothing was encrypted. You would access a website through a specific address, which started with “HTTP”. At the time, this wasn’t much of an issue because all that people accessed online was content, which by definition was already public. But things got a little more complicated with email. When email was unencrypted communication, anybody on the internet could listen to the data coming in and out of your computer, and thus, read your private messages.

At the time, a group of cryptographers and privacy enthusiasts, called the cypherpunks, started thinking about the future of privacy online. Their premise was to use modern cryptography to ensure that information online could not be read by anyone other than those for whom it was intended. This culminated in the publication of “The Cypherpunk’s Manifesto” in 1993, which laid the foundation for all the modern privacy technologies now employed on a daily basis. The manifesto was visionary, as we can see from the following snippets:

  • Privacy is necessary for an open society in the electronic age.
  • When I ask my electronic mail provider to send and receive messages, my provider need not know to whom I am speaking or what I am saying or what others are saying to me.
  • Privacy in an open society requires anonymous transaction systems.

And perhaps most importantly, the manifesto stated, “We are defending our privacy with cryptography, with anonymous mail forwarding systems, with digital signatures, and with electronic money.

The cypherpunks weren’t just idealists; they were builders. So they went on to create the very tools they believed the world needed in order to safeguard their privacy. Phil Zimmerman, the inventor of end-to-end encryption and pretty good privacy (PGP), was a cypherpunk. Julien Assange, the founder of Wikileaks, was a cypherpunk. Bram Cohen, the inventor of BitTorrent, was a cypherpunk. Many of the core Tor developers were cypherpunks. Even Satoshi Nakamoto, the pseudonym for the inventors of Bitcoin, were cypherpunks. We owe the little privacy we have online today to the cypherpunks.

While there were some historical uses of encryption in private communications, it wasn’t until the advent of the internet and the cypherpunks in the 90s that cryptography became popular amongst the general population. Governments opposed the rise of the civilian use of cryptography, which until then had mainly been used by the military. Indeed, cryptography made it hard or even impossible to engage in mass surveillance of the population. So governments tried to ban cryptography, forcing telecom companies to add a “backdoor” — a way to decrypt communications without people knowing — to their services.

The cypherpunks fought back, and for years, there was a Crypto War between researchers, who claimed backdoors would do more harm than good, and governments, who claimed that without a backdoor, criminals would have a free pass. It was finally accepted that having a backdoor for the US government meant anyone with sufficient resources (e.g., a foreign government like Russia) would eventually be able to access the same information. By engaging in mass surveillance of their populations, governments would enable their enemies to do the same, thereby weakening their sovereignty. Fortunately, the US government backtracked on its anti-privacy stance, and anyone was now free to communicate privately, in an encrypted way.

From this point, it became possible to design a new internet protocol where communications would be encrypted, such that no one could see the data being sent back and forth between users and the websites they accessed. This new internet protocol was simply called HTTPS, with the “S” meaning “secure”. This is what the little lock in the address bar of your browser means.

Since its inception in the 90s, HTTPS has gained mass adoption and now represents over 80% of internet traffic. Problem solved, right? Well, not really. As we discovered in 2013 with the Snowden revelations, governments did not eliminate surveillance. Instead of intercepting communications, they went straight to the online service providers to ask them for their user data.

This crisis is compounded by the fact that the more a company is successful, the more user data they can potentially access. Thus, they are more likely to become targets of mass surveillance and data theft, both of which have risen exponentially in the past decade. Today, having access to a single company’s user data means being able to retrieve the data of millions, sometimes even billions, of people.

https://www.ibm.com/reports/data-breach

Clearly, securing the transmission of data is no longer sufficient. The reason we still have mass surveillance and data breaches today is that our data isn’t encrypted during processing. There seems to be a fundamental paradox where to be able to use the internet, we need to give access to our data, making us prone to surveillance and data breaches. Does it mean that we must give up our privacy to use online services? Fortunately not, and the solution is, once again, cryptography.

Introducing fully homomorphic encryption (FHE)

Fully homomorphic encryption (FHE) is a technique that enables data to be processed blindly without having to decrypt it.

As a user of a web service, you would encrypt your data using a secret key and send the encrypted data to the web server, where blind processing would occur. The result would then be encrypted and sent back to you, and you could then decrypt it using your secret key. From your perspective, nothing would change; you would still send data to a web service and get a response. But the company providing the service could now do so without seeing your data and having to secure it, as your data would not only be encrypted in transit but also in processing, with the end result also encrypted (i.e., it would be encrypted end-to-end).

With FHE, nobody would be able to see your data but you. Governments, hackers, and even the company providing the service wouldn’t be able to see your data because they wouldn’t have your key. They also couldn’t break it, as the type of encryption used in FHE can even resist quantum computers.

To illustrate how FHE works, let’s imagine you want to apply a filter on an image by sending it to a cloud photo editing tool. This is how you would do it with FHE:

Step 1.

Chose an image, and encrypt it using a secret key

The secret key will be used to encrypt the image, producing what effectively looks like a bunch of random pixels. Without the secret key, it is impossible to know what the original picture is.

Step 2.

Upload it to the cloud, and choose which filters to apply to it

Let’s say you now want to apply an artistic filter to it using an online cloud photo editing service. The company operating the service has no way of knowing which picture you sent because they don’t have your secret key. However, they can apply filters to the encrypted image, producing a new encrypted image that also looks like a bunch of random pixels.

Step 3.

Download the result, and decrypt it using your secret key

When you’re done, simply download the new random image, and decrypt it using the same secret key you used to encrypt the original image. You now see the result of applying the filters to your image, despite never having shown it to anyone.

You can try this demo live, on Hugging Face here: Encrypted image filtering using Homomorphic Encryption.

FHE might feel like magic at first, but it is actually based on the well-known mathematical concept homomorphism, which means “same structure”. In essence, the secret key that is used to encrypt only changes the interpretation of the data, not the structure of it. This enables you to apply mathematical operations to the encrypted data and reinterpret the result using the original secret key. The secret key is like a Rosetta Stone: unless you have it, you cannot understand what you are looking at.

Why then aren’t we using FHE everywhere today? The answer is that although FHE was imagined in the late 1970s, no concrete realization was available until 2009. Originally, FHE was too slow to be useful — about a million times slower than doing the same thing without FHE. Something that took a second unencrypted took 11 days encrypted. Not exactly user-friendly!

However, this has changed. New cryptographic breakthroughs combined with powerful hardware means that we are closing the gap between the time it takes to work without privacy and the time it takes to accomplish the same tasks with FHE. In the past two years alone, we progressed from being 1,000,000 times slower to being 10,000 to 1,000 times slower, and we are on track to be less than 10 times slower by 2025. This marks the moment where FHE can become ubiquitous, to employ everywhere we desire privacy.

Examples of areas that FHE will enable include the following:

  • Preventive Medicine: Imagine knowing in advance what you need to do to stay healthy throughout your life. This is increasingly possible with AI but requires sharing all your health data — everything from your DNA to your medical history to your lifestyle habits. With FHE, you could send all of this data in encrypted form, and the AI would respond with encrypted health recommendations that you alone could see.
  • Facial Recognition: From science fiction to the palm of your hand, facial recognition is now a part of our everyday experience. We use facial recognition to enter buildings, to unlock our phones, to tag people in pictures, and soon, to log in to websites everywhere. This, however, requires your biometric fingerprint to be on file, which, in the wrong hands, can be used to impersonate you. With FHE, you could authenticate yourself securely, without anybody being able to steal this biometric data.
  • Voice Assistants: Every time you or someone in your family speaks to Siri, Alexa, or Google Assistant, personal information is sent to the companies behind them. With FHE, your voice query would be sent encrypted to your AI assistant, and they could respond without actually knowing what you asked! This means you would no longer have to worry about your family’s data being misused or stolen. It would no longer matter if you had microphones in the most sensitive places in your home because nobody would be able to listen to what you say.
  • Private Smart Contracts. By design, blockchains are public, meaning all the user data flowing into web3 applications are visible to the entire world. With FHE, we can enable private smart contracts, where the inputs and outputs are encrypted end to end, meaning you can safely build decentralized applications that use sensitive personal
    data.

Even targeted advertising could be done homomorphically. There is no limit to what this technology could enable. At some point, the only reason for not using it would be that either a human being has to be involved in providing the service (sometimes acceptable) or the company providing the service is selling your data (never acceptable).

A new internet protocol for end-to-end encryption

Reaching FHE ubiquity is a key milestone for internet privacy, as it means everything online could become encrypted end-to-end, without compromising the user experience.

This, in turn, could give rise to a new internet protocol, HTTPZ, that would standardize end-to-end encryption and replace HTTPS as the default protocol that online services implement. Web services that use FHE would simply expose an HTTPZ url, while web browsers would take care of encryption and decryption on the user side. To end users, everything would happen in the background, without any action needed on their part.

When this happens, people won’t care about privacy anymore, not because it doesn’t matter but because it will be guaranteed by design at the internet protocol level.

We are a team of cryptographers and engineers passionate about privacy. We build open source FHE technologies to help companies make their products encrypted end to end. Our hope is to one day see this HTTPZ vision come to life, so that we can put an end to mass surveillance and data breaches!

Additional links

Read more related posts

How We Hire at Zama

When we think about how to build our team, the first thing we think about is the culture we want to create!

Read Article

The (r)Evolution of FHE

Machine Learning and Homomorphic Encryption.

Read Article