Telegram advertises itself as a private and secure communication channel, emphasizing things like encryption and protection “from hackers.” Often, this image is picked up in the mass media, where it is not uncommon to find Telegram being described as a “highly encrypted” or “ultra secure” messaging app.

Let’s take a closer look at Telegram’s actual security model and technical properties that define what kind of privacy it provides in practice.

But first, a little context.

It is important for me to be in control of the technology I use. To be able to decide what kind of technology mediates my interaction with information online and my relationships with other people, and in which way. And to be able to change that if I need.

I don’t agree with the terms of use and many abusive practices of Facebook, therefore I don’t use it. For the same reason, I don’t use some of the other popular services which make money by soaking up as much info about you as they can.

I also run almost exclusively free (also known as libre or open-source) software on my devices, as it is designed to guarantee the user “the freedom to share and change all versions” of the software, as well as to make it very hard for anyone to hide spyware inside it.

The services I didn’t want to involve myself with included Telegram, until recently. Well, to an extent it still does. I started using it when my coworkers asked if I would consider it as a secondary channel to email, since they all have it installed on their mobile devices. As a compromise I have decided to give it a try, albeit keeping the use of it mostly limited to this group. This prompted me to investigate Telegram a bit closer, and hence this blog post.

Positive things deserve credit

While there are many critical points in this post, I really want to note that Telegram did a few important things differently (and better) than some other “big players.”

For instance, a notable difference between Telegram and, say, Facebook’s Messenger or Skype is that they release the source code of the client apps (i.e. the part of the system that you actually install on your device) as free a.k.a. open-source software under the Gnu GPL license. There are some important issues with how they do it in practice (see below), but nevertheless this is a big plus.

Thus, in theory anyone can study the code and understand what it does, as well as modify it to remove any unwanted behavior.

They also make it possible to write independent alternative clients by publishing an open-source library as well as providing a public API for interacting with their servers.

Another interesting thing is Telegram’s business model. It doesn’t seem to be based on targeted advertising, which is nice, but on the other hand they depend on financing from Pavel Durov. I don’t know what his interests are and how they may change in the future.

Which brings me to…

Telegram’s interpretation of privacy

1. Phone numbers and contacts

It is probably no surprise to you that Telegram requires you to identify yourself using a phone number. And with the default settings (Contact Sync enabled and if granted access to contacts) it also uploads your entire phone book to its servers, including phone numbers and names of people who don’t use Telegram. (There is a function to delete that data from their servers, but it is not very reassuring, since there is no way you could know it was actually deleted.)

Phone numbers are very interesting from privacy perspective. They are more valuable than names - since they are globally unique. They can’t be created by you and must be allocated to you by a cellular company. In some countries that involves the obligation of registering your phone number with the government. Phone numbers are also very physical - they (mostly) require a separate device to operate. As such they represent your offline, physical identity.

The phone number requirement makes anonymity practically impossible on Telegram. So if you are a whistle-blower who needs to protect their identity, Telegram is probably not the best option for you. Consider this incident from 2016 (quoting Telegram’s blog, emphasis mine):

Certain people checked whether some Iranian numbers were registered on Telegram and were able to confirm this for 15 million accounts.

According to Reuters, this included the mapping between phone numbers and Telegram accounts - a fact that Telegram conveniently omitted in their blog post, claiming that only “public information” was revealed. The “no big deal” style of Telegram’s response really shows that protecting users’ phone numbers wasn’t a priority for them. More recently, Telegram have admitted they have “detected evidence” that contact discovery may have been abused by authorities in Hong Kong to reveal the phone numbers of protesters in group chats.1

This is why, when phone numbers are used for “verifying” someone’s identity as opposed to making phone calls per se, they become a mechanism of control rather than a communication tool. They are used to take away your ability to define your own digital identity. They become an ID you are required to present in order to use technology (while also being encouraged to reveal the IDs of everyone you are or were in contact with).

This is the opposite of privacy.

On top of that, phone numbers have poor security - it doesn’t take much to hijack a sim card remotely; or your government may have a system in place to intercept the content of SMSes (this has also happened in Iran in 2016). Either way, by taking control of your phone number, an attacker can take over your Telegram account if you haven’t enabled 2-step verification with a password.

I believe that a crucial element of privacy is the ability to control your identity in different contexts. This is what we do all the time in the offline world. You don’t say your full name and phone number to every stranger on the street. You have the freedom to introduce yourself however you wish (for instance, by just saying your first name without any details) in different contexts.

This is not unusual in the digital domain either. Using a pseudonym was the norm on web forums before Facebook. And it is not terribly rare to have separate mailboxes for different communication contexts.

So while I understand it’s very convenient for Telegram to tie everything to phone numbers (as it is for Signal), it significantly degrades privacy, as it gives them a pretty direct picture of a person’s real-life connections, regardless if the connections are happening through Telegram or not. They end up collecting a very valuable asset - complete social graph of millions of people, many of whom don’t use Telegram and didn’t give Telegram permission to collect their phone numbers.

Edit in 2021: Telegram has added an option to prevent random people who have your phone number from discovering you on Telegram; however, there’s no way to enable it when you create an account - only afterward. The result is that right after you register, Telegram sends a notification to everyone who happens to have your phone number and shows them you have a Telegram account. It doesn’t even give a warning that this is about to happen.

2. Encryption confusion

On their homepage Telegram advertises that your chats are “heavily encrypted.” Well, saying something is encrypted means nothing without mentioning who has the key. In Telegram, there are two kinds of encryption of chats:

  1. The normal, so called cloud chats are encrypted between you and the server (just like any web site that uses https, no big deal in 2019), and stored encrypted on the server, according to Telegram’s Privacy Policy. However, the server has access to the decryption key, as Durov has said. If they are hacked or their servers confiscated, your chats are busted. (They claim to have a “cross-jurisdiction” network of servers with the key material kept away from the content, but this does not change anything - the server can and does see the content of your messages, and where it is stored is completely irrelevant.) It’s like asking someone to store something for you in their safe, and they say they are going to store the key at their grandma’s. They control access to the key and the safe is at their place, that’s it.

  2. Secret chats. Here the encryption is actually relevant, since in this case we are talking about end-to-end encryption. These chats are encrypted in such a way that only the recipient’s device can decrypt it. Telegram’s servers only know the fact that a message has been sent, but can’t read its contents. For this to have the security it promises though, you need to make sure the encryption keys your device uses match the ones on the other end. Otherwise there is no way to know no one is sitting in the middle, silently decrypting and re-encrypting the messages with another key (known as a man-in-the-middle attack).

Now, the secret chats are not the default on Telegram. The Desktop client does not support them at all. Group secret chats are not supported. And they don’t synchronize between your devices. These features aren’t anything extraordinary - other notable projects (like the OMEMO open protocol, Signal and Matrix/Riot) support them, but for some reason Telegram and Durov think having end-to-end encryption by default is a wrong way to go.

I don’t know what percentage of chats on Telegram are secret chats, but I suspect that the majority aren’t, for the reasons described above.

This leaves us in a situation where the content of a big chunk of chats on Telegram is accessible to Telegram’s servers and to whoever may gain access to them now or in the future. And someday someone will misuse that data, either from inside or outside of Telegram (again, I don’t see how the “cross-jurisdiction” key storage has anything to do with real security). The only reliable way to protect a piece of user data is to not have it in the first place, or to not have the decryption key for it.

What I find most disappointing is that Telegram and Durov personally are promoting the idea that “average fella” doesn’t need end-to-end encryption and should sleep well while all their messages are accessible to Telegram.

3. A different kind of open-source

Edit in 2021: Telegram has addressed some of the issues discussed below in 2020 by providing reproducible builds for Android and iOS. I haven’t tried to actually verify the builds are reproducible (and there were some peculiar bugs around that). I have left the rest of the this section unchanged for context.

While noting the positive characteristic of having the client app code released under a free (libre) software license, the GPL, I mentioned that there is an important caveat.

If you go to Telegram’s page on Google Play Store, you wouldn’t find any reference to the source code or the GPL. The UI of the app itself, to best of my knowledge, doesn’t contain any mention of the GPL. In fact it appears that what you install from Play Store is not actually free software under the GPL. (The same thing applies to the iPhone app.)

They release an often outdated version of the source code on GitHub. It is this version that you can use under the terms of GNU GPL.

If you look at the history of the source code repository, you’ll see that some versions of the software were simply skipped, for instance the source code for any of the 1.5 or 1.8 versions was never published.

This opens a window for a situation in which they could be forced, cracked or simply change their mind to start including a backdoor in the App/Play Store version of the app, but not in the open-and-clean source code release. Or they can simply skip releasing the source code of a specific version that they wouldn’t want released.

It is therefore very important for a privacy-focused communication system to allow users to run a version of the software that they have independently compiled from public source code, on equal terms as the official binaries. This is not the case with Telegram, as the source code is not released in sync with the (Android) app. And even if you run a version of the app that is compiled from public source, your chat buddy probably runs a version from Play / App Store that may be backdoored.

The whole idea of the GPL is that it is meant to guarantee the user certain freedoms, such as to have access to the source code of the program, study it and adapt it to their needs, redistribute it and independently audit it.

My argument here is that these freedoms do not apply at all to the app on Play Store. You can’t study or modify this version of the app, and nothing requires Telegram to provide to you any source code for the app you use. This is radically different from most software usually called free / libre / open-source software, like the Linux kernel or Mozilla Firefox. Usually, the most recent source code is available first, then binaries (the executable form of the program), not vice-versa.

The version of Telegram I’m using is not coming from Play Store. Instead it is an independently maintained branch (so-called fork) of the Android app source code called Telegram-FOSS. In fact, I don’t even have Play Store on my device.

In case of the iPhone app, the published source code, as of writing, contains neither a copy of the GPL nor does it reference it other than inside third-party libraries. They declare the app as GPL on their website, but this is not sufficient under the GPL itself, as I understand it. This is a weird and questionable way to release something as free software.

Only the desktop app seems to have the most recent, pre-release source code available and is open to outside contributions - much more in the spirit of free software.

4. Closed-source server-side

Quoting Telegram’s FAQ:

Telegram is open, anyone can check our source code, protocol and API, see how everything works and make an informed decision.

This is not accurate. While Telegram releases the client code, no server software is published. No one can audit the code that handles your messages, contacts, and metadata when they leave your device and reach Telegram’s servers. No one can verify how your data is stores and how “heavily” encrypted it is.

And even if Telegram did release the server source code, there would be no way to know they actually run the same source code they release, or that there is no other malicious software running alongside it on the same server. It is technically impossible to ensure what software is running on a server you don’t own, even if the source code of such software is supposedly public. The “you can read the code” argument simply doesn’t work for remote systems such as servers.

The only real (albeit not perfect) solution for this is properly implemented end-to-end encryption, whereby the server can’t read the content of the messages.

Yet even with end-to-end encryption, the server still has all your metadata like who and when you speak to, your list of contacts, your pattern of life (times of day you are most active, your IP addresses, etc).

With systems like email or jabber, it is normal for people and communities to run their own servers so that they choose themselves who to trust the server administration with. They can exchange messages with anyone regardless if they are on the same server or not. The system is thus decentralized, and power (and data) is not concentrated in the hands of a single company.

This is when having access to the code of the server software could actually be useful - to run independent Telegram servers, not owned or controlled by Durov. But this doesn’t seem to be in Durov’s interest.

The proprietary and centralized nature of Telegram reduces the benefit of having their client code open-source, as at any moment Telegram may introduce a restriction on the servers, for instance by blocking access from certain IP addresses (such as Tor) or by banning certain API keys (i.e. apps) from interacting with the servers. In fact, the requirement of providing your phone number is an example of an already enforced restriction. No matter the client is open-source, you can’t use it if you don’t give Telegram the information it demands.

Summary

If you look at the website of Telegram, you might get the impression that it is a bastion of encrypted and secure messaging, providing you with state-of-the-art uncrackable privacy. Buzz-wordy yet empty phrases like “heavily encrypted” (things are either encrypted or not, what matters are the keys) and “safe from hackers” create a false sense of security and privacy.

Nothing is 100% safe from “hackers” (or more accurately, crackers), ever. Declaring a service uncrackable is dishonest or naive. But certain things, if used consciously and taking into account their strengths and limitations, are safer than others - and that includes end-to-end-encrypted communication implemented via fully free/libre/open-source software (client and server, preferably having undergone an independent security audit and / or battle-tested).

Yet Telegram doesn’t favor end-to-end encryption and leaves that for the “paranoid”. The so-called “secret chats” (again a misnomer since message time stamps and destinations are visible to Telegram) are limited in functionality and not supported on Telegram Desktop.

And the hard requirement to define an account by a phone number plus the eagerness to get the user’s full phone book certainly doesn’t hold as a privacy-above-all approach, in my view.

The promise of privacy that Telegram makes on their homepage is not backed by concrete uncompromising technical choices. The only privacy they objectively provide with the normal a.k.a. cloud chats is so-called “privacy by promise” - essentially they simply claim not to look at your content (to which they have decryption keys) and that no one will steal it from them no matter how intelligent the attacker is. And with secret chats, you still have the problem of metadata and lack of anonymity.

I will let myself conclude that the reputation of Telegram as a highly private messenger is… a myth.

  1. The attack vector seems to be the following: the attackers start with a list of phone numbers they want to test (maybe retrieved from a cellular operator or some kind of government database) and add these numbers as contacts on a device with a regular Telegram account. If any of these numbers are in the same group chat as the attackers, they will be happily revealed by Telegram. The way Telegram are planning to fix this is to allow users to opt-out of this “feature” for their account, which most people will probably not know they need to do.