Meta’s Biggest Encrypted Messaging Mistake Was Its Promise

Since the 1990s, governments around the world have often used the welfare of children as an excuse for all kinds of internet policy overreach: encryption backdoors, centralized censorship mechanisms, and anti-anonymity measures. So when Meta, facing pressure from the government as well as NGOs, announced its decision last week to delay the rollout of end-to-end encryption for messaging systems such as Instagram DMs and Messenger—with child safety as the cited reason—privacy advocates were understandably upset and suspicious. But speaking as someone who previously worked on safety and security at Facebook, I don’t view the delay as an arbitrary political decision. The concern over the safety of young users is genuine, and the problems are pervasive, especially when it comes to social systems as complex as those at Meta.

Frustrating as it may be, the company’s delay is likely justified. Some form of end-to-end encryption should be available to all people, to preserve the right to private communication and prevent government incursions. But end-to-end encryption isn’t just one issue or technology—it’s a broad set of policy decisions and use cases with high-stakes consequences. As such, creating the proper environment for its use is a complex task. The need for end-to-end encryption, as well as the conditions required to implement it safely, vary for each platform, and apps like Facebook and Instagram still require serious changes before it can be introduced without compromising functionality or introducing safety risks. Meta’s greatest misstep isn’t this latest delay but rather the timeline, and perhaps even the outcome it promised.

When then-Facebook first announced its timeline to implement interoperable end-to-end encryption across all its properties in 2019, its immediate infeasibility was clear. The proposed timeline was so rapid that even producing the technology itself would be nigh impossible, with safety mechanisms barely entering the picture. Systems like WhatsApp already had end-to-end encryption and content-oblivious mechanisms for detecting some kinds of harm, and it was assumed this would readily translate to other Facebook properties.

However, apps and sites like Facebook and Instagram are wildly different in architecture and dynamics than WhatsApp. Both implement direct messaging alongside systems that attempt to actively connect you with people, derived from a combination of reading users’ phone books, algorithmically determining similar accounts based on locations, interests, and friends, as well as general online activity. In the case of Facebook, large public or private groups also facilitate expansion of one’s social graph, along with global search of all accounts and grouping by institutions such as schools. While apps like WhatsApp and Signal operate more like private direct messaging between known contacts, Facebook and Instagram’s growth-oriented design leads to situations where abusers can more easily find new victims, identities and relationships are accidentally exposed, and large numbers of strangers are mixed together.

These fundamental differences mean that before Meta can safely switch all of its platforms to end-to-end encryption, its apps must undergo some nontrivial changes. First off, the company must improve its existing content-oblivious harm-reduction mechanisms. This involves using social graphs to detect users who are trying to rapidly expand their networks or to target people of certain demographics (for example, people of a particular declared or inferred age), and finding other potentially problematic patterns in metadata. These mechanisms can work hand in hand with user reporting options and proactive messaging, such that users are presented with safety messaging that informs them of their options for reporting abuse, along with efficient reporting flows to allow them to escalate to the operator of the platform. While these types of features are beneficial with or without end-to-end encryption, they become significantly more important when the ability to inspect content is removed.