SOPA Images/LightRocket via Getty Images
On the surface, the announcement slipped out by Facebook this week, that a range of new safety measures are being added to Messenger, seems just business as usual. But far from it. There are serious implications in what Facebook is now doing. The announcement is all Messenger, but the context is all WhatsApp. And so this matters for more than 2 billion secure messaging users worldwide.
The update is crucial—both for what it does and for how it does it. According to Facebook, its new update is intended to enhance user security on Messenger—its other platform, the one without all the security bells and whistles. Once deployed, a new monitoring system will detect scams and protect minors.
But that’s not the news. Messenger isn’t encrypted, meaning that monitoring is easily achieved. The real news is that Facebook has designed a radically different approach, one that works on encrypted chats. Facebook says this is to prepare for Messenger’s imminent encryption—but in reality is has larger implications for WhatsApp, the world’s most popular encrypted messaging platform.
Facebook is leading the charge against the push by law enforcement agencies in the U.S., U.K. and elsewhere to weaken end-to-end encryption, to mandate backdoors such that investigations can access content. Before COVID-19 took over the world, this was a major news story and it seemed inevitable that some form of mandated compromise would be forthcoming, that security would be weakened.
A year’s worth of cajoling back and forth has ultimately resulted in the EARN-IT bill wending its way through the U.S. system, a bill that, if passed, would see messaging services become legally responsible for the content on their platforms. While not mandating backdoors, per se, without some form of probes into message content, the argument runs that the punitive risks become unsurvivable.
As Sophos explained, “there’s a bill tiptoeing through the U.S. Congress that could inflict the backdoor virus that law enforcement agencies have been trying to inflict on encryption for years… The choice for tech companies comes down to weakening their own encryption and endangering the privacy and security of all their users, or foregoing protections and potentially facing liability in a wave of lawsuits.”
The debate is complex—the two sides don’t perfectly line up. This was illustrated just a few weeks ago, when the U.S. National Security Agency awarded top-marks to encrypted platforms in an advisory notice on the safest messaging platform.
The encryption debate is just one part of the wave of legislation and regulation in the last year, intended to clean up social media, to remove hatred and extremism, to make these spaces safer for young and old. And so Facebook’s latest move, to help users avoid scams, to spot imposters and, critically, to protect minors from adult abuse, can be seen in this generic light. But it’s more complex than that.
In a blogpost on May 21, Facebook’s Director of Product Management, Messenger Privacy and Safety, Jay Sullivan, set out these Messenger enhancements. Some of these have been available on Android for a couple of months, iOS users will start to see them from next week. He explained that the new safety features “will help millions of people avoid potentially harmful interactions and possible scams without compromising their privacy.”
The measures themselves all seem supremely sensible. Through the analysis of metadata around the chats, who’s messaging who, when, how often, the platform can warn when it might be a scam, or when a user might be impersonating someone known to the recipient. The same analysis can alert to adults messaging children in a pattern that flags as inappropriate.
Messenger is not end-to-end encrypted. While Facebook CEO Mark Zuckerberg has said that this is on the roadmap, it has been delayed. Meanwhile, law enforcement agencies and politicians have urged Facebook to withdraw those encryption plans, for fear that this will play into the hands of terrorists, criminals, traffickers and child-abusers hiding their activities from investigations and monitoring activities.
“As we move to end-to-end encryption,” Facebook says, “we are investing in privacy-preserving tools like this to keep people safe without accessing message content.” And that’s the key. Facebook isn’t saying as much, but it’s not difficult to do the math here. “We developed these safety tips with machine learning that looks at behavioral signals,” the tech giant explains. “This ensures that the new features will be available and effective when Messenger is end-to-end encrypted.”
Metadata is a powerful surveillance source in itself. Security agencies collect and mine this data to identify patterns and warning flags. Who talks to who, when and for how often? Who else do they know? Complex relationship networks are drawn from these contact points, identifying criminal, terrorist and pedophile networks, all without accessing the content itself.
Facebook, it seems, is doing several things here. First, it is providing a potential defense against EARN-IT, providing a monitoring system that can adhere to the proposed regulations without breaking message security. Second, it is offering a potential concession to law enforcement. If there is a new collection process for metadata, that could, in theory, be a win for law enforcement without breaking widespread security. And, third, it is sending a clear message that expanding end-to-end encryption is still the plan. The platform’s strategy has not changed.
The company emphasized this last point in its announcement. “We designed this safety feature to work with full encryption,” adding that “people should be able to communicate securely and privately with friends and loved ones without anyone listening to or monitoring their conversations.” Its intent could not be clearer.
More than two billion users exchange an unimaginable volume of end-to-end encrypted messages on WhatsApp each day. And unless an endpoint (phone) is compromised, or those chats are backed-up into accessible cloud platforms, neither owner Facebook nor law enforcement has a copy of those encryption keys.
How effective metadata monitoring will be in establishing malicious behavior—beyond just the superficial—we have yet to find out. But it’s a start. “As Messenger becomes end-to-end encrypted by default,” Facebook says, “we will continue to build innovative features that deliver on safety while leading on privacy.”
What it means is, WhatsApp encryption is here to stay, our stance will not change. Apart from anything else, it has become one of the messaging platform’s USPs—it cannot afford to see its differentiator watered down. As Ian Thorton-Trump CD, CISO for Cyjax, puts it, “having to back-door WhatsApp would kill the app big time, and everyone would move to Signal, Telegram and Wire in record time. “
Of course, with Facebook being Facebook, there is another, more commercial outlet for this type of metadata analysis. If the platform knows who you are, and knows what you do based on its multi-faceted internet tracking tools, then knowing who you talk to and when could be a commercial goldmine. Person A just purchased Object 1 and then chatted to Person B. Try to sell Object 1 to Person B. All of which can be done without any messaging content being accessed.
There is a major difference between metadata analysis on Messenger and WhatsApp, of course. The volume of data that Facebook itself has outweighs the equivalent when you use WhatsApp. Unless of course there is a unique identifier that can tie the two together—like a phone number, for example.
“Facebook is in a tight spot,” Thornton-Trump says, “and it’s really important for folks to understand the truth about the security of the platform and the use of the data it collects.” It’s a fair point—users will not want to see data mining expanding across their WhatsApp metadata. But if that’s the price to maintain encryption, one can assume it will be a relatively easy sell for most users.
Putting platform commercialization to one side, the two take-aways—that Facebook has confirmed Messenger end-to-end encryption is still on the cards and that it is working on security measures that work in tandem with encryption—are both positives for the billions relying on that security.
Security agency frustration at the lack of lawful interception for encrypted messaging is understandable, but the problem with global over-the-top platforms is that once those weaknesses are inbuilt, they become potentially available to bad actors as well as good. And at these times of heightened risks, that would be worrying.
And so for security-conscious WhatsApp users out there—as well as everyone using other end-to-end encrypted platforms like Signal and Wickr, if it’s possible to establish metadata analysis as an alternative to message content analysis and access, this will have been a critical move by Facebook in a quiet sort of way.