Facebook’s plan to put ‘privacy first’ could create new problems
For years, Mark Zuckerberg preached the value of people sharing more and more information online, with the mission of building what he often called a “more open and connected world.” For years, Facebook profited from that activity by collecting data and using it to lure advertisers.
Now, after a bruising year filled with a bottomless pit of scandals about misuse of the platform and Facebook’s data-privacy practices, Zuckerberg is changing his tune somewhat. The CEO unveiled plans in a post Wednesday to reposition Facebook as a “privacy-focused” platform.
Gone are the days of focusing first and foremost on building what Zuckerberg called a digital “town square.” Instead, he says Facebook is increasingly invested in building “the digital equivalent of the living room.” To get there, Zuckerberg says Facebook will emphasize private, encrypted and ephemeral conversations across its products.
And yet, to the extent Zuckerberg actually makes good on these pledges to put “privacy first” and prove skeptics wrong — “Frankly we don’t currently have a strong reputation for building privacy protective services,” he admitted — he may also open the door to a different set of concerns about the social networks he controls.
Privacy advocates and researchers say the strategy shift Zuckerberg laid out could effectively limit the accountability of Facebook and bad operators on its platforms by making it harder to track and police troubling content. For example, content ranging from fake news to false advertising could get distributed privately rather than publicly. What’s more, a move to integrate Instagram, WhatsApp and Messenger, as outlined in the post, could effectively cement Facebook’s dominance over the messaging market for years to come.
“I don’t think it’s purely altruistic,” said Ashkan Soltani, an independent privacy researcher who’s the former chief technologist of the Federal Trade Commission. “It allows the company to get out of some of its obligations of moderating or eliminating problematic content.”
Adam Preset, an analyst with Gartner, echoed the point. “Anything that can be done in public on the Facebook platform, or even very broadly to large networks of people, is subject to criticism,” he said. If more of the activity takes place in private, “that puts a lot less pressure on Facebook.”
This means the future of Facebook may look more like WhatsApp. Rather than users spending most of their time with a public-facing News Feed where posts can potentially reach millions, they can message each other in smaller groups protected by end-to-end encryption. It may be a boon for privacy, but it’s a proven breeding ground for viral hoaxes and fake news that spread in more intimate circles — sometimes with deadly consequences.
“It’s one thing to see a random link that is blatantly false being shared on a News Feed by someone you barely know at all. But it’s another thing entirely when someone you know sends you a blatantly false story or a deep fake video,” said Woodrow Hartzog, a law and computer science professor at Northeastern University School of Law. “You might actually trust it even more.”
Tracking misinformation in such an environment is more challenging, too.
“It’s going to be much harder to detect emerging problems in clusters of closed groups, particularly if they’re encrypted,” says Renée DiResta, who researches disinformation online as the head of policy at Data For Democracy. “As it is now, Facebook currently has some moderating power and visibility into Groups, but not WhatsApp.”
A Facebook spokesperson stressed that the company is still “in early stages here” and directed CNN Business back to Zuckerberg’s post. In it, the CEO admits there could be “real safety concerns” by implementing encryption technology broadly, which need to be studied first.
“Encryption is a powerful tool for privacy, but that includes the privacy of people doing bad things,” he said. “When billions of people use a service to connect, some of them are going to misuse it for truly terrible things like child exploitation, terrorism, and extortion.”
He added that Facebook has a “responsibility to work with law enforcement” to prevent these issues whenever it can.
But the new strategy laid out by Zuckerberg may also give Facebook more wiggle room to avoid some of the blowback when it fails to crack down on inappropriate content, according to Alex Stamos, Facebook’s former chief security officer.
In a series of tweets sent following the news, Stamos said Facebook currently gets criticized “for both invading people’s privacy and not policing communications enough.” But this privacy-focused shift, he said, “is the judo move: in a world where everything is encrypted and doesn’t last long, entire classes of scandal are invisible to the media.”