In January 2021, after pro-Trump rioters stormed the U.S. Capitol, Mark Zuckerberg announced a new priority for Meta: He wanted to reduce the amount of political content on the company’s apps, including Facebook and Instagram.
As the United States hurtles toward November’s election, Mr. Zuckerberg’s plan appears to be working.
On Facebook, Instagram and Threads, political content is less heavily featured. App settings have been automatically set to de-emphasize the posts that users see about campaigns and candidates. And political misinformation is harder to find on the platforms after Meta removed transparency tools that journalists and researchers used to monitor the sites.
Inside Meta, Mr. Zuckerberg, 40, no longer meets weekly with the heads of election security as he once did, according to four employees. He has reduced the number of full-time employees working on the issue and disbanded the election integrity team, these employees said, though the company says the election integrity workers were integrated into other teams. He has also decided not to have a “war room,” which Meta previously used to prepare for elections.
Last month, Mr. Zuckerberg sent a letter to the House Judiciary Committee laying out how he wanted to distance himself and his company from politics. The goal, he said, was to be “neutral” and to not “even appear to be playing a role.”
“It’s quite the pendulum swing because a decade ago, everyone at Facebook was desperate to be the face of elections,” said Katie Harbath, chief executive of Anchor Change, a tech consulting firm, who previously worked at Facebook.
The result is that the near-constant barrage of headlines that Meta faced in past U.S. elections about its role in political discourse has largely abated. The company is instead recommending more content about sports, cooking, animals and celebrity gossip to its users.
Online political conversations this election cycle instead appear to be taking place more prominently on other platforms, such as TikTok and Elon Musk’s X. The campaigns of Vice President Kamala Harris and former President Donald J. Trump have turned to niche TikTok creators to humanize the candidates and reach young voters. Mr. Musk posts almost daily on X about Mr. Trump, whom he has endorsed for president. Even Zoom, the videoconferencing app, has become a major gathering ground for grass-roots political organizing.
Yet political posts, images and videos have not disappeared from Meta’s apps — they are just less visible. Anyone searching Facebook or Instagram for political groups or posts can quickly find them. Misinformation and conspiracy theories still spread on private groups on Facebook and WhatsApp, and the company continues to regularly remove disinformation campaigns from Iran, China, Russia and other countries. During a debate with Ms. Harris in September, Mr. Trump spread false stories that Haitian immigrants in Springfield, Ohio, were eating pets — a claim that started with a now debunked post on Facebook. The posts have been shared millions of times.
Emerson Brooking, a resident senior fellow at the Atlantic Council’s Digital Forensic Research Lab, a think tank that studies disinformation online, called Meta’s shift on politics “a stunning retreat.” But while it is harder to find political posts, he said, that has not solved all of its content problems, since “‘explicit’ content — violence, sex, drugs — is available by default,” meaning that users do not have to opt in to see that type of content as they do with politics. Meta has said it continues to guard against other violations of its policies with its safety and security teams.
Meta affirmed that it was ratcheting down political posts in public feeds and said engagement on its apps remained robust because people wanted to see less — not more — political content. Its number of users has grown, and revenue continues to spike as the company improves ad targeting using artificial intelligence.
“As we’ve said for years, people have told us they want to see less politics overall while still being able to engage with political content on our platforms if they want to, and that’s exactly what we’ve been doing,” said Dani Lever, a Meta spokeswoman.
But Meta is not doing less to combat misinformation, she said. The company has 40,000 employees working on safety and security, has invested $20 billion in those areas since 2016 and does not have fewer people working on elections, she said. This month, Meta barred the Russian state media outlets Rossiya Segodnya and RT and affiliated companies from posting on the company’s apps, citing “foreign interference activity” and disinformation campaigns carried out against other countries.
“No tech company does more to protect its platforms,” Ms. Lever said, adding, “We have hundreds of people focused on elections work.”
Meta’s distancing from politics had been a long time coming. Mr. Zuckerberg spent 2017 through 2021 engaging — in fits and starts — with lawmakers after his company was blamed for spreading Russian disinformation in the 2016 election.
Mr. Zuckerberg spent years on an apology tour, including appearing before Congress. But he grew frustrated with how little progress the company made defending itself, four other current and former Meta employees with knowledge of internal deliberations said. Meta was also caught in partisan politics, with Democrats blasting Mr. Zuckerberg for not doing enough to rein in problematic speech across his apps, while Republicans insisted the company was doing too much to curtail it.
Before the 2020 election, Mr. Zuckerberg told lieutenants there was no greater priority than securing the elections, an executive close to him said. He had a weekly meeting with top executives to discuss the issue and directed hundreds of employees to work on election integrity, including quashing conspiracy theories and misinformation around the vote.
Even so, political conspiracy theories ran wild across Facebook and Instagram. Some were spread by Mr. Trump, who amplified the false idea after the 2020 election that it had been stolen from him. Hundreds of thousands of people were drawn to “Stop the Steal” Facebook groups, which spread the inaccuracy that Mr. Trump had won the election.
When the Jan. 6 riot broke out, Meta was blamed for spreading election misinformation. Two weeks later, Mr. Zuckerberg told investors that the company was “considering steps” to reduce political content across Facebook.
“People don’t want politics and fighting to take over their experience on our services,” he said.
By the 2022 midterm elections, Meta had restructured its election teams and reduced the number of people working on elections, four current and former employees said. It also tested reducing the amount of political content people saw on Facebook in some markets, beginning in Brazil and later in other countries.
That year and into 2023, Mr. Zuckerberg cut roughly a third of Meta’s overall work force. Members of the integrity teams were among the first to go.
Last year, the company introduced Threads, a social platform widely seen as a competitor to X. Adam Mosseri, who oversees Threads and Instagram, soon posted that both platforms would “avoid making recommendations that could be about politics or political issues.”
The apps no longer recommend political content to users from accounts they do not follow, unless the user specifically switches on a setting buried in a menu of the Instagram app. The setting is switched off by default.
Mr. Zuckerberg drove these decisions, current and former Meta executives said. His personal posts on Threads and Instagram have tilted to featuring him talking about technology while sporting a new look, complete with gold chains and a revamped wardrobe. He also posts about newer passions, like participating in martial arts or fencing. There is little to no political content.
Instead of Mr. Zuckerberg’s weekly meetings, election issues have largely been delegated to Nick Clegg, Meta’s president of global affairs, and Guy Rosen, the chief information security officer. Both have taken over the day-to-day work of overseeing election security issues, two executives at the company said.
Mr. Zuckerberg has focused almost all of his public speaking moments on artificial intelligence, the metaverse and open source technologies.
It is unclear whether this strategy will last. After Mr. Zuckerberg’s letter to Congress last month, Meta faced some criticism from both parties for its role in political speech. And as the Harris and Trump campaigns lean heavily on social media to attract voters, Meta may find the subject of politics unavoidable.
“I think what they’re finding is that they can run, but they can’t hide from it,” Ms. Harbath, the former Facebook employee, said. “So they need to have a plan for how they’re going to deal with it.”
The post How Meta Distanced Itself From Politics appeared first on New York Times.