When insurrectionists stormed the US capitol on January 6, 2021, in support of former president Donald Trump, Meta had to confront the reality that world leaders—even those in democratic nations—might use its platform to call for violence against their own citizens and political rivals. The company had long kept a policy that forbade users from inciting attacks, but political figures had typically been given greater leeway, on the basis that a tech platform couldn’t be seen to interfere in political discourse.
But, facing immense pressure, Meta did, finally, suspend Trump’s Facebook account on January 7, and later announced a new policy on incitement to violence by public figures on its platforms, which threatens bans of between six months and two years for violations.
That policy was tested this January in Cambodia, when the autocratic prime minister Hun Sen went on a livestream to threaten opponents of his Cambodian People’s Party, saying he would “gather CPP people to protest and beat you up,” and send “gangsters” to their houses. Hun Sen, who has 14 million followers on Facebook, has a credible history of violence and intimidation against activists and political opponents. Though the video was reported for violating the company’s policies against hate speech and incitement, Meta left the video up, arguing that Hun Sen, as a world leader, made the video newsworthy. In June, Facebook’s independent Oversight Board, which makes judgements on select content moderation decisions, said that the platform had made a mistake, requiring it to remove the video and recommending that Hun Sen’s Facebook page and Instagram account be suspended for six months. The company took down the video, but said it would conduct a review of the board’s recommendation to suspend Hun Sen’s accounts.
Meta usually implements the Oversight Board’s recommendations. But this time, the company seems to be hesitating. It has until August 28 to review the board’s recommendation, and take action—or not—on Hun Sen’s Facebook page. The company now faces a decision that is fraught with risk, with consequences for itself and for users in Cambodia, and which could set an important precedent ahead of 2024, when more than 30 countries—including the US, India, Indonesia, Mexico, and the UK—are due to hold elections. How Meta deals with Hun Sen’s account could set precedent for the standard to which world leaders are held on its platforms, which are used by billions of people around the world, and test the legitimacy of the Oversight Board.
“The difficulty is going to be where you draw that line of when a world leader demonstrated enough that they should be kicked off the platform,” says Katie Harbath, a fellow at the Integrity Institute and who formerly oversaw Facebook’s elections operations until 2021. “No platform has taken action in the middle of an election in terms of taking down a world leader. What does this look like in the case of India? Or Indonesia, or Mexico, all of these countries with elections next year?”
Just hours after the Oversight Board’s decision was released on June 29, Hun Sen announced on Telegram, where his channel has just over 1 million followers, that he would be leaving Facebook, and took down his page. The Cambodian government issued a ban on all of the Oversight Board’s members, and Hun Sen threatened to shut off access to Facebook in the country. But the departure didn’t last long. On July 20, three days before Cambodia held national elections, the page returned, managed by one of Hun Sen’s advisers, Duong Dara.
The speed of the prime minister’s about turn, and his unwillingness to act against Facebook, show how important the platform is in Cambodia.
An Oversight Board member, speaking to WIRED on condition of anonymity, says the board did take into consideration the possibility of a countrywide ban of Meta’s platforms when formulating its decision. “It was our judgment that it was more important to keep the regime from being able to use the platform to threaten the political opposition, even at the risk that Facebook might be shut down now,” they say. “The point of a decision of this sort is not just to take down one post, but to try to nudge the company in the direction of more consistent enforcement of its already existing rules having to do with abuse of the platform.”
It’s not clear that suspending Hun Sen would have a material impact on his ability to reach supporters.
An analysis of mentions of Donald Trump after his suspension from Facebook and Twitter found that conversations around him did indeed decrease, but researchers were unable to tell whether that was also due to a simultaneous crackdown on many far-right groups that supported him as well.
Hun Sen’s supporters and other party members will almost certainly remain active on the platform. He uses Telegram, and has a popular TikTok account—although neither has as many followers as his Facebook page. Instead of an outright suspension, Access Now’s Benjamin says the company could take some half measures, such as removing the ability to share content from Hun Sen’s page or deprioritizing its reach. In its current policy, Meta says that pages that violate its community guidelines may “be removed from recommendations and have their distribution reduced.”
But Piseth Duch, a Cambodian human rights lawyer and legal analyst, says that Meta should remain consistent in its policies, no matter the risks. “I believe that they should strictly follow their principles, regardless of every country’s leader,” he says.
Even if, as some have argued, the threat of being suspended from the platform might have been enough to convince Hun Sen to stay within Facebook’s community guidelines since January, the lack of consequences for breaking the rules means that others will inevitably test the boundaries in the future. If Meta doesn’t impose consequences, “other people will continue to use Facebook to incite violence or speech of speech,” Duch says.
Meta’s decision could have consequences for the Oversight Board, which is a core part of the company’s governance. The body was set up in 2018 to act as a sort of independent judiciary for Meta platforms, particularly around issues of content moderation. Meta funds it via an irrevocable trust, but does not have any say in its decisions.
The board can issue binding decisions as well as non-binding recommendations. Its first decisions were released in January 2021, and since then it has submitted 191 recommendations to Meta, some of which have forced the platform to reevaluate its internal policies. Many of them dealt with thorny topics, such as whether content from a news outlet covering Afghanistan’s Taliban government (considered a “dangerous organization”) could stay up (it could), or whether a Croatian cartoon implying ethnic Serbians were “rats” violated Meta’s hate speech rules (it did). In the case of Hun Sen’s account, the board issued a binding decision that the offending video be removed, and recommended a suspension of the prime minister’s account.
Access Now’s Benjamin says that not following the board’s recommendation could throw into question how the company will approach issues of violent and hateful speech moving forward—particularly with how it handles the context in which a post is made. “If Meta doesn’t not comply with the recommendation of the Oversight Board, it also speaks so much about its sincerity and commitment to its own policies against violent and harmful content,” she says. “If they disobey this recommendation, we are forced to go back to the drawing board of looking at freedom of expression on a per-post basis.”
The Oversight Board member says that the body isn’t issuing any one-size-fits-all recommendations for how Meta should approach future elections. “Every election is going to be unique, the timing is going to be different, the nature of the issues that come up are different,” they say. But the board has recommended that Meta should adopt a new system so that when heads of state attempt to incite violence on its platforms, there is a rapid escalation to try to limit the harm it causes.
But what seems like a question relegated to a smaller market could have massive repercussions in other countries. In India, Facebook’s largest market, the far-right, Hindu-nationalist ruling Bharatiya Janata Party (BJP) has already banned TikTok and accused Meta of censoring nationalist users, even as reporting from The Wall Street Journal found that the company routinely let hateful and violent content by BJP members stay up on Facebook. Last year, the country announced the creation of a Grievance Appellate Committee meant to oversee moderation decisions made by large tech companies. The government also requires foreign tech companies to have in-country representatives who can be held legally responsible for company decisions, and has set up a state-run fact-checking arm that can flag content it determines to be misleading about the government, which companies, and even internet service providers, must comply with. After the US, India has submitted the most requests to have content taken off Meta’s platforms.
If a situation similar to Cambodia were to play out in India’s elections, Meta might face not only a possible ban, shutting off hundreds of millions of users, but also the arrest of its in-country staff.
And possibly nowhere else is this question more alive than in the United States, where Donald Trump, the person for whom Meta’s policy was originally written, has already begun to campaign for the presidency. His suspension from Facebook ended at the beginning of 2023, but “guardrails” remain in place–if he violates the company’s policies again, he will be suspended for between a month to another two years. How that will play out in the midst of an election is anyone’s guess.
The Integrity Institute’s Harbath says that the Cambodia decision could have afforded the Oversight Board the opportunity to help prepare Meta for these upcoming elections.
“I think that the Oversight Board put Facebook in a bit of a difficult spot with this one because they didn’t really adequately address what this will look like in a place like India or a fairly free country ahead of an election,” she says. “I feel like they kind of kicked the can down the road.”
The information provided in our posts or blogs are for educational and informative purposes only. We do not guarantee the accuracy, completeness or suitability of the information. We do not provide financial or investment advice. Readers should always seek professional advice before making any financial or investment decisions based on the information provided in our content. We will not be held responsible for any losses, damages or consequences that may arise from relying on the information provided in our content.