Facebook will change one of its most controversial rules, following recommendations from the Oversight Board. The social network announced that it would walk back a longstanding policy that allowed politicians to circumvent some of its rules under the guise of “newsworthiness.” Facebook announced the change at the same time it revealed that Donald Trump could be back on Facebook in 2023.
“When we assess content for newsworthiness, we will not treat content posted by politicians any differently from content posted by anyone else,” the company’s VP of Global Affairs, Nick Clegg, wrote in a blog post.
However, other rules that do give elected officials special treatment, including an exemption from fact checking, will remain in place. The changes come in response to a series of policy recommendations from the Oversight Board that accompanied its nondecision on the Trump suspension. Under the rules for the board, Facebook is bound to respond to the board’s policy recommendations, but isn’t required to implement the changes it suggests.
Facebook says it is changing its much-maligned “newsworthiness exemption,” which allows it to not take action on content that would otherwise break its rules when it believes there is “public interest value” in leaving it up. Previously, Clegg said that the company considered speech from politicians to be newsworthy and therefore should be “be seen and heard” even if a post may otherwise break its rules.
Now, the company says it will no longer automatically presume that posts from politicians are newsworthy, meaning that rule-breaking posts will get more scrutiny than in the past (previously, Facebook would only take down posts for breaking specific policies, like voter interference or inciting violence). With the change, reported Thursday by The New York Times, Facebook can still decide to apply a “newsworthiness allowance” to specific posts, but the company will “not presume that any person’s speech is inherently newsworthy, including by politicians.” Facebook also said it will clearly explain when it does exempt a post for newsworthiness, a policy Mark Zuckerberg has previously endorsed.
Clegg also said that Facebook would take steps to make its “strike” system more clear to users, a change first reported by The Verge. Like other social media platforms, Facebook has used “strikes” to help it determine how accounts that break its rules are punished. But up until now the company hasn’t notified users when they receive a strike, which can lead people to have their accounts suspended seemingly without prior warning. High-ranking company officials have also reportedly overruled fact-checkers in order to remove strikes from prominent conservative pages.
The changes announced today could have a much bigger impact than just Trump. The company has for years grappled with whether or not to fact check or apply its content policies to world leaders. The latest changes could clarify how the social network approaches these types of cases, and give it more leeway to enforce its rules. At the same time, Facebook made it clear it still intends to have some potential carve-outs for leaders, whose posts could still be exempted for “newsworthiness.”
The Oversight Board had taken issue with Facebook’s treatment of politicians, writing in its decision that “the same rules should apply to all users” and that Facebook should “address widespread confusion about how decisions relating to influential users are made.”
As part of its response, Facebook published new details on “heightened penalties for public figures during times of civil unrest and ongoing violence.” The policy allows for suspensions of up to two years initially, and heightened penalties or permanent removal for repeat violations.
In all, the company responded to 19 separate recommendations from the Oversight Board, saying that it was “committed to fully implementing” 15 of the suggestions. In addition to explaining how it applies newsworthiness and strikes, Facebook said it would implement a “cross-check” system in order to ensure that “high-visibility content” that could potentially break its rules is reviewed multiple times.
“These additional reviews are a supplemental safeguard to ensure we’re accurately taking action on potentially violating content that more people see,” Facebook wrote. “It also helps us verify that when content violates our policies, including from public figures or popular Pages, we consistently remove it.”
But on some issues raised by the board, Facebook — once again — didn’t commit to specific changes. For example, in response to a recommendation that Facebook “resist pressure from governments to silence their political opposition,” the company wrote that it “will look for additional ways to incorporate external feedback and hold ourselves more accountable.”
Notably, Facebook only committed to “implementing in part” a recommendation that the social network investigate its own role in enabling the events of Jan. 6. It pointed to “expanded research initiatives” on its handling of the 2020 election, but that “ultimately, though, we believe that independent researchers and our democratically elected officials are best positioned to complete an objective review” of the Jan. 6, insurrection.
“The responsibility for January 6, 2021 lies with the insurrectionists and those who encouraged them, whose words and actions have no place on Facebook,” Facebook wrote. “We will continue to cooperate with law enforcement and any US government investigations related to the events on January 6. We also believe that an objective review of these events, including contributing societal and political factors, should be led by elected officials.”
In a brief statement, the Oversight Board said it was reviewing Facebook’s decisions and would weigh in on the company’s responses after it had time to evaluate them.
All products recommended by Engadget are selected by our editorial team, independent of our parent company. Some of our stories include affiliate links. If you buy something through one of these links, we may earn an affiliate commission.