Facebook revamps controversial content moderation process for VIP | CNN Business
New York
CNN
—
Facebook parent Meta announced a revamp of its “cross-check” moderation system on Friday after facing criticism for giving special treatment to VIPs by applying different review processes for VIP posts compared to those of regular users.
But Meta stopped short of adopting all of the recommended changes previously proposed by its own Oversight Board, including a suggestion to publicly identify which high-profile accounts qualify for the program.
The cross-verification program came under fire in November 2021 after a Wall Street Journal report indicated that the system shielded some VIP users, including politicians, celebrities, journalists and Meta’s business partners such as advertisers, from the normal process of Company Content Moderation. in some cases allowing them to post content that violates the rules without consequence.
As of 2020, the program had grown to include 5.8 million users, the Journal reported. The Meta Supervisory Board said following the report that Facebook had not provided it with crucial details about the system. At the time, Meta said that criticism of the system was fair, but that cross-checking was created to improve the accuracy of content moderation that “might require further understanding.”
The Meta Oversight Board in a December policy recommendation said the program was created to “satisfy business concerns” and said it risks harming everyday users. The council, an entity funded by Meta but which it says operates independently, urged the company to “radically increase transparency” about the cross-verification system and how it works.
On Friday, Meta said it would partially or fully implement many of the more than two dozen recommendations the Oversight Board made to improve the program.
Among the changes it has pledged to make, Meta says it will aim to distinguish between accounts included in the enhanced review program for business and human rights reasons, and detail those distinctions to the board and the transparency center of the company Meta will also refine its process to remove or temporarily hide potentially harmful content pending further review. And the company also said it would work to ensure cross-checking content reviewers have the appropriate language and regional expertise “wherever possible.”
The company, however, refused to implement recommendations such as publicly marking the pages of state actors and political candidates, business partners, media actors and other public figures included in the cross-verification program. The company said these public identifiers could make these accounts “potential targets for bad actors.”
“We are committed to maintaining transparency with the board and the public as we continue to execute on the commitments we are making,” Meta said in a policy statement about the cross-verification program.
Meta’s Oversight Board did not immediately respond to a request for comment on Meta’s planned policy changes.