Meta Platforms' Oversight Board recommended on Tuesday that the company revamp its system exempting high-profile users from its rules, saying the practice privileged the powerful and allowed business interests to influence content decisions.
Meta Platforms' Oversight Board recommended on Tuesday that the company revamp its system exempting high-profile users from its rules, saying the practice privileged the powerful and allowed business interests to influence content decisions.
The arrangement, called cross-check, adds a layer of enforcement review for millions of
Facebook and Instagram accounts belonging to celebrities, politicians and other influential users, allowing them extra leeway to post content that violates the company's policies.
Cross-check "prioritizes users of commercial value to Meta and as structured does not meet Meta's human rights responsibilities and company values," Oversight Board director Thomas Hughes said in a statement announcing the decision.
The board had been reviewing the cross-check program since last year, when whistleblower Frances Haugen exposed the extent of the system by leaking internal company documents to the Wall Street Journal.
Those documents revealed that the program was both larger and more forgiving of influential users than Meta had previously told the Oversight Board, which is funded by the company through a trust and operates independently.
Without controls on eligibility or governance, cross-check sprawled to include nearly anyone with a substantial online following, although even with millions of members it represents a tiny slice of Meta's 3.7 billion total users.
In 2019, the system blocked the company's moderators from removing nude photos of a woman posted by Brazilian soccer star Neymar, even though the post violated Meta's rules against "nonconsensual intimate imagery," according to the WSJ report.
The board at the time of the report rebuked Meta for not being "fully forthcoming" in its disclosures about cross-check.
In the opinion it issued on Tuesday, the board said it agreed that Meta needed mechanisms to address enforcement mistakes, given the extraordinary volume of user-generated content the company moderates each day.
However, it added, Meta "has a responsibility to address these larger problems in ways that benefit all users and not just a select few."
It made 32 recommendations that it said would structure the program more equitably, including transparency requirements, audits of the system's impact and a more systematic approach to eligibility.
State actors, it said, should continue to be eligible for inclusion in the program, but based only on publicly available criteria, with no other special preferences.
The Oversight Board's policy recommendations are not binding, but Meta is required to respond to them, normally within 60 days.
A spokeswoman for the Oversight Board said the company had asked for and received an extension in this case, so it would have 90 days to respond.