An independent group tasked with evaluating how Facebook handles online content on Thursday reversed the social media giant's decision to delete content in four out of five test cases that it had reviewed.
The cases dealt with hate speech, COVID-19 disinformation and other content — including a post quoting Nazi propagandist Joseph Goebbels — that may have broken the tech giant's digital rules.
Facebook's so-called Oversight Board, a collection of legal and human rights experts whose decisions are binding on how the company treats potentially divisive online content on its global platform, made its rulings as tensions continue over the role social media companies play in fomenting unrest online. The Oversight Board is also currently reviewing Facebook's decision to lock former U.S. President Donald Trump's account in the wake of the Capitol Hill riots early this month. An announcement in that case will be made by April.
"We think, after careful consideration, that first of all there were difficult cases, but we don't think [Facebook] got it right," Helle Thorning Schmidt, the former Danish prime minister and co-chair of the Oversight Board, told POLITICO. "We're saying to Facebook that they need to be better at telling users why their content is getting removed."
Among the posts Facebook deleted was one showing women’s breasts. Another post was taken down for inciting hate speech against Muslims.
Together, the five cases announced Thursday lie at the heart of the difficult choices that Facebook and the Oversight Board must make to determine what constitutes legitimate free speech and what falls into the category of hate speech, misinformation or other harmful content that break the company’s content policies.
In its first round of decisions made public Thursday, the Oversight Board spent almost two months reviewing a series of Facebook posts the company had initially removed for breaking its content rules.
The group has the power to determine if such deletions were justified or unfairly restrict people's freedom of speech, but the experts are not able to review Facebook posts that remain online.
That will change in the next couple of months, Thorning Schmidt added, and the group will be given the power to adjudicate on posts that Facebook has not removed.
The Board is run separately from the company, but its $130 million budget is provided by the tech giant. Online users or the company can ask the body to review cases, and more than 150,000 referrals have been submitted since October. On Friday, the group will announce its next round of cases, and will also allow people to submit public comments concerning its ongoing investigation into whether to reinstate Trump’s Facebook account.
In one ruling, the Board said that a post from a user in Myanmar that appeared to criticize Muslims, which Facebook had removed because it said the content had breached the company’s hate speech standards, should be reinstated because while the comments could be seen as offensive, they did not meet Facebook’s own standards for what constituted hate speech.
In another, the group said that a deleted Facebook post from France which criticized local officials’ failure to use hydroxychloroquine, a malarial drug, to treat COVID-19 — a debunked claim that remains widely popular across the country — should also be returned to the social media platforms because it did not represent an imminent harm to people’s lives.
A third decision ordered Facebook to reinstate an Instagram post from Brazil that included women's nipples as part of a breast cancer awareness campaign, which the company’s automated content moderation system had initially removed for falling afoul of the photo-sharing app’s nudity policy. Facebook eventually reposted the image on Instagram, but outside experts criticized the company for failing to have sufficient human oversight of such automated decisions.
And in the case involving Goebbels, the board determined that the post had not promoted Nazi propaganda, but had in fact criticized Nazi rule and therefore had not breached Facebook’s content rules.
“Everyone can see that these are not easy cases and it has been difficult to come to a final decision,” said Thorning Schmidt, adding that not all of the rulings were backed universally by the group’s members. She said their recent deliberations had shown that the company relied heavily on automation to remove potentially harmful content.
“It seems to us that many of these decisions were decided by an algorithm,” she said. “Our advice to Facebook is that when they have these difficult decisions, they should have human oversight.”
Facebook confirmed that it had already reinstated the four pieces of content, but cautioned that it would continue to take a hard line against posts that promote falsehoods around COVID-19.
“Our current approach in removing misinformation is based on extensive consultation with leading scientists, including from the CDC and WHO,” Monika Bickert, Facebook’s vice president of content policy said in reference to the U.S. and international health organizations. "During a global pandemic this approach will not change."
The rulings may also apply more broadly to similar content across its global platform.
The only case in which the Oversight Board agreed with Facebook’s decision to remove a post related to a Russian-language attack on Azerbaijanis that the experts agreed had broken the company’s hate speech standards.
Despite the outside group’s willingness to overturn how Facebook handles potential dubious posts across its platform, not everyone has welcomed the increased oversight.
Damian Collins, a British lawmaker and co-founder of The Real Oversight Board, a campaigning group critical of its namesake, said that the body’s inability to review Facebook’s wider content moderation policies and failure to rule on potentially harmful posts that remained on the platform, made its work mostly toothless.
“These types of decisions should not be left to Facebook,” he said. “The decision to remove content or not should be in the hands of a government or politically-elected figures.”