Facebook's business model promotes hate speech, damages democracy and is even "tearing our societies apart," a former product manager of the company's civic misinformation team has said.
Frances Haugen, who left Facebook in May this year, was the whistleblower who leaked internal company research - including reports that reveal Instagram's impact on teenage girls' mental health - to the Wall Street Journal last month.
Speaking to American TV network CBS on Sunday, Haugen said that her experience working for Facebook revealed a company that prioritised growth over making its product safer.
"The thing I saw over and over again was that there were conflicts of interest between what was good for the public and what was good for Facebook," Haugen told CBS's 60 Minutes.
"Facebook over and over again chose to optimise for its own interests, like making more money," she said.
Haugen joined Facebook in June 2019, working on a team tasked with tackling misinformation around elections.
In the run-up to the United States' presidential elections in November 2020, Facebook announced a raft of measures it said would help connect voters to accurate information and reduce "the risks of post-election confusion".
Shortly after the election took place, however, the team was dissolved.
Haugen claimed Facebook then rolled back many of the interventions it had put in place to limit the spread of misinformation.
"As soon as the election was over, they turned them back off or they changed the settings back to what they were before, to prioritise growth over safety," Haugen said.
"And that really feels like a betrayal of democracy to me".
In Sunday's interview, Haugen drew a connection between Facebook and the US Capitol riots on January 6, when supporters of former president Donald Trump stormed the seat of the American legislature.
She said that she did not trust the company to "invest what actually needs to be invested to keep Facebook from being dangerous".
Speaking before Haugen's interview was broadcast, Facebook's vice president of policy and global affairs Nick Clegg told CNN that the company did not accept that it was one of the main contributors to political polarisation in the US.
"The insurrection on that day lies squarely with the people who inflicted the violence and those who encouraged them, including President Trump," Clegg said, adding that it was "ludicrous" to pin the blame on social media.
"I think it gives people false confidence to assume that there must be a technological or a technical explanation for the issues of political polarisation in the United States ... It's too easy to say it's Facebook's fault".
According to Facebook documents provided to CBS by Haugen, the company is aware both of the spread of hate speech on its platforms, and also of how challenging a problem it poses.
"We estimate that we may action as little as 3-5 per cent of hate and ~0.6 per cent of V&I [Violence and Incitement] on Facebook despite being the best in the world at it," one internal report said.
"We have evidence from a variety of sources that hate speech, divisive political speech, and misinformation on Facebook and the family of apps are affecting societies around the world," said another.
According to Haugen, fixing Facebook's hate speech problem could take more than heavier content moderation and stricter rules.
A 2018 change to the algorithm that decides which pieces of content are shown to users inadvertently ended up giving Facebook users information that was more likely to trigger an angry response, Haugen told CBS.
"[Facebook] is optimising for content that gets engagement, a reaction," she said.
"But its own research is showing that content that is hateful, that is divisive, that is polarising - it's easier to inspire people to anger than it is to other emotions".
Haugen alleged that the company was unwilling to change the algorithm as it could impact the business's bottom line.
"Facebook has realised that if they change the algorithm to be safer, people will spend less time on the site, they'll click on less ads, they'll make less money," she said.
Despite the seriousness of the allegations she levelled at the company, Haugen said she empathised with Facebook's founder and CEO Mark Zuckerberg.
"No one at Facebook is malevolent, but the incentives are misaligned," she said.
"Facebook makes more money when you consume more content. People enjoy engaging with things that elicit an emotional reaction, and the more anger they get exposed to, the more they interact and the more they consume".