This is not entirely true. The company privately built a system that allowed high-profile users to be exempt from some or all of its rules, as per documents reviewed by The Wall Street Journal.
The program is known as "cross-check" or "XCheck," and it was initially intended as a quality-control measure for actions that are taken against high-profile accounts, including those of celebrities, politicians and journalists.
It shields millions of users from the company's enforcement process as some of them are "whitelisted" -- or rendered immune from enforcement actions -- while others are allowed to post rule-violating material.
The documents also showed that XCheck protected public figures whose posts contain harassment or incitement to violence. These posts typically lead to sanctions for the average user.
In 2019, Facebook allowed soccer star Neymar to show nude photos of a woman who accused him of rape. Whitelisted accounts shared inflammatory claims that Facebook's fact-checkers deemed false as well.
Despite attempts to minimize the number of exempted individuals, XCheck grew to at least 5.8 million users in 2020. In its struggle to moderate content and avoid negative attention, Facebook created invisible elite tiers in the social network.
The system misled the public and its own Oversight Board, which the company initially created to ensure accountability of its enforcement systems.
Facebook told the Oversight Board in writing that its system for high-profile users was used in "a small number of decisions."
Facebook spokesman Andy Stone said that the criticism of XCheck was fair. However, he added that it was designed to create an additional step for them to accurately enforce policies on content that could require more understanding.
He also said that Facebook has been accurate in its communications to the board and that the company is working on phasing out the practice of whitelisting. (Related: Facebook confirmed to be a massive spy machine that records your conversations and violates your privacy.)
"A lot of this internal material is outdated information stitched together to create a narrative that glosses over the most important point: Facebook itself identified the issues with cross-check and has been working to address them," he said.
The documents that describe XCheck are part of an extensive array of internal communications. They showed that Facebook is aware that its platforms are riddled with flaws that cause harm. Moreover, the documents show that the company lacks the will or ability to address them.
Facebook's own researchers have also identified the platform's own negative effects in areas such as teen mental health, political discourse and human trafficking. The company also held back for fear of hurting its business. For instance, Facebook previously made changes that backfired.
The revelation of these documents showed what is perhaps the clearest picture of how broadly problems are known inside the company. However, when the company speaks publicly about these issues, it often provides misleading or partial answers to mask how much it knows.
Zuckerberg estimated that in 2018, Facebook gets 10 percent of its content removal decisions wrong; depending on the enforcement action taken, users might not be told what rule they violated, or if they will be given a chance to appeal.
Employees said that the whitelisting practice was at odds with the company's values. Kaushik Iyer, who was then a lead engineer for Facebook's civic integrity team, said in a 2020 memo that the company should separate content policy from public policy.
Read more at TechGiants.news.