A floating staircase against a wall, with the stairs facing downwards.
A floating staircase against a wall, with the stairs facing downwards.
A floating staircase against a wall, with the stairs facing downwards.

To treat users fairly, Facebook must commit to transparency


September 2021

Last week, new information emerged on Facebook’s ‘cross-check’ system, which the company uses to review content decisions relating to some high-profile users. This information came to light due to the reporting of the Wall Street Journal, and we are grateful to the efforts of journalists who have shed greater light on issues that are relevant to the Board’s mission. These disclosures have drawn renewed attention to the seemingly inconsistent way that the company makes decisions, and why greater transparency and independent oversight of Facebook matter so much for users.

In light of recent developments, we are looking into the degree to which Facebook has been fully forthcoming in its responses in relation to cross-check, including the practice of whitelisting. The Board has reached out to Facebook to request they provide further clarity about the information previously shared with us. We expect to receive a briefing from Facebook in the coming days and will be reporting what we hear from this as part of our first release of quarterly transparency reports which we will publish in October. On top of providing new information on the types of appeals the Board is receiving, these reports will provide an analysis of the Board’s decisions related to cross-check and Facebook’s responses on this topic.

We are also looking at how the Board can further explore policy issues related to cross-check, which may lead to further recommendations in this area.

At the Oversight Board, we have been asking questions about cross-check for some time. In our decision concerning former US President Donald Trump’s accounts, we warned that a lack of clear public information on cross-check and Facebook’s ‘newsworthiness exception’ could contribute to perceptions that Facebook is unduly influenced by political and commercial considerations.

To address this, we asked Facebook to explain how its cross-check system works and urged the company to share the criteria for adding pages and accounts to cross-check as well as to report on relative error rates of determinations made through cross-check, compared with its ordinary enforcement procedures. In its response, Facebook provided an explanation of cross-check but did not elaborate criteria for adding pages and accounts to the system, and declined to provide reporting on error rates.

It is crucial that we continue to ask questions on cross-check, and publish the answers for people to see. Transparency is essential for social media platforms. The choices made by companies like Facebook have real-world consequences for the freedom of expression and human rights of billions of people across the world. By having clear rules and enforcing them consistently, platforms can give users the confidence that they’ll be treated fairly. Ultimately, that benefits everyone.

Transparency mattersTransparency matters

Since we published our first decisions in January, we have been shining a light on Facebook’s opaque rules and steering the company towards greater transparency. In less than a year, we’ve received more than 500,000 requests from users to examine Facebook’s content moderation decisions. We’ve taken on 20 important cases so far, covering issues from hate speech to COVID-19 misinformation. And we have issued 15 decisions, overturning Facebook 11 times.

Just last week, the Oversight Board published its latest decision, reversing Facebook’s original decision to remove a post from a Facebook user in Egypt that shared news from a verified Al Jazeera page about violence in Israel and the Occupied Palestinian territories. While Facebook originally removed the user’s content, it only realized its error after the Board selected the case and restored it. In many other instances, Facebook has reversed its original decision upon learning that the Board was considering review.

We’ve also made around 70 recommendations to improve Facebook’s policies, with many focusing on transparency.

Many can relate to the experience of having their content removed with little explanation of what they did wrong. The Board is deeply concerned with the impact on users and the implications for freedom of expression, which is why in three of our first five decisions, we made the same recommendation to Facebook: tell users whose content you remove not just what broad ‘Community Standard’ they violated, but the specific rule they supposedly broke.

Treating users fairly also means making your rules readily available to users around the world. In another recommendation, we called on Facebook to translate its Community Standards into Punjabi, a language spoken by around 130 million people. Facebook agreed to by the end of this year.

Adopting another of our recommendations, Facebook agreed to provide information on content removed for violating its Community Standards following a formal report by a government, including the number of requests it receives. This will increase transparency about how governments pressure Facebook, and how Facebook responds to that pressure. In last week’s decision, we went further and called on Facebook to formalize a transparent process on how it receives and responds to all government requests for content removal.

We know that the power of our recommendations lies not in Facebook’s initial responses, but in the action the company takes. To measure this, we recently set up a team to assess how Facebook implements our recommendations, with the goal of ensuring that Facebook delivers on its commitments over time. We will be publicly reporting on how Facebook has implemented each of our decisions and recommendations and holding Facebook to account for their actions.

Steering Facebook towards greater transparency will be a collective effort. Journalists, academics and civil society all have an essential role to play in holding Facebook accountable. By providing crucial independent oversight, we are proud to be part of this. Over time, we believe that the precedents of our decisions and the cumulative impact of our recommendations will help make Facebook’s approach more consistent, more transparent, and better for users across the globe. That is a goal worth pursuing.

Catalina Botero-Marino, Jamal Greene, Michael McConnell, Helle Thorning-Schmidt

Co-Chairs of the Oversight Board

Back to news and articles