A number of recent public reports have claimed that our moderation of content related to the war in Ukraine is biased. At the heart of these accusations is the claim that those who review the content of Meta deliberately censor Ukrainian content, while freely allowing pro-Russian propaganda.
These accusations are false and there is no evidence to back them up. Preventing bias and ensuring we apply our policies accurately and consistently is fundamental to our approach and something we take seriously.
That’s why there are dedicated teams that collectively review content on Facebook and Instagram in over 70 languages, as we recognize that it often takes a local to understand the specific meaning of a word or the political climate in which a message is shared. For example, Ukrainian content requiring linguistic and cultural expertise is checked by Ukrainian reviewers, not Russians.
This is also why we have policies — called our community standards on Facebook and Community Rules on Instagram – which are, by design, complete and detailed so that two people looking at the same content make the same decision. These policies are also the only criteria these teams are allowed to use to review content. They are global and apply equally to everyone who uses our technologies, whether they are Ukrainians, Russians, Americans or anyone else. Each assessor goes through a thorough onboarding process and regular training to ensure they properly understand our policies and how to apply them.
Finally, this is also why we randomly assign content to reviewers, so they cannot determine which content they will receive for review. We also perform weekly audits across all review teams – again, on a random sample basis – which means it would not be possible for any reviewer to escape this system. Our regular audits of sites that review content in Ukrainian and Russian, as well as other languages across Central and Eastern Europe, have shown consistently high levels of accuracy and are consistent with our results on other sites. examination and other languages.
Of course, there will always be things we miss and things we delete by mistake. There is no “perfect” here, because humans and technology make mistakes. Wars are also complex and rapid events where the risk of errors is greater. That’s why we have teams working around the clock to rectify mistakes and provide channels for people to appeal decisions they disagree with.
False claims designed to undermine trust in public and private institutions are nothing new and are predictable in times of conflict. However, to suggest that our content moderation processes – and those that review content for Meta – are deliberately biased and in favor of anyone is simply wrong.
For more information on our ongoing efforts regarding Russia’s invasion of Ukraine, including how we deal with Russian propaganda, info ops and disinformation, please read our publication in the press room on this topic.