
Social Media Policy Review
Localization Lab’s ongoing work on social media platform governance and accountability focuses on the quality of Facebook and YouTube’s content moderation policies in translation. Our initial study, developed in partnership with Internews, produced a first of its kind quality review of Facebook and YouTube’s content moderation policies in Arabic, Amharic, Bengali, and Hindi. The report was co-developed with communities impacted by issues of platform accountability and demonstrates the essential role of language in content moderation.
What We’ve Done
The initial research phase found systematic errors that produced translations far below the quality that would be considered acceptable by an average reader. The translations were at times unintelligible or misstated the policies in translation, producing conflicting policies across languages and regions. The limited utility of the translated policies significantly impacts communities’ ability to report harmful content, as well as limits content moderators’ ability to detect such content.
Each of the translations showed systematic errors that impacted readers’ ability to understand the policy without referencing the original version in English. These included issues of cultural sensitivity, systematic errors in numeration, and mistranslations that carried different meanings than the original. In Arabic, for example, Facebook mistranslated the prohibition of “calls to violence” as “phone calls to violence”. Likewise, on Facebook, the Amharic translation included a xenophobic slur. Similar problems were found in the YouTube policy translations. The study also found that the policies explained content policies using examples relevant to American audiences but failed to include examples relevant to the needs of local users in the Global Majority. This contributed to significant gaps and conflicting interpretations about what the policies can and do cover in the Global Majority.
Our report presents these findings in detail and discusses how translation impacts the entire value chain of content moderation and why localization is central to effective platform governance. This spans end-users’ ability to report content to moderators’ ability to detect harmful content and to regulators’ knowledge about the content that exists on the platform. The report includes recommendations for improving translation policies and processes, as well as for advancing best practices in translation across the industry.
What We’re Doing Next
We are building on this research to produce more reports on the current four and other languages; to translate the reports into local languages; to develop advocacy toolkits that support community-led platform accountability; and to continue building partnerships across local, national, international civil society to drive platform accountability. Our vision is to regularly audit the translation quality of platform policies with affected communities to advance platform transparency and accountability.
Localization Lab and Internews presented the report and hosted a panel discussion at Bread and Net 2022:
"Wait, Who is Timothy McVeigh?”: Why localizing tech policies has to start with communities - Part 1: Research Presentation
"Wait, Who is Timothy McVeigh?”: Why localizing tech policies has to start with communities - Part 2: Panel Discussion
The full report is available here.
INTERESTED IN LEARNING MORE ABOUT OUR WORK IN DIGITAL RIGHTS? REACH OUT TO US AT INFO@LOCALIZATIONLAB.ORG OR FOLLOW US ON X @ L10NLAB TO JOIN THE CONVERSATION.