Facebook Combats Russian Disinformation with Ukrainian Content Moderators

Facebook's Ukrainian moderators focus on combating Russian disinformation on the platform. Since March 2022, Facebook has reduced distribution of Russian state-controlled media content and made it harder to find.

author-image
Rizwan Shah
Updated On
New Update
Facebook Combats Russian Disinformation with Ukrainian Content Moderators

Facebook Combats Russian Disinformation with Ukrainian Content Moderators

Facebook's Ukraine public policy manager Kateryna Kruk has revealed that Ukrainian employees are responsible for moderating Ukrainian content on the platform, with a focus on combating Russian disinformation. Since March 2022, Facebook has been reducing the distribution of content from Russian state-controlled media and making it harder to find on their platforms worldwide.

Why this matters: The efforts to combat Russian disinformation on social media platforms have significant implications for global politics and national security. Effective moderation can help prevent the spread of false information and promote a more informed public discourse.

According to Kruk, Facebook uses a combination of artificial intelligence technology, human reviewers, and user complaints to remove content that violates its policies. Native speakers who understand the local context are part of the moderation teams. The platform also has a network of local and international partners that provide feedback on new risks, enabling quick responses.

Despite these efforts, Facebook still faces content moderation challenges, particularly since the start of its Russian embargo. Meta, Facebook's parent company, has no moderators, offices, employees, or contractors in Russia. "We are working hard to combat disinformation from Russia, which comes from state-controlled media," said Kruk.

Meta has around 40,000 people working on safety and security issues, with about 15,000 content moderators. These teams are located globally, covering all major time zones, and can review content in over 70 languages, including Ukrainian. However, advertising that targets people in Russia is currently suspended, and advertisers in Russia can no longer create or display ads anywhere in the world, including Russia.

To help users understand what is happening to their accounts and what steps they can take to resolve issues, Meta recently launched Account Status. This feature provides transparency into the status and any potential violations associated with a user's Facebook or Instagram account.

As the conflict between Russia and Ukraine continues, Facebook remains committed to combating disinformation and ensuring the safety of its users. With Ukrainian moderators at the forefront of this effort, the platform aims to provide a more secure and trustworthy environment for its global community.

Key Takeaways

  • Facebook's Ukrainian employees moderate Ukrainian content to combat Russian disinformation.
  • Facebook reduces the distribution of Russian state-controlled media content worldwide.
  • Native speakers and AI technology are used to remove violating content.
  • Meta has 40,000 people working on safety and security, including 15,000 content moderators.
  • Facebook launched "Account Status" to provide transparency into account issues.