Meta has introduced new measures to reduce the spread of “unoriginal” content on Facebook, targeting accounts that frequently repost others’ videos, photos, and text without significant modification or proper attribution. The company aims to improve content quality and ensure creators get the credit they deserve.
In a statement released on Monday, Meta revealed that it has removed approximately 10 million accounts this year that were impersonating popular content creators. Additionally, 500,000 accounts faced actions due to spam-like behavior or fake engagement. These accounts will experience a reduction in content reach and will be barred from accessing Facebook’s monetization programs. Repeat offenders may lose distribution privileges entirely.
This new policy follows similar actions taken by YouTube, which recently clarified its stance on reused and AI-generated content. Meta’s move comes amid growing concerns about the rise of low-quality, mass-produced content, often referred to as “AI slop.” These videos typically consist of stitched-together images, clips, or computer-generated voiceovers, contributing to the flood of uninspired media across platforms.
Focus on Content Authenticity
Meta’s policy does not target users who engage creatively with content, such as reaction videos or commentary. Instead, the focus is on accounts that repost others’ work without meaningful contribution. To support original creators, Facebook will demote duplicate videos in users’ feeds. The company is also testing a feature that will link back to the original post when users encounter duplicate content.
While Meta’s statement does not directly address AI-generated content, it encourages creators to avoid simply combining clips or adding watermarks over others’ material. Meta emphasizes the importance of authentic storytelling and high-quality captions, suggesting a shift away from the increasing use of unedited AI-generated subtitles.
Addressing User Concerns and Fake Accounts
Meta’s new policy has sparked concerns, particularly among small business owners and content creators. Many have voiced frustrations with the platform’s content moderation policies and wrongful account takedowns. A petition advocating for improved enforcement has garnered nearly 30,000 signatures. In response, Meta announced that users can now track their content’s performance and receive notifications regarding potential penalties through the Professional Dashboard.
Additionally, Meta has tackled the issue of fake accounts, reporting that 3% of Facebook’s global monthly active users are fake. In the first quarter of 2025, the company took action against over a billion fake profiles. Meta is also testing Community Notes, a crowdsourced system in the U.S. that allows users to verify content accuracy, much like the system used by X (formerly Twitter).
Meta has stated that these changes will be rolled out gradually, giving creators time to adjust to the new enforcement policies.
YouTube is tightening its monetization rules starting July 15, focusing on real voices and original content. Read more about the changes here