YouTube is introducing new guidelines to address the potential misuse of generative AI and the spread of misinformation.
Starting next month, content creators on the platform will be required to disclose if their videos include realistic-looking content altered or synthesized using artificial intelligence, such as deep fakes.
Also Read: Youtube Initiates Measures to Combat the Usage of Ad-Blocking Software
Failure to disclose AI-generated content may result in demonetization, video removal, account suspension, or suspension from the YouTube Partner Program.
The platform will also display a label on videos with generative AI content, indicating that sounds or visuals were altered or generated digitally. Users can request the removal of AI-generated content that they find inaccurate.