Creators Need to Label AI-Generated Content or Face Suspension From YouTube Soon

YouTube Will Soon Require Creators to Tag AI-Generated Content
Credit: Budrul Chukrut / SOPA Images / LightRocket via Getty Images

YouTube Will Soon Require Creators to Tag AI-Generated Content or Face Suspension From Platform to prevent the spread of misinformation or confusion. Creators will have new options during the upload process to disclose the use of generative AI in producing videos. Failure to consistently disclose this information may lead to penalties, including content removal or suspension from the YouTube Partner Program.

Why YouTube is concerned about AI-Generated Content?

This move is part of YouTube’s broader efforts to address the potential misleading effects of generative AI, which poses risks such as deepfakes and misinformation, especially in sensitive contexts like elections. The labeling of AI-generated or altered content will be enforced through generative AI technology, not only to identify convincingly real-looking fake content but also to catch content violating community guidelines.

Creators must now indicate whether their content “contains realistic altered or synthetic material” during the upload process. Labels will be added to the description panel to inform viewers about AI-generated or altered content, with more prominent labels for content involving sensitive topics. Even with appropriate labeling, content violating YouTube’s community guidelines will be removed.

YouTube’s content moderation will leverage generative AI technology to contextualize and understand potential threats on a large scale. The platform emphasizes the need to balance the creative potential of generative AI with the responsibility to protect the YouTube community. These new rules, set to take effect by the next year, expand on Google’s previous requirements for prominent warning labels on political ads using AI, extending the disclosure obligations to a broader range of AI-generated content.

In a blog post on Tuesday detailing various updates to AI-related policies, YouTube emphasized that creators failing to disclose their use of AI tools to create “altered or synthetic” videos could face penalties, including the removal of their content or suspension from the platform’s revenue-sharing program.

“Generative AI has the potential to unlock creativity on YouTube and transform the experience for viewers and creators on our platform,” stated Jennifer Flannery O’Connor and Emily Moxley, vice presidents for product management, in the blog post. “But just as important, these opportunities must be balanced with our responsibility to protect the YouTube community.”

These new restrictions build upon the rules introduced by YouTube’s parent company, Google, in September. These rules mandated prominent warning labels on political ads using artificial intelligence across YouTube and other Google platforms.

Under the forthcoming changes set to take effect by next year, YouTubers will have additional options to specify whether they’re uploading AI-generated videos that, for instance, realistically depict events that never happened or portray individuals saying or doing things they did not actually do.

Jennifer Flannery O’Connor and Emily Moxley emphasized the significance of this disclosure, particularly in content discussing sensitive topics like elections, ongoing conflicts, public health crises, or involving public officials. The goal is to provide transparency and accountability in the context of potentially impactful content.

In addition, YouTube’s privacy complaint process will be updated to allow the removal requests of AI-generated videos that simulate identifiable persons, including their face or voice. Music partners, such as record labels or distributors, will have the ability to request the takedown of AI-generated music content mimicking an artist’s unique singing or rapping voice. This comprehensive approach aligns with YouTube’s commitment to striking a balance between fostering creativity through generative AI and safeguarding the well-being of the YouTube community.

Read Original Article on AOL

The information above is curated from reliable sources, modified for clarity. Slash Insider is not responsible for its completeness or accuracy. Please refer to the original source for the full article. Views expressed are solely those of the original authors and not necessarily of Slash Insider. We strive to deliver reliable articles but encourage readers to verify details independently.