Enhancing Transparency in Content Creation: YouTube’s Disclosure Policy for Altered Media
Summary:
YouTube is introducing a new tool in Creator Studio to enhance transparency for viewers regarding altered or synthetic content. Creators are now required to disclose when their content, particularly realistic depictions that could be mistaken for real, is created using generative AI or other synthetic methods.
Key Points:
- Disclosure Requirement: Creators must disclose content that could be mistaken for real, such as altered faces, synthetic voices, or realistic depictions of events or places.
- Exemptions: Content that is clearly unrealistic, animated, or uses generative AI for productivity like script generation doesn’t require disclosure.
- Responsible AI Innovation: This initiative aligns with YouTube’s responsible AI innovation approach, focusing on transparency, disclosure, and user trust.
- Examples Requiring Disclosure:
- Digitally replacing faces or generating synthetic voices.
- Altering real events or places realistically.
- Creating realistic scenes depicting fictional major events.
- Exemptions from Disclosure:
- Clearly unrealistic content, animations, or fantastical elements.
- Minor adjustments like color correction or visual enhancements.
- Labeling and Enforcement: Labels will appear in video descriptions and on sensitive topics, with potential enforcement measures for consistent non-disclosure.
- Industry Collaboration: YouTube collaborates with the industry for increased transparency, including participation in the Coalition for Content Provenance and Authenticity (C2PA).
- Future Plans: YouTube plans to roll out the disclosure labels across all platforms gradually and is working on an updated privacy process for AI-generated or synthetic content removal requests.
- Empowering Creators: The goal is to empower creators while increasing transparency and understanding among viewers in the evolving landscape of generative AI and digital content.