YouTube is actively enhancing transparency to help viewers understand whether the content they are watching has been created or modified by generative AI. In a recent message, YouTube emphasized, “Generative AI is transforming how creators express their ideas—ranging from storyboarding to utilizing innovative tools that enrich the creative process.” However, there’s a growing demand for clarity among viewers regarding the authenticity of the content they consume.
To foster this trust, YouTube is introducing a new feature that requires creators to disclose when realistic-looking content—defined as “content a viewer could easily mistake for a real person, place, or event”—is generated or altered using synthetic media, including generative AI.
YouTube provided several examples of scenarios where creators should disclose content alterations:
- Using the likeness of a real person: This includes digitally swapping one individual's face with another or synthetically generating a voice to narrate a video.
- Altering footage of actual events or locations: For instance, editing footage to suggest that a real building is on fire or modifying a cityscape to appear different from its true form.
- Creating realistic scenes: Depicting fictional major events in a realistic manner, such as portraying a tornado approaching a real town.
When a creator opts to mark their content, a disclosure label will be displayed. For most videos, this label will appear in the video's extended description, while for more sensitive topics—like health, news, elections, or finance—it will be visible directly on the video for greater emphasis. YouTube clarified that creators are not required to disclose content that is obviously unrealistic, animated, or utilizes generative AI solely for production assistance. Moreover, they don’t need to point out every instance of generative AI usage in the broader content creation process, such as scriptwriting, generating ideas, or automatic captioning.
These new labels are scheduled to roll out across all YouTube platforms in the coming weeks. YouTube also cautioned creators considering bypassing the requirement to disclose altered content, stating it will “explore enforcement measures for those who repeatedly opt not to share this information.” In certain cases, where a creator fails to disclose potentially misleading altered or synthetic content, YouTube may impose a label on that content.