Adobe Unveils New Strategy to Shield Artists from AI Plagiarism

As a leading force for digital artists worldwide, Adobe holds a crucial responsibility in combatting the increasing threats of AI-generated deepfakes, misinformation, and content theft. To address these challenges, Adobe is set to unveil its Content Authenticity web app in beta by the first quarter of 2025. This innovative tool will empower creators to apply content credentials to their work, establishing clear ownership.

However, securing content goes beyond merely modifying an image's metadata, which can easily be bypassed with screenshots. Adobe’s content credentials elevate protection by employing advanced techniques such as digital fingerprinting, invisible watermarking, and cryptographically signed metadata. These methods provide a more robust shield for various forms of artistic content, including images, videos, and audio files.

Invisible watermarking subtly alters pixels to an extent that remains undetectable to the naked eye. Similarly, the digital fingerprint integrates an ID within the file, ensuring that even if the content credentials are stripped away, the original creator's ownership can still be identified. Andy Parsons, Adobe’s Senior Director of Content Authenticity, emphasized that with this technology, Adobe can confidently assert that content credentials will stay intact wherever an image, video, or audio file appears, whether online or on mobile devices.

The success of initiatives like this heavily relies on user adoption. Given Adobe's impressive base of 33 million software subscribers, the company is well-positioned to reach a diverse audience of digital artists and creators. Additionally, even those who do not use Adobe software will have access to the Content Authenticity web app to apply credentials to their work.

Ensuring the widespread accessibility of content credentials online presents another challenge. Adobe has co-founded two industry groups dedicated to preserving content authenticity and enhancing online trust and transparency. These alliances include camera manufacturers representing 90% of the market, as well as content creation tools from major players like Microsoft and OpenAI, alongside platforms such as TikTok, LinkedIn, Google, Instagram, and Facebook. While membership in these groups does not guarantee integration of Adobe's credentials into their products, it signifies a commitment to dialogue on the issue.

Nevertheless, not all social media platforms and websites currently showcase provenance information prominently.

To address this gap, Adobe plans to launch a Content Authenticity browser extension for Chrome, alongside an Inspect tool available on the Adobe Content Authenticity website. Parsons stated, "These tools will help users discover and display content credentials wherever they're associated with content on the web, allowing for clear attribution of credit."

As AI technology continues to advance, the lines between real and synthetic images blur. Consequently, these tools can provide a dependable method for tracing an image's origin, as long as credentials are present. Rather than opposing AI, Adobe seeks to clarify its use within artworks, aiming to prevent unauthorized use of artists' works in training datasets. Adobe’s own generative AI tool, Firefly, is trained exclusively on Adobe Stock images with full permission.

"Firefly is commercially safe, and we only train it on content that Adobe explicitly has permission to use, and we never use customer content," Parsons explained. Although artists have voiced significant resistance to AI tools, Adobe's Firefly functionalities in applications like Photoshop and Lightroom have garnered positive responses. For instance, the generative fill feature in Photoshop, designed to extend images through prompts, enjoys a tenfold adoption rate compared to typical features.

Furthermore, Adobe collaborates with Spawning, an initiative that helps artists monitor how their works are utilized online. Through the "Have I Been Trained?" website, artists can check if their artwork is included in popular training datasets and can also register their works on a Do Not Train list. This registry informs AI companies that these artworks should not be part of any training datasets, provided these companies respect the list. Notably, Hugging Face and Stability have committed to honoring this initiative.

On Tuesday, Adobe will launch the beta version of the Content Authenticity Chrome extension, enabling creators to explore the capabilities first-hand. Interested individuals can also sign up for updates regarding the beta release of the full web app next year.

Most people like

Find AI tools in YBX