The rise of contextual images and videos on social media has paved the way for the rapid spread of dangerous misinformation. In response, Google is enhancing its platform to provide users with comprehensive contextual information about images, helping to curb the dissemination of false information.
The newly introduced tools allow users to view an image’s history, metadata, and the contexts in which it has been used across various websites. Announced earlier this year, these “About this image” features are now fully accessible to all English-speaking users worldwide.
With these tools, users can determine when an image was first indexed by Google Search, thus gauging its relevance and timeliness. Additionally, users can see how others have described the image on different platforms, aiding in debunking false claims.
Google highlights that, when available, users will also have access to metadata, including indicators that specify whether an image is generated by AI. The company has pledged to clearly label all images created by its AI technologies. In October, Adobe partnered with industry giants like Microsoft, Nikon, and Leica to establish a clear symbol that identifies AI-generated images.
To utilize the new image verification tools, simply click the three-dot menu on Google Images results. You can also find this information under the “more about this page” option in the “About this result” menu within the same three-dot dropdown. Google is continuously exploring additional methods to enhance access to these tools.
In another important development, Google announced that accredited journalists and fact-checkers will have the capability to upload or copy image URLs to further investigate them via their proprietary FaceCheck Claim Search API. This feature complements the previously tested Fact Check Explorer tool, which enables fact-checkers to examine references and additional information related to specific images.
Moreover, Google is experimenting with generative AI to improve the descriptions of sources, especially for lesser-known sellers or obscure blogs. Users enrolled in the Search Generative Experience (SGE) will see AI-generated insights about sites in the “more about this page” section. This generated content will also include citations from other reputable sources when available, especially in cases where Wikipedia or the Google Knowledge Graph lacks sufficient information.
As the capabilities of generative AI continue to grow, companies are actively working on technologies to provide more transparency regarding image origins. In June, Adobe released an open-source toolkit designed for apps and websites to verify image credentials. Parallelly, X has initiated its Community Notes program, focusing on crowdsourced fact-checking for images and videos.