Google Photos will soon be able to tell you when an image has been AI-generated
Google Photos is reportedly adding new functionality that will allow users to check whether an image was generated or enhanced using artificial intelligence (AI). According to the report, the photo and video sharing and storage service will get new ID source tags that reveal the image’s AI information and digital source type. The Mountain View-based tech giant is likely working on this feature to reduce the number of deepfakes. However, it is unclear how the information will be shown to users.
Google Photos AI attribution
Deepfakes have emerged in recent years as a new form of digital manipulation. These are the images, videos, audio files or other similar media that are digitally generated using AI or enhanced using various means to spread misinformation or mislead people. For example, actor Amitabh Bachchan recently filed a lawsuit against a company owner for running deepfake video ads in which the actor promoted the company’s products.
According to an Android authority reportA new functionality in the Google Photos app allows users to see if an image in their gallery was created with digital assets. The feature was spotted in version 7.3 of the Google Photos app. However, it’s not an active feature, meaning those on the latest version of the app won’t be able to see it yet.
Within the layout files, the publication found new strings of XML code that pointed toward this development. These are ID resources, which are IDs assigned to a specific element or resource in the app. One reportedly contained the phrase “ai_info,” which is believed to refer to the information added to the images’ metadata. This section should be labeled if the image was generated by an AI tool that adheres to transparency protocols.
Additionally, the ‘digital_source_type’ tag is believed to refer to the name of the AI tool or AI model used to generate or enhance the image. These can be names like Gemini, Midjourney and others.
However, it is currently uncertain how Google plans to display this information. Ideally, it could be added to the EXIF (Exchangeable Image File Format) data embedded in the image so that there are fewer ways to tamper with it. But a downside to this would be that users cannot easily see this information unless they go to the metadata page. Alternatively, the app could add a badge on the image to indicate AI images, similar to what Meta did on Instagram.