The advent of AI, ML, IoT, cyber-physical systems, robotics and intelligent devices provides many opportunities for the multimedia community to reach out and develop synergies. Various research communities frequently use digital images to enhance the role of multimedia. To extract significant value from these images, recent years showed the rise of intelligent data processing services operating on different platforms for industries and other potential applications.

Further, more than 2266 photos are uploaded to Facebook every second, while another 995 photos are uploaded to Instagram. However, the unauthorized use of these images is an increasing concern that significantly impacts security and privacy issues. 

The Digital India project aims to transform India into a digitally empowered society and knowledge economy. The project can only be realized if proper documentation, security and authentication mechanisms are enforced. Furthermore, the problem of identity theft and privacy leakages has become a major contributor to fraud in India and other countries.

Therefore, copyright protection of such images has become a critical issue we must resolve. Recently, these images have been protected by a watermarking technique, where the different kinds of copyright marks are invisibly embedded into an image as a carrier for maintaining their security. Watermarking of AI-generated images has the following benefits: 

  • It saves bandwidth and storage demand 
  • It solves the issue of copyright violation and ownership conflict 
  • It acts as a keyword, 
  • It protects tempering and 
  • The watermarking operations use lightweight computation, leading to low energy consumption 

Google’s SynthID

DeepMind, in partnership with Google Cloud, launched a beta version of SynthID, a tool for watermarking and identifying AI-generated images. This technology embeds a digital watermark directly into the pixels of an image, making it imperceptible to the human eye but detectable for identification. 

SynthID is being released to a limited number of Vertex AI customers using Imagen, a text-to-image model that uses input text to create photorealistic images. According to a DeepMind blog, while generative AI can unlock huge creative potential, it also presents new risks, like enabling creators to spread false information — both intentionally and unintentionally. Identifying AI-generated content is critical to empowering people with knowledge of when they’re interacting with generated media and helping prevent the spread of misinformation. 

Synthroid isn’t foolproof against extreme image manipulations, but it does provide a promising technical approach for empowering people and organizations to work with AI-generated content responsibly. 

New type of watermarks 

Traditional watermarks are not sufficient for identifying AI-generated images because they are often applied like a stamp on the image. These watermarks can be easily edited out. This is not possible with SynthID. 

Without compromising image quality, SynthID allows the watermark to remain detectable, even after modifications like adding filters, changing colors and saving with various lossy compression schemes, most commonly used for JPEGs.

SynthID uses two deep learning models for watermarking and identifying that have been trained together on diverse images. The combined model is optimized for various objectives, including correctly identifying watermarked content and improving imperceptibility by aligning the watermark to the original content. 

This tool provides three confidence levels for interpreting the results of watermark identification. If a digital watermark is detected, part of the image is likely generated by Imagen. The developers believe that SynthID could be expanded across other AI models. They will be integrating it into more Google products and making it available to third parties in the near future. 

Sources of Article

Source:

Content & Image source: DeepMind Blog Post

Want to publish your content?

Publish an article and share your insights to the world.

Get Published Icon
ALSO EXPLORE