Nikon, Sony, and Canon vs. Deepfakes
Fake images are growing in use and are becoming harder to spot. Top camera makers are hitting back with technology that can verify photo authenticity. Nikon, Sony Group, and Canon plan to embed digital signatures in their cameras for proof of origin and integrity of the images.
The goal is to create digital signatures that are resistant to tampering and will help those who need to ensure the credibility of their work. Nikon plans to offer the feature in its mirrorless cameras. Sony and Canon will use it in professional-grade mirrorless SLR cameras.
The three camera manufacturers have already agreed on a global standard for digital signatures, making them compatible with a web-based tool called “Verify.” The tool will allow anyone to check the credentials of an image for free. If AI creates or alters an image, Verify will flag it as having “No Content Credentials.”
As deepfakes of prominent figures such as U.S. Presidents and other foreign leaders have gone viral, the public is questioning the validity of online content. There is good reason to question online content with generative AI technology, like China’s Tsinghua University latent consistency model, which produces approximately 700,000 images daily.
The camera makers are not alone. For example, Google released a tool that adds invisible digital watermarks to AI-generated pictures. Intel’s new technology can analyze the skin color changes of subjects in images, using the blood flow under the skin to determine image authenticity.
Sony will release its technology in the spring, and Canon will follow suit later that year. Sony is also potentially adding the feature to videos, and Canon is developing a similar video technology. Canon has also released an image management app to tell whether humans take images.
The Associated Press, Thompson Reuters, and the Sterling Lab for Data Integrity are testing the technology.