Science & Technology

Sony to Authenticate Camera Images With Digital ‘Birth Certificates’

The technology to be integrated into the Sony cameras’ hardware will create a digital “birth certificate” for every image they capture, helping creators to verify the origin of their visual content.

Sony to Authenticate Camera Images With Digital 'Birth Certificates'

At CES 2024 in Las Vegas, Sony, the electronics and gaming provider, presented its new in-camera digital signature technology, aiming to create a “birth certificate” for images captured with Sony devices.

The product, which is still under development, should be able to validate the origin of the firm device’s content, helping to safeguard ownership and combat misinformation via manipulative imagery. Visual creators can use it similarly to non-fungible tokens (NFTs) – as a way to prove the origin of their artwork and track one’s content.

“Helping creators navigate opportunities while protecting the authenticity of their work is a priority,” said Neal Manowitz, president and COO at Sony Electronics.

The innovative feature will be integrated into the Sony cameras’ hardware starting with the new Alpha 9 Mark III model. Alpha 1, and Alpha 7S III models will be also updated with the new functionality later this year.

At the moment when an image is captured, the camera will generate a unique machine-based digital signature. This ‘birth certificate’ will serve as an identifier that can be tracked and verified across various use cases. The technology will help photographers or agencies to prove that photos are authentic and unaltered.

This way, professionals, particularly in journalism and photography, can safeguard the authenticity of their content, adding a layer of security to the media segment where the distribution of manipulated images becomes more common with the rise of generative AI.

“The dissemination of false information and images has real-world social impact that brings harm not only to our photojournalist and news agency partners but to society as a whole,” believes Manowitz.

Recent DeepMedia’s data has shown that in 2023, the number of fake videos and audio recordings in the virtual space has tripled compared to 2022.

AI technology providers such as OpenAI also acknowledge potential issues that may arise due to artificially created imagery. OpenAI has been working for a while on a detection tool that would distinguish AI-created or altered images from the original content.

Nina Bobro

1436 Posts 0 Comments

https://payspacemagazine.com/

Nina is passionate about financial technologies and environmental issues, reporting on the industry news and the most exciting projects that build their offerings around the intersection of fintech and sustainability.