..

On Provenance Of Pictures

Google’s Pixel 9 lineup, unveiled last week, has garnered impressive reviews. Google has made a point to plaster AI all over this release. What intrigued me were some of the AI features that were announced. Take a look at some of these in the video below:

We must pause to consider the implications of AI’s growing presence in our pockets. While these image models have existed for quite some time in an experimental setting, putting them in the hands of millions of consumers is bound to unleash a whole different challenge. The major one being, how do you prove the provenance of non-AI generated images?

In our digital age, we must question the very essence of photography. Traditionally, a photograph served as evidence of reality, a frozen moment in time. However, AI features like those in the latest smartphones challenge this fundamental concept. As AI algorithms increasingly blur the line between captured and generated imagery, we face philosophical quandaries: What constitutes authentic visual representation? How will we navigate a world where the camera might no longer be a reliable witness to reality?

The CEO of Procreate has something to say about this as well:

Model providers or device manufacturers are making some efforts to curtail this. For example, Meta’s Llama (available via WhatsApp and Instagram) watermarks their AI-generated images as shown below.

drawing

This is a futile effort because you have equally good watermark removal models. See: https://github.com/zuruoke/watermark-removal. Some other providers, add metadata in their image EXIF data to tag an image as AI-generated. This doesn’t work as well because EXIF data can be easily overwritten.

This situation presents us with a profound challenge:

  • Technological inadequacy: Our current tools for distinguishing AI-generated content from authentic imagery are proving insufficient.

  • Ethical implications: The ease of creating and manipulating convincing false imagery threatens the very foundation of visual trust in our society.

  • Societal impact: How do we maintain a shared reality when our visual evidence becomes unreliable?

We need a highly-scalable publicly verifiable chain of trust mechanism for non-AI-generated images. A public repository of signatures of images that are original, were never tampered with, or enhanced using AI. And a verifying mechanism that tracks every change done on top of that image. Sounds like an NFT, doesn’t it? Well yes, if we could build a system without all the scammy tokens that just does this one thing right and is standardized by model providers and device manufacturers, where they also own the root of trust, then maybe we have a chance.

In the meantime, I find solace in my photographic device, a bastion of inherent trustworthiness. As you may have guessed, I’m referring to the venerable film camera, a testament to the tangible and the authentic in an increasingly digitized world.

drawing