It is probably evident by now, but the recent AI “generators” are very good at creating content. Let’s make it clear immediately: the “truth” battle is already lost. People and machines will be fooled repeatedly by fake content: we can’t, and more importantly, we should not trust our senses anymore.
This isn’t news, though. More than 500 years ago, Copernicus and Galileo forced us to consider that our eyes are tricking us: even though the sun looks to be moving like the moon is not rotating around us, and, no matter how appealing the story and how our common sens(es) seem to be confirming it, the truth might be different.
It’s normal to wonder, “What can we trust?” when institutions challenge each other, but the answers are the same as for Copernicus and Galileo: science and maths. It turns out that cryptography, often considered the science of secrets, is also critical when thinking about “proofs”.
One of the key cryptography processes is the concept of “signing.” It’s an operation through which an application generates a unique and mathematically unforgeable signature for any input: an image, a text, a video… etc. Critically, another piece of software, given the signature and the piece of content, can “recover” the signer’s identity. The little 🔒 in your web browser is an example of that mechanism: it tells you the browser (Chrome, Firefox…) has verified that the content you are sending originates from the domain name (linkedin.com, google.com, or ouvre-boite.com …) you see in the address bar.
Creating (fake) content is marginally free, which means we will see infinite amounts of it. Signing will soon become the only way to identify provenance, so any content not signed by its author will not be considered trustworthy.
In the not-too-distant future, every camera will be equipped with its own signature mechanism; every press outlet will sign their articles and bylines, and every company and government will be required to sign their communications… This is the only way to move from “beliefs” to “trust” in a world where we we can’t trust our senses.