Blockchain can be a hero in the fight against Deepfakes
In an era of scams and fake news, so-called “deepfakes” are the latest assault on our relationship with reality. If you’re unfamiliar, deepfakes are videos generated with help from artificial intelligence that show a recognizable figure (like Barack Obama or Mark Zuckerberg, for example) saying things that they’ve never actually said. By putting false words in the mouths of prominent, powerful people, deepfakes are a perceived threat that becomes a flashpoint in the ongoing cultural war.
In October, a deepfake video was posted on Twitter, in which, CEO of Binance, Changpeng Zhao, starred in the film of Jet Li. However, Mr. Zhao was not a martial arts expert and he had to exclaim that this technology is scary.
lol damn, this technology is scary… Video KYC and facial recognition will be out of the window soon.
— CZ Binance (@cz_binance) October 30, 2019
There are concerns about the possible effects of this technology on international diplomacy, for example, a deepfake video depicting Trump’s declaration of war on Russia that prompts big names such as Facebook, DARPA, Microsoft, and Oxford University to spend billions on finding solutions. A series of blockchain companies, like Alethea AI, are tracking, using decentralized databases and immutable databases of technology to help people detect fake videos.
Blockchain can be the savior
But if there’s one thing we know about blockchain, the database technology that props up popular cryptocurrencies like Bitcoin and Ethereum, we know that it excels at verifying and confirming what is real.
Witness Media Lad issued a 72-page report that goes in-depth on the tools that stand a chance to push back against the threat of deepfakes, and blockchain is one of them. It’s not the first time this idea has been floated, but it’s certainly one of the more thoughtful explanations we’ve seen.
The general idea goes like this: images, videos, and audio can be cryptographically signed, geotagged, and timestamped to establish their origins. This kind of “verified capture” calls for applications to perform a number of checks, ensuring that transmitted data conforms with the source material. That media can be assigned a cryptographic hash based on the image or audio data it contains, comparing that source hash to another in search of any mismatches will easily tell you if the media has been manipulated or not. In other words, blockchain can verify source media against copycats or outright manipulations the same way it verifies crypto transactions.
But this isn’t a totally bulletproof approach, it effectively calls for us to put trust in a technical system without considering its limits. The report cites media forensics expert Hany Farid saying that any finished blockchain solution for fighting deepfakes is still years away due to the complexity involved here. Blockchains are still vulnerable to sophisticated attacks against their governance structures but there’s enough promise here that people are taking note. Corin Faife, senior coordinator for Witness, who offered a useful example:
“When we buy food from a supermarket, we generally expect it to be packaged in such a way that it cannot tamper with sealed plastic, stickers over the wrapper, and so on. It doesn’t guarantee the food will be perfect: it still might not taste good, and we can’t prevent it from spoiling if you leave it out too long, but what you do know is that the package hasn’t been interfered with on its way to you. These authenticity measures propose something similar for video: it’s not an ultimate guarantee of truth and shouldn’t be taken as an endorsement of the content itself, but it does allow you to confirm that a media item hasn’t been tampered with on its way to you from the original point of capture.”
Blockchain companies are searching for a method to fight back deepfakes
Alethea is setting up artificial intelligence tools to build a large video database that helps AI detects deepfakes in the real world. Specifically, they are focusing on building a data set with video content related to agile operations, making it harder for machines to detect. The company also creates its own deepfakes similar to Changpeng Zhao’s clip. The immutability of the database ensures that it cannot be tampered with.
Alethea is also find a way to raise people’s awareness of this technology. They say that although the naked eye cannot see the most intricate deepfakes created, it partly help the community to be more wary of deepfakes.
While Alethea focuses on search and education support, Amber Video, Ethereum-backed platform is interested in “fingerprinting,” which is the hashing video coding method at source before connecting to the blockchain. (Hashing denotes data in alphanumeric code that cannot be imitated or tampered with.)
Shamir Allibhai, CEO of Amber Video, said that fingerprinting at the source ensures that the videos are original, even if you cut and stitch them together. “If the fingerprint doesn’t match, then something has been changed,” he said.
In any case, Amber Video plans to sell its technology to large companies and law enforcement agencies, where there are internet-connected security cameras and many complex stakeholders resolving disputes.
At a time when deepfakes technology seems to only be improving, it is good to know that we have tools for pushing back against them.