How blockchains can be used in authenticating video and countering deepfakes
Take, for example, police departments (where officers wear body cams) and the numerous stakeholders that exist when a shooting occurs. These could include the officer involved, investigators, the person who was shot and his/her legal team, prosecutors, judge/jury, watchdog groups, the media, and the general public. Many of these groups serve as an important part of our checks-and-balances system: how can each of these stakeholders have confidence in the veracity of the video?
More specifically, how can each of these stakeholders have confidence in the veracity of the video in a world where exploited video exists?
Artificial intelligence technology known as deepfakes allows bad actors to create increasingly believable maliciously-altered video with decreasing amounts of technical knowledge needed to execute. Replacing politicians’ heads in pornography videos is a common stunt these days. When manipulating video, to make it realistically appear that someone said or did something that he or she had not
A solution that incorporates an immutable chain of custody.
A solution to the above law enforcement scenario is where videos are fingerprinted at the source recorder, the police body cameras. These fingerprints, or hashes, are written to an immutable ledger or blockchain from the recorder itself as most modern police body cameras have a wireless connection. As the video is downloaded from the device, uploaded to the cloud, clipped and shared, each event is written to a smart contract (a transparent agreement that is part of many blockchains). This creates an audit trail of the file and rehashing throughout the process to make sure the file’s integrity permeates.
When the file is shared and played by a stakeholder, the fingerprinting process is rerun or rehashed, and the fingerprints are compared to the ones in the smart contract. Either they match or they do not match: it is binary. The video is authentic or the video has been altered.
Which would you trust more:
a) a file sitting on the cloud server of an organization with a vested interest in the outcome of a case, and within a system where numerous people have access privileges and whose security practices are opaque; or
b) a file stored in a decentralized system, the permissions of which are transparent, and whose fingerprints — which you can independently confirm yourself —and audit history are readily accessible?
As the stakeholders should have more confidence in the immutability of a blockchain than the security policies and practices of a biased party, the doubt in the legitimacy of the video will recede. A key premise of blockchain design is trust-minimization. It seeks to create a secure environment between people and things (like Internet-enabled devices) and allows third parties to confidently transact with each other.
Even for the organizations and departments recording the evidence: they don’t want to be tarnished by false accusations and lies. They don’t want to be on the receiving end of a faked video that purports their officers committed a violation that did not actually occur and was antithetical to their mission. And arbiters, as well as society as a whole, will need to be skeptical of body cam and bystander videos being shared that don’t have a valid chain of custody.
A frequent defense at future trials, if we haven’t implemented a solution by then, will be that ‘manipulating video is commonplace and the court system can’t trust the police or the prosecution as they are biased parties who recorded and held the evidence on their own insecure systems. Thus, the evidence should be thrown out.’ Defense lawyers may not even need to prove who supposedly altered a video: just the fact that fake videos exist will delegitimize authentic videos. Coupled with genuine questions of data integrity in centralized systems could be enough to dismiss video evidence. And without that video evidence, the case may get dismissed.
There will be bad actors on both sides of a controversial issue, each intent on skewing public sentiment in their favor. Bad actors will distribute faked videos via social media’s echo chambers, hoping it will spread, create controversy, generate outrage, and catalyze upheaval.
We must not allow this to occur. We need the right technologies, system designs, and incentives to prevent this. Evidence-based conclusions have been critical to the advancement of societies. Regressing to negative aspects of tribalism to make judgements will chip away at democratic institutions.
We can’t stop rapidly advancing technology and the people intent on wielding it for malice from creating fakes but we can prevent their impact.
By securely fingerprinting video at the time of recording and tracking those hashes through to distribution, a major source of doubt among stakeholders, when fake video is prevalent, will be removed. When bad actors manipulate video and distribute them online to sow chaos, reasonable people, guided by the pursuit of facts, will be able to confirm the legitimacy of a video and then focus on what actually transpired in a recorded scene.
And blockchain technology is one important technology in this process to create trust minimization (vs. trusting vulnerable systems), combat video manipulation, and preserve due process.