The saying “seeing is believing” will soon be obsolete. You’re about to live in a world where video and audio of a person will seem authentic in every way, but it might not be.
From the person’s mannerisms, to the inflection in their voice, to their speech pattern — everything that makes that person who they are will seem genuine as you watch. Real and fake video will be almost indistinguishable from each other.
“Even what happened this last [presidential] election is going to be considered quaint, and almost like the Model T, when you look back at it,” says Eric Newton, Innovation Chief at Arizona State University's Walter Cronkite School of Journalism and Mass Communication.
“Fake audio and fake video, if undetected, could cause the worst thing you could imagine, Newton says. “It could cause a war… it could cause the collapse of the stock market… it could cause any number of calamities and catastrophes because of our flight or flight instinct in doing things fast.”
The future is now
It may seem like we are years away from this, but we have already entered this stage of new media. Those with expertise can do all this now as evidenced by this video of Barack ObamaBarack Hussein ObamaOur remote warfare counterterrorism strategy is more risk than reward Clinton lawyer's indictment reveals 'bag of tricks' Chelsea Manning tests positive for COVID-19 MORE and Reddit’s rise of fake videos — which superimposes someone’s face, usually a celebrity, onto a body of a pornographic film actress or actor.
What should be alarming is that the programs used to create this type of media are already being marketed to the masses. It will soon be so easy to use, that with just a little training and practice, anyone will be able to create fake video and audio.
Adobe announced in 2016 the development of a new program called Voco. Dubbed as the "Photoshop-for-voice," this software can create audio of someone’s voice and make it almost exactly like the actual person.
Then there is FakeApp, the video maker which uses artificial intelligence software to swap faces on bodies. It’s what many of the fake pornographic videos are being created with now.
Not believing anything isn’t wise either
In politics, this will work both ways.
While someone can create a nefarious video of a politician saying something, this technology can also act as a cover for the politician who gets caught actually doing something awful. The new defense will always be it was a fake video.
This becomes even more dangerous with social media. An MIT study just released a few weeks ago found that on Twitter “false news stories are 70 percent more likely to be retweeted than true stories are. It also takes true stories about six times as long to reach 1,500 people as it does for false stories to reach the same number of people.”
Which means these manipulated videos will spread much faster and farther than the real ones.
Are we ready for this media jungle?
“The average person is driving on the information-superhighway never having taken a driver’s education class, never having to take a driver’s test… it’s a recipe for problems,” Newton says.
Most people are not equipped to navigate through this new media ecosystem that they are being thrusted into.
Of course there is savior in this information-crisis world: journalism.
The news media can function as a referee. Journalists are the best prepared with the the resources, training, and credibility to verify the authentic from inauthentic.
Too bad there is high public distrust in the media. If the media does not have the public supporting its work, essentially the referees will have no whistles to blow to call foul. And those who try to discredit the work of journalists are creating a space where information-disarray will allow manipulative persons to deceive in the chaos.
We can control this, if we want to
“Ultimately, the best free society solution is that the good speech drives out the bad speech,” Newton says.
He believes that a market can be created where digital tools can check and detect fake videos. Once we have mass access to these products that can verify videos, then it will be up to the individual to challenge social media connections when they share or promote manipulated content.
Seeing may soon not be believing, but that doesn’t change that it will still be worth finding out what to believe.
Newton says there used to be an old journalism adage that went “when in doubt, leave it out.” He believes we all need to follow the new rule “when in doubt, find out.”
Adam Chiara is an assistant professor of communication at the University of Hartford. He has worked as a legislative aide in the Connecticut General Assembly, as a journalist and as a public relations practitioner. You can find him on Twitter: @AdamChiara.