Mellman: The next war

Mellman: The next war
© iStock

It’s natural for people to fight the last war.

And sometimes it’s absolutely necessary.

Understanding how to limit the penetration and impact of fake news is critical. Preventing hacking into American political assets by Russia or other nefarious actors is vital.

ADVERTISEMENT

But we can’t let the last war blind us to the new and different weapons that will be deployed in the next war.


As Shane Greer, co-owner of Campaigns & Elections magazine, detailed in a recent speech to the European Association of Political Consultants, these new armaments, though often developed for wholesome purposes, are scary as hell.

Start with Voco, a software product from Adobe that’s been dubbed “Photoshop for audio.”

After recording about 20 minutes of a person’s voice, users can create new audio, indistinguishable from the target’s own voice, saying whatever the user types into the computer.

In short, Voco enables almost anyone to produce a whole new audio, using a candidate’s own voice, to say something he or she has never actually said. It will sound exactly like they did say it though.

Last cycle, numerous false statements were attributed to Hillary ClintonHillary Diane Rodham ClintonHillary Clinton: FBI investigation into Kavanaugh could be done quickly Hillary Clinton urges Americans to 'check and reject' Trump's 'authoritarian tendencies' by voting in midterms EXCLUSIVE: Trump says exposing ‘corrupt’ FBI probe could be ‘crowning achievement’ of presidency MORE. For example, a quote superimposed on a picture of Clinton, sourced to a book by Dick Morris, reading, “Look, the average Democratic voter is just plain stupid. They’re easy to manipulate,” was circulated on social media.

In fact, Clinton never uttered such words, nor does Morris’s screed even make such a claim.

But with Voco, you could actually hear Clinton “say” it, even though she never did.

It won’t be fake news, reporting something a candidate didn’t say. It will be fake audio featuring a candidate appearing to actually say something they never did.

Professor Matthias Niessner, of the Technical University of Munich, has developed a different toy. Using a simple web cam, his software allows someone to literally change another person’s facial expressions in a target video in real time. The altered video can’t be readily distinguished from the real thing.

Let’s say a candidate is giving their convention speech or participating in a debate. Someone could show it on the web in real time, altering the candidate’s expressions — substituting a smirk for a smile, fear for a frown, haughtiness for happiness or contempt for concern. And a video “recording” of the re-engineered event could be circulating on the web in no time.

Since our interpretation of others’ emotions rests so heavily on their facial expressions, such a weapon could completely change voters’ impressions of candidates.  

Like many technical innovations, Deepfake apparently got its start in the unwholesome world of porn. Sick innovators superimposed the heads of celebrities on the bodies of porn stars engaged, well … in their profession.

This is nothing new. People have long been cutting out pictures of heads and pasting them on the bodies of others to create comedic (or pornographic) scenes. But the cutouts were easily identifiable as fake.

Deepfake uses artificial intelligence to create realistic looks.

Add Voco and voters could be awash in videos portraying candidates saying and doing things they have never said nor done. And keeping track of what’s real and what’s fake could be a full-time job.

The Russians could do all this, but it doesn’t take their technical sophistication. Trump’s proverbial basement-dwelling couch potato could use these tools, off the shelf, with terrifying consequences.

In the wrong hands — and there are far too many of those — these insidious pieces of software could make 2016 look quaint, wreaking vastly more havoc on our elections than the Russians did last cycle.

We’re used to believing the evidence of our senses. We assume what we see and hear is real, and those sights and sounds are difficult to ignore.

In the next war, what our ears hear and our eyes see can be complete fiction.

And what are we doing to prepare for these challenges?

Almost nothing, as far as I can tell. We haven’t even figured out how to fight the last war, let alone the next one.

Don’t say no one warned you.

Mellman is president of The Mellman Group and has helped elect 30 U.S. senators, 12 governors and dozens of House members. Mellman served as pollster to Senate Democratic leaders for more than 20 years and as president of the American Association of Political Consultants.