Is there a difference between good and bad online election targeting?

Is there a difference between good and bad online election targeting?
© Greg Nash

Setting your personal goals such as saving for retirement, running a marathon, or learning to meditate have never been easier. Four thousand years ago, Babylonians, who were the first people to make New Year resolutions, had only willpower to push them forward. Today, anyone with a smartphone has apps to guide them on a path toward their aspirations.

Underpinning those apps is behavioral microtargeting, the union of behavioral sciences with machine learning, which predicts and refines through experimentation the message most likely to persuade a person to perform some action based on the data of that person. Microtargeting can be a powerful force for good. Imagine if everyone saved for retirement. Like any tool, however, it can also be used unethically to manipulate.


With the elections heating up and news feeds brimming with ads for this candidate and that cause, voters need to be adept at recognizing persuasive from manipulative microtargeting. But what is the difference? Whereas persuasion involves convincing your audience that your position advances their agenda, manipulation involves convincing your audience that your position advances their agenda when, in reality, it does not. It advances your own agenda. Persuasion, in short, relies on integrity whereas manipulation relies on deception.

Instances of deception in the 2012 and 2016 elections illustrate how microtargeting can be used ethically to persuade and unethically to manipulate. In 2012, the Obama campaign created an app for supporters to donate money and find houses to canvas. By asking the people who downloaded the app for permission to scan their Facebook news feeds and friends list, campaign also collected data on the friend of supporters, which it used to determine who might be persuadable. The campaign then encouraged supporters to contact their most persuadable friends.

Importantly, the campaign complied with Facebook’s terms of service and federal election law. The campaign had the consent of its supporters to access their data and the supporters knew the campaign would use their data for political purposes. The campaign also only directly messaged those who downloaded the app. The transgression, albeit legal, was that although its supporters gave consent, their friends did not and so were unaware that a political campaign obtained and used their data.

In 2016, the deceptions by Cambridge Analytica on behalf of candidate Donald TrumpDonald John TrumpTrump: WHCA picking non-comedian for headliner a 'good first step' Five takeaways from Mississippi's Senate debate Watergate’s John Dean: Nixon would tell Trump 'he's going too far' MORE were numerous and more egregious. Cambridge Analytica was the American commercial subsidiary of a British company, which purchased Facebook data from a developer who duped people into relinquishing their data and friends list under the auspices of a personality quiz that would be used for academic research. Cambridge Analytica then sent targeted ads to those people and any person with a similar profile.

These activities violated not only Facebook’s terms of service, which bans developers from selling its data to businesses, but also federal election law, which bans foreign nationals from participating in decisions that affect American elections. Worse, none of the people targeted by Cambridge Analytica, not the people who took the personality quiz nor their friends, knew that a political campaign had their data.

Probably most disturbing, however, was microtargeting content of Cambridge Analytica. According to former employee turned whistleblower, Christopher Wylie, Cambridge Analytica “sought to identify mental vulnerabilities in voters and worked to exploit them by targeting information designed to activate some of the worst characteristics in people such as neuroticism, paranoia, and racial biases” that were “making them believe things that are not necessarily true.”

How do voters avoid becoming a victim of manipulative microtargeting? Congress has two bipartisan bills that would significantly increase online transparency standards. Introduced by Senators John KennedyJohn Neely KennedyMORE (R-La.) and Amy KlobucharAmy Jean KlobucharDrug industry nervous about Grassley’s new role Some of us Midwesterners think maybe Amy Klobuchar would do OK as president Hillicon Valley: Facebook reeling after NYT report | Dems want DOJ probe | HQ2 brings new scrutiny on Amazon | Judge upholds Russian troll farm indictments | Cyber moonshot panel unveils recommendations MORE (D-Minn.), the Social Media Privacy Protection and Consumer Rights Act of 2018 would give people the right to opt out of microtargeting and keep their information private. The Honest Ads Act, also introduced by Klobuchar, with Senators Mark WarnerMark Robert WarnerDems slam Trump for siding with Saudi Arabia in Khashoggi killing CIA's report complicates US response to Khashoggi murder Banking panel showcases 2020 Dems MORE (D-Va.) and John McCainJohn Sidney McCainJoy Behar torches Ivanka Trump over emails: 'How dumb' can she be? Stephen King calls Trump a 'chickens---' Trump stokes new unlikely feud MORE (R-Ariz.), would ensure that online ads are subject to the same rules that apply to television, radio, and print ads.

In the meantime, technology companies have started introducing features to make it easier for people to ascertain the identities and agendas behind the ads they see. Facebook introduced an online archive of all of its political ads, who paid for them, as well as the demographics of those who were targeted. Twitter launched a similar policy to help users identify political ads and who paid for them. These features should boost online transparency, but they do not delegate enough control to users. This means that until privacy and election laws catch up with microtargeting, voters will have to determine the integrity or deception in the ads they see on their own. My advice going into this election is to ask yourself whether the ads trigger your “inner demons” or your aspirations.

Charlotte Stanton is a fellow in the technology and international affairs program and the director of the Silicon Valley office of the Carnegie Endowment for International Peace. She is on Twitter @CharlotteStant.