The Cambridge Analytica whistleblower on how American voters are «primed to be exploited»

The Cambridge Analytica whistleblower on how American voters are “primed to be exploited”

Christopher Wylie says his former firm easily manipulated democracy with Facebook data. Could it happen again?

By

Hope Reese

Updated

Nov 4, 2019, 8:50am EST

Share this story

  • Share this on Facebook

  • Share this on Twitter

  • Share
    All sharing options


All sharing options for:
The Cambridge Analytica whistleblower on how American voters are “primed to be exploited”

  • Reddit

  • Pocket

  • Flipboard

  • Email

When Christopher Wylie leaves his London flat these days, he carries a panic button. The 30-year-old data consultant, best known as the whistleblower exposing Cambridge Analytica’s role in election interference — fueling Brexit, teaching Russians how to use propaganda to sway voters, and arguably helping elect Donald Trump — has disconnected all of the devices in his apartment, including his smart TV. He needs to ensure that no one is listening in.

In his case, the paranoia is justified: After revealing how his former London-based data firm worked with Facebook (with the social media platform’s permission), using an app that harvested data from 87 million Facebook profiles, Wylie has been stalked and threatened. He was also banned by the social media giant, a punishment Facebook hasn’t even meted out to some neo-Nazis.

But it hasn’t stopped Wylie from speaking up about what he saw at Cambridge Analytica. Mindf*ck: Cambridge Analytica and the Plot to Break America is Wylie’s new account of the rise and influence of the data company, which was created with Steve Bannon and kickstarted by a $15 million investment from Republican billionaire donor Robert Mercer. It began in 2013 when Wylie, a liberal Canadian who had helped construct the Democratic-leaning Voter Activation Network and later worked with Canada’s Liberal Party, ironically found himself designing the data architecture to support an alt-right conspiracy aimed at stoking fear and hate among Americans.

Cambridge Analytica cannot be singularly credited for the Trump victory in 2016, Wylie says now, but by studying American culture — researching how Americans responded to Fox News, for instance — the company tapped into already existing beliefs and insecurities. “America,” Wylie tells me, “was primed to be exploited.”

So the data company began feeding Americans fake news — inciting fear of immigrants, encouraging the idea that Hillary Clinton should be locked up, and dissuading African Americans from voting.

Disgusted by the actions of the company’s brass and far right investors such as Bannon, Wylie left the company in 2014, because, he writes, “otherwise I risked catching the same disease of mind and spirit.” After leaving Cambridge Analytica, and before the 2016 election, Wylie says he tried to warn Facebook and the White House about the manipulation of American voters. But at that point, no one imagined a Trump victory. “They didn’t care,” he says. But in March 2017, two months after Trump’s inauguration, Wylie was contacted by Guardian journalist Carole Cadwalladr — and later the New York Times — and Cambridge Analytica’s work was exposed in stories published a year later.

Facebook has since been hit with a $5 billion penalty by the FTC. Cambridge Analytica dissolved. As for what happened to the scraped data? No one is quite sure.

I spoke to Wylie about how the propaganda spread by Cambridge Analytica and the Trump campaign influenced American voters and why he’s worried about the 2020 election, among other subjects.

Our conversation has been condensed and edited for clarity.

Hope Reese

The political strategy of inciting fear is not new; Nixon used it, for instance. How did Cambridge Analytica take it to the next level? And could you see the propaganda having an impact?

Christopher Wylie

This is really important for people to understand: Data sets are connected to each other. When you subscribed to a magazine two years ago, it feels disconnected to you liking something on Facebook today. But if I acquire both of those data sets, I can put them together. Including when you registered to vote, who you voted for in a primary, if you’ve responded to a poll before. The people who would be targeted are called the “targeting universe.” Imagine it as a list of specific people.

So an algorithm goes through a bunch of data, makes a list of individuals. And those people would be put into a campaign, like, “the immigrants are coming,” or “Obama’s going to take your guns,” or whatever. And the people from that list who keep engaging over and over again would receive an invite. So if you know that 30 percent of this particular invite group went to that event, you’d know that there’s almost a one in three chance that this person went versus that person went.

Everything is tracked — when you click on stuff, when you share stuff. Imagine you are a target. You’re sitting in your living room, and you see ads for a group, and you click on it, and you join that group, and you start having conversations. A couple days later, you get a share from somebody in the group about some kind of weird thing that Obama is doing. And you’re slightly outraged by it. And then you keep clicking on stuff and then a week later you get a phone call, which is a poll to ask your opinion about something.

When you’re talking on the phone to some random polling company, you’re not thinking that that’s connected to, like, the things that you saw last week, the chats you had last week. And if you respond in a particular way, you get put into a new target group where they try to push more content. If you engage at a certain rate, somebody might send you a message or an email saying, “Hey, do you want to come to this event?” You don’t suspect that you’re in a target universe, cause you don’t even know what that is — you’ve never even heard of it.

What you were doing in your living room two weeks ago, that phone call or email, or a knock on the door from a canvasser — you don’t see how they’re connected, but they are.

Hope Reese

You were becoming uncomfortable with what was happening at Cambridge Analytica, but it felt abstract for a while — until a video you saw made it feel more real to you.

Christopher Wylie

Some people in the target universe would get invited to stream focus groups or events and those would often be filmed. It becomes a lot more real when you go from looking at a record ID number…to actually seeing a video of somebody filled with rage about something that’s completely made up. They don’t understand that what they’re angry about was specifically crafted and curated to make them feel that way, about something that may or may not be real.

I looked at that and thought: This is not just a game of math. It’s not increased rates and increased numbers here, and decreased numbers there, with database ID numbers. All of a sudden, there’s an actual person who looks like they’re about to break a chair because they’re so angry about something that you know, but they don’t know, was made up, that they are there because they’ve been clicking on stuff and they’ve been manipulated to feel this way.

Hope Reese

Obama’s political campaign also created and spread targeted ads on Facebook. How was what happened in the Trump campaign different?

Christopher Wylie

The Obama campaign didn’t rely on scaled disinformation. Cambridge Analytica was trying to identify people who were prone to conspiratorial thinking or paranoid ideation and exacerbate those latent characteristics with those people. The Obama campaign focused on identifying people who typically didn’t vote or were infrequent in their voting habits. So people of color or single women with children — there are structural obstacles to voting, so motivating them to vote was a big focus.

I don’t think there’s an innate problem with targeting in campaigns. If you care about the environment, I should be talking to you about the environment. Where the line gets crossed is where you start to effectively stalk a person, going beyond just an issue and looking at: How does the person make decisions? What are the emotional vulnerabilities of the person and how can I exploit that? And in terms of transparency, when the Obama campaign did advertising, you were aware that you’re seeing an ad.

Hope Reese

At Cambridge Analytica, Russian businessmen visited the office frequently — but at the time, Russia was not on anyone’s radar. How much of Russian involvement in the US election happened via Cambridge Analytica?

Christopher Wylie

At the time, it was weird — but there was a lot of weird things all happening at the same time. Steve Bannon was weird. Everything that the company did was weird. But when the Russian involvement started to come into the public consciousness, I thought — wait a second. This company was advising Donald Trump, and Russian businessmen were coming in left, right, and center.

I’m not saying there was a conspiracy. But there was so much frequent contact, where we explained over and over and over again, to people connected to Russian intelligence services, “Hey, we have all this data. We have this AI. This is how we’ve done it.” Literally, our presentation in St. Petersburg was about the efficacy of using voter targeting in the US using social media data. There were a lot of opportunities for exploitation.

Hope Reese

How can we begin to regulate the use of private data from Facebook or other social media platforms?

Christopher Wylie

I am not a policy expert — I’m a dude who works with tech. But I have noticed a couple of things that I find concerning and irritating about how policy makers talk about tech. There’s this notion that “the law can’t keep up with technology.” That technology moves so fast that we can never create rules that keep up with it. I’ve heard that so many times from members of Congress. But I point out that we have all kinds of safety regulations for aerospace, nutrition, power plants, cancer medicine and pharmaceuticals — for the types of fertilizers and pesticides that are allowed or not.

These are all products of technology. The difference is that we have technically competent regulators that are empowered by the law to make decisions on the public’s behalf, without a debate in Congress. So Silicon Valley is like, “Well, you don’t understand the algorithms, so how are you going to debate in Congress?” But they also don’t understand how a nuclear power plant works. So the debate in Congress is: Should we have people who know how this works in power to make rules about the safety of these nuclear power plants? Yes. Cool. Let’s create the Department of Energy. And they make regulations. So the first thing is that we need to get over this idea that just because it’s software, somehow the law can’t keep up.

Hope Reese

How much of a problem will this be in America’s 2020 election?

Christopher Wylie

If a relatively small company in London, within a couple of years, can build up a sophisticated capacity to target and deliberately manipulate a subset of American voters, enough to push certain candidates over the line. Even if Cambridge Analytica has dissolved, the same people are working on the Trump campaign. And there is no way to confirm that the data sets that they amassed are actually gone.

If a company like Cambridge Analytica can do it, what happens when China becomes the next Cambridge Analytica? What happens when North Korea or Iran becomes the next Cambridge Analytica? Cambridge Analytica was first. But these are countries that have more than enough capacity to replicate the work that Cambridge Analytica was doing. And probably go further. This is why I was so upset with Facebook: It’s about the fact that we have unsafe platforms that are causing a huge risk to the integrity of democracy in the United States and around the Western world.

Look at Mark Zuckerberg’s speech, where he said, essentially, “Well, disinformation — you’re just going to have to deal with that.” Why is it that he, unilaterally, gets to decide how much or how little disinformation is part of our electoral process?

The most egregious thing is what Cambridge Analytica has exposed: That we have relegated the security and integrity of our democracy to a private company that doesn’t really want to do anything about it.

Hope Reese is a writer based in Louisville, Kentucky, currently living in Budapest. Her work has appeared in the Atlantic, the Boston Globe, and Vice.

Sourse: vox.com

No votes yet.
Please wait...

Ответить

Ваш адрес email не будет опубликован. Обязательные поля помечены *