Marking 20 years
of bold journalism,
reader supported.
News
Rights + Justice
Science + Tech

Beware the ‘Weaponized’ Web, Says Guy Who Helped Elect Trump

Cambridge Analytica whistleblower Christopher Wylie on Facebook, democracy and hope.

Zoë Ducklow 19 Apr 2019TheTyee.ca

Zoë Ducklow is an independent journalist and photographer covering current affairs. Find her on Twitter here.

Christopher Wylie, of highlighter hair and Cambridge Analytica fame, was in Vancouver this week to talk about “Confronting the Disinformation Age.”

In 2018, Wylie blew the whistle on Facebook and consultancy firm Cambridge Analytica, cofounded by Trump campaign strategist Steve Bannon, for collecting data without users’ consent.

The extensive and personal data sets, Wylie said, were used to target people with political messaging that appealed to their specific psychologies.

“Fashion data was used to build AI models to help Steve Bannon build his insurgency and build the alt-right. And the alt-right is an insurgency,” he said in a presentation last November. “We used weaponized algorithms, we used weaponized cultural narratives to undermine people and undermine their perception of reality.”

He said the success of the targeting led to Donald Trump’s election win and the Brexit vote.

The Democracy Project wanted to know what the Victoria native had to say about democracy in Canada, and it shared this interview with The Tyee. Our interview was edited for clarity and length.

Zoë Ducklow: Who does democracy work for, and who is left out of the democratic process?

Christopher Wylie: Historically, democracy has worked for people who hold the power of information versus people who become recipients of information. Whether it’s elections or monarchs, the control of what’s allowed to be said or not said in the press has been the central aspect of power.

Now, information has been decentralized, so it’s no longer the people who can influence the media. It’s tech platforms that literally have created the informational ecosystem themselves. There’s a new concentration of power, big tech companies, who literally hold the ecosystem where we get our information. That’s problematic, because there are no rules to require these companies to act in good faith.

For example, with me as a whistleblower, after I came forward with information and reported Facebook to the authorities, they immediately banned me from their platforms with no due process, no right of appeal. It’s technically a private space, so they can do that.

If you can, without any due process, ban a whistleblower who’s reporting you to the authorities, what happens when these platforms feel threatened and decide to start influencing the political discourse even more?

Are you still banned from Facebook?

Girl, I am banned from Facebook, Instagram — apparently I’m longer banned from WhatsApp, but I don’t use it. Nor should anybody. 

Ooh, it used to be the safe one.

Um, it’s owned by Facebook. I mean it — that’s all I — that’s all I’ll say.

You’ve spoken about how Cambridge Analytica used weaponized targeting to identify people by their fashion choices who were perhaps susceptible to becoming alt-right. How does fashion intersect with democracy?

Cambridge Analytica was looking for proxies that could be measured, that were interrelated with people’s identity and personality traits. They weren’t literally targeting on the basis of whether you wore Levi’s jeans or not; it was using that in conjunction with lots of other data sets to build a holistic profile. Nonetheless, fashion played a more significant role than I think people realize.

When you look at the history of politics, so often movements have a ‘look.’ Whether it is fascists like the Nazis, authoritarian regimes like Maoists, or Stalinists, or the alt-right here, you have an image of them. Because one of the first things they do is they develop an aesthetic. So often, totalitarianism is about the aesthetics of society rather than any kind of policy prescription. It’s about what things should look like and feel like.

Even during the feminist revolution when people were burning bras, or wearing lower-cuts or things like that, they were using fashion to redefine themselves and redefine their position in society. So it works both ways. 

Fashion plays a role... because if you can enforce a new kind of identity and a new kind of conformity, you change how people see themselves. So in that sense, fashion is actually quite influential in politics. It’s subtle, but it’s influential.

What is the fashion look of the alt-right?

Basic bitches.

When you think about what a person in the alt-right looks like versus a hippie-dippie liberal, you can immediately think of people and what they look like. All those angry white dudes with the tiki torches, they were all wearing beige khakis and polo shirts. Not to say that if somebody wears beige khakis that all of a sudden they’re a proto-Nazi, but there is something to be said about if you are trying to whitewash society — quite literally, racially whitewash society — part of that conformity comes with what people look like, and what do they wear.

In terms of the digital ecosystem that would enable targeting like that to happen in Canada, we’re wide open, right?

Oh completely. I mean, this is what people really have to understand: most people socialize and get their information now on a set of American-owned tech platforms that have no rules.

Facebook, for example, did nothing to stop the rampant disinformation and hate propaganda that was happening in Myanmar, where the United Nations came out and said this contributed to ethnic cleansing. The threshold of ethnic cleansing is not enough to take action on your own platform. I do not have faith that these platforms will do their due diligence and their duty to protect free and fair elections around the world, and also to make sure that the discourse that happens on these platforms is safe for people.

What can regular people do to be aware of their own digital influences?

Tech platforms are not services. We talk about Facebook, or Instagram, or whatever, as services where you opt-in by clicking accept the terms and condition. But actually, what they are is architectures. The top job title at Facebook, for example, is engineer and data architect. They’re building and engineering informational architectures. Look at how we treat architectures in other domains. In physical architecture, for example, we have building codes; electrical and water infrastructures have utilities — these all have a higher standard of regulatory obligations, because you as the consumer do not actually have bargaining power.

It is unreasonable to say that a person can negotiate a special contract with an energy company. Or to let an architect build a building without emergency exits and say, ‘Because of my user experience concept, I’m not going to have fire exits. But I’ll slap some terms and conditions on the door that you’ll accept when you walk into the building.’ Right?

Unfortunately, there isn’t really anything that an individual can do, per se, about the informational architecture that they live in. But what they can do is go and talk to people who are in a position of power who can create a set of rules, a building code so to speak, for the internet. And create technically competent regulators to enforce safety standards. 

Has democracy changed because of artificial intelligence?

Oh yes, completely. Undeniably. That’s not to say that we’ve got an evil team of robots somewhere controlling the masses. But the news feed that you see on Facebook is substantially influenced by AI. The things that you see online are not random. This is how targeted adverts work; this is how user experience works. You are put in an environment which is constantly observing you and constantly thinking about you.

The problem is that a lot of these big tech companies have created AI widgets and tools and whatnot and sort of released them onto the population without really knowing how they work.

If you’ve created a piece of AI that’s simply looking for engagement, people will click and share things that are shocking. It’s just human behaviour. So if you’ve got something as cold and calculating as a piece of AI to up engagement rates, it’s going to promote things that probably it shouldn’t be promoting. Even on YouTube, you go and watch the livestream of the Notre Dame burning, and underneath it has a pop-up that talks about 9/11. Because [the AI] looks for relationships between big disasters and what other people are talking about. You’re watching a cathedral burn, and all of a sudden now you’re starting to think, was it Muslim terrorists? All because of a piece of AI trying to get engagement and click-through.

Because, if the information environment that we’re living in is filtered through these pieces of AI, that innately changes what we focus on and what our dialogue is. Which then in turn affects an election. Absolutely AI is having profound effects on how democracy works.

How do we need to evolve to avoid another Cambridge Analytica situation?

Other professions — engineers, architects in the physical space, lawyers, doctors, nurses, etc. — these professions have ethical standards. You not only need the academic and technical expertise to do that job, but you have a higher set of ethical standards because you are in a position of unique power.

It really frustrates me that software engineers, data scientists — people who are building the technologies that you touch almost every single minute of your life — have no obligation to give consideration to the ethical impact of what they’re building, to whether it is safe for you, whether it’s deceptive or coercive. Anything is okay so long as it doesn’t meet threshold of technical criminality. That’s a real problem, and that’s where we need to start.

But most engineers aren’t bad people. They want to build something that’s cool. The problem is that when they feel uncomfortable about what they’re working on, they don’t have legal backing to say no to the company. As a nurse, if you were told to do something unethical, you would have not only an obligation to say no, but the hospital couldn’t retaliate against you. It’s not about creating moral and ethical standards for software engineers, because they’re all trying to do evil things, it’s about empowering engineers who see things that could be problematic to say no without threat of retaliation.

You’ve talked about tech companies being the new colonizers. Can you explain what you mean by that? Are certain groups of people maybe more vulnerable to being exploited?

Looking at Europeans coming into North or South America or Australia or Asia where this narrative of ‘terra nova’ — which means new land, empty new land, barren land — was used to justify claiming this land and starting to rule over this land. The Indigenous people were just considered flora and fauna; they weren’t considered people. When you look at how a lot of these big tech platforms look at the internet, it’s a terra nova mindset: ‘We can do whatever we want because it’s the internet.’

What colonizers wanted to do, and still do in a lot of the global south, was exploit resources and exploit people to make money. Gold, rubber, sugar cane, whatever.

But now you are the resource: your data and your identity.

The scary thing for me is that when you look at the history of what happens when we start treating people as property, you’ve got some really scary industries: the slave trade, the sex trade, organ trade. Now we’re on the precipice of a global data trade, where your identity and who you are and your behaviour becomes a product.

It’s scary when you look at the developments of technology in the next five to 10 years with the implementation of 5G networks which will enable Internet of Things, so that your toothbrush will be able to make a dentist appointment and talk to your fridge, and your fridge can talk to Google or Amazon or whatever. All of a sudden every object in your house becomes a tool of surveillance. Not only a tool of surveillance, but, because they can influence your physical space, they might be used as a tool of control and influence.

For the first time in history, the environment itself thinks about you and seeks to influence you. Your car talks to the road and the road decides whether or not you should be on time for work depending on the subscription package you’ve selected for your self-driving car. If we allow this kind of influence to be unchecked, once you’re inside of it, it’s very difficult to leave or challenge it.

Is there hope?

I think there is. The fact that in under a year it’s become such a prevalent conversation I think is really hopeful. A year and a half ago, it was a niche, abstract issue talked about at the ivory towers of the university. Now it’s in mainstream political discourse.

The first step to any kind of social change is awareness. We can’t move anywhere until people are aware that there’s a problem. I am hopeful because only a couple years ago, these giant tech companies were seen as sort of the saviours of mankind, but now kind of look like another douchey oil company who are trying to hide what they do, trying to whitewash and greenwash or in this case machine-wash what they’re doing. But actually it’s all bullshit, and it’s just another big corporation. I think people are starting to wake up to that and that actually is what makes me hopeful. Regular people who don’t have tech backgrounds are starting to talk about it.  [Tyee]

  • Share:

Facts matter. Get The Tyee's in-depth journalism delivered to your inbox for free

Tyee Commenting Guidelines

Comments that violate guidelines risk being deleted, and violations may result in a temporary or permanent user ban. Maintain the spirit of good conversation to stay in the discussion.
*Please note The Tyee is not a forum for spreading misinformation about COVID-19, denying its existence or minimizing its risk to public health.

Do:

  • Be thoughtful about how your words may affect the communities you are addressing. Language matters
  • Challenge arguments, not commenters
  • Flag trolls and guideline violations
  • Treat all with respect and curiosity, learn from differences of opinion
  • Verify facts, debunk rumours, point out logical fallacies
  • Add context and background
  • Note typos and reporting blind spots
  • Stay on topic

Do not:

  • Use sexist, classist, racist, homophobic or transphobic language
  • Ridicule, misgender, bully, threaten, name call, troll or wish harm on others
  • Personally attack authors or contributors
  • Spread misinformation or perpetuate conspiracies
  • Libel, defame or publish falsehoods
  • Attempt to guess other commenters’ real-life identities
  • Post links without providing context

LATEST STORIES

The Barometer

Do You Think Trudeau Will Survive the Next Election?

Take this week's poll