Mindf*ck: Cambridge Analytica and the Plot to Break America
Christopher Wylie
Random House
Copyright © 2019 by Verbena Limited
I suspect most Americans didn’t read Robert Mueller’s convincing but legalistic report — so many faked out by the disingenuous William Barr and the drumbeat of FOX News lies. Even after Mueller testified with extreme hesitancy, they didn’t appreciate the extraordinary extent of the Russian Internet Research Agency’s phony Facebook postings and disinformation, or the impact of the GRU hackers of the DNC servers, and the effective leaks by Wikileaks of the stolen emails.
And so it falls upon the flamboyant, charismatic, brilliant Christopher Wylie, he of the dyed head and younger generation, to explain the oh-so-successful mindf*ck pulled off by the Trump team and the Russians — not ever by the Ukrainians.
And because he was at the very heart of the operation, Wylie’s task — his obsession, really — is to explain what really happened, the most chilling and comprehensive hijack of American democracy: “The largest data crime … in history.”
Thankfully Christopher Wylie is blowing the whistle on Strategic Communication Laboratories (SCL), father to Cambridge Analytica, and Steve Bannon and the Mercers and the Trump campaign and the con-men and con-women of Facebook, our social media gone amuck. And because he’s both a very good writer, and a digital visionary, he forces us to see what we’re so very reluctant to see: our willing surrender of the most personal, private data to money-hungry, anti-democratic manipulators who, every day, pervert privacy, enable tyrants and help elect the unworthy.

Christopher Wylie thought he was helping develop tools to combat the online spread of extreme ideologies. Instead, he helped build the machinery necessary to empower a growing alt-right conspiratorial movement.
But because he learned and grew along the way, “Mindf*ck” is the redemptive clarion call of someone who knows better than anyone that we are all canaries in their coalmines, that we’re running out of air.
Wylie introduces us to the British dandy Alexander Nix of SCL, and Cambridge Analytica; the computer scientists and researchers like Dr. Aleksandr Kogan; the eccentric billionaires Robert and Rebekkah Mercer; and Steve Bannon, the right-wing former editor of Breitbart News and former strategist for the Trump White House, a man the Mercers bankroll no matter what.
While Mercer saw a money-making gold mine, Bannon appreciated the political potential to aggregate and manipulate the vast mega-mountains of personal data SCL, Analytica and Kogan were sucking up. And don’t forget the requisite Russians, the Putin-friendly oil executives of Lukoil, and Konstanin Kilimnik, who helped Paul Manafort install a pro-Russian Ukrainian government. Not surprisingly, they were more than thrilled to learn about the enormous data now available.
Wylie was hired by Strategic Communication Laboratories, led by Nigel Oakes and Alexander Nix, “a British military contractor, working on big ideas, with a growing team of mostly gay and mostly liberal data scientists … breaking into a new field of researching societies. If we could put society into a computer, we could start to quantify everything and encapsulate problems like poverty and ethnic violence in a computer; we could simulate how to fix them … I did not yet see the contradiction in what I was doing.”
Thanks to computers and social media, “suddenly military and security agencies had direct access to the minds and lives of guards, clerks, girlfriends, and runners of criminal and terrorist organizations all around the world … a trail of detailed personal information that previously would have taken months of careful observation to gather …”
I spoke of good writing: Wylie tells us Nix “was born into the British upper classes and schooled at Eton, an institution where the royals send their children and whose uniform still includes collars and tails … His accent was as rich as they come. He wore black-rimmed glasses, and his floppy strawberry-blond hair had a deliberately casual little flip to it …

“Nix was the ringleader, the grinning, soulless salesman who didn’t understand anything we were doing but wasted no time selling it … The directors of SCL assigned him to lead the firm’s side business of rigging elections in forgotten countries of Africa, the Caribbean, and South Asia. … [giving] these politicians access to anything they wanted in the old imperial capital of London … invitations to exclusive parties, or, if desired, the private company of elegant and open-minded women …[to] broker deals between ministers who were looking for validation and women, and businessmen who were looking to exploit corrupt business opportunities and travel unnoticed … Born too late to play colonial master in the old British Empire, he treated SCL as the modern equivalent. As Nix put it in one of our meetings, he got to ‘play the white man.’ ‘They [are] just niggers,’ he once said to a colleague in an email, referring to black politicians in Barbados.”
Case in point, Trinidad: “The Trinidad Ministry of National Security wanted to know whether it was possible to use data to identify Trinidadians who were more likely to commit crimes—and, beyond that, whether it was possible to predict when and how they might do it … But the truth was, the Trinidadian government wasn’t interested only in reducing crime. They knew that if we built a tool to forecast behavior, they could use it in elections …
“Trinidad government contacts were offering SCL access to the unredacted, de-anonymized census …. Essentially the Trinidadian government was violating the privacy of all its citizens in one swoop …” Next SCL wanted to combine raw census data with information about what the citizens of Trinidad were doing, thinking “using the Internet to collect relevant data … Working with a set of contractors, SCL was able to tap into the telecom firehose, pick an IP address, and then sit and watch what a person in Trinidad was browsing on the Internet at that very moment.
“We were spying, pure and simple, with cover from Trinidadian leaders. It felt bizarre—unreal—to be observing what people were watching on a tiny, faraway island, somehow more like we were playing a video game than intruding on the private lives of actual people. Even today, thinking back on it, Trinidad seems more like a dream than something we actually did. But we did do it … As I watched those livestreams, I didn’t allow myself to actually picture the human prey, people who had no idea that their private behavior was delighting sinister audiences half a world away. The Trinidad project was my first taste of this new wave of digital colonialism. We arrived unannounced with our superior technology and moral disregard, no better than the king’s armies. Except this time, unlike the conquerors of old, we were completely invisible.” (Emphasis added.)
Wylie explains the key is getting as much useful data as possible from as many people as possible. The process began with Amazon’s Mechanical Turk (MTurk) program, where a small army of people agree “to do micro-tasks—such as typing in scans of receipts or identifying photographs—for small amounts of money.” Then getting these MTurk members to take personality tests, comment on products, concepts, ideas and paying them a small fee to do so.
Enter the Mercers, Robert and Rebekkah, who, like Facebook, appreciated the link between knowing what people did and what they liked and the ability to make money. They invested between $15 million and $20 million in SCL. “To put it crudely, if we could copy everyone’s data profiles and replicate society in a computer … we could simulate and forecast what would happen in society and the market. This seemed to be Mercer’s goal. If we created this artificial society, we thought we would be on the threshold of creating one of the most powerful market intelligence tools in the world … cultural finance and trend forecasting for hedge funds.”

“Mercer, the computer engineer turned social engineer, wanted to re-factor society and optimize its people. One of his hobbies is building model train sets, and I got the feeling that he thought he could, in effect, get us to build him a model society for him to tinker with until it was perfect. By taking a leap at quantifying many of the intrinsic aspects of human behavior and cultural interaction, Mercer eventually realized that he could have at his disposal the Uber of information warfare …”
Enter Dr. Aleksandr Kogan of the University of Cambridge, a Soviet-born American expert who exploits a 2015 study that reveals a computer model utilizing similar data from Facebook likes can predict human behavior: “With ten likes, the model predicted a person’s behavior more accurately than one of their co-workers. With 150 likes, better than a family member. And with 300 likes, the model knew the person better than their own spouse … Facebook knows more about you than any other person in your life …” Kogan shows them how to maximize and harvest that data through the Facebook apps he and his colleagues have developed. (Emphasis added.)
Much like MTurk: “A person would agree to take a test in exchange for a small payment. But in order to get paid, they would have to download Kogan’s app on Facebook … The app, in turn, would take all the responses from the survey and put those into one table. It would then pull all of the user’s Facebook data and put it into a second table. And then it would pull all the data for all the person’s Facebook friends and put that into another table …
“It always started with a … personality measure called the IPIP NEO-PI, which presented hundreds of items, like ‘I keep others at a distance,’ ‘I enjoy hearing new ideas,’ and ‘I act without thinking.’ When these responses were combined with Facebook likes, reliable inferences could then be made … [and] when those likes were combined with hundreds of other likes, as well as other voter and consumer data, then powerful predictions could be made. Once the profiling algorithm was trained and validated, it would then be turned onto the database of Facebook friends … [because] we had access to their likes page, which meant that the algorithm could ingest the data and infer how they likely would have responded to each question if they had taken a survey …
“Kogan’s suggestions began to match exactly what Bannon wanted … that we should begin examining people’s life satisfaction, fair-mindedness (fair or suspicious of others), and a construct called ‘sensational and extreme interests,’ which has been used increasingly in forensic psychology to understand deviant behavior. This included ‘militarism’ (guns and shooting, martial arts, crossbows, knives), ‘violent occultism’ (drugs, black magic, paganism), ‘intellectual activities’ (singing and making music, foreign travel, the environment), ‘occult credulousness’ (the paranormal, flying saucers), and ‘wholesome interests’ (camping, gardening, hiking) …

Why did Facebook allow developers to access your data? “The more it learns about its users, the more it can monetize them … Facebook did not require express consent for apps to collect data from an app user’s friends … even if the friends had no idea the app was harvesting their private data. The average Facebook user has somewhere between 150 and 300 friends. My mind turned to Bannon and Mercer, as I knew they would love this idea—and Nix would simply love that they loved it … [Because] If I create a Facebook app, and a thousand people use it, I’ll get like 150,000 profiles …” (Emphasis added.)
Wiley explains how valuable this data was/is: “friends, colleagues, spouses, and parents typically see only part of your life … Your parents may never see how wild you can get at a 3 A.M. rave after dropping two hits of MDMA, and your friends may never see how reserved and deferential you are in the office with your boss … But Facebook peers into your relationships, follows you around in your phone, and tracks what you click and buy on the Internet. This is how data from the site becomes more reflective of who you ‘really are’ than the judgments of friends or family …”
Surprise, surprise. Kogan also worked at the University of St. Petersburg. And theoretically all this data was easily accessible by the same Russians who hacked the DNC. How could this data translate into political influence and votes? The moment when Steve Bannon knows:
“‘Give me a name.’ Bannon looked bemused and gave a name. ‘Okay. Now give me a state.’ ‘I don’t know,’ he said. ‘Nebraska.’ Jucikas typed in a query, and a list of links popped up. He clicked on one of the many people who went by that name in Nebraska—and there was everything about her, right up on the screen. Here’s her photo, here’s where she works, here’s her house. Here are her kids, this is where they go to school, this is the car she drives. She voted for Mitt Romney in 2012, she loves Katy Perry, she drives an Audi, she’s a bit basic, and on and on and on. We knew everything about her—and for many records, the information was updated in real time, so if she posted to Facebook, we could see it happening.
“And not only did we have all her Facebook data, but we were merging it with all the commercial and state bureau data we’d bought as well. And imputations made from the U.S. Census. We had data about her mortgage applications, we knew how much money she made, whether she owned a gun. We had information from her airline mileage programs, so we knew how often she flew. We could see if she was married (she wasn’t). We had a sense of her physical health. And we had a satellite photo of her house, easily obtained from Google Earth. We had re-created her life in our computer. She had no idea …
Data about tens of millions. Possibly 200 million by the end of the year. “And we know literally everything about these people?” asked Nix. ‘Yes,’ I told him. ‘That’s the whole point.’ … ‘Do we have their phone numbers?’ Nix asked. I told him we did … [then] he reached for the speakerphone and asked for the number …
“We heard a woman say ‘Hello?’ and Nix, in his most posh accent, said, ‘Hello, ma’am. I’m terribly sorry to bother you, but I’m calling from the University of Cambridge. We are conducting a survey … ‘Ms. Smith, I’d like to know, what is your opinion of the television show Game of Thrones?’ Jenny raved about it—just as she had on Facebook. ‘Did you vote for Mitt Romney in the last election?’ Jenny confirmed that she had. Nix asked whether her kids went to such-and-such elementary school, and Jenny confirmed that, too. When I looked over at Bannon, he had a huge grin on his face …
“Bannon said, ‘Let me do one!’ We went around the room, all of us taking a turn. It was surreal to think that these people were sitting in their kitchen in Iowa or Oklahoma or Indiana, talking to a bunch of guys in London who were looking at satellite pictures of where they lived, family photos, all of their personal information. Looking back, it’s crazy to think that Bannon—who then was a total unknown, still more than a year away from gaining infamy as an adviser to Donald Trump—sat in our office calling random Americans to ask them personal questions. And people were more than happy to answer him.
“We had done it … This was an epic moment. I was proud that we had created something so powerful. I felt sure it was something that people would be talking about for decades.” (Emphasis added.)
One dot led to another: “Bannon’s goal was to change politics by changing culture; Facebook data, algorithms, and narratives were his weapons. First we used focus groups … [to] learn what people cared about—term limits, the deep state, draining the swamp, guns, and the concept of walls to keep out immigrants were all explored in 2014, several years before the Trump campaign. We then came up with hypotheses for how to sway opinions. CA tested these hypotheses with target segments in online panels or experiments to see whether they performed as the team expected, based on the data …
Now here’s a critical part of the process: “the team was able to identify people … who were more prone to impulsive anger or conspiratorial thinking than average citizens. Cambridge Analytica would target them, introducing narratives via Facebook groups, ads, or articles that the firm knew from internal testing were likely to inflame the very narrow segments of people with these traits. CA wanted to provoke people, to get them to engage.”
In the summer of 2014, Cambridge Analytica “began developing fake pages on Facebook and other platforms that looked like real forums, groups, and news sources … Because of the way Facebook’s recommendation algorithm worked, these pages would pop up in the feeds of people who had already liked similar content. When users joined CA’s fake groups, it would post videos and articles that would further provoke and inflame them …”
And here we have it. Christopher Wylie explains Donald Trump’s success in 2016. “Now CA had users who (1) self-identified as part of an extreme group, (2) were a captive audience, and (3) could be manipulated with data. Lots of reporting on Cambridge Analytica gave the impression that everyone was targeted. In fact, not that many people were targeted at all. CA didn’t need to create a big target universe, because most elections are zero-sum games: If you get one more vote than the other guy or girl, you win the election. Cambridge Analytica needed to infect only a narrow sliver of the population, and then it could watch the narrative spread.” (Emphasis added.)
Then Cambridge Analytica took it a step further, moving to political agitation: “Once a group reached a certain number of members, CA would set up a physical event … People would show up and find a fellowship of anger and paranoia … The meetings took place in counties all across the United States, starting with the early Republican primary states, and people would get more and more fired up at what they saw as ‘us vs. them.’ …
“CA estimated that if only 25 percent of the infrequent voters who began clicking on this new CA content eventually turned out to vote, they could increase statewide turnout for the Republicans in several key states by around 1 percent, which is often the margin of victory in tight races. Steve Bannon loved this … He urged us to include what were in effect racially biased questions in our research, to see just how far we could push people. The firm started testing questions about black people—whether they were capable of succeeding in America without the help of whites, for example, or whether they were genetically predetermined to fail …

“Cambridge Analytica’s findings confirmed his suspicion: America is filled with racists who remain silent for fear of social shunning … an undercurrent populated by millions of intense and angry young men … [and] Bannon realized he could harness them and their anonymous swarms of resentment and harassment … Over the course of many experiments, we concocted an arsenal of psychological tools that could be deployed systematically via social media, blogs, groups, and forums …
“What CA observed was that when respondents were angry, their need for complete and rational explanations was also significantly reduced. In particular, anger put people in a frame of mind in which they were more indiscriminately punitive, particularly to out-groups. They would also underestimate the risk of negative outcomes. This led CA to discover that even if a hypothetical trade war with China or Mexico meant the loss of American jobs and profits, people primed with anger would tolerate that domestic economic damage if it meant they could use a trade war to punish immigrant groups and urban liberals …
Meanwhile “CA’s client list eventually grew into a who’s who of the American right wing. The Trump and Cruz campaigns paid more than $5 million apiece to the firm … The super PAC of future national security adviser John Bolton paid Cambridge Analytica more than $1 million to explore how to increase militarism in American youth. Bolton was worried that millennials were a ‘morally weak’ generation that would not want to go to war with Iran or other ‘evil’ countries …
“One project was described in CA correspondence as a ‘voter disengagement’ (i.e., voter suppression) initiative targeting African Americans. Republican clients were worried about the growing minority vote, especially in relation to their aging white base, and were looking for ways to confuse, demotivate, and disempower people of color … In the end, we were creating a machine to contaminate America with hate and cultish paranoia, and I could no longer ignore the immorality and illegality of it all. I did not want to be a collaborator.”
“As I saw Trump rise to power and watched as he banned citizens of Muslim states from entering the United States and gave justifications for white supremacist movements, I couldn’t help feeling that I had laid the seeds for this to happen. I had played with fire, and now I watched as the world was burning.” (Emphasis added.)
There’s this nugget from his testimony before the House Intelligence Committee: Rep. Adam Schiff asks: “Did you work with Steve Bannon? Yes. Did Cambridge Analytica have any contacts with potential Russian agents? Yes. Do you believe that this data was used to sway the American electorate to elect the president of the United States? Yes.”
Wylie provides some insight into some of what the Ukraine phone-call whistleblower might face if he or she is outed: “Reporters followed me everywhere. I started to receive threats. Fearing for my safety, I had to hire bodyguards to protect me at public events. My parents, both physicians, had to temporarily close their medical clinic due to a frenzy of journalists asking questions and scaring patients. In the months that followed, my life became almost unmanageable, but I knew I had to keep sounding the alarm.”
By the summer of 2018, Facebook’s lawyers “threatened to report me to the FBI for an unspecified cybercrime … My life now looks like that of a paranoid man, but after being assaulted in the street, receiving threats from rogue private security firms, having my hotel room broken in to late at night as I was sleeping, and experiencing two hacking attempts on my email in the past twelve months, it is only sensible to be cautious … [Because] the alt-right’s digital rage machine turned its sights to me. In London, enraged Brexiteers pushed me into oncoming traffic. I was followed around by alt-right stalkers and had photos of me at clubs with my friends published on alt-right websites with information about where to find me … Later, it emerged that Facebook, in a panic about its PR crisis, had hired the secret communications firm Definers Public Affairs, which subsequently leaked out fake narratives filled with anti-Semitic tropes about its critics being part of a George Soros–funded conspiracy. Rumors were seeded on the Internet and, as I discovered personally, its targets took it as a cue to take matters into their own hands …
“Facebook … did not need to hack me; they could simply track me everywhere because of the apps on my phone—where I was, who my contacts were, who I was meeting … My mom, dad, and sisters all had to remove Facebook, Instagram, and WhatsApp from their phones for the same reason.”
Why? Wylie explains something I never knew: “The terms and conditions of Facebook’s mobile app asked for microphone and camera access. Although the company is at pains to deny pulling user audio data for targeted advertising, there is nonetheless a technical permission sitting on our phones that allows access to audio capabilities.” (Emphasis added.)
And something most people don’t really think about: “Facebook also knew who all my friends were, they knew where we liked to go out, what we wrote about in messages, and they knew where we all lived. Even hanging out with my friends became a risk, as Facebook had access to their phones. If a friend took a photo, Facebook could access it, and its facial recognition algorithms could, at least in theory, detect my face in the photos sitting on other people’s phones, even if they were strangers to me.”
Finally, it’s important to acknowledge the connection Wylie believes is at work here between his personal life and the political choices he’s made. As a youngster, he was targeted by bullies, “I was diagnosed with two relatively rare conditions, whose symptoms included severe neuropathic pain, muscle weakness, and vision and hearing impairment. By twelve I was in a wheelchair—just in time for the onset of adolescence—and I used that chair for the rest of my school days.”
“Not long after I discovered the computer lab, it became the one room at school where I didn’t feel alienated. Outside, there were either bullies or patronizing staff … Growing up queer, you learn early in life that your existence is outside the norm. We incubate ourselves inside a closet, remaining unknown, and hide our truth until it becomes unbearable … Queers understand systems of power intimately, and coming out is our transformative act of truth telling. In coming out, we realize the power of speaking our truth to those who may not want to hear it. We reject their comfort and make them listen. Why do so many gays blow whistles at Pride? To get your attention. To announce that we will no longer hide ourselves. To defy hegemonies of the powerful. And, like so many queers who came before me, I had to accept my own truth and come to terms with my inevitable failure to ever become society’s notion of a perfect man …
“But like other out queers, I am a truth teller, and I chose to be indiscreet with those uncomfortable truths, to stop hiding, to stop being their secret, to face the consequences before me, and to shout out to the world what I know …
Let me end where I began. I’ve read many, and reviewed some, of the books recently written about the Trump presidency. I’ve read Mueller’s indictments and the redacted Mueller report and like, some of you, watched too many hours of political pundits, former federal prosecutors, former FBI and CIA agents and too many TV talking heads. Of all of the words, Christopher Wylie best explains where we are today and how we’ve gotten here. Many of you have read George Orwell’s “1984” and Aldous Huxley’s “Brave New World” — brilliant glimpses into frightening futures.
But Wylie offers us a chilling, inside-out look at his and our present, his and our betrayal. For Wylie “we were about to break new ground for the cyber defenses of Britain, America, and their allies and confront bubbling insurgencies of radical extremism with data, algorithms, and targeted narratives online … in 2014, a billionaire acquired our project in order to build his own radicalized insurgency in America.”
Wylie knows how we moved in the shortest of time from Obamaland, from change you believe in, to Trumpovia and Build the Wall. In the shortest of time, we moved from our first African-American president to a president who equated neo-Nazis and civil rights advocates. He knows how this happened because he helped make it happen.
And he very ably walks us through his nightmare made real, and convincingly insists his nightmare is ours. It’s time to wake up, and confront the continuing mindf*ck.
_____________________________________________________________________
Notes
The New York Times just published an interview with a Turker and provides readers the opportunity to take similar Turk tests.
“I Found Work on an Amazon Website. I Made 97 Cents an Hour.”
Andy Newman, Nov. 15, 2019, New York Times
https://www.nytimes.com/interactive/2019/11/15/nyregion/amazon-mechanical-turk.html?