Why Zuckerbergs 14-Year Apology Tour Hasnt Fixed Facebook

In 2003, one year before Facebook was founded, a website called Facemash began nonconsensually scraping pictures of students at Harvard from the school’s intranet and asking users to rate their hotness. Obviously, it caused an outcry. The website’s developer quickly proffered an apology. "I hope you understand, this is not how I meant for things to go, and I apologize for any harm done as a result of my neglect to consider how quickly the site would spread and its consequences thereafter,” wrote a young Mark Zuckerberg. “I definitely see how my intentions could be seen in the wrong light.”

In 2004 Zuckerberg cofounded Facebook, which rapidly spread from Harvard to other universities. And in 2006 the young company blindsided its users with the launch of News Feed, which collated and presented in one place information that people had previously had to search for piecemeal. Many users were shocked and alarmed that there was no warning and that there were no privacy controls. Zuckerberg apologized. “This was a big mistake on our part, and I'm sorry for it,” he wrote on Facebook’s blog. "We really messed this one up," he said. "We did a bad job of explaining what the new features were and an even worse job of giving you control of them."

Zeynep Tufekci (@zeynep) is an associate professor at the University of North Carolina and an opinion writer for The New York Times. She recently wrote about the (democracy-poisoning) golden age of free speech.

Then in 2007, Facebook’s Beacon advertising system, which was launched without proper controls or consent, ended up compromising user privacy by making people’s purchases public. Fifty thousand Facebook users signed an e-petition titled “Facebook: Stop invading my privacy.” Zuckerberg responded with an apology: “We simply did a bad job with this release and I apologize for it." He promised to improve. “I'm not proud of the way we've handled this situation and I know we can do better,” he wrote.

By 2008, Zuckerberg had written only four posts on Facebook’s blog: Every single one of them was an apology or an attempt to explain a decision that had upset users.

In 2010, after Facebook violated users' privacy by making key types of information public without proper consent or warning, Zuckerberg again responded with an apology—this time published in an op-ed in The Washington Post. “We just missed the mark,” he said. “We heard the feedback,” he added. “There needs to be a simpler way to control your information.” “In the coming weeks, we will add privacy controls that are much simpler to use,” he promised.

I’m going to run out of space here, so let’s jump to 2018 and skip over all the other mishaps and apologies and promises to do better—oh yeah, and the consent decree that the Federal Trade Commission made Facebook sign in 2011, charging that the company had deceptively promised privacy to its users and then repeatedly broken that promise—in the intervening years.

Last month, Facebook once again garnered widespread attention with a privacy related backlash when it became widely known that, between 2008 and 2015, it had allowed hundreds, maybe thousands, of apps to scrape voluminous data from Facebook users—not just from the users who had downloaded the apps, but detailed information from all their friends as well. One such app was run by a Cambridge University academic named Aleksandr Kogan, who apparently siphoned up detailed data on up to 87 million users in the United States and then surreptitiously forwarded the loot to the political data firm Cambridge Analytica. The incident caused a lot of turmoil because it connects to the rolling story of distortions in the 2016 US presidential election. But in reality, Kogan’s app was just one among many, many apps that amassed a huge amount of information in a way most Facebook users were completely unaware of.

At first Facebook indignantly defended itself, claiming that people had consented to these terms; after all, the disclosures were buried somewhere in the dense language surrounding obscure user privacy controls. People were asking for it, in other words.

But the backlash wouldn’t die down. Attempting to respond to the growing outrage, Facebook announced changes. “It’s Time to Make Our Privacy Tools Easier to Find”, the company announced without a hint of irony—or any other kind of hint—that Zuckerberg had promised to do just that in the “coming few weeks” eight full years ago. On the company blog, Facebook’s chief privacy editor wrote that instead of being “spread across nearly 20 different screens” (why were they ever spread all over the place?), the controls would now finally be in one place.

Zuckerberg again went on an apology tour, giving interviews to The New York Times, CNN, Recode, WIRED, and Vox (but not to the Guardian and Observer reporters who broke the story). In each interview he apologized. “I’m really sorry that this happened,” he told CNN. “This was certainly a breach of trust.”

But Zuckerberg didn’t stop at an apology this time. He also defended Facebook as an “idealistic company” that cares about its users and spoke disparagingly about rival companies that charge users money for their products while maintaining a strong record in protecting user privacy. In his interview with Vox’s Ezra Klein, Zuckerberg said that anyone who believes Apple cares more about users than Facebook does has “Stockholm syndrome”—the phenomenon whereby hostages start sympathizing and identifying with their captors.

This is an interesting argument coming from the CEO of Facebook, a company that essentially holds its users' data hostage. Yes, Apple charges handsomely for its products, but it also includes advanced encryption hardware on all its phones, delivers timely security updates to its whole user base, and has largely locked itself out of user data—to the chagrin of many governments, including that of the United States, and of Facebook itself.

Most Android phones, by contrast, gravely lag behind in receiving security updates, have no specialized encryption hardware, and often handle privacy controls in a way that is detrimental to user interests. Few governments or companies complain about Android phones. After the Cambridge Analytica scandal, it came to light that Facebook had been downloading and keeping all the text messages of its users on the Android platform—their content as well as their metadata. “The users consented!” Facebook again cried out. But people were soon posting screenshots that showed how difficult it was for a mere mortal to discern that’s what was going on, let alone figure out how to opt out, on the vague permission screen that flashed before users.

On Apple phones, however, Facebook couldn’t harvest people’s text messages because the permissions wouldn’t allow it.

In the same interview, Zuckerberg took wide aim at the oft-repeated notion that, if an online service is free, you—the user—are the product. He said that he found the argument that “if you’re not paying that somehow we can’t care about you, to be extremely glib and not at all aligned with the truth.” His rebuttal to that accusation, however, was itself glib; and as for whether it was aligned with the truth—well, we just have to take his word for it. “To the dissatisfaction of our sales team here,” he said, “I make all of our decisions based on what’s going to matter to our community and focus much less on the advertising side of the business.”

As far as I can tell, not once in his apology tour was Zuckerberg asked what on earth he means when he refers to Facebook’s 2 billion-plus users as “a community” or “the Facebook community.” A community is a set of people with reciprocal rights, powers, and responsibilities. If Facebook really were a community, Zuckerberg would not be able to make so many statements about unilateral decisions he has made—often, as he boasts in many interviews, in defiance of Facebook’s shareholders and various factions of the company’s workforce. Zuckerberg’s decisions are final, since he controls all the voting stock in Facebook, and always will until he decides not to—it’s just the way he has structured the company.

This isn’t a community; this is a regime of one-sided, highly profitable surveillance, carried out on a scale that has made Facebook one of the largest companies in the world by market capitalization.

Facebook’s 2 billion users are not Facebook’s “community.” They are its user base, and they have been repeatedly carried along by the decisions of the one person who controls the platform. These users have invested time and money in building their social networks on Facebook, yet they have no means to port the connectivity elsewhere. Whenever a serious competitor to Facebook has arisen, the company has quickly copied it (Snapchat) or purchased it (WhatsApp, Instagram), often at a mind-boggling price that only a behemoth with massive cash reserves could afford. Nor do people have any means to completely stop being tracked by Facebook. The surveillance follows them not just on the platform, but elsewhere on the internet—some of them apparently can’t even text their friends without Facebook trying to snoop in on the conversation. Facebook doesn’t just collect data itself; it has purchased external data from data brokers; it creates “shadow profiles” of nonusers and is now attempting to match offline data to its online profiles.

Again, this isn’t a community; this is a regime of one-sided, highly profitable surveillance, carried out on a scale that has made Facebook one of the largest companies in the world by market capitalization.

There is no other way to interpret Facebook’s privacy invading moves over the years—even if it’s time to simplify! finally!―as anything other than decisions driven by a combination of self-serving impulses: namely, profit motives, the structural incentives inherent to the company’s business model, and the one-sided ideology of its founders and some executives. All these are forces over which the users themselves have little input, aside from the regular opportunity to grouse through repeated scandals. And even the ideology—a vague philosophy that purports to prize openness and connectivity with little to say about privacy and other values—is one that does not seem to apply to people who run Facebook or work for it. Zuckerberg buys houses surrounding his and tapes over his computer’s camera to preserve his own privacy, and company employees went up in arms when a controversial internal memo that made an argument for growth at all costs was recently leaked to the press—a nonconsensual, surprising, and uncomfortable disclosure of the kind that Facebook has routinely imposed upon its billions of users over the years.

This isn’t to say Facebook doesn’t provide real value to its users, even as it locks them in through network effects and by crushing, buying, and copying its competition. I wrote a whole book in which I document, among other things, how useful Facebook has been to anticensorship efforts around the world. It doesn’t even mean that Facebook executives make all decisions merely to increase the company valuation or profit, or that they don’t care about users. But multiple things can be true at the same time; all of this is quite complicated. And fundamentally, Facebook’s business model and reckless mode of operating are a giant dagger threatening the health and well-being of the public sphere and the privacy of its users in many countries.

So, here’s the thing. There is indeed a case of Stockholm syndrome here. There are very few other contexts in which a person would be allowed to make a series of decisions that have obviously enriched them while eroding the privacy and well-being of billions of people; to make basically the same apology for those decisions countless times over the space of just 14 years; and then to profess innocence, idealism, and complete independence from the obvious structural incentives that have shaped the whole process. This should ordinarily cause all the other educated, literate, and smart people in the room to break into howls of protest or laughter. Or maybe tears.

Facebook has tens of thousands of employees, and reportedly an open culture with strong internal forums. Insiders often talk of how free employees feel to speak up, and indeed I’ve repeatedly been told how they are encouraged to disagree and discuss all the key issues. Facebook has an educated workforce.

By now, it ought to be plain to them, and to everyone, that Facebook’s 2 billion-plus users are surveilled and profiled, that their attention is then sold to advertisers and, it seems, practically anyone else who will pay Facebook—including unsavory dictators like the Philippines’ Rodrigo Duterte. That is Facebook’s business model. That is why the company has an almost half-a-trillion-dollar market capitalization, along with billions in spare cash to buy competitors.

These are such readily apparent facts that any denial of them is quite astounding.

And yet, it appears that nobody around Facebook’s sovereign and singular ruler has managed to convince their leader that these are blindingly obvious truths whose acceptance may well provide us with some hints of a healthier way forward. That the repeated word of the use “community” to refer Facebook’s users is not appropriate and is, in fact, misleading. That the constant repetition of “sorry” and “we meant well” and “we will fix it this time!” to refer to what is basically the same betrayal over 14 years should no longer be accepted as a promise to do better, but should instead be seen as but one symptom of a profound crisis of accountability. When a large chorus of people outside the company raises alarms on a regular basis, it’s not a sufficient explanation to say, “Oh we were blindsided (again).”

Maybe, just maybe, that is the case of Stockholm syndrome we should be focusing on.

Zuckerberg’s outright denial that Facebook’s business interests play a powerful role in shaping its behavior doesn’t bode well for Facebook’s chances of doing better in the future. I don’t doubt that the company has, on occasion, held itself back from bad behavior. That doesn’t make Facebook that exceptional, nor does it excuse its existing choices, nor does it alter the fact that its business model is fundamentally driving its actions.

At a minimum, Facebook has long needed an ombudsman’s office with real teeth and power: an institution within the company that can act as a check on its worst impulses and to protect its users. And it needs a lot more employees whose task is to keep the platform healthier. But what would truly be disruptive and innovative would be for Facebook to alter its business model. Such a change could come from within, or it could be driven by regulations on data retention and opaque, surveillance-based targeting—regulations that would make such practices less profitable or even forbidden.

Facebook will respond to the latest crisis by keeping more of its data within its own walls (of course, that fits well with the business of charging third parties for access to users based on extensive profiling with data held by Facebook, so this is no sacrifice). Sure, it’s good that Facebook is now promising not to leak user data to unscrupulous third parties; but it should finally allow truly independent researchers better (and secure, not reckless) access to the company’s data in order to investigate the true effects of the platform. Thus far, Facebook has not cooperated with independent researchers who want to study it. Such investigation would be essential to informing the kind of political discussion we need to have about the trade-offs inherent in how Facebook, and indeed all of social media, operate.

Even without that independent investigation, one thing is clear: Facebook’s sole sovereign is neither equipped to, nor should he be in a position to, make all these decisions by himself, and Facebook’s long reign of unaccountability should end.


Facebook in Crisis

  • Initially, Facebook said that Cambridge Analytica got unauthorized access to some 50 million users' data. The social network has now raised that number to 87 million.
  • Next week, Mark Zuckerberg will testify before Congress. The question on our minds: How can Facebook prevent the next crisis if its guiding principle is and always has been connection at all cost?
  • Facebook has a long history of privacy gaffes. Here are just a few.

Photograph by WIRED/Getty Images

Read more: https://www.wired.com/story/why-zuckerberg-15-year-apology-tour-hasnt-fixed-facebook/

I made Steve Bannons psychological warfare tool: meet the data war whistleblower

Christopher Wylie goes on the record to discuss his role in hijacking the profiles of millions of Facebook users in order to target the US electorate

The first time I met Christopher Wylie, he didnt yet have pink hair. That comes later. As does his mission to rewind time. To put the genie back in the bottle.

By the time I met him in person, Id already been talking to him on a daily basis for hours at a time. On the phone, he was clever, funny, bitchy, profound, intellectually ravenous, compelling. A master storyteller. A politicker. A data science nerd.

Play Video
13:04

Cambridge Analytica whistleblower: ‘We spent $1m harvesting millions of Facebook profiles’ video

Two months later, when he arrived in London from Canada, he was all those things in the flesh. And yet the flesh was impossibly young. He was 27 then (hes 28 now), a fact that has always seemed glaringly at odds with what he has done. He may have played a pivotal role in the momentous political upheavals of 2016. At the very least, he played a consequential role. At 24, he came up with an idea that led to the foundation of a company called Cambridge Analytica, a data analytics firm that went on to claim a major role in the Leave campaign for Britains EU membership referendum, and later became a key figure in digital operations during Donald Trumps election campaign.

Or, as Wylie describes it, he was the gay Canadian vegan who somehow ended up creating Steve Bannons psychological warfare mindfuck tool.

In 2014, Steve Bannon then executive chairman of the alt-right news network Breitbart was Wylies boss. And Robert Mercer, the secretive US hedge-fund billionaire and Republican donor, was Cambridge Analyticas investor. And the idea they bought into was to bring big data and social media to an established military methodology information operations then turn it on the US electorate.

It was Wylie who came up with that idea and oversaw its realisation. And it was Wylie who, last spring, became my source. In May 2017, I wrote an article headlined The great British Brexit robbery, which set out a skein of threads that linked Brexit to Trump to Russia. Wylie was one of a handful of individuals who provided the evidence behind it. I found him, via another Cambridge Analytica ex-employee, lying low in Canada: guilty, brooding, indignant, confused. I havent talked about this to anyone, he said at the time. And then he couldnt stop talking.

Explainer embed

By that time, Steve Bannon had become Trumps chief strategist. Cambridge Analyticas parent company, SCL, had won contracts with the US State Department and was pitching to the Pentagon, and Wylie was genuinely freaked out. Its insane, he told me one night. The company has created psychological profiles of 230 million Americans. And now they want to work with the Pentagon? Its like Nixon on steroids.

He ended up showing me a tranche of documents that laid out the secret workings behind Cambridge Analytica. And in the months following publication of my article in May,it was revealed that the company had reached out to WikiLeaks to help distribute Hillary Clintons stolen emails in 2016. And then we watched as it became a subject of special counsel Robert Muellers investigation into possible Russian collusion in the US election.

The Observer also received the first of three letters from Cambridge Analytica threatening to sue Guardian News and Media for defamation. We are still only just starting to understand the maelstrom of forces that came together to create the conditions for what Mueller confirmed last month was information warfare. But Wylie offers a unique, worms-eye view of the events of 2016. Of how Facebook was hijacked, repurposed to become a theatre of war: how it became a launchpad for what seems to be an extraordinary attack on the USs democratic process.

Wylie oversaw what may have been the first critical breach. Aged 24, while studying for a PhD in fashion trend forecasting, he came up with a plan to harvest the Facebook profiles of millions of people in the US, and to use their private and personal information to create sophisticated psychological and political profiles. And then target them with political ads designed to work on their particular psychological makeup.

We broke Facebook, he says.

And he did it on behalf of his new boss, Steve Bannon.

Is it fair to say you hacked Facebook? I ask him one night.

He hesitates. Ill point out that I assumed it was entirely legal and above board.

Last month, Facebooks UK director of policy, Simon Milner, told British MPs on a select committee inquiry into fake news, chaired by Conservative MP Damian Collins, that Cambridge Analytica did not have Facebook data. The official Hansard extract reads:

Christian Matheson (MP for Chester): Have you ever passed any user information over to Cambridge Analytica or any of its associated companies?

Simon Milner: No.

Matheson: But they do hold a large chunk of Facebooks user data, dont they?

Milner: No. They may have lots of data, but it will not be Facebook user data. It may be data about people who are on Facebook that they have gathered themselves, but it is not data that we have provided.

Alexander
Alexander Nix, Cambridge Analytica CEO. Photograph: The Washington Post/Getty Images

Two weeks later, on 27 February, as part of the same parliamentary inquiry, Rebecca Pow, MP for Taunton Deane, asked Cambridge Analyticas CEO, Alexander Nix: Does any of the data come from Facebook? Nix replied: We do not work with Facebook data and we do not have Facebook data.

And through it all, Wylie and I, plus a handful of editors and a small, international group of academics and researchers, have known that at least in 2014 that certainly wasnt the case, because Wylie has the paper trail. In our first phone call, he told me he had the receipts, invoices, emails, legal letters records that showed how, between June and August 2014, the profiles of more than 50 million Facebook users had been harvested. Most damning of all, he had a letter from Facebooks own lawyers admitting that Cambridge Analytica had acquired the data illegitimately.

Going public involves an enormous amount of risk. Wylie is breaking a non-disclosure agreement and risks being sued. He is breaking the confidence of Steve Bannon and Robert Mercer.

Its taken a rollercoaster of a year to help get Wylie to a place where its possible for him to finally come forward. A year in which Cambridge Analytica has been the subject of investigations on both sides of the Atlantic Robert Muellers in the US, and separate inquiries by the Electoral Commission and the Information Commissioners Office in the UK, both triggered in February 2017, after the Observers first article in this investigation.

It has been a year, too, in which Wylie has been trying his best to rewind to undo events that he set in motion. Earlier this month, he submitted a dossier of evidence to the Information Commissioners Office and the National Crime Agencys cybercrime unit. He is now in a position to go on the record: the data nerd who came in from the cold.

There are many points where this story could begin. One is in 2012, when Wylie was 21 and working for the Liberal Democrats in the UK, then in government as junior coalition partners. His career trajectory has been, like most aspects of his life so far, extraordinary, preposterous, implausible.

Profile

Cambridge Analytica: the key players

Alexander Nix, CEO

An Old Etonian with a degree from Manchester University, Nix, 42, worked as a financial analyst in Mexico and the UK before joining SCL, a strategic communications firm, in 2003. From 2007 he took over the companys elections division, and claims to have worked on 260 campaigns globally. He set up Cambridge Analytica to work in America, with investment from RobertMercer.

Aleksandr Kogan, data miner

Aleksandr Kogan was born in Moldova and lived in Moscow until the age of seven, then moved with his family to the US, where he became a naturalised citizen. He studied at the University of California, Berkeley, and got his PhD at the University of Hong Kong before joining Cambridge as a lecturer in psychology and expert in social media psychometrics. He set up Global Science Research (GSR) to carry out CAs data research. While at Cambridge he accepted a position at St Petersburg State University, and also took Russian government grants for research. He changed his name to Spectre when he married, but later reverted to Kogan.

Steve Bannon, former board member

A former investment banker turned alt-right media svengali, Steve Bannon was boss at website Breitbart when he met Christopher Wylie and Nix and advised Robert Mercer to invest in political data research by setting up CA. In August 2016 he became Donald Trumps campaign CEO. Bannon encouraged the reality TV star to embrace the populist, economic nationalist agenda that would carry him into the White House. That earned Bannon the post of chief strategist to the president and for a while he was arguably the second most powerful man in America. By August 2017 his relationship with Trump had soured and he was out.

Robert Mercer, investor

Robert Mercer, 71, is a computer scientist and hedge fund billionaire, who used his fortune to become one of the most influential men in US politics as a top Republican donor. An AI expert, he made a fortune with quantitative trading pioneers Renaissance Technologies, then built a $60m war chest to back conservative causes by using an offshore investment vehicle to avoid US tax.

Rebekah Mercer, investor

Rebekah Mercer has a maths degree from Stanford, and worked as a trader, but her influence comes primarily from her fathers billions. The fortysomething, the second of Mercers three daughters, heads up the family foundation which channels money to rightwing groups. The conservative megadonors backed Breitbart, Bannon and, most influentially, poured millions into Trumps presidential campaign.

Wylie grew up in British Columbia and as a teenager he was diagnosed with ADHD and dyslexia. He left school at 16 without a single qualification. Yet at 17, he was working in the office of the leader of the Canadian opposition; at 18, he went to learn all things data from Obamas national director of targeting, which he then introduced to Canada for the Liberal party. At 19, he taught himself to code, and in 2010, age 20, he came to London to study law at the London School of Economics.

Politics is like the mob, though, he says. You never really leave. I got a call from the Lib Dems. They wanted to upgrade their databases and voter targeting. So, I combined working for them with studying for my degree.

Politics is also where he feels most comfortable. He hated school, but as an intern in the Canadian parliament he discovered a world where he could talk to adults and they would listen. He was the kid who did the internet stuff and within a year he was working for the leader of the opposition.

Hes one of the brightest people you will ever meet, a senior politician whos known Wylie since he was 20 told me. Sometimes thats a blessing and sometimes a curse.

Meanwhile, at Cambridge Universitys Psychometrics Centre, two psychologists, Michal Kosinski and David Stillwell, were experimenting with a way of studying personality by quantifying it.

Starting in 2007,Stillwell, while a student, had devised various apps for Facebook, one of which, a personality quiz called myPersonality, had gone viral. Users were scored on big five personality traits Openness, Conscientiousness, Extroversion, Agreeableness and Neuroticism and in exchange, 40% of them consented to give him access to their Facebook profiles. Suddenly, there was a way of measuring personality traits across the population and correlating scores against Facebook likes across millions of people.

An
Examples, above and below, of the visual messages trialled by GSRs online profiling test. Respondents were asked: How important should this message be to all Americans?

The research was original, groundbreaking and had obvious possibilities. They had a lot of approaches from the security services, a member of the centre told me. There was one called You Are What You Like and it was demonstrated to the intelligence services. And it showed these odd patterns; that, for example, people who liked I hate Israel on Facebook also tended to like Nike shoes and KitKats.

There are agencies that fund research on behalf of the intelligence services. And they were all over this research. That one was nicknamed Operation KitKat.

The defence and military establishment were the first to see the potential of the research. Boeing, a major US defence contractor, funded Kosinskis PhD and Darpa, the US governments secretive Defense Advanced Research Projects Agency, is cited in at least two academic papers supporting Kosinskis work.

But when, in 2013, the first major paper was published, others saw this potential too, including Wylie. He had finished his degree and had started his PhD in fashion forecasting, and was thinking about the Lib Dems. It is fair to say that he didnt have a clue what he was walking into.

An

I wanted to know why the Lib Dems sucked at winning elections when they used to run the country up to the end of the 19th century, Wylie explains. And I began looking at consumer and demographic data to see what united Lib Dem voters, because apart from bits of Wales and the Shetlands its weird, disparate regions. And what I found is there were no strong correlations. There was no signal in the data.

And then I came across a paper about how personality traits could be a precursor to political behaviour, and it suddenly made sense. Liberalism is correlated with high openness and low conscientiousness, and when you think of Lib Dems theyre absent-minded professors and hippies. Theyre the early adopters theyre highly open to new ideas. And it just clicked all of a sudden.

Here was a way for the party to identify potential new voters. The only problem was that the Lib Dems werent interested.

I did this presentation at which I told them they would lose half their 57 seats, and they were like: Why are you so pessimistic? They actually lost all but eight of their seats, FYI.

Another Lib Dem connection introduced Wylie to a company called SCL Group, one of whose subsidiaries, SCL Elections, would go on to create Cambridge Analytica (an incorporated venture between SCL Elections and Robert Mercer, funded by the latter). For all intents and purposes, SCL/Cambridge Analytica are one and the same.

Alexander Nix, then CEO of SCL Elections, made Wylie an offer he couldnt resist. He said: Well give you total freedom. Experiment. Come and test out all your crazy ideas.

An
Another example of the visual messages trialled by GSRs online profiling test.

In the history of bad ideas, this turned out to be one of the worst. The job was research director across the SCL group, a private contractor that has both defence and elections operations. Its defence arm was a contractor to the UKs Ministry of Defence and the USs Department of Defense, among others. Its expertise was in psychological operations or psyops changing peoples minds not through persuasion but through informational dominance, a set of techniques that includes rumour, disinformation and fake news.

SCL Elections had used a similar suite of tools in more than 200 elections around the world, mostly in undeveloped democracies that Wylie would come to realise were unequipped to defend themselves.

Wylie holds a British Tier 1 Exceptional Talent visa a UK work visa given to just 200 people a year. He was working inside government (with the Lib Dems) as a political strategist with advanced data science skills. But no one, least of all him, could have predicted what came next. When he turned up at SCLs offices in Mayfair, he had no clue that he was walking into the middle of a nexus of defence and intelligence projects, private contractors and cutting-edge cyberweaponry.

The thing I think about all the time is, what if Id taken a job at Deloitte instead? They offered me one. I just think if Id taken literally any other job, Cambridge Analytica wouldnt exist. You have no idea how much I brood on this.

A few months later, in autumn 2013, Wylie met Steve Bannon. At the time, he was editor-in-chief of Breitbart, which he had brought to Britain to support his friend Nigel Farage in his mission to take Britain out of the European Union.

What was he like?

Smart, says Wylie. Interesting. Really interested in ideas. Hes the only straight man Ive ever talked to about intersectional feminist theory. He saw its relevance straightaway to the oppressions that conservative, young white men feel.

Wylie meeting Bannon was the moment petrol was poured on a flickering flame. Wylie lives for ideas. He speaks 19 to the dozen for hours at a time. He had a theory to prove. And at the time, this was a purely intellectual problem. Politics was like fashion, he told Bannon.

[Bannon] got it immediately. He believes in the whole Andrew Breitbart doctrine that politics is downstream from culture, so to change politics you need to change culture. And fashion trends are a useful proxy for that. Trump is like a pair of Uggs, or Crocs, basically. So how do you get from people thinking Ugh. Totally ugly to the moment when everyone is wearing them? That was the inflection point he was looking for.

But Wylie wasnt just talking about fashion. He had recently been exposed to a new discipline: information operations, which ranks alongside land, sea, air and space in the US militarys doctrine of the five-dimensional battle space. His brief ranged across the SCL Group the British government has paid SCL to conduct counter-extremism operations in the Middle East, and the US Department of Defense has contracted it to work in Afghanistan.

I tell him that another former employee described the firm as MI6 for hire, and Id never quite understood it.

Its like dirty MI6 because youre not constrained. Theres no having to go to a judge to apply for permission. Its normal for a market research company to amass data on domestic populations. And if youre working in some country and theres an auxiliary benefit to a current client with aligned interests, well thats just a bonus.

When I ask how Bannon even found SCL, Wylie tells me what sounds like a tall tale, though its one he can back up with an email about how Mark Block, a veteran Republican strategist, happened to sit next to a cyberwarfare expert for the US air force on a plane. And the cyberwarfare guy is like, Oh, you should meet SCL. They do cyberwarfare for elections.

U.S.
Steve Bannon: He loved the gays, says Wylie. He saw us as early adopters. Photograph: Tony Gentile/Reuters

It was Bannon who took this idea to the Mercers: Robert Mercer the co-CEO of the hedge fund Renaissance Technologies, who used his billions to pursue a rightwing agenda, donating to Republican causes and supporting Republican candidates and his daughter Rebekah.

Nix and Wylie flew to New York to meet the Mercers in Rebekahs Manhattan apartment.

She loved me. She was like, Oh we need more of your type on our side!

Your type?

The gays. She loved the gays. So did Steve [Bannon]. He saw us as early adopters. He figured, if you can get the gays on board, everyone else will follow. Its why he was so into the whole Milo [Yiannopoulos] thing.

Robert Mercer was a pioneer in AI and machine translation. He helped invent algorithmic trading which replaced hedge fund managers with computer programs and he listened to Wylies pitch. It was for a new kind of political message-targeting based on an influential and groundbreaking 2014 paper researched at Cambridges Psychometrics Centre, called: Computer-based personality judgments are more accurate than those made by humans.

In politics, the money man is usually the dumbest person in the room. Whereas its the opposite way around with Mercer, says Wylie. He said very little, but he really listened. He wanted to understand the science. And he wanted proof that it worked.

And to do that, Wylie needed data.

How Cambridge Analytica acquired the data has been the subject of internal reviews at Cambridge University, of many news articles and much speculation and rumour.

When Nix was interviewed by MPs last month, Damian Collins asked him:

Does any of your data come from Global Science Research company?

Nix: GSR?

Collins: Yes.

Nix: We had a relationship with GSR. They did some research for us back in 2014. That research proved to be fruitless and so the answer is no.

Collins: They have not supplied you with data or information?

Nix: No.

Collins: Your datasets are not based on information you have received from them?

Nix: No.

Collins: At all?

Nix: At all.

The problem with Nixs response to Collins is that Wylie has a copy of an executed contract, dated 4 June 2014, which confirms that SCL, the parent company of Cambridge Analytica, entered into a commercial arrangement with a company called Global Science Research (GSR), owned by Cambridge-based academic Aleksandr Kogan, specifically premised on the harvesting and processing of Facebook data, so that it could be matched to personality traits and voter rolls.

He has receipts showing that Cambridge Analytica spent $7m to amass this data, about $1m of it with GSR. He has the bank records and wire transfers. Emails reveal Wylie first negotiated with Michal Kosinski, one of the co-authors of the original myPersonality research paper, to use the myPersonality database. But when negotiations broke down, another psychologist, Aleksandr Kogan, offered a solution that many of his colleagues considered unethical. He offered to replicate Kosinski and Stilwells research and cut them out of the deal. For Wylie it seemed a perfect solution. Kosinski was asking for $500,000 for the IP but Kogan said he could replicate it and just harvest his own set of data. (Kosinski says the fee was to fund further research.)

Dr
An unethical solution? Dr Aleksandr Kogan Photograph: alex kogan

Kogan then set up GSR to do the work, and proposed to Wylie they use the data to set up an interdisciplinary institute working across the social sciences. What happened to that idea, I ask Wylie. It never happened. I dont know why. Thats one of the things that upsets me the most.

It was Bannons interest in culture as war that ignited Wylies intellectual concept. But it was Robert Mercers millions that created a firestorm. Kogan was able to throw money at the hard problem of acquiring personal data: he advertised for people who were willing to be paid to take a personality quiz on Amazons Mechanical Turk and Qualtrics. At the end of which Kogans app, called thisismydigitallife, gave him permission to access their Facebook profiles. And not just theirs, but their friends too. On average, each seeder the people who had taken the personality test, around 320,000 in total unwittingly gave access to at least 160 other peoples profiles, none of whom would have known or had reason to suspect.

What the email correspondence between Cambridge Analytica employees and Kogan shows is that Kogan had collected millions of profiles in a matter of weeks. But neither Wylie nor anyone else at Cambridge Analytica had checked that it was legal. It certainly wasnt authorised. Kogan did have permission to pull Facebook data, but for academic purposes only. Whats more, under British data protection laws, its illegal for personal data to be sold to a third party without consent.

Facebook could see it was happening, says Wylie. Their security protocols were triggered because Kogans apps were pulling this enormous amount of data, but apparently Kogan told them it was for academic use. So they were like, Fine.

Kogan maintains that everything he did was legal and he had a close working relationship with Facebook, which had granted him permission for his apps.

Cambridge Analytica had its data. This was the foundation of everything it did next how it extracted psychological insights from the seeders and then built an algorithm to profile millions more.

For more than a year, the reporting around what Cambridge Analytica did or didnt do for Trump has revolved around the question of psychographics, but Wylie points out: Everything was built on the back of that data. The models, the algorithm. Everything. Why wouldnt you use it in your biggest campaign ever?

In December 2015, the Guardians Harry Davies published the first report about Cambridge Analytica acquiring Facebook data and using it to support Ted Cruz in his campaign to be the US Republican candidate. But it wasnt until many months later that Facebook took action. And then, all they did was write a letter. In August 2016, shortly before the US election, and two years after the breach took place, Facebooks lawyers wrote to Wylie, who left Cambridge Analytica in 2014, and told him the data had been illicitly obtained and that GSR was not authorised to share or sell it. They said it must be deleted immediately.

Christopher
Christopher Wylie: Its like Nixon on steroids

I already had. But literally all I had to do was tick a box and sign it and send it back, and that was it, says Wylie. Facebook made zero effort to get the data back.

There were multiple copies of it. It had been emailed in unencrypted files.

Cambridge Analytica rejected all allegations the Observer put to them.

Dr Kogan who later changed his name to Dr Spectre, but has subsequently changed it back to Dr Kogan is still a faculty member at Cambridge University, a senior research associate. But what his fellow academics didnt know until Kogan revealed it in emails to the Observer (although Cambridge University says that Kogan told the head of the psychology department), is that he is also an associate professor at St Petersburg University. Further research revealed that hes received grants from the Russian government to research Stress, health and psychological wellbeing in social networks. The opportunity came about on a trip to the city to visit friends and family, he said.

There are other dramatic documents in Wylies stash, including a pitch made by Cambridge Analytica to Lukoil, Russias second biggest oil producer. In an email dated 17 July 2014, about the US presidential primaries, Nix wrote to Wylie: We have been asked to write a memo to Lukoil (the Russian oil and gas company) to explain to them how our services are going to apply to the petroleum business. Nix said that they understand behavioural microtargeting in the context of elections but that they were failing to make the connection between voters and their consumers. The work, he said, would be shared with the CEO of the business, a former Soviet oil minister and associate of Putin, Vagit Alekperov.

It didnt make any sense to me, says Wylie. I didnt understand either the email or the pitch presentation we did. Why would a Russian oil company want to target information on American voters?

Muellers investigation traces the first stages of the Russian operation to disrupt the 2016 US election back to 2014, when the Russian state made what appears to be its first concerted efforts to harness the power of Americas social media platforms, including Facebook. And it was in late summer of the same year that Cambridge Analytica presented the Russian oil company with an outline of its datasets, capabilities and methodology. The presentation had little to do with consumers. Instead, documents show it focused on election disruption techniques. The first slide illustrates how a rumour campaign spread fear in the 2007 Nigerian election in which the company worked by spreading the idea that the election would be rigged. The final slide, branded with Lukoils logo and that of SCL Group and SCL Elections, headlines its deliverables: psychographic messaging.

https://www.theguardian.com/news/2018/mar/17/data-war-whistleblower-christopher-wylie-faceook-nix-bannon-trump